CompaRe | Allgemeine und Vergleichende Literaturwissenschaft
Refine
Year of publication
- 2016 (8) (remove)
Document Type
- Part of a Book (3)
- Working Paper (3)
- Article (2)
Language
- English (8) (remove)
Has Fulltext
- yes (8)
Is part of the Bibliography
- no (8)
Keywords
- Digital Humanities (3)
- Synergie (3)
- Emergenz (2)
- Quantitative Literaturwissenschaft (2)
- Romantheorie (2)
- Aesthetic appeal dimensions (1)
- Aktienmarkt (1)
- Coevolution (1)
- Empirische Ästhetik (1)
- Evolution (1)
Literature, measured
(2016)
There comes a moment, in digital humanities talks, when someone raises the hand and says: "Ok. Interesting. But is it really new?" Good question... And let's leave aside the obvious lines of defense, such as "but the field is still only at its beginning!", or "and traditional literary criticism, is that always new?" All true, and all irrelevant; because the digital humanities have presented themselves as a radical break with the past, and must therefore produce evidence of such a break. And the evidence, let's be frank, is not strong. What is there, moreover, comes in a variety of forms, beginning with the slightly paradoxical fact that, in a new approach, not everything has to be new. When "Network Theory, Plot Analysis” pointed out, in passing, that a network of Hamlet had Hamlet at its center, the New York Times gleefully mentioned the passage as an unmistakable sign of stupidity. Maybe; but the point, of course, was not to present Hamlet’s centrality as a surprise; it was exactly the opposite: had the new approach not found Hamlet at the center of the play, its plausibility would have disintegrated. Before using network theory for dramatic analysis, I had to test it, and prove that it corroborated the main results of previous research.
Th¬e main principle of holism – "the whole is more than the sum of its parts" – can be traced back to ancient philosophical studies. Although the term itself was coined by Jan Christiaan Smuts in 1926, the earliest formulations can already be found in Taoism, in the philosophy of Lao Tzu, as well as in Aristotle's 'Metaphysics'. However, a complete and profound sense of the principle has only been revealed in such theories as Gestalt psychology (Kurt Koffka, Max Wertheimer and others), the general systems theory (Ludwig von Bertalanffy), and the theory of complexity (synergetics) as formulated by the Moscow school of synergetics (Sergey Pavlovich Kurdyumov), to name just a few. ¬inking in this direction, from the whole to the parts (subsystems), is quite unusual for classical science which, in its course of analysis, usually moves from distinct parts to the whole. In synergetics, according to Hermann Haken, order parameters determine how parts (subsystems) of complex systems behave. A select few order parameters, as Haken says, encompass the complex behavior of diverse parts and, therefore, lead to enormously reducing the complexity in a description of a given system.
The present paper aims to elucidate the conceptual structure of the aesthetics of literature.Following Fechner's "aesthetics from below" (1876) and adopting a method introduced by Jacobsen, Buchta, Kohler, and Schroeger (2004), we asked 1544 German-speaking research participants to list adjectives that they use to label aesthetic dimensions of literature in general and of individual literary forms and genres in particular (novels, short stories, poems, plays, comedies). According to our analyses of frequency, mean list rank, and the Cognitive Salience Index, beautiful and suspenseful rank highest across all target categories. For plays/comedies, funny and sad turned out to be the most relevant terms; for novels and short stories, suspenseful, interesting and romantic; and for poetry romantic, along with the music-related terms harmonious, rhythmic, and melodious. A comparison of our results with analogous studies for visual aesthetics and music yielded a comprehensive map of the distribution of aesthetic appeal dimensions across sensory modalities and aesthetic domains, with poetry and music showing the greatest overlap.
The Emotions of London
(2016)
A few years ago, a group formed by Ben Allen, Cameron Blevins, Ryan Heuser, and Matt Jockers decided to use topic modeling to extract geographical information from nineteenth-century novels. Though the study was eventually abandoned, it had revealed that London-related topics had become significantly more frequent in the course of the century, and when some of us were later asked to design a crowd-sourcing experiment, we decided to add a further dimension to those early findings, and see whether London place-names could become the cornerstone for an emotional geography of the city.
If reductionism and a search for deterministic, predictive 'laws' of nature represented the dominant research strategy – and world view – of the scientific community during the 20th century, 'emergence' has become a major theme, if not the dominant approach in the 21st century, reflecting a major shift of focus toward the study of complexity and complex systems. However, this important 'climate change' in the scientific enterprise has been accompanied by much confusion and debate about what exactly emergence is. How do you know it when you see it? Or don't see it? What are its defining properties? Is it possible to predict emergence? And is there more to emergence than meets the eye? Beyond these meta-theoretical issues, there is a deep question that is often skirted, or even ignored. How do we explain emergence? Why does emergence emerge? Here, I will briefly recount the history of this important concept and will address some of the many questions that surround it. I will also consider the distinction between reductionist and holistic approaches to the subject, as well as the distinction between epistemological and ontological emergence (that is, the ability to deduce or predict emergence versus the concrete reality of an emergent phenomenon). I will argue that living systems are irreducibly emergent in both senses and that biological evolution has quintessentially been a creative emergent process that is fully consistent with modern (Darwinian) evolutionary theory. Furthermore, as I will explain, novel 'synergies' of various kinds have been responsible for the 'progressive' evolution of more complex living systems over time. e selective advantages associated with emergent, synergistic effects have played a major causal role in the evolutionary process.
'Synergetics', a fascinating interdisciplinary science initially proposed by Hermann Haken in the late 1960s, is a framework for understanding the interaction effects of very large complex systems, with an emphasis on explaining how self-organized macroscopic phenomena can emerge as a result of these underlying interactions. An especially exciting aspect is that entirely new and distinct properties of the system can emerge somewhat spontaneously. e approach has seen great success in a host of fields ranging from physics and chemistry to brain science and economics.
In her article, Karin Littau proposes a material or medial turn in the humanities and social sciences to end the neglect of the material basis to every act of communication, including translation. This proposal is warmly welcomed. As a comparatist who has for some time been trying to build bridges between literary studies and book history, I strongly support Littau's point of view – all the more since I am less optimistic regarding the general acceptance of such ideas in the humanities, and especially in literary and translation studies. I am not so sure that McLuhan and the other authorities for the importance of mediality and technicity whom Littau quotes (e.g. Kittler, Ong, and Gumbrecht) have really provoked a "crisis in the self-understanding of the human sciences". For brevity's sake, in my response below, I leave aside literary studies to focus on translation studies.
Of the novelties introduced by digitization in the study of literature, the size of the archive is probably the most dramatic: we used to work on a couple of hundred nineteenth-century novels, and now we can analyze thousands of them, tens of thousands, tomorrow hundreds of thousands. It's a moment of euphoria, for quantitative literary history: like having a telescope that makes you see entirely new galaxies. And it's a moment of truth: so, have the digital skies revealed anything that changes our knowledge of literature? This is not a rhetorical question. In the famous 1958 essay in which he hailed "the advent of a quantitative history" that would "break with the traditional form of nineteenth-century history", Fernand Braudel mentioned as its typical materials "demographic progressions, the movement of wages, the variations in interest rates [...] productivity [...] money supply and demand." These were all quantifiable entities, clearly enough; but they were also completely new objects compared to the study of legislation, military campaigns, political cabinets, diplomacy, and so on. It was this double shift that changed the practice of history; not quantification alone. In our case, though, there is no shift in materials: we may end up studying 200,000 novels instead of 200; but, they're all still novels. Where exactly is the novelty?