Modern. When new terms enter the contemporary lexicon, it’s natural to find them kind of annoying at first. (I remember when ‘selfie’ was new and now I have no shame in using it, or taking them.) I’ve been annoyed with two in particular: 1) ‘self-care’; 2) ‘mindfulness.’* Yes, part of the annoyance is that when new catchwords arise they seem to be everywhere; the combination of their novelty and their ubiquity is probably what really rubs me the wrong way. But also: a new term needs time and repeated use to develop meaning, and that process of negotiation can reveal subtleties which complicate the word’s original intent. And there are some reasons to be distrustful of ‘self-care.’ The whole point, as I understand it, of self-care as an idea is that you disrupt your work habits (which you’ve feverishly developed in order to become, or remain, gainfully employed in the unstable economic landscape of 2020) in order to spend some time doing things that you actually like, which make you feel rejuvenated, and return to you some of your innate creative abilities. Okay! Well, obviously, the first issue is that time to yourself – the time to contemplate, to ‘do nothing’ (see: Jenny Odell), should, by all rights, be a bigger part of our lives in the first place. Secondly, self-care is often pitched not as a rebellion against the commodification and infestation of our private lives, but rather as its tool; i.e. self-care is supposed to rejuvenate us so that we can get back to work. At the “Facing Race” conference (Nov. 2016), Roxane Gay put it well (paraphrase via live-tweet): “I can’t stand the phrase self-care. It’s what women can do, but don’t do. We do it so we can work, but what comes next?”
Lastly, self-care is regularly figured as a consumerist activity; you should try searching “self-care face mask” in twitter. Self-care as the deliberate derailing of learned habits of overwork is itself a good thing, I think. But it’s hard to practice. And as a result, self-care has entered the zeitgeist as something quite frivolous, a superficial manifestation of something that is mostly invisible; a negotiation with yourself, and your self-perception. Likewise, ‘mindfulness.’ The point of this, again, as I understand it, is to consciously pay attention to what is happening in the very moment; including, if not particularly, your own internal, emotional landscape. To put it oversimplistically, we only really have what we are experiencing right now. Sure: we have indications of the future; and we have records of the past. But we are experiencing the present. Mindfulness as a practice is intended to remind us of this, and to encourage us to engage in the present fully, and to perceive its granularities. And to give us the ability to understand when we are being drawn into behaviours which are not totally within our control.
When it comes down to it, I love twitter. Over the years it has brought me community and a sense of belonging in a field that is often quite severe towards its members. I like its pluralism; I thank it for giving me more perspectives on certain issues. I think it can be empowering. In Classics, it’s where a lot of the social justice work starts. And because my personal life is deeply intertwined with my professional life, it has also been good for my work. I never want to write a screed against its use, and indeed, despite its documented toxicities, I still find myself encouraging people to use it so that they can get their work out into the world. But for all its functionalities, I don’t always like how I feel when I use it. I don’t like mindlessly scrolling; and I don’t like the possibility that at any given moment of casual scrolling, I can be made to feel all sorts of negative emotions that were not there seconds ago (and twitter privileges emotionally volatile content). It’s a turbulance which I volunteer for, but I don’t have to. I don’t have to participate in the parts that are engineering me.
I don’t want to leave twitter. I did a hiatus last summer to work on my book, and I hated it. As much as I want to have time that is my own, I also want to engage with the internet soul. So, here’s what I’ve been thinking. Snail mail (physical letters! some things, they tell me, still exist in “material reality”, whatever that means) only arrives once a day. You check it, and then you know what you’ve got, and there won’t be another thing to check till tomorrow. You get on with your day. But twitter (and email, don’t get me started) can come for you whenever you open that app. Sometimes, I think about social media in terms of the functionality of Stardew Valley. Long story short, this is a very charming, and calming, farm simulator, which operates on a calendar with days and seasons. Every morning when you wake up in game, the fruits and vegetables whose seeds you had planted previously have produced new growth, which you can harvest. But this harvesting should only take up a little part of the day. After which, you can explore the world, talk to the characters, maybe go fishing or mining.
Yes, it’s a farming simulator, but even this game understands there’s more to life than your occupation! I want to treat social media and work emails like this. Harvest (i.e. open, and deal with?) once or twice a day. What I’m doing right now is letting every twitter or email notification take my attention whenever it sends me something, and this is the equivalent of virtually sitting in my field and staring at my crops until they tell me I can harvest them. Actually, the more I think about it, video games in general have a built-in mindfulness which reality sometimes does not. You, the protagonist, receive missions, but you choose in which order, when, or even if you want to do them. You can dissent from tasks given to you, you can (usually) take your sweet time and indulge in as many side quests as you want. We can learn something from this. There’s an intentionality which we often (or at least I do, I’ll speak for myself) willingly give up. But you can always get it back.
* ‘Self-care’ as a term actually appears with the meaning ‘self-interest’ as early as the 16th c., where it was used by the English poet, George Turberville‘s translation of Ovid’s Herodes (specifically: 19.205). ‘Mindfulness’ too has a long history, appearing in English as “the quality or state of being conscious or aware of something; attention” in the 16th c. (see Oxford English Dictionary). These terms are ‘new’ to the extent that they have reappeared in the context of a specific socio-cultural moment, in which the modern human life is structured according to 21st c. philosophies of productivity.
- Katherine Blouin: papyrology as parenting.
- Sane and incisive analysis of the boring Oscar nominations on this week’s Keep It.
- Stephanie Pepper: Geralt vs. Geralt.
Excerpt. Hayden White 2010*: 114: “The kind of understanding we get from following his story is different from the kind of understanding we might get from following his arguments. We can dissent from the argument while assenting, in such a way as to increase our comprehension of the facts, to the story itself.”
*repr. of “The Structure of Historical Narrative” (1972)
Daily Life. I recently fell in love with cycling again because of Boston’s city bikes. It’s good stuff.
Ancient. Last weekend was the annual meeting of the Society for Classical Studies. Since I was still back in the UK with my family over the New Year, I missed most of it, but I was there for the last day to take part in the panel commemorating Prof. Tom Habinek, who sadly died last year. Tom was my PhD advisor, and a major influence in the field of Roman Studies. The event was very poignant, but fitting. On Sunday evening I posted my part of the panel, which you can read here: “Tom Habinek on ‘generativity’.”
Modern. In an essay originally published in 1971, “The Culture of Criticism”, Hayden White describes the frustrations of Ernst Gombrich, Erich Auerbach, and Karl Popper (respectively: art historian, philologist and literary critic, philosopher of science) with the avant-gardists as typified by, for example, the abstract expressionist, Jackson Pollock. Each of these scholars held an attachment to realism; in some cases considering realism, in historiography and art alike, to be a means of resisting authoritarianism, with its power to overwrite the experience of reality by means of ideology. White (2010*: 105) writes that for these critics, historical, literary, or artistic realism, i.e. an attempt to represent reality as it actually is or was “results from the controlled interplay of human consciousness with a shifting social and natural milieu.” In the face of the fact that realism is supposed to reflect the human perception of reality, the avant-garde is taken by these critics to be a frustration of perception rather than a refinement of it. More than this, this break with tradition is a challenge to perception. White writes (2010: 107):
“The surfaces of the external world, so laboriously charted over the last three thousand years, suddenly explode; perception loses its power as a restraint on imagination; the fictive sense dissolves — and modern man teeters on the verge of the abyss of subjective longing, which, Auerbach implies, must lead him finally to an enslavement once more by myth.”
(The fear of “myth” — figured as an antitype to so-called “rationality” in tandem with “realism” — has probably produced a number of negative results itself.) By the end of this essay, White (2010: 108-110) points to one of the real comforts of realism, one which lies in its hierarchical nature. Realistic art or narrative reflects a grammatically syntactical worldview, i.e. a mode of composition which privileges certain ideas over others, and arranges information around that privilege; whereas artefacts of the avant-garde might be interpreted as paratactical — presenting discrete elements “side-by side” (= παρά) in a “democracy of lateral coexistence” (2010: 109).
In Washington DC last weekend, I found myself face-to-face with Hans Hofmann’s Oceanic (1958) in the Hirshhorn Museum. I was really struck by the large heaps of paint in certain parts of the work, which I have now affectionately come to call “globs.” It feels appropriate!
Inspired by that visit, when I returned to Boston I wanted to go and look closely at more oil paintings in the MFA. Last night we got up close with some more excellent globs from Lee Krasner (Sunspots, 1963) and Joan Mitchell (Chamonix, c. 1962):
Digitization is vital, and I depend on it for my teaching and my scholarship, and I would never want digital resources to be taken away from me. But there is pretty much nothing like looking a glob straight in the eye, if you get the chance to. You can get a general sense of texture from a photograph. But the glob is just so noticeable IRL. Krasner applied oils straight from the tube onto the canvas for Sunspots, and you can tell. Looking at that painting tells the story of its making. As for Mitchell’s Chamonix, you can see the movement of her body in its wide, energetic strokes. Each is a record of embodiment, one which figurative, narrative, and supposedly veristic accounts tend to leave invisible. Back to Hayden White (2010: 110) one last time:
“The avant-garde insists on a transformation of social and cultural practice that will not end in the substitution of a new elite for an old one, a new protocol of domination for the earlier ones, nor the institution of new privileged positions for old ones — whether of privileged positions in space (as in the old perspectival painting and sculpture), of privileged moments in time (as one finds in the older narrative art of fiction and conventional historiography), of privileged places in society, in privileged areas in the consciousness (as in the conservative, that is to say, orthodox Freudian psychoanalytic theory), of privileged parts of the body (as the genitally organized sexual lore insists is ‘natural’), or of privileged positions in culture (on the basis of presumed superior ‘taste’) or in politics (on the basis of a presumed superior ‘wisdom’).”
* “The Culture of Criticism” (1971) is reprinted in The Fiction of Narrative: Essays on History, Literature, and Theory (2010), edited by Robert Doran.
- worry lines: rolling into the new year.
- some people do like the Witcher.
- Erika Lee Sears: resolutions.
Excerpt. Kurt Vonnegut Jr. 1987: 44: “I thought about myself and art: that I could catch the likeness of anything I could see — with patience and the best instruments and materials. I had, after all, been an able apprentice under the most meticulous illustrator of this century, Dan Gregory. But cameras could do what he had done and what I could do. And I knew that it was this same thought which had sent Impressionists and Cubists and the Dadaists and the Surrealists and so on in their quite successful efforts to make good pictures which cameras and people like Dan Gregory could not duplicate.”
Daily Life. We spent New Year’s Eve walking along the shore at Troon.
On 5th January 2020 I took part in a commemoration of Tom Habinek at the SCS organized by James Ker, Andrew Feldherr, and Enrica Sciarrino; with Basil Dufallo, Zsuzsanna Várhelyi, Scott Lepisto, and Enrica Sciarrino, and myself as panelists. With the generosity of Hector Reyes, we were able to read Tom’s (incomplete) book manuscript on the topic of personhood and authorship. Here’s the text of my contribution to the workshop, in case of interest. Enormous thanks to everyone involved and everyone who came to the panel.
It is my task today to speak on the concept of generativity as discussed in Tom’s manuscript. When I think of Tom’s work and the influence he had on students like me, it is, indeed, particularly his theorization of generativity which I feel to have been the most impactful. In earlier works, Tom’s interest in generativity manifested in his study of social generation via cultural production and reproduction, with a focus on how ritual acts instantiated Roman community. In a key passage of The World of Roman Song (2005, p129), Tom cited the work of the anthropologist, Paul Connerton, who, in How Societies Remember (1989, p62) discussed Thomas Mann’s understanding of the Freudian ego:
“We are to envisage the ego, less sharply defined and less exclusive than we commonly conceive of it, as being so to speak ‘open behind’: open to the resources of myth which are to be understood as existing for the individual not just as a grid of categories, but as a set of possibilities which can become subjective, which can be lived consciously. In this archaising attitude the life of the individual is consciously lived as a ‘sacred repetition’, as the explicit reanimation of prototypes.”
The “explicit reanimation of prototypes” is how Tom understood Roman self-construction: the invocation of ancient exemplars; the continuous citation and reinscription of Roman ancestral memory; rituals which resubstantiated the dead in the bodies of the living. Roman literary and political history demonstrates clearly that the Romans were interested in how their culture generated and regenerated itself; how the present day related to the past and preserved a tensile balance between new iterations of Roman youth, and their ancestral blueprints. All we need think of is the late Republican Brutus contemplating his ancestor, the expeller of kings; or perhaps Cicero in the Pro Caelio raising the ancient, blind Appius Claudius from the dead to speak with Cicero’s own lips and chastize Cicero’s own enemies.
In his latest work Tom approached the question of Roman generativity from some new perspectives. In his search for an understanding of Roman personhood, he figured the Roman persona as an active process, not a passive state; I think that for Tom persona was not a noun, but a verb. “Personifying” is a practice — it is an action, it is alive. Starting from this position, Tom was able to see different kinds of ancient evidence not as discrete, disconnected elements of Roman intellectual systems, but rather mutually supportive organs of an organic, synthetic whole. Tom’s work instantiates a theorization of human culture which does not merely render literature, or law, or art objects into cynical, insensate records of elites and auteurs. His organic approach reveals that the ancient artefact is an expression and mirror of biological as well as cultural forms. To put it another way, without really knowing that they are doing so, humans make things which reflect their insides. Tom’s work makes you realize that when you read a Latin text, that text is actively trying to constitute you into a Roman reader — like a 3D printer with instructions to produce a piece of plastic in a specific way, the scientific, ethical, political scripts of the Roman text tries to make us.
With this, or something like this, in his mind, Tom in this latest work proceeded to examine generativity in a number of different types of ancient evidence, ranging from the practices of Roman bride dowries to the emergence of birthday celebration as a theme in Latin love elegy. Underneath each artefact, Tom found a consistent preoccupation in the Roman attitude to cultural and biological reproduction which expressed a profound anxiety, one which can be conveyed in the form of a simple question. Will we continue to survive?
Romans expressing this anxiety in different ways figured reproduction, with its insistence upon a continuity of resources, as relating to survival in the long term. The fact that Roman bride dowries are reabsorbed into the natal family to allow women to marry again and to have children is, Tom suggests, an intentional defense mechanism against the failure to reproduce. As a result, generativity in Roman thought relates not only to explicit, biological reproduction (i.e. producing children), but making provision for a self-sustaining reabsorption of assets as part of a framework which allows such reproduction to take place. At its core, this legal provision expresses a care to conserve not just culture or biology but energy; like keeping a little something left in the storeroom in case of an unexpected hunger. Cast in this light, Roman conservatism, which is so frustratingly obvious and, frankly, obtuse sometimes (just think of Cato the Elder) seems to be not simply fanatical traditionalism, but indeed a form of conservationism.
It is an impulse to conserve that Tom saw in the Roman discourse around luxuria. The chastizing of luxuria is not simply, Tom suggests, a knee-jerk political reaction against perceived excesses and hedonism, but rather a criticism of “pointless growth” — i.e. the expenditure of energy which will not return, will not be reabsorbed and thereby conserved for future use. Tom notes that criticism of luxuria in Roman texts so often employ agricultural and botanical metaphors because luxuria was an metaphysical outgrowth which defied the boundaries of the carefully proportioned Catonian fields, designed and tended to produce year after year. Incidentally, Tom made a point to note that luxuriant excess — a squandering of resources, the refusal to regenerate, to conserve, to recycle — expressed itself in many different ways: the fact that furniture, fine art, construction, urban development, and non-reproductive sex were each as bad as each other speaks to the intersection of conservatism with conservationism in the Roman attitude; i.e. having fancy pedestal tables and sideboards (Livy 39.6) is just as bad as fucking your boyfriend because you should, good Roman, be conserving your attention and energies for generative activities. Here, Tom seems to have revealed a kind of biological essentialism in Roman thought which is not usually, I think, made explicit. Tom notes that while the elegists and other figures from the Roman counter-culture were “ambivalent” about such a formulation of luxuria, they nonetheless accepted its definition; that is, while they did not play by these rules, they accepted that these indeed were the rules. Even if you are walking away from Rome rather than towards it, you are still on the road to Rome.*
Tom translates the Latin luxuria as “pointless growth”, “withering growth”, “wild growth.” An agricultural, biological symptom of “bad” growth is itself a helpful tool to reveal the nature of “good” growth, and Tom realized that, in Roman thought, “good” growth often related to an inseparable dualism: life and death. An insistence that growth (that is “good” growth, not luxuria) is actually related to death appears, Tom says, in the Pro Marcello (23): Cicero’s exhortation of Caesar to propagate new growth includes the impossible wish that Caesar could bring the dead back to life, if only that were possible. Indeed, the relationship between the living and the dead at Rome was one of Tom’s deepest preoccupations; in the book proposal for the project, Tom had focused in on a passage from Rudolph Sohm which I believe was, for him, programmatic: “the heir is treated as though he were deceased…the deceased continues to live in the person of the heir” (1907, p504). Indeed, the idea that the dead live in the face, the name, and the actions of the living is one of the vital aspects of Roman generation, regeneration, generativity. Tom’s discussion of generativity in this manuscript reveals a living organism, a beating heart underneath the details of textuality. According to his understanding, the Romans formulated their generative function as a life pulse which conserved itself, returned to itself, and, being limited, precious, did not waste itself.
*Ursula Le Guin, The Left Hand of Darkness (1969/1999, p151): “To oppose something is to maintain it. They say here ‘all roads lead to Mishnory.’ To be sure, if you turn your back on Mishnory and walk away from it, you are still on the Mishnory road.'”
With the release on the same day (Dec. 20th 2019) of both the Netflix adaptation of The Witcher and the final installation of the new Star Wars trilogy, The Rise of Skywalker, this week we got an object lesson on how cultural criticism works on a mass scale. Before we dive in to either of these, I want again to invoke Jia Tolentino’s analysis of social media as a commercially driven organ, designed to privilege negative or otherwise emotionally provocative content. In Trick Mirror, Tolentino writes that over time, personal lives transforming into public assets via social media meant that “social incentives — to be liked, to be seen — were becoming economic ones” (2019: 6). She goes on: “Twitter, for all its discursive promise, was where everyone tweeted complaints at airlines and bitched about articles that had been commissioned to make people bitch” (2019: 7-8). Looking at the internet as an exercise of performativity (one that extends and magnifies the natural human performativity of the offline world), Tolentino writes that “the internet is defined by a built-in performance incentive” (2019: 8). In How to Do Nothing, Jenny Odell (2019: 18) discusses social media too, drawing in the remarks of Franco Berardi:
“Berardi, contrasting modern-day Italy with the political agitations of the 1970s, says the regime he inhabits ‘is not founded on the repression of dissent; nor does it rest on the enforcement of silence. On the contrary, it relies on the proliferation of chatter, the irrelevance of opinion and discourse, and on making thought, dissent, and critique banal and ridiculous.’ Instances of censorship, he says, ‘are rather marginal when compared to what is essentially an immense informational overload and an actual siege of attention, combined with the occupation of the sources of information by the head of the company.’ [Berardi 2011: 35] It is this financially incentivized proliferation of chatter, and the utter speed at which waves of hysteria now happen online, that has so deeply horrified me and offended my senses and cognition as a human who dwells in human, bodily time.”
The commercial incentive of online interaction is what particularly disturbs Odell; the communities and networks of social media are one thing, the design of such platforms to fulfill a capitalist purpose is another. Odell continues (2019: 60):
“Our aimless and desperate expressions of these platforms don’t do much for us, but they are hugely lucrative for advertisers and social media companies, since what drives the machine is not the content of information but the rate of engagement. Meanwhile, media companies continue churning out deliberately incendiary takes, and we’re so quicky outraged by their headlines that we can’t even consider the option of not reading and sharing them.”
All of this has a bearing on what happened this week. When Netflix dropped The Witcher last Friday, it was met with some noteworthy and negative reviews. Darren Franich and Kristen Baldwin’s “Netflix’s The Witcher is nakedly terrible: Review” (Entertainment Weekly) gave the series an F grade, with a 0/100 on Metacritic. These reviewers immediately, and justifiably, came under fire themselves given that they admitted that they did not watch the series in its entirety. Reponse to The Witcher has been divided: critics hate it, the public loves it. So is The Witcher any good? One of the barriers here is the general distaste for “genre” pieces. Some might avoid science fiction, fantasy, or romance just because it is labled so. Ursula K. Le Guin took on this problem in her essay, “Genre: a word only a Frenchman could love” (reprinted in Words are My Matter, 2019: 10):
“So we have accepted a hierarchy of fictional types, with ‘literary fiction,’ not defined, but consisting almost exclusively of realism, at the top. All other kinds of fiction, the ‘genres,’ are either listed in rapidly descending order of inferiority or simply tossed into a garbage heap at the bottom. This judgemental system, like all arbitrary hierarchies, promotes ignorance and arrogance. It has seriously deranged the teaching and criticism of fiction for decades, by short-circuiting useful critical description, comparison, and assessment. It condones imbecilities on the order of ‘If it’s science fiction it can’t be good, if it’s good it can’t be science fiction.'”
In the preface to her (critically acclaimed) The Left Hand of Darkness, Le Guin had already drawn attention to the fact that science fiction, like any literature, is about its present, not the future (1969/1999: xvi):
“All fiction is metaphor. Science fiction is metaphor. What sets it apart from older forms of fiction seems to be its use of new metaphors, drawn from certain great dominants of our contemporary life — science, all the sciences, and technology, and the relativistic and the historical outlook, among them. Space travel is one of those metaphors; so is an alternative society, an alternative biology; the future is another. The future, in fiction, is a metaphor.”
The Witcher is not actually “about” magic and monsters; it’s about the relationship between storytelling and reality (Jaskier’s song vs. Geralt’s action), about the pain of isolation (Yennefer), about trying to live your life despite tempestuous circumstances (Geralt); it’s about assembling strange families, when biological ones fail (Geralt, Yennefer, Ciri). Assigning an F to The Witcher because it successfully engages with its own genre, one which you, the reviewer, do not know or care enough about to situate the object of your critique within, removes the rich layers of cultural entanglement which may make such a show worthwhile to a viewer like me. Le Guin continues (2019: 10): “If you don’t know what kind of book you’re reading and it’s not the kind you’re used to, you probably need to learn how to read it. You need to learn the genre.”
I’m not coming at this from a neutral perspective, since I voraciously played and replayed, and loved Witcher 3. But is Netflix’s The Witcher “objectively bad”? No, it’s not. It has haunting performances from Anya Chalotra (Yennefer) and Henry Cavill (Geralt) is perfection. The fight scenes are incredible. And it’s beautiful to look at. Yes, they say “destiny” too many times. But, look, it’s a romp!
On to Star Wars, then. Since we kept up our tradition of seeing the newest Star Wars on Christmas eve, I was aware of an enormous amount of critical disappointment and fan anger regarding the latest installment before I saw the film itself. You know what? It was fine. Yes, it had a very fast pace, and it wasn’t seamless with the trilogy’s own self-mythologizing. The Star Wars universe is full of holes because of the method of its composition; to some extent the writing, and overwriting (if you think that’s what J.J. is doing) resembles the process of story development in the oral tradition of the Greek epic canon, and in its reception. Consider Odysseus in the Iliad vs. Odysseus in the Odyssey vs. Odysseus in Sophocles’ Ajax. Indeed, the empty spaces projected by Star Wars are part of its charm: it’s a perfect landscape for imaginative rethinking, whether in the form of fan fiction, fan art, or roleplaying games like Edge of The Empire. That Star Wars captures the modern imagination so strongly is somewhat ironically reflected in the strength of the vitriol against it (and in the fan art. Peruse #reylo only if you dare).
All of this might be fine if it really were so simple. The emotional economy of the internet has a role to play here, but in this case we end up in a different place than we did with The Witcher. Anthony Breznican of Vanity Fair recorded J.J. Abrams’ public response to the backlash against TROS :
“After a screening at the Academy of Motion Picture Arts and Sciences on Friday, I [=Breznican] asked Abrams what he would say to those who are unhappy. Are they not getting something? Is there a problem in the fandom? ‘No, I would say that they’re right,’ he answered quickly. ‘The people who love it more than anything are also right.’ The director had just returned from a global tour with the film, where he also fielded questions about that mixed reaction. ‘I was asked just seven hours ago in another country, ‘So how do you go about pleasing everyone?’ I was like’ What…?’ Not to say that that’s what anyone should try to do anyway, but how would one go about it? Especially with Star Wars.’ With a series like this, spanning more than four decades, nearly a dozen films, several TV shows, and countless novels, comics, and video games, the fanbase is so far-reaching that discord may be inevitable. ‘We knew starting this that any decision we made — a design decision, a musical decision, a narrative decision — would please someone and infuriate someone else,’ Abrams said. ‘And they’re all right.'”
You can see how the viewers’ response to Star Wars might be taken as a reflection of contemporary political and cultural life in the US. In the New York Times, Annalee Newitz affirmed Le Guin’s view that cultural artefacts, sci-fi or not, are reflective of the society which produces and consumes them:
“Star Wars became a new national mythos; it rebooted America’s revolutionary origin story and liberty-or-death values using the tropes of science fiction. Now, however, the movies no longer strike the same chord. Just as America’s political system is falling into disarray again, our cultural mythmaking machine is faltering as well.”
How and why we critique Star Wars may well reflect some deeper truth about the times we live in, but there’s another dark side to all this (get it?). To some extent the divided criticism is irrevelant, given that TROS earned an enormous amount of money. Indeed, the controversy only helped bring in the dollars (not to mention all the baby yodas hiding under the xmas trees this year). We entrusted our storytelling to a capitalist behemoth, and it’s disconcerting that cultural criticism has no impact on its forward march. Some have suggested that the F rating which Entertainment Weekly gave The Witcher was motivated by a desire to get more eyeballs (and more $) by artificially stirring up controversy. Given that the internet runs on divisiveness and ire (these are our social currencies), that might have been an economically shrewd move. But was it good cultural criticism?
Ancient and Modern. In the De Fato (10-11), Cicero discusses whether it is possible for the individual to overcome their nature. Here comes the Loeb:
Stilponem, Megaricum philosophum, acutum sane hominem et probatum temporibus illis accepimus. Hunc scribunt ipsius familiares et ebriosum et mulierosum fuisse, neque haec scribunt vituperantes sed potius ad laudem, vitiosam enim naturam ab eo sic edomitam et compressam esse doctrina ut nemo umquam vinolentum illum, nemo in eo libidinis vestigium viderit. Quid? Socratem nonne legimus quemadmodum notarit Zopyrus physiognomon, qui se profitebatur hominum mores naturasque ex corpore oculis vultu fronte pernoscere? stupidum esse Socratem dixit et bardum quod iugula concava non haberet—obstructas eas partes et obturatas esse dicebat; addidit etiam mulierosum, in quo Alcibiades cachinnum dicitur sustulisse.  Sed haec ex naturalibus causis vitia nasci possunt, exstirpari autem et funditus tolli, ut is ipse qui ad ea propensus fuerit a tantis vitiis avocetur, non est positum in naturalibus causis, sed in voluntate studio disciplina; quae tollentur omnia si vis et natura fati…firmabitur.
“The Megarian philosopher Stilpo, we are informed, was undoubtedly a clever person and highly esteemed in his day. Stilpo is described in the writings of his own associates as having been fond of liquor and of women, and they do not record this as a reproach but rather to add to his reputation, for they say that he had so completely mastered and suppressed his vicious nature by study that no one ever saw him the worse for liquor or observed in him a single trace of licentiousness. Again, do we not read how Socrates was stigmatized by the ‘physiognomist’ Zopyrus, who professed to discover men’s entire characters and natures from their body, eyes, face and brow? He said that Socrates was stupid and thick-witted because he had not got hollows in the neck above the collarbone—he used to say that these portions of his anatomy were blocked and stopped up. He also added that he was addicted to women—at which Alcibiades is said to have given a loud guffaw!  But it is possible that these defects may be due to natural causes; but their eradication and entire removal, recalling the man himself from the serious vices to which he was inclined, does not rest with natural causes, but with will, effort, training; and if the potency and the existence of fate is proved…all of these will be done away with.”
In this passage, Cicero describes some of the quote-unquote defects which naturally arise in humans. Stilpo (4th c. BCE) reportedly had a natural proclivity for alcohol and sex with women; he was, according to friends, ebriosus (“addicted to drink”) and mulierosus* (“addicted to women”). But, Cicero says, Stilpo was able to master his nature with philosophical training (doctrina), and was never seen drunk again, and showed no outward sign of lust. Zopyrus (5th c. BCE), applied physiognomy, i.e. the theory that human character can be read in the condition of the body, to Socrates and concluded from the philosopher’s body that he could only be an idiot. Oh, and that he must also be “addicted to women” (mulierosus again). Cicero writes that nature may be responsible for giving us certain tendencies. But, he says, it is human agency that can overcome them: “will” (voluntas), “effort” (studium), and “training” (disciplina). This passage, of course, contains an oversimplistic attitude to addiction as well as an ablest assumption that bodily imperfection is a mirror of morality or intellect. It’s also quite clear that these anecdotes are designed to reflect male power in the context of elite competition: the detail that the notorious party animal, Alcibiades, laughed at Zopyrus calling Socrates names suggests a symposiastic setting (Phaedo’s dialogue, Zopyrus, dramatized a debate between the physiognomist and Socrates). Putting those things aside, what do we make of Cicero’s claim that we can overcome our nature?
In the recent (and superb), How to Do Nothing (2019), Jenny Odell cites this passage of Cicero’s De Fato (pp71-72) in the context of arguing for the creation of a “third space” of attention — one which reframes human interaction with reality as a kind of rejection of market forces and commercially-run social media. The book as a whole is a meditation on and a protreptic towards a modern kind of recusatio, i.e. the technique of saying “I would prefer not to.” Odell asks her reader to refuse to internalize the contemporary narrative of productivity, and to reclaim time and space to “do nothing.” (There are a lot of classical references throughout — Seneca, Epicurus, Diogenes. And Cicero’s cum dignitate otium is clearly a spiritual forebear.) Here’s what Odell says about this passage of Cicero (p72):
“If we believed that everything were merely a product of fate, or disposition, Cicero reasons, no one would be accountable for anything and therefore there could be no justice. In today’s terms, we’d all just be algorithms. Furthermore, we’d have no reason to try to make ourselves better or different from our natural inclinations. VOLUNTATE, STUDIO, DISCIPLINA — it is through these things that we find and inhabit the third space, and more important, how we stay there. In a situation that would have us answer yes or no (on its terms), it takes work, and will, to keep answering something else.”
The possibility of escaping (or mitigating) the frailties of human psychology and embodiment which Cicero suggests relies on the intentional application of the mind (or soul). Odell would have us apply ourselves in this way as an act of resistance against cynical structures of social influence. The concept of “will” (voluntas) invokes the notion of presence — or attention — the ability to be here in the moment, to have an appreciation for the moment in all its granularities. To “focus” (studium). As for the “training” (disciplina), this obviously could take a number of forms. But evidently self-awareness, and awareness of the churning forces around you, is at the core of this idea.
*Mulierosus is quite an unusual Latin word! It only appears in extant classical Latin three times. According to Aulus Gellius (4.9.2), quoting “mulierosus” as discussed by the Pythagorean magician, Nigidius Figulus, the Latin suffix –osus indicates an excess of the characteristic in question.
- may divorce be with you | may reports be with you | may the chords be with us | may four score be with you.
- Lindy West: “Shit, actually.”
- Lorraine Sorlet: a nap.
- David Shrigley: the game.
Excerpt. Suzanne McConnell on Kurt Vonnegut Jr., 2019: 134-135: “By its nature, literary fiction ‘teaches’: it shows how people feel, think, respond, vary; how circumstances affect them; how their brains, personalities, surroundings and culture make them tick. How an experiences strikes a particular person a certain way, and another differently. How a person feels inside as opposed to how they act or are perceived. And so on. All writing teaches — communicates something about something. [p135] Even bad writing. So if you’re writing, you’re teaching. You can’t help it. But then there’s intentional teaching through writing.”*
*Austin came into the room to point to a passage written by Vonnegut (on teachers and teaching) which was quoted on this page. So I thank him for the excerpt this week.
Daily Life. Max helped me grade.
Earlier this week I went to an independent cinema in Boston in the rain to see a matinee screening of Marriage Story, even though it’s on Netflix and I could have watched it from my couch. I’m glad I did. I really adore going to the movies on my own, especially if there’s hardly anyone else with me in the screening. And giving my attention to only one point of focus feels meditative. As the entire internet knows by now, I am very fond of Adam Driver. I’ve seen an embarrassing number of his films. (But not Silence (2016), because, and I’m sorry to say this, I just can’t deal with Andrew Garfield. Also, I didn’t see Lincoln (2012), in which he has a minor (yet acclaimed) role, because, god, who has the time. And I passed on Midnight Special (2016).) As I was watching Marriage Story, I was thinking about how I would rank my favourite Adam Driver performances, as though ranking an actor’s performances is something I should be doing, or even be thinking about doing. I don’t know if I can really give a numerical value to this, but here’s what I’ve got:
- Girls (2012-2017). A number of recent profiles have been unable to resist the fact that a character in Girls (“Jessa”, played by Jemima Kirke) once said of Adam Driver’s character (“Adam Sackler”): “He does sort of look like the original man” (New Yorker, Oct. 21st; The Washington Post, Dec. 6th; The Observer, Dec. 8th). Even though Driver was nominated for an Oscar recently (for BlacKkKlansman, 2018), part of me still thinks his earlier work in Girls is the best because he seems at ease in this role, and the outcome of that ease is a performance which is naturalistic and, quite frankly, funny. In general, Driver’s performances can be characterized as tensile: while calm, he always seems on the edge of a fit of rage. This was true long before Kylo Ren. And in Girls it was much more nuanced. For me, Driver is at his best in season 3 of Girls, where his story arc takes him into the theatre; “Adam Sackler” plays Bronterre O’Brien Price in a Broadway production of George Bernard Shaw’s “Major Barbara.” Driver has a theatre company and stage background, which explains why he gravitates towards narratives about the theatre, and thrives (imho) in those roles (see: Marriage Story). Writers like to write about writing, actors like to act about acting. Television allows actors to really inhabit a role: character can be developed more slowly and deliberately; there’s time – and space – for more emotional depth. So even though there are so many great Adam Driver films, I still think of Girls as some of his best work.
- Marriage Story (2019). First of all, I reject this film’s thesis, namely that Los Angeles is a vapid cultural wasteland (as pointed out, and also rejected, by Ira Madison III this week on Keep It). I loved LA. Anyway, Marriage Story might well be Adam Driver’s best performance to date. Driver does best, I think, when he can be reactive — his trademark intensity and tensility structures his performance most clearly in the silences. Highlights of this film: the claustrophobia of the legal proceedings, particularly that one scene with Alan Alda; the knife scene (!); and, of course, Driver singing “Being Alive” from Steven Sondheim’s Company (1970). Adam Driver actually sings more often than you might think. In Hungry Hearts (2014), he sings in Italian; he infamously contributes to a song in Inside Llewyn Davis (2013), alongside Justin Timberlake and Oscar Isaac; and he sings as the character “Art the Artist” in two very remarkable episodes of Bob’s Burgers (“The Bleakening,” Parts 1 + 2, in season 8).
- Paterson (2016). This quiet, contemplative film is a good watch if you feel like indulging in long, lingering close-ups of Adam Driver’s interesting face. Adam Driver plays a bus driver (get it?) whose name is Paterson and who lives in Paterson, NJ. He’s married to the most beautiful and cutest woman alive, a character played by Iranian actress Golshifteh Farahani, who paints everything black and white, and has a precocious dog named Marvin. The film is an imaginative reception of William Carlos Williams’ Paterson, and dramatizes Williams’ conception of a poetic continuum of mind and matter, extending between human consciousness and the mundanities of daily life. Driver plays his role tranquilly, passively. (Again, that trademark tensility works best in profound, deliberate silences.) The interplay between this character’s poesis and his observations of daily life play out through a series of coincidences and minor dramas which break through the cyclical rhythm of his bus route. Some really wonderful and memorable performances by Barry Shabaka Henley, William Jackson Harper, and Chasten Harmon. Honestly, I love this one.
- The Man Who Killed Don Quixote (2019). OH BOY, what a hot mess. This Terry Gilliam film had an extremely troubled production, and it shows. The plot is bonkers, a lot of the dialogue is nonsense, the characters are underdeveloped (the female characters especially so), and, as A. O. Scott wrote in the NY Times: “the romanticism has a creepy side.” Plus, it’s long! This film attempts in a rather ham-fisted way to entangle itself with and enact the themes of Cervantes; Jim Jarmusch’s Paterson, and its entanglement with William Carlos Williams, is a much more subtle and successful version of this. But if you can make it through this weird (and, overall, basically bad) thing, there is some really lush cinematography and glamour…
- Star Wars (2015, 2017; 2019). It’s interesting. I think Adam Driver plays the role of Kylo Ren well, but Driver’s overexposed association with this character has the effect of flattening out his perceived range. Put another way, those who know Driver principally as the villain in this Disney merch fest will (with some cause) think that all he can do is stomp around and have “hissy fits.” But, listen, I think he does good things with the role. Yes, I audibly gasped when he took off that helmet in The Force Awakens (2015). That “You need a teacher” line is unfortunate. And I don’t know why he had to be wearing such high waisted trousers in The Last Jedi (2017). He has a uniquely hunched physicality that makes his fight choreography very interesting to watch. What troubles me most, I suppose, is the oversimplicity of Kylo Ren’s character. (Honestly, there is no “there” there for much of Star Wars.) While his struggle with “dark” and “light” is asserted by the films, the received image of Kylo Ren in pop culture is of an angry young man who violently rejects the limits placed upon him by his social context. In 2019, we have a lot of angry young men who use violence to reject limits. In the context of his broader filmography, Adam Driver’s portrayal of anger is actually quite nuanced; elsewhere his performances of rage contain a self-awareness which acknowledges and indeed urges that anger must be resolved and exorcised somehow. There’s an interesting scene of rather violent rage between Jemima Kirke (who plays “Jessa”) and Adam Driver in season 5 of Girls: it’s intense, it’s absurd. But its intense absurdity is precisely the frame of critique which is needed when depicting acts of anger — there must be some mechanism of judgement that presents a means of resolution. In Kylo Ren’s case, we’ll find out soon how his anger is resolved. But in the meantime, I wonder, as others have, about presenting young male anger as a piece of merchandise. Star Wars reified Driver’s place in the modern canon, but he’s been busy doing other work in order to insist that there’s more to him than this.
- There are also some nice small roles and appearances. As a dog lover, I find this short W Magazine interview where Adam Driver talks about his dog, Moose, extremely charming; in the background to that video, there is a woman extremely losing her shit, which I also find very sweet. Driver plays a small but noticeable role as a fuckboi in Noah Baumbach’s Frances Ha (2012), starring the extremely charismatic Greta Gerwig. The next year, Driver sweetly played “love interest” to the force of nature, Mia Wasikowska, in Tracks (2013); there’s a very sad scene in this film involving dogs, btw, so watch out. While he won his Oscar nomination for BlacKkKlansman (2018), and he does give a very subtle performance in it, I wouldn’t actually say it’s his best (that is often the way with Oscars, isn’t it?). I reject the thesis regarding millennials in Noah Baumbach’s While We’re Young (2014), but he plays this odious part fairly well (the stand-out scene to me in that film is when Naomi Watts, high on Ayahuasca, mistakes Adam Driver for Ben Stiller — that moment is so poignant and intimate).
- With the good, there’s the not so good. We’ve already talked about The Man Who Killed Don Quixote (2019). Logan Lucky (2017), which put Adam Driver alongside Daniel Craig (doing an accent even before Knives Out) and Channing Tatum, should have been more of a romp but the whole thing fell flat, and he has a rather underwhelming presence. I wanted to like Jim Jarmusch’s The Dead Don’t Die (2019), but I didn’t — it lacked subtlety and the sharpness of parody; plus, I know that Tilda Swinton basically is a real life alien (spoiler alert) but she has apparently learned nothing about appropriating Asian culture. I watched The Report (2019) a few days ago and I’ve already forgotten about it: not only it is it forgettable, it lacks the high drama of its genre, and, more worryingly, presents certain politicians in the guise of heroes (again: lacking subtlety). I think I did watch This Is Where I Leave You (2014) but, again, I remember nothing about it, and it’s just about what you would expect given the genre — although I think there are broad comedies of this type which have more heart.