On ‘self-care’ and ‘mindfulness’; Hayden White; tempora cum causis (12)

Modern. When new terms enter the contemporary lexicon, it’s natural to find them kind of annoying at first. (I remember when ‘selfie’ was new and now I have no shame in using it, or taking them.) I’ve been annoyed with two in particular: 1) ‘self-care’; 2) ‘mindfulness.’* Yes, part of the annoyance is that when new catchwords arise they seem to be everywhere; the combination of their novelty and their ubiquity is probably what really rubs me the wrong way. But also: a new term needs time and repeated use to develop meaning, and that process of negotiation can reveal subtleties which complicate the word’s original intent. And there are some reasons to be distrustful of ‘self-care.’ The whole point, as I understand it, of self-care as an idea is that you disrupt your work habits (which you’ve feverishly developed in order to become, or remain, gainfully employed in the unstable economic landscape of 2020) in order to spend some time doing things that you actually like, which make you feel rejuvenated, and return to you some of your innate creative abilities. Okay! Well, obviously, the first issue is that time to yourself – the time to contemplate, to ‘do nothing’ (see: Jenny Odell), should, by all rights, be a bigger part of our lives in the first place. Secondly, self-care is often pitched not as a rebellion against the commodification and infestation of our private lives, but rather as its tool; i.e. self-care is supposed to rejuvenate us so that we can get back to work. At the “Facing Race” conference (Nov. 2016), Roxane Gay put it well (paraphrase via live-tweet): “I can’t stand the phrase self-care. It’s what women can do, but don’t do. We do it so we can work, but what comes next?”  

Lastly, self-care is regularly figured as a consumerist activity; you should try searching “self-care face mask” in twitter. Self-care as the deliberate derailing of learned habits of overwork is itself a good thing, I think. But it’s hard to practice. And as a result, self-care has entered the zeitgeist as something quite frivolous, a superficial manifestation of something that is mostly invisible; a negotiation with yourself, and your self-perception. Likewise, ‘mindfulness.’ The point of this, again, as I understand it, is to consciously pay attention to what is happening in the very moment; including, if not particularly, your own internal, emotional landscape. To put it oversimplistically, we only really have what we are experiencing right now. Sure: we have indications of the future; and we have records of the past. But we are experiencing the present. Mindfulness as a practice is intended to remind us of this, and to encourage us to engage in the present fully, and to perceive its granularities. And to give us the ability to understand when we are being drawn into behaviours which are not totally within our control.

When it comes down to it, I love twitter. Over the years it has brought me community and a sense of belonging in a field that is often quite severe towards its members. I like its pluralism; I thank it for giving me more perspectives on certain issues. I think it can be empowering. In Classics, it’s where a lot of the social justice work starts. And because my personal life is deeply intertwined with my professional life, it has also been good for my work. I never want to write a screed against its use, and indeed, despite its documented toxicities, I still find myself encouraging people to use it so that they can get their work out into the world. But for all its functionalities, I don’t always like how I feel when I use it. I don’t like mindlessly scrolling; and I don’t like the possibility that at any given moment of casual scrolling, I can be made to feel all sorts of negative emotions that were not there seconds ago (and twitter privileges emotionally volatile content). It’s a turbulance which I volunteer for, but I don’t have to. I don’t have to participate in the parts that are engineering me.

I don’t want to leave twitter. I did a hiatus last summer to work on my book, and I hated it. As much as I want to have time that is my own, I also want to engage with the internet soul. So, here’s what I’ve been thinking. Snail mail (physical letters! some things, they tell me, still exist in “material reality”, whatever that means) only arrives once a day. You check it, and then you know what you’ve got, and there won’t be another thing to check till tomorrow. You get on with your day. But twitter (and email, don’t get me started) can come for you whenever you open that app. Sometimes, I think about social media in terms of the functionality of Stardew Valley. Long story short, this is a very charming, and calming, farm simulator, which operates on a calendar with days and seasons. Every morning when you wake up in game, the fruits and vegetables whose seeds you had planted previously have produced new growth, which you can harvest. But this harvesting should only take up a little part of the day. After which, you can explore the world, talk to the characters, maybe go fishing or mining.

Yes, it’s a farming simulator, but even this game understands there’s more to life than your occupation! I want to treat social media and work emails like this. Harvest (i.e. open, and deal with?) once or twice a day. What I’m doing right now is letting every twitter or email notification take my attention whenever it sends me something, and this is the equivalent of virtually sitting in my field and staring at my crops until they tell me I can harvest them. Actually, the more I think about it, video games in general have a built-in mindfulness which reality sometimes does not. You, the protagonist, receive missions, but you choose in which order, when, or even if you want to do them. You can dissent from tasks given to you, you can (usually) take your sweet time and indulge in as many side quests as you want. We can learn something from this. There’s an intentionality which we often (or at least I do, I’ll speak for myself) willingly give up. But you can always get it back.

* ‘Self-care’ as a term actually appears with the meaning ‘self-interest’ as early as the 16th c., where it was used by the English poet, George Turberville‘s translation of Ovid’s Herodes (specifically: 19.205). ‘Mindfulness’ too has a long history, appearing in English as “the quality or state of being conscious or aware of something; attention” in the 16th c. (see Oxford English Dictionary). These terms are ‘new’ to the extent that they have reappeared in the context of a specific socio-cultural moment, in which the modern human life is structured according to 21st c. philosophies of productivity.

Internet.

Excerpt. Hayden White 2010*: 114: “The kind of understanding we get from following his story is different from the kind of understanding we might get from following his arguments. We can dissent from the argument while assenting, in such a way as to increase our comprehension of the facts, to the story itself.” 

*repr. of “The Structure of Historical Narrative” (1972)

Daily Life. I recently fell in love with cycling again because of Boston’s city bikes. It’s good stuff. 

Tom Habinek, realism vs. the ‘glob’, Kurt Vonnegut Jr.; tempora cum causis (11)

Ancient. Last weekend was the annual meeting of the Society for Classical Studies. Since I was still back in the UK with my family over the New Year, I missed most of it, but I was there for the last day to take part in the panel commemorating Prof. Tom Habinek, who sadly died last year. Tom was my PhD advisor, and a major influence in the field of Roman Studies. The event was very poignant, but fitting. On Sunday evening I posted my part of the panel, which you can read here: “Tom Habinek on ‘generativity’.” 

Modern. In an essay originally published in 1971, “The Culture of Criticism”, Hayden White describes the frustrations of Ernst Gombrich, Erich Auerbach, and Karl Popper (respectively: art historian, philologist and literary critic, philosopher of science) with the avant-gardists as typified by, for example, the abstract expressionist, Jackson Pollock. Each of these scholars held an attachment to realism; in some cases considering realism, in historiography and art alike, to be a means of resisting authoritarianism, with its power to overwrite the experience of reality by means of ideology. White (2010*: 105) writes that for these critics, historical, literary, or artistic realism, i.e. an attempt to represent reality as it actually is or was “results from the controlled interplay of human consciousness with a shifting social and natural milieu.” In the face of the fact that realism is supposed to reflect the human perception of reality, the avant-garde is taken by these critics to be a frustration of perception rather than a refinement of it. More than this, this break with tradition is a challenge to perception. White writes (2010: 107): 

“The surfaces of the external world, so laboriously charted over the last three thousand years, suddenly explode; perception loses its power as a restraint on imagination; the fictive sense dissolves — and modern man teeters on the verge of the abyss of subjective longing, which, Auerbach implies, must lead him finally to an enslavement once more by myth.”

(The fear of “myth” — figured as an antitype to so-called “rationality” in tandem with “realism” — has probably produced a number of negative results itself.) By the end of this essay, White (2010: 108-110) points to one of the real comforts of realism, one which lies in its hierarchical nature. Realistic art or narrative reflects a grammatically syntactical worldview, i.e. a mode of composition which privileges certain ideas over others, and arranges information around that privilege; whereas artefacts of the avant-garde might be interpreted as paratactical — presenting discrete elements “side-by side” (= παρά) in a “democracy of lateral coexistence” (2010: 109).

In Washington DC last weekend, I found myself face-to-face with Hans Hofmann’s Oceanic (1958) in the Hirshhorn Museum. I was really struck by the large heaps of paint in certain parts of the work, which I have now affectionately come to call “globs.” It feels appropriate!

Inspired by that visit, when I returned to Boston I wanted to go and look closely at more oil paintings in the MFA. Last night we got up close with some more excellent globs from Lee Krasner (Sunspots, 1963) and Joan Mitchell (Chamonix, c. 1962):

Digitization is vital, and I depend on it for my teaching and my scholarship, and I would never want digital resources to be taken away from me. But there is pretty much nothing like looking a glob straight in the eye, if you get the chance to. You can get a general sense of texture from a photograph. But the glob is just so noticeable IRL. Krasner applied oils straight from the tube onto the canvas for Sunspots, and you can tell. Looking at that painting tells the story of its making. As for Mitchell’s Chamonix, you can see the movement of her body in its wide, energetic strokes. Each is a record of embodiment, one which figurative, narrative, and supposedly veristic accounts tend to leave invisible. Back to Hayden White (2010: 110) one last time:

“The avant-garde insists on a transformation of social and cultural practice that will not end in the substitution of a new elite for an old one, a new protocol of domination for the earlier ones, nor the institution of new privileged positions for old ones — whether of privileged positions in space (as in the old perspectival painting and sculpture), of privileged moments in time (as one finds in the older narrative art of fiction and conventional historiography), of privileged places in society, in privileged areas in the consciousness (as in the conservative, that is to say, orthodox Freudian psychoanalytic theory), of privileged parts of the body (as the genitally organized sexual lore insists is ‘natural’), or of privileged positions in culture (on the basis of presumed superior ‘taste’) or in politics (on the basis of a presumed superior ‘wisdom’).”

* “The Culture of Criticism” (1971) is reprinted in The Fiction of NarrativeEssays on History, Literature, and Theory (2010), edited by Robert Doran.

Internet.

Excerpt. Kurt Vonnegut Jr. 1987: 44: “I thought about myself and art: that I could catch the likeness of anything I could see — with patience and the best instruments and materials. I had, after all, been an able apprentice under the most meticulous illustrator of this century, Dan Gregory. But cameras could do what he had done and what I could do. And I knew that it was this same thought which had sent Impressionists and Cubists and the Dadaists and the Surrealists and so on in their quite successful efforts to make good pictures which cameras and people like Dan Gregory could not duplicate.” 

Daily Life. We spent New Year’s Eve walking along the shore at Troon. 

This slideshow requires JavaScript.

Tom Habinek on “generativity” (SCS 2020)

On 5th January 2020 I took part in a commemoration of Tom Habinek at the SCS organized by James Ker, Andrew Feldherr, and Enrica Sciarrino; with Basil Dufallo, Zsuzsanna Várhelyi, Scott Lepisto, and Enrica Sciarrino, and myself as panelists. With the generosity of Hector Reyes, we were able to read Tom’s (incomplete) book manuscript on the topic of personhood and authorship. Here’s the text of my contribution to the workshop, in case of interest. Enormous thanks to everyone involved and everyone who came to the panel.


It is my task today to speak on the concept of generativity as discussed in Tom’s manuscript. When I think of Tom’s work and the influence he had on students like me, it is, indeed, particularly his theorization of generativity which I feel to have been the most impactful. In earlier works, Tom’s interest in generativity manifested in his study of social generation via cultural production and reproduction, with a focus on how ritual acts instantiated Roman community. In a key passage of The World of Roman Song (2005, p129), Tom cited the work of the anthropologist, Paul Connerton, who, in How Societies Remember (1989, p62) discussed Thomas Mann’s understanding of the Freudian ego: 

“We are to envisage the ego, less sharply defined and less exclusive than we commonly conceive of it, as being so to speak ‘open behind’: open to the resources of myth which are to be understood as existing for the individual not just as a grid of categories, but as a set of possibilities which can become subjective, which can be lived consciously. In this archaising attitude the life of the individual is consciously lived as a ‘sacred repetition’, as the explicit reanimation of prototypes.”

The “explicit reanimation of prototypes” is how Tom understood Roman self-construction: the invocation of ancient exemplars; the continuous citation and reinscription of Roman ancestral memory; rituals which resubstantiated the dead in the bodies of the living. Roman literary and political history demonstrates clearly that the Romans were interested in how their culture generated and regenerated itself; how the present day related to the past and preserved a tensile balance between new iterations of Roman youth, and their ancestral blueprints. All we need think of is the late Republican Brutus contemplating his ancestor, the expeller of kings; or perhaps Cicero in the Pro Caelio raising the ancient, blind Appius Claudius from the dead to speak with Cicero’s own lips and chastize Cicero’s own enemies.

In his latest work Tom approached the question of Roman generativity from some new perspectives. In his search for an understanding of Roman personhood, he figured the Roman persona as an active process, not a passive state; I think that for Tom persona was not a noun, but a verb. “Personifying” is a practice — it is an action, it is alive. Starting from this position, Tom was able to see different kinds of ancient evidence not as discrete, disconnected elements of Roman intellectual systems, but rather mutually supportive organs of an organic, synthetic whole. Tom’s work instantiates a theorization of human culture which does not merely render literature, or law, or art objects into cynical, insensate records of elites and auteurs. His organic approach reveals that the ancient artefact is an expression and mirror of biological as well as cultural forms. To put it another way, without really knowing that they are doing so, humans make things which reflect their insides. Tom’s work makes you realize that when you read a Latin text, that text is actively trying to constitute you into a Roman reader — like a 3D printer with instructions to produce a piece of plastic in a specific way, the scientific, ethical, political scripts of the Roman text tries to make us.

With this, or something like this, in his mind, Tom in this latest work proceeded to examine generativity in a number of different types of ancient evidence, ranging from the practices of Roman bride dowries to the emergence of birthday celebration as a theme in Latin love elegy. Underneath each artefact, Tom found a consistent preoccupation in the Roman attitude to cultural and biological reproduction which expressed a profound anxiety, one which can be conveyed in the form of a simple question. Will we continue to survive?

Romans expressing this anxiety in different ways figured reproduction, with its insistence upon a continuity of resources, as relating to survival in the long term. The fact that Roman bride dowries are reabsorbed into the natal family to allow women to marry again and to have children is, Tom suggests, an intentional defense mechanism against the failure to reproduce. As a result, generativity in Roman thought relates not only to explicit, biological reproduction (i.e. producing children), but making provision for a self-sustaining reabsorption of assets as part of a framework which allows such reproduction to take place. At its core, this legal provision expresses a care to conserve not just culture or biology but energy; like keeping a little something left in the storeroom in case of an unexpected hunger. Cast in this light, Roman conservatism, which is so frustratingly obvious and, frankly, obtuse sometimes (just think of Cato the Elder) seems to be not simply fanatical traditionalism, but indeed a form of conservationism.

It is an impulse to conserve that Tom saw in the Roman discourse around luxuria. The chastizing of luxuria is not simply, Tom suggests, a knee-jerk political reaction against perceived excesses and hedonism, but rather a criticism of “pointless growth” — i.e. the expenditure of energy which will not return, will not be reabsorbed and thereby conserved for future use. Tom notes that criticism of luxuria in Roman texts so often employ agricultural and botanical metaphors because luxuria was an metaphysical outgrowth which defied the boundaries of the carefully proportioned Catonian fields, designed and tended to produce year after year. Incidentally, Tom made a point to note that luxuriant excess — a squandering of resources, the refusal to regenerate, to conserve, to recycle — expressed itself in many different ways: the fact that furniture, fine art, construction, urban development, and non-reproductive sex were each as bad as each other speaks to the intersection of conservatism with conservationism in the Roman attitude; i.e. having fancy pedestal tables and sideboards (Livy 39.6) is just as bad as fucking your boyfriend because you should, good Roman, be conserving your attention and energies for generative activities. Here, Tom seems to have revealed a kind of biological essentialism in Roman thought which is not usually, I think, made explicit. Tom notes that while the elegists and other figures from the Roman counter-culture were “ambivalent” about such a formulation of luxuria, they nonetheless accepted its definition; that is, while they did not play by these rules, they accepted that these indeed were the rules. Even if you are walking away from Rome rather than towards it, you are still on the road to Rome.*

Tom translates the Latin luxuria as “pointless growth”, “withering growth”, “wild growth.” An agricultural, biological symptom of “bad” growth is itself a helpful tool to reveal the nature of “good” growth, and Tom realized that, in Roman thought, “good” growth often related to an inseparable dualism: life and death. An insistence that growth (that is “good” growth, not luxuria) is actually related to death appears, Tom says, in the Pro Marcello (23): Cicero’s exhortation of Caesar to propagate new growth includes the impossible wish that Caesar could bring the dead back to life, if only that were possible. Indeed, the relationship between the living and the dead at Rome was one of Tom’s deepest preoccupations; in the book proposal for the project, Tom had focused in on a passage from Rudolph Sohm which I believe was, for him, programmatic: “the heir is treated as though he were deceased…the deceased continues to live in the person of the heir” (1907, p504). Indeed, the idea that the dead live in the face, the name, and the actions of the living is one of the vital aspects of Roman generation, regeneration, generativity. Tom’s discussion of generativity in this manuscript reveals a living organism, a beating heart underneath the details of textuality. According to his understanding, the Romans formulated their generative function as a life pulse which conserved itself, returned to itself, and, being limited, precious, did not waste itself.

*Ursula Le Guin, The Left Hand of Darkness (1969/1999, p151): “To oppose something is to maintain it. They say here ‘all roads lead to Mishnory.’ To be sure, if you turn your back on Mishnory and walk away from it, you are still on the Mishnory road.'”

The Witcher and Star Wars IX; tempora cum causis (10)

This slideshow requires JavaScript.

With the release on the same day (Dec. 20th 2019) of both the Netflix adaptation of The Witcher and the final installation of the new Star Wars trilogy, The Rise of Skywalker, this week we got an object lesson on how cultural criticism works on a mass scale. Before we dive in to either of these, I want again to invoke Jia Tolentino’s analysis of social media as a commercially driven organ, designed to privilege negative or otherwise emotionally provocative content. In Trick Mirror, Tolentino writes that over time, personal lives transforming into public assets via social media meant that “social incentives — to be liked, to be seen — were becoming economic ones” (2019: 6). She goes on: “Twitter, for all its discursive promise, was where everyone tweeted complaints at airlines and bitched about articles that had been commissioned to make people bitch” (2019: 7-8). Looking at the internet as an exercise of performativity (one that extends and magnifies the natural human performativity of the offline world), Tolentino writes that “the internet is defined by a built-in performance incentive” (2019: 8). In How to Do Nothing, Jenny Odell (2019: 18) discusses social media too, drawing in the remarks of Franco Berardi: 

“Berardi, contrasting modern-day Italy with the political agitations of the 1970s, says the regime he inhabits ‘is not founded on the repression of dissent; nor does it rest on the enforcement of silence. On the contrary, it relies on the proliferation of chatter, the irrelevance of opinion and discourse, and on making thought, dissent, and critique banal and ridiculous.’ Instances of censorship, he says, ‘are rather marginal when compared to what is essentially an immense informational overload and an actual siege of attention, combined with the occupation of the sources of information by the head of the company.’ [Berardi 2011: 35] It is this financially incentivized proliferation of chatter, and the utter speed at which waves of hysteria now happen online, that has so deeply horrified me and offended my senses and cognition as a human who dwells in human, bodily time.”

The commercial incentive of online interaction is what particularly disturbs Odell; the communities and networks of social media are one thing, the design of such platforms to fulfill a capitalist purpose is another. Odell continues (2019: 60):

“Our aimless and desperate expressions of these platforms don’t do much for us, but they are hugely lucrative for advertisers and social media companies, since what drives the machine is not the content of information but the rate of engagement. Meanwhile, media companies continue churning out deliberately incendiary takes, and we’re so quicky outraged by their headlines that we can’t even consider the option of not reading and sharing them.”

All of this has a bearing on what happened this week. When Netflix dropped The Witcher last Friday, it was met with some noteworthy and negative reviews. Darren Franich and Kristen Baldwin’s “Netflix’s The Witcher is nakedly terrible: Review” (Entertainment Weekly) gave the series an F grade, with a 0/100 on Metacritic. These reviewers immediately, and justifiably, came under fire themselves given that they admitted that they did not watch the series in its entirety. Reponse to The Witcher has been divided: critics hate it, the public loves it. So is The Witcher any good? One of the barriers here is the general distaste for “genre” pieces. Some might avoid science fiction, fantasy, or romance just because it is labled so. Ursula K. Le Guin took on this problem in her essay, “Genre: a word only a Frenchman could love” (reprinted in Words are My Matter, 2019: 10):

“So we have accepted a hierarchy of fictional types, with ‘literary fiction,’ not defined, but consisting almost exclusively of realism, at the top. All other kinds of fiction, the ‘genres,’ are either listed in rapidly descending order of inferiority or simply tossed into a garbage heap at the bottom. This judgemental system, like all arbitrary hierarchies, promotes ignorance and arrogance. It has seriously deranged the teaching and criticism of fiction for decades, by short-circuiting useful critical description, comparison, and assessment. It condones imbecilities on the order of ‘If it’s science fiction it can’t be good, if it’s good it can’t be science fiction.'” 

In the preface to her (critically acclaimedThe Left Hand of Darkness, Le Guin had already drawn attention to the fact that science fiction, like any literature, is about its present, not the future (1969/1999: xvi):

“All fiction is metaphor. Science fiction is metaphor. What sets it apart from older forms of fiction seems to be its use of new metaphors, drawn from certain great dominants of our contemporary life — science, all the sciences, and technology, and the relativistic and the historical outlook, among them. Space travel is one of those metaphors; so is an alternative society, an alternative biology; the future is another. The future, in fiction, is a metaphor.”

The Witcher is not actually “about” magic and monsters; it’s about the relationship between storytelling and reality (Jaskier’s song vs. Geralt’s action), about the pain of isolation (Yennefer), about trying to live your life despite tempestuous circumstances (Geralt); it’s about assembling strange families, when biological ones fail (Geralt, Yennefer, Ciri). Assigning an F to The Witcher because it successfully engages with its own genre, one which you, the reviewer, do not know or care enough about to situate the object of your critique within, removes the rich layers of cultural entanglement which may make such a show worthwhile to a viewer like me. Le Guin continues (2019: 10): “If you don’t know what kind of book you’re reading and it’s not the kind you’re used to, you probably need to learn how to read it. You need to learn the genre.”

I’m not coming at this from a neutral perspective, since I voraciously played and replayed, and loved Witcher 3. But is Netflix’s The Witcher “objectively bad”? No, it’s not. It has haunting performances from Anya Chalotra (Yennefer) and Henry Cavill (Geralt) is perfection. The fight scenes are incredible. And it’s beautiful to look at. Yes, they say “destiny” too many times. But, look, it’s a romp!

On to Star Wars, then. Since we kept up our tradition of seeing the newest Star Wars on Christmas eve, I was aware of an enormous amount of critical disappointment and fan anger regarding the latest installment before I saw the film itself. You know what? It was fine. Yes, it had a very fast pace, and it wasn’t seamless with the trilogy’s own self-mythologizing. The Star Wars universe is full of holes because of the method of its composition; to some extent the writing, and overwriting (if you think that’s what J.J. is doing) resembles the process of story development in the oral tradition of the Greek epic canon, and in its reception. Consider Odysseus in the Iliad vs. Odysseus in the Odyssey vs. Odysseus in Sophocles’ Ajax. Indeed, the empty spaces projected by Star Wars are part of its charm: it’s a perfect landscape for imaginative rethinking, whether in the form of fan fiction, fan art, or roleplaying games like Edge of The Empire. That Star Wars captures the modern imagination so strongly is somewhat ironically reflected in the strength of the vitriol against it (and in the fan art. Peruse #reylo only if you dare).

All of this might be fine if it really were so simple. The emotional economy of the internet has a role to play here, but in this case we end up in a different place than we did with The Witcher. Anthony Breznican of Vanity Fair recorded J.J. Abrams’ public response to the backlash against TROS :

“After a screening at the Academy of Motion Picture Arts and Sciences on Friday, I [=Breznican] asked Abrams what he would say to those who are unhappy. Are they not getting something? Is there a problem in the fandom? ‘No, I would say that they’re right,’ he answered quickly. ‘The people who love it more than anything are also right.’ The director had just returned from a global tour with the film, where he also fielded questions about that mixed reaction. ‘I was asked just seven hours ago in another country, ‘So how do you go about pleasing everyone?’ I was like’ What…?’ Not to say that that’s what anyone should try to do anyway, but how would one go about it? Especially with Star Wars.’ With a series like this, spanning more than four decades, nearly a dozen films, several TV shows, and countless novels, comics, and video games, the fanbase is so far-reaching that discord may be inevitable. ‘We knew starting this that any decision we made — a design decision, a musical decision, a narrative decision — would please someone and infuriate someone else,’ Abrams said. ‘And they’re all right.'”

You can see how the viewers’ response to Star Wars might be taken as a reflection of contemporary political and cultural life in the US. In the New York Times, Annalee Newitz affirmed Le Guin’s view that cultural artefacts, sci-fi or not, are reflective of the society which produces and consumes them:

Star Wars became a new national mythos; it rebooted America’s revolutionary origin story and liberty-or-death values using the tropes of science fiction. Now, however, the movies no longer strike the same chord. Just as America’s political system is falling into disarray again, our cultural mythmaking machine is faltering as well.”

How and why we critique Star Wars may well reflect some deeper truth about the times we live in, but there’s another dark side to all this (get it?). To some extent the divided criticism is irrevelant, given that TROS earned an enormous amount of money. Indeed, the controversy only helped bring in the dollars (not to mention all the baby yodas hiding under the xmas trees this year). We entrusted our storytelling to a capitalist behemoth, and it’s disconcerting that cultural criticism has no impact on its forward march. Some have suggested that the F rating which Entertainment Weekly gave The Witcher was motivated by a desire to get more eyeballs (and more $) by artificially stirring up controversy. Given that the internet runs on divisiveness and ire (these are our social currencies), that might have been an economically shrewd move. But was it good cultural criticism?

Jenny Odell on Cicero, Suzanne McConnell on Kurt Vonnegut Jr.; tempora cum causis (9)

Ancient and Modern. In the De Fato (10-11), Cicero discusses whether it is possible for the individual to overcome their nature. Here comes the Loeb:

Stilponem, Megaricum philosophum, acutum sane hominem et probatum temporibus illis accepimus. Hunc scribunt ipsius familiares et ebriosum et mulierosum fuisse, neque haec scribunt vituperantes sed potius ad laudem, vitiosam enim naturam ab eo sic edomitam et compressam esse doctrina ut nemo umquam vinolentum illum, nemo in eo libidinis vestigium viderit. Quid? Socratem nonne legimus quemadmodum notarit Zopyrus physiognomon, qui se profitebatur hominum mores naturasque ex corpore oculis vultu fronte pernoscere? stupidum esse Socratem dixit et bardum quod iugula concava non haberet—obstructas eas partes et obturatas esse dicebat; addidit etiam mulierosum, in quo Alcibiades cachinnum dicitur sustulisse. [11] Sed haec ex naturalibus causis vitia nasci possunt, exstirpari autem et funditus tolli, ut is ipse qui ad ea propensus fuerit a tantis vitiis avocetur, non est positum in naturalibus causis, sed in voluntate studio disciplina; quae tollentur omnia si vis et natura fati…firmabitur.

“The Megarian philosopher Stilpo, we are informed, was undoubtedly a clever person and highly esteemed in his day. Stilpo is described in the writings of his own associates as having been fond of liquor and of women, and they do not record this as a reproach but rather to add to his reputation, for they say that he had so completely mastered and suppressed his vicious nature by study that no one ever saw him the worse for liquor or observed in him a single trace of licentiousness. Again, do we not read how Socrates was stigmatized by the ‘physiognomist’ Zopyrus, who professed to discover men’s entire characters and natures from their body, eyes, face and brow? He said that Socrates was stupid and thick-witted because he had not got hollows in the neck above the collarbone—he used to say that these portions of his anatomy were blocked and stopped up. He also added that he was addicted to women—at which Alcibiades is said to have given a loud guffaw! [11] But it is possible that these defects may be due to natural causes; but their eradication and entire removal, recalling the man himself from the serious vices to which he was inclined, does not rest with natural causes, but with will, effort, training; and if the potency and the existence of fate is proved…all of these will be done away with.”

In this passage, Cicero describes some of the quote-unquote defects which naturally arise in humans. Stilpo (4th c. BCE) reportedly had a natural proclivity for alcohol and sex with women; he was, according to friends, ebriosus (“addicted to drink”) and mulierosus* (“addicted to women”). But, Cicero says, Stilpo was able to master his nature with philosophical training (doctrina), and was never seen drunk again, and showed no outward sign of lust. Zopyrus (5th c. BCE), applied physiognomy, i.e. the theory that human character can be read in the condition of the body, to Socrates and concluded from the philosopher’s body that he could only be an idiot. Oh, and that he must also be “addicted to women” (mulierosus again). Cicero writes that nature may be responsible for giving us certain tendencies. But, he says, it is human agency that can overcome them: “will” (voluntas), “effort” (studium), and “training” (disciplina). This passage, of course, contains an oversimplistic attitude to addiction as well as an ablest assumption that bodily imperfection is a mirror of morality or intellect. It’s also quite clear that these anecdotes are designed to reflect male power in the context of elite competition: the detail that the notorious party animal, Alcibiades, laughed at Zopyrus calling Socrates names suggests a symposiastic setting (Phaedo’s dialogue, Zopyrus, dramatized a debate between the physiognomist and Socrates). Putting those things aside, what do we make of Cicero’s claim that we can overcome our nature?

In the recent (and superb), How to Do Nothing (2019)Jenny Odell cites this passage of Cicero’s De Fato (pp71-72) in the context of arguing for the creation of a “third space” of attention — one which reframes human interaction with reality as a kind of rejection of market forces and commercially-run social media. The book as a whole is a meditation on and a protreptic towards a modern kind of recusatio, i.e. the technique of saying “I would prefer not to.” Odell asks her reader to refuse to internalize the contemporary narrative of productivity, and to reclaim time and space to “do nothing.” (There are a lot of classical references throughout — Seneca, Epicurus, Diogenes. And Cicero’s cum dignitate otium is clearly a spiritual forebear.) Here’s what Odell says about this passage of Cicero (p72):

“If we believed that everything were merely a product of fate, or disposition, Cicero reasons, no one would be accountable for anything and therefore there could be no justice. In today’s terms, we’d all just be algorithms. Furthermore, we’d have no reason to try to make ourselves better or different from our natural inclinations. VOLUNTATE, STUDIO, DISCIPLINA — it is through these things that we find and inhabit the third space, and more important, how we stay there. In a situation that would have us answer yes or no (on its terms), it takes work, and will, to keep answering something else.”

The possibility of escaping (or mitigating) the frailties of human psychology and embodiment which Cicero suggests relies on the intentional application of the mind (or soul). Odell would have us apply ourselves in this way as an act of resistance against cynical structures of social influence. The concept of “will” (voluntas) invokes the notion of presence — or attention — the ability to be here in the moment, to have an appreciation for the moment in all its granularities. To “focus” (studium). As for the “training” (disciplina), this obviously could take a number of forms. But evidently self-awareness, and awareness of the churning forces around you, is at the core of this idea.

*Mulierosus is quite an unusual Latin word! It only appears in extant classical Latin three times. According to Aulus Gellius (4.9.2), quoting “mulierosus” as discussed by the Pythagorean magician, Nigidius Figulus, the Latin suffix –osus indicates an excess of the characteristic in question.

Internet.

Excerpt. Suzanne McConnell on Kurt Vonnegut Jr., 2019: 134-135: “By its nature, literary fiction ‘teaches’: it shows how people feel, think, respond, vary; how circumstances affect them; how their brains, personalities, surroundings and culture make them tick. How an experiences strikes a particular person a certain way, and another differently. How a person feels inside as opposed to how they act or are perceived. And so on. All writing teaches — communicates something about something. [p135] Even bad writing. So if you’re writing, you’re teaching. You can’t help it. But then there’s intentional teaching through writing.”*

*Austin came into the room to point to a passage written by Vonnegut (on teachers and teaching) which was quoted on this page. So I thank him for the excerpt this week.

Daily Life. Max helped me grade. 

IMG_3441

 

 

On Adam Driver.

This slideshow requires JavaScript.

Earlier this week I went to an independent cinema in Boston in the rain to see a matinee screening of Marriage Story, even though it’s on Netflix and I could have watched it from my couch. I’m glad I did. I really adore going to the movies on my own, especially if there’s hardly anyone else with me in the screening. And giving my attention to only one point of focus feels meditative. As the entire internet knows by now, I am very fond of Adam Driver. I’ve seen an embarrassing number of his films. (But not Silence (2016), because, and I’m sorry to say this, I just can’t deal with Andrew Garfield. Also, I didn’t see Lincoln (2012), in which he has a minor (yet acclaimed) role, because, god, who has the time. And I passed on Midnight Special (2016).) As I was watching Marriage Story, I was thinking about how I would rank my favourite Adam Driver performances, as though ranking an actor’s performances is something I should be doing, or even be thinking about doing. I don’t know if I can really give a numerical value to this, but here’s what I’ve got:

  • Girls (2012-2017). A number of recent profiles have been unable to resist the fact that a character in Girls (“Jessa”, played by Jemima Kirke) once said of Adam Driver’s character (“Adam Sackler”): “He does sort of look like the original man” (New Yorker, Oct. 21st; The Washington Post, Dec. 6th; The Observer, Dec. 8th). Even though Driver was nominated for an Oscar recently (for BlacKkKlansman, 2018), part of me still thinks his earlier work in Girls is the best because he seems at ease in this role, and the outcome of that ease is a performance which is naturalistic and, quite frankly, funny. In general, Driver’s performances can be characterized as tensile: while calm, he always seems on the edge of a fit of rage. This was true long before Kylo Ren. And in Girls it was much more nuanced. For me, Driver is at his best in season 3 of Girls, where his story arc takes him into the theatre; “Adam Sackler” plays Bronterre O’Brien Price in a Broadway production of George Bernard Shaw’s “Major Barbara.” Driver has a theatre company and stage background, which explains why he gravitates towards narratives about the theatre, and thrives (imho) in those roles (see: Marriage Story). Writers like to write about writing, actors like to act about acting. Television allows actors to really inhabit a role: character can be developed more slowly and deliberately; there’s time – and space – for more emotional depth. So even though there are so many great Adam Driver films, I still think of Girls as some of his best work.

    Girls s3 e09
    “Don’t only text me ‘CAR CRASH!'” “Girls” s3 e09
  • Marriage Story (2019). First of all, I reject this film’s thesis, namely that Los Angeles is a vapid cultural wasteland (as pointed out, and also rejected, by Ira Madison III this week on Keep It). I loved LA. Anyway, Marriage Story might well be Adam Driver’s best performance to date. Driver does best, I think, when he can be reactive — his trademark intensity and tensility structures his performance most clearly in the silences. Highlights of this film: the claustrophobia of the legal proceedings, particularly that one scene with Alan Alda; the knife scene (!); and, of course, Driver singing “Being Alive” from Steven Sondheim’s Company (1970). Adam Driver actually sings more often than you might think. In Hungry Hearts (2014), he sings in Italian; he infamously contributes to a song in Inside Llewyn Davis (2013), alongside Justin Timberlake and Oscar Isaac; and he sings as the character “Art the Artist” in two very remarkable episodes of Bob’s Burgers (“The Bleakening,” Parts 1 + 2, in season 8).

This slideshow requires JavaScript.

  • Paterson (2016). This quiet, contemplative film is a good watch if you feel like indulging in long, lingering close-ups of Adam Driver’s interesting face. Adam Driver plays a bus driver (get it?) whose name is Paterson and who lives in Paterson, NJ. He’s married to the most beautiful and cutest woman alive, a character played by Iranian actress Golshifteh Farahani, who paints everything black and white, and has a precocious dog named Marvin. The film is an imaginative reception of William Carlos Williams’ Paterson, and dramatizes Williams’ conception of a poetic continuum of mind and matter, extending between human consciousness and the mundanities of daily life. Driver plays his role tranquilly, passively. (Again, that trademark tensility works best in profound, deliberate silences.) The interplay between this character’s poesis and his observations of daily life play out through a series of coincidences and minor dramas which break through the cyclical rhythm of his bus route. Some really wonderful and memorable performances by Barry Shabaka Henley, William Jackson Harper, and Chasten Harmon. Honestly, I love this one.

This slideshow requires JavaScript.

  • The Man Who Killed Don Quixote (2019). OH BOY, what a hot mess. This Terry Gilliam film had an extremely troubled production, and it shows. The plot is bonkers, a lot of the dialogue is nonsense, the characters are underdeveloped (the female characters especially so), and, as A. O. Scott wrote in the NY Times: “the romanticism has a creepy side.” Plus, it’s long! This film attempts in a rather ham-fisted way to entangle itself with and enact the themes of Cervantes; Jim Jarmusch’s Paterson, and its entanglement with William Carlos Williams, is a much more subtle and successful version of this. But if you can make it through this weird (and, overall, basically bad) thing, there is some really lush cinematography and glamour…

    The Man Who Killed Don Quixote (2019).
    Adam Driver as Toby and Olga Kurylenko as Jacqui in “The Man Who Killed Don Quixote” (2019) https://twitter.com/quixotemovie/status/997193036747673602
  • Star Wars (2015, 2017; 2019). It’s interesting. I think Adam Driver plays the role of Kylo Ren well, but Driver’s overexposed association with this character has the effect of flattening out his perceived range. Put another way, those who know Driver principally as the villain in this Disney merch fest will (with some cause) think that all he can do is stomp around and have “hissy fits.” But, listen, I think he does good things with the role. Yes, I audibly gasped when he took off that helmet in The Force Awakens (2015). That “You need a teacher” line is unfortunate. And I don’t know why he had to be wearing such high waisted trousers in The Last Jedi (2017). He has a uniquely hunched physicality that makes his fight choreography very interesting to watch. What troubles me most, I suppose, is the oversimplicity of Kylo Ren’s character. (Honestly, there is no “there” there for much of Star Wars.) While his struggle with “dark” and “light” is asserted by the films, the received image of Kylo Ren in pop culture is of an angry young man who violently rejects the limits placed upon him by his social context. In 2019, we have a lot of angry young men who use violence to reject limits. In the context of his broader filmography, Adam Driver’s portrayal of anger is actually quite nuanced; elsewhere his performances of rage contain a self-awareness which acknowledges and indeed urges that anger must be resolved and exorcised somehow. There’s an interesting scene of rather violent rage between Jemima Kirke (who plays “Jessa”) and Adam Driver in season 5 of Girls: it’s intense, it’s absurd. But its intense absurdity is precisely the frame of critique which is needed when depicting acts of anger — there must be some mechanism of judgement that presents a means of resolution. In Kylo Ren’s case, we’ll find out soon how his anger is resolved. But in the meantime, I wonder, as others have, about presenting young male anger as a piece of merchandise. Star Wars reified Driver’s place in the modern canon, but he’s been busy doing other work in order to insist that there’s more to him than this. 

This slideshow requires JavaScript.

  • There are also some nice small roles and appearances. As a dog lover, I find this short W Magazine interview where Adam Driver talks about his dog, Moose, extremely charming; in the background to that video, there is a woman extremely losing her shit, which I also find very sweet. Driver plays a small but noticeable role as a fuckboi in Noah Baumbach’s Frances Ha (2012), starring the extremely charismatic Greta Gerwig. The next year, Driver sweetly played “love interest” to the force of nature, Mia Wasikowska, in Tracks (2013); there’s a very sad scene in this film involving dogs, btw, so watch out. While he won his Oscar nomination for BlacKkKlansman (2018), and he does give a very subtle performance in it, I wouldn’t actually say it’s his best (that is often the way with Oscars, isn’t it?). I reject the thesis regarding millennials in Noah Baumbach’s While We’re Young (2014), but he plays this odious part fairly well (the stand-out scene to me in that film is when Naomi Watts, high on Ayahuasca, mistakes Adam Driver for Ben Stiller — that moment is so poignant and intimate). 

This slideshow requires JavaScript.

  • With the good, there’s the not so good. We’ve already talked about The Man Who Killed Don Quixote (2019). Logan Lucky (2017), which put Adam Driver alongside Daniel Craig (doing an accent even before Knives Out) and Channing Tatum, should have been more of a romp but the whole thing fell flat, and he has a rather underwhelming presence. I wanted to like Jim Jarmusch’s The Dead Don’t Die (2019), but I didn’t — it lacked subtlety and the sharpness of parody; plus, I know that Tilda Swinton basically is a real life alien (spoiler alert) but she has apparently learned nothing about appropriating Asian culture. I watched The Report (2019) a few days ago and I’ve already forgotten about it: not only it is it forgettable, it lacks the high drama of its genre, and, more worryingly, presents certain politicians in the guise of heroes (again: lacking subtlety). I think I did watch This Is Where I Leave You (2014) but, again, I remember nothing about it, and it’s just about what you would expect given the genre — although I think there are broad comedies of this type which have more heart. 

Roman time, “Mrs. Maisel”, Ursula K. Le Guin; tempora cum causis (8)

Ancient. This week, BU hosted the annual Classics day for high school and middle school students from the Boston area, with workshops on different aspects of the ancient world. The theme this year was ancient time, and so I did a workshop on time keeping devices in Rome. We talked about the ham sundial from Herculaneum, the so-called ‘Horologium’ of Augustus, and a 4th c. lunar calendar. I had the students recreate these devices in clay and paper to get a sense of how they worked. Afterwards I posted a thread detailing my workshop on twitter, including pdfs of the materials in case anyone wants to use them for a workshop of their own (pdf of the handout | pdf of the printout). Some twitter users tagged it with the unrolling Thread Reader App, so you can read the thread in the resulting blog format if you wish. I wore my “petrify the patriarchy” shirt from wire and honey for Classics day and received lots of compliments! 

Modern. After sleeping on it for way too long, I’m finally watching The Marvelous Mrs. Maisel. Although I came to Amy Sherman-Palladino’s earlier work, Gilmore Girls, late in life, when I did discover it, I fell deeply in love with it (one time, when we were still living in LA, we saw Keiko Agena outside of iO West, which is closed now). As much as I’m enjoying Mrs. Maisel, I find myself bothered by one of the characters. Midge Maisel’s father, played by Tony Shalhoub, is a professor of mathematics at Columbia. He’s an older man, and he’s “curmudgeonly.” There’s only one student in his maths class that he thinks is any good, and he says so. His students are DESPERATE for his approval. They try out new references to impress him. They follow him around in a pack. When things start to go wrong at the university, the dean tells him: “You’re a brilliant mathematician, but an uncooperative colleague and a very poor teacher.” There are a lot of interesting touches of modernity and anachronism in Mrs. Maisel, set in the late 1950s. The fact that the state of his teaching would be a concern to the scholarly community may number among them.

I found myself being bothered so much by this character, despite Tony Shalhoub’s deep charm (let’s face it, Shalhoub is a national treature), that I had to take a moment to think about why and excavate my emotional response. It’s not the character, really, that I have a problem with, but the trope that it draws upon. Shalhoub’s character, the proud patriarch in crisis, is supposed to be flawed, supposed to be fragile. Depicting professorial grumpiness is a vehicle for this character’s essential nature. But, evidently, I’m bothered by the “professor” stereotype. Sometimes when I’m at academic conferences, I see younger men wearing tweed, bowties, thick-framed or horn-rimmed glasses, as though this were the uniform of the intellectual. This was the contemporary style of dress for the older generation of gentlemen who have now become the senior scholars in our field, but for those men these clothes were just clothes, not a costume. (Well, maybe the elbow patches were an intentional display of identity then too.) The idea that there is a specific scholarly aesthetic implies that there is also a specific scholarly behaviour. Say, curmudgeonliness. Or torturing your students.

Education has changed. What we think education is for, who can receive an education, who can do the educating — all those things have changed. We do so many things today that a professor of the 1950s would never think of doing, may even have been incapable of doing. Scholarship over time has opened itself to new ways of thinking. Scholarly personnel is more varied. We need even more new ways of thinking and we need to open our doors to even more people. The potential to do intellectual work was never limited to one kind of person, but for decades the scholar was basically one kind of person. That’s not the case anymore. Yet the stereotype remains. Scholarship and intellectual life is a practice, not a costume.

This slideshow requires JavaScript.

Internet.

Excerpt. Ursula K. Le Guin 2019: 5: “All of us have to learn to invent our lives, make them up, imagine them. We need to be taught these skills; we need guides to show us how. Without them, our lives get made up for us by other people.” 

Daily Life. Snow came to Boston. 

EK4p0o0UEAAaDEW

 

Plato, the shadow book, Samantha Irby; tempora cum causis (7)

Ancient. I wouldn’t say that I am the biggest Plato* fan in the world, but there is one passage of the Protagoras that I do actually find myself coming back to often. Here comes the Loeb of Protagoras 314a–314b:

καὶ γὰρ δὴ καὶ πολὺ μείζων κίνδυνος ἐν τῇ τῶν μαθημάτων ὠνῇ ἢ ἐν τῇ τῶν σιτίων. σιτία μὲν γὰρ καὶ ποτὰ πριάμενον παρὰ τοῦ καπήλου καὶ ἐμπόρου ἔξεστιν ἐν ἄλλοις ἀγγείοις ἀποφέρειν, καὶ πρὶν δέξασθαι αὐτὰ εἰς τὸ σῶμα πιόντα ἢ φαγόντα, καταθέμενον οἴκαδε ἔξεστι συμβουλεύσασθαι, παρακαλέσαντα τὸν ἐπαΐοντα, ὅ τι τε ἐδεστέον ἢ ποτέον καὶ ὅ τι μή, καὶ ὁπόσον καὶ ὁπότε· ὥστε ἐν τῇ ὠνῇ οὐ μέγας ὁ κίνδυνος. μαθήματα δὲ οὐκ ἔστιν ἐν ἄλλῳ ἀγγείῳ ἀπενεγκεῖν, ἀλλ᾿ ἀνάγκη, καταθέντα τὴν τιμήν, τὸ μάθημα ἐν αὐτῇ τῇ ψυχῇ λαβόντα καὶ μαθόντα ἀπιέναι ἢ βεβλαμμένον ἢ ὠφελημένον. 

“For I tell you there is far more serious risk in the purchase of doctrines than in that of eatables. When you buy victuals and liquors you can carry them off from the dealer or merchant in separate vessels, and before you take them into your body by drinking or eating you can lay them by in your house and take the advice of an expert whom you can call in, as to what is fit to eat or drink and what is not, and how much you should take and when; so that in this purchase the risk is not serious. But you cannot carry away doctrines in a separate vessel: you are compelled, when you have handed over the price, to take the doctrine in your very soul by learning it, and so to depart either an injured or a benefited man.”

Food and drink, the things we consume, can be good or bad for us. But we’re not immediately exposed to this benefit or harm. We have a chance to consider whether or not to ingest them. We can consult someone whose opinion is worth knowing. Ideas are different, according to Socrates. Once you hear something, you can’t unhear it. There is no mechanism to mediate ideas – we become infected, by good things and bad alike, via an organic movement of thought which no vessel can contain. Contagion of this kind is discussed in the Protagoras as part of a warning against accepting the teachings of the sophists – individuals who, from Plato’s perspective, can teach you intellectual parlour tricks, but not true wisdom. As James Collins (2015: 158) writes, “In this scenario, there is no gap between things taught and things learned; both are μαθήματα and instantly transmitted…To hear is to learn. Exposure means ingestion.”

As Collins notes, it’s surprisingly to hear Socrates speak this way. The absorption of ideas is figured as instantaneous – stripped of the possibility of a failure, or rejection, of understanding (ib.): “Following his metaphor of ingestion (δέξασθαι αὐτὰ εἰς τὸ σῶμα) to the end, also missing are the vital processes of chewing, swallowing, and digestion, not to mention the possibilities of indigestion and regurgitation.” Yet everyone who has tried to learn something knows that it’s not always easy to internalize new ideas. At the same time, the kind of unwitting contagion which this passage describes is a real phenomenon. In 1994, Elaine Hatfield and her colleagues, John Cacioppo and Richard Rapson, produced a text entitled Emotional Contagion, outlining the impact of one individual’s emotions over another’s. In the introduction to the book, Hatfield describes a scenario in which she and Rapson, working together as therapists, left a session in a state of high-wired anxiety. After some reflection they understood that they were both feeling the emotions of their patient, even though they did not realize this was the case, and indeed, had initially missed the signs of her deliberately cloaked distress.

While we’ve historically been discouraged from thinking so, we learn with our emotions. Plato’s suggestion that we “catch” ideas reflects the mechanism of emotional contagion, which in turn suggests that knowledge is generated and conveyed relationally, socially. Despite the fact that we (academics especially) flagillate and exert ourselves into knowing more, there is something to be said for the fact that a student (and, in my case, a professor, i.e. the eternal student) learns passively from their environment and from their social surroundings. The internet’s role in this epistemology of contagion is an interesting one. On the one hand, the exposure to so much social information means that we are exposed (and ingest) ideas at a higher rate than ever before. This does have benefits. The confessional nature of social media has given me a chance to see into the lives of those who have different experiences from me. Getting to know the voices of the marginalized prepares me better to advocate them for them in my own positions of power (such as they are). On the other hand, the difficulty of resisting this absorption means that malicious ideas are also spread quickly. Ultimately, it is of interest to that, as original as I may think I am, some of my ideas are not coming directly from my interal processes but are developing passively from my interactions with others.

*My problem is not really with Plato, but with the reception of Plato. It bothers me that Plato is so often invoked without placing his ideas in their cultural and intellectual context.

Modern. After hearing Sarah Derbew discuss Kevin Young’s The Grey Album (2012) during her talk at BU last week, I wanted to read it too. The first chapter of this work, “The Shadow Book”,  presents a taxonomy of books which fail to be written. Given the fact that what we research and write about must be reflective of our identities, I’m not exactly sure why I am so interested in fragmentation and lack. At a certain point I moved away from the fullness (some might say over-fullness) of Cicero towards the other voices which his works contain – or at least echo – and from that point on I became attracted to the world of the fragmented, forgotten, or lost. Young’s work confirms something which can be readily felt: no writing contains everything that the writer might wish to say, and all writing reveals a negative imprint of the world which shaped it. There is no fullness. In the case of black culture, fullness is negated not just by the natural negation of existence, but by a deep and long history of violence – slavery, social death, and the social inheritance of their effects. Books, if they are written in the first place, are left unfinished, lost, burned. In the context of violence, black authors speak in code; their words say one thing, but there is also another meaning, a shadow. Donna Zuckerberg used the image of the shadow library to describe how harassment in the academy has made us lose brilliant scholarship that was never produced. Young’s writing invites a reflection on what we really think text can do — certainly, literature and writing gives an index of reality, but it isn’t the totality of what is real. I keep find myself saying to our students, “The ancient sources don’t want to tell you what you want to know.” There is an inherent conservatism to most ancient writing – they are not like William Carlos Williams, whose poetry attempts to include, as Young (2012: 16) writes, “not everything but anything.” Cicero, whose letters often seem so confessional, wasn’t making a documentary for us. He took things for granted in his writing (as we all do); aspects of his life which were so familiar to him that he didn’t write about them are the kind of things which we could now never prove existed.

Internet.

Excerpt. Samantha Irby 2017: 218: “People are boring and terrible. I am boring and terrible. My funny runs out, my cute runs out, my smart sometimes hiccups,* my sexy wakes up with uncontrollable diarrhea. I have an attitude. And a sharp edge! I’m impatient. I like the whole bed.”

*I don’t usually include an editorial note on these excerpts (I like them to speak for themselves), but “my smart sometimes hiccups” is my new mantra.

Daily Life. Last week, thanks to Rhiannon Knol, I got to get up close and personal with some early printed classical texts at the Boston Antiquarian Book Fair. 

Temple of Dendur, Lindy West, Josephine Balmer; tempora cum causis (6)

Ancient. One of the interesting features of studying the past is that you can see patterns of change even in objects which claim to be unchanging. A famous passage of the Greek historian, Thucydides, expresses horror at the fact that political crisis changed the very meaning of words themselves: καὶ τὴν εἰωθυῖαν ἀξίωσιν τῶν ὀνομάτων ἐς τὰ ἔργα ἀντήλλαξαν τῇ δικαιώσει, “Words had to change their ordinary meaning and to take that which was now given them” (Histories 3.82). During the reign of Augustus (27 BCE – 14 CE), this phenomenon is perceptible in a number of ways. Augustus, who claimed to restore the republic, changed the course of Roman political life forever, inaugurating a monarchy which did not name itself so. In the Metropolitan Museum in New York there is a famous Egyptian temple honouring the goddess, Isis, as well as Pedesi and Pihor, deified sons of the local Nubian chieftain. This temple is noteworthy in a number of respects, one of which is the fact that the ruler of Egypt depicted making sacrifices to divinities in its reliefs is the Roman princeps, Augustus himself. Augustus is there represented in the traditional regalia of the Egyptian pharaoh. An example of a deep difference consciously hidden underneath traditional forms. 

main-image (4)
The Temple of Dendur, completed by 10 BCE. Metropolitan Museum.
main-image (5)
The Roman princeps, Augustus (left), as Egyptian pharaoh, burning incense before deified figures of Pedesi and Pihor. Metropolitan Museum.

Incidentally, I can’t think about the Temple of Dendur without also thinking about this scene from When Harry Met Sally (1989), filmed right in front of it.

 

Modern. Last Friday, we made the trek to Cambridge (crossing the Charles is no joke) to hear Lindy West speak at an event organized by the Harvard Book Store as part of the book tour for her newly released The Witches are Coming. I heard West at the MFA last summer, giving a version of what would be the book’s first chapter; that lecture was deliberately arranged as a counterpoint to the narratives of sexual conquest on display in the MFA’s special exhibit at the time: “Casanova’s Europe.” West praised the curators of the exhibit for their explicit acknowledgement that, while Casanova’s memoirs provide a rich document of the 18th c., it also described “behaviors toward women that today would be criminal.” A praiseworthy effort to make visitors to the museum consider that the objects on display do not uncomplicatedly manifest beauty and wisdom of the past (there are other indications of this throughout the MFA). That our interaction with these artefacts is not simply one of aesthetic appreciation, but one which creates meaning by contextualizing the object within an understanding of the culture which produced it.

The reading from The Witches are Coming took place in a church, which I thought was somewhat fitting. The audience was 99% women. After her reading, audience members asked her questions which were heartbreakingly moving. Over the week that followed, one which was unusually busy, I found snatches of time here and there to read the book itself. Day-to-day life in modern America is filled with noise and fury and rhetoric, and bad faith arguments. West, with characteristic wit, manages to cut through that noise. 

Internet.

Excerpt. Josephine Balmer, Papyrus Trace (Papyrological Institute, Florence, 1953) in The Paths of Survival (2017): 

“trapped in the scent of lavender, musk;
letters from a lost world, seeping back

to black, etched in breath-blown dust:
speak out… …dissent… …enough…:

a few precious words of Aeschylus
we’d all believed had gone forever —

the fragment found at Oxyrhynchus
then lost again in an Allied raid

by this second miracle returned to us,
late violets trembling above a grave.”

Daily Life. This week, we had a Greece vs. Rome debate at BU. My colleague, Sasha Nikolaev, made us some ostraka for the vote — I heard him hammering the pot through the wall. It ended in a tie! 

EJNJCkHUEAEtgxq.jpg

Vergil enamels, liking what you like, Adrienne Maree Brown; tempora cum causis (5)

Ancient. Vergil’s Aeneid has inspired no shortage of visual representations in antiquity and modernity. In the 16th century, an unknown enameler made a series of plaques (82 are recorded) illustrating episodes from the Aeneid. These images are based on woodcut illustrations in the complete works of Vergil, edited by Sebastian Brandt, and published by Johann Grüninger in Strasburg in 1502. Here’s a small selection: Aeneas leaves Dido in Book 4 (Met Museum); Aeneas and the Sibyl in the Underworld in Book 6 (Fitzwilliam Museum); Nisus and Euryalus in the enemy camp in Book 9 (Met Museum). 

 

Modern. This week I’ve been thinking about how difficult it can be to be open about what you really like. I was listening to Monday’s episode of What a Cartoon, which is done by the Talking Simpsons hosts, Henry Gilbert and Bob Mackey (I’ve written about them before, and surely will again). This week they were talking about the Pokémon anime (Japan 1997; US 1998). Both of the hosts spoke with their guest, Kat Bailey, about the fact that they felt pressure to hide their interest in it, despite the fact that it was deeply attractive to them and deeply resonant. There are a number of reasons why you might feel the need to hide your interest in something benign. We do want to connect, of course, but openness of this kind is a vulnerable thing. And it’s not just about the popularity contest. When I think back to times when I kept my interests to myself, I can pinpoint a dread which stems from middle class anxiety. For a long time, part of me truly could not embrace pop culture publicly, as much as I wanted to, because I felt that I was supposed to be interested, or appear to be interested, in something else. It’s certainly connected to my profession; in some of the academic environments I’ve found myself in, there has been a performative preference for the high brow. But it didn’t originate there for me. Education more broadly has historically expressed itself as the individual’s ability to make the correct series of references to correct audiences. The peer pressure which arises from a common consciousness can be enough to make you want to hide the parts of your interest that do not fit into the contemporary cultural lexicon.

In this context, I find myself, again, having some praise for the internet, despite its current and growing toxicity. Early on, it gave me an outlet and a community for my (probably bad) creative writing on message boards. The mainstream acceptance of quote unquote nerd culture could be explained by the fact that those who developed secret ways to find their people back in the nascent years of the internet over time set the stage for the embrace of once niche interests by the general public. (The cynical commodification of our desires and childhood nostalgia also has a role to play here.) While my twitter account is primarily geared towards an audience expecting Classics content, these days I tweet almost as much about the “bad” tv I watch, Adam Driver, or video games.

I’m beginning very slowly to operate by the principle that even if I know the thing I like is not actually that good, it’s okay for me to like it, and to admit that that’s the case. I’m beginning to regret not giving some things a chance just because I thought I was supposed not to like them. If some cultural artefact resonates with you, the draw is magnetic. The feeling of being pulled in a certain direction without knowing exactly why is the same divining rod that I use for my scholarly life. I’m drawn to certain texts for my research because something about them resonates with me; I read some ancient authors instead of others because I’m more interested in how they do what they do. Resisting that magnetic pull based on the expectation of imagined rejection is an extra mental block which none of us needs. 

Internet.

Excerpt. Adrienne Maree Brown 2019: 11: “I believe in transformative justice — that rather than punishing people for surface-level behaviour, or restoring conditions to where they were before the harm happened, we need to find the roots of the harm, together, and make the harm impossible in the future. I believe that the roots of most harm are systemic, and we must be willing to disrupt vicious systems that have been normalized. I believe that we are at the beginning of learning how to really practice transformative justice in this iteration of species and society. There is ancient practice, and there will need to be future practices we can’t yet foresee. But I believe that with time it must be an incredible pleasure to be able to be honest, expect to be whole, and to know that we are in a community that will hold us accountable and change with us.”

Daily Life. I received two sets of flowers for the first time in my life!