Nausicaa pyxis; climate change and fragments; N. K. Jemisin; tempora cum causis (15)

Ancient. In the Boston MFA, there is a late 5th c. BCE pyxis which depicts the scene from Odyssey Book 6 where a naked Odysseus encounters Nausicaa. Given that the pyxis was an object used by women (as a make up or jewelry box), it is really interesting to see what kind of scenes are depicted on them; i.e. what kind of media did the Greeks of this period think was fitting for women? I look at this object and imagine a young (affluent) woman holding it in her hands, seeing reflections of her own life in the life of Nausicaa in Phaeacia (as well as the “calculated flirtation”, as Emily Wilson calls it, between herself and Odysseus). Note the care given by the painter to distinguish each of the figures on the vase according to their narrative role, and class: Odysseus, as in Homer’s depiction, is embarrassed by his nakedness; Athena is present as his guide; the women in Nausicaa’s attendance run wildly away when Odysseus appears (as in Homer), except for the one still engaged in the washing; Nausicaa stands tall, and is elaborately dressed. This graphic representation is remarkably faithful to the verses of the Odyssey. Compare the 20th century version by American painter, William McGregor Paxton, in which everyone is naked, not just Odysseus.

This slideshow requires JavaScript.

Here is Emily Wilson’s translation of the scene (Odyssey 6.119-146):

“What is this country I have come to now?
Are all the people wild and violent,
or good, hospitable, and god-fearing?
I heard the sound of female voices. Is it
nymphs, who frequent the craggy mountaintops,
and river streams and meadows lush with grass?
Or could this noise I hear be human voices?
I have to try to find out who they are.”

Odysseus jumped up from our the bushes.
Grasping a leafy branch he broke it off
to cover up his manly private parts.
Just as a mountain lion trusts its strength,
and beaten by the rain and wind, its eyes
burn bright as it attacks the cows or sheep,
or wild deer, and hunger drives it on
to try the sturdy pens of sheep — so need
impelled Odysseus to come upon
the girls with pretty hair, though he was naked.
All caked with salt, he looked a dreadful sight.
They ran along the shore quite terrified,
some here, some there. But Nausicaa stayed still.
Athena made her legs stop trembling
and gave her courage in her heart. She stood there.
He wondered, should he touch her knees, or keep
some distance and use charming words, to beg
the pretty girl to show him to the town,
and give him clothes. At last he thought it best
to keep some distance and use words to beg her.

Modern. Earlier this week I was texting with a friend when they mentioned that coffee production may be at risk by the year 2050. The dread of climate change is something that is always with me (I stopped eating meat a few years ago for this reason), but I find myself always pushing it to the back of my mind. The coffee thing brought it forward in an instant. I found myself googling flood projections for Boston, where I live now, as well as my childhood home, Glasgow. I thought about the fear of things to come which washed over me when I saw the flood sequence in Parasite. I asked my husband whether, in light of all of this, we were really doing enough. I asked myself why, given our knowledge of the climate emergency, I don’t give up everything and try to grow vegetables and live off the grid in some sustainable way. Then the next day, I got up and continued to write and worry about my book about Latin fragments.

Last week Parul Sehgal’s review of Jenny Offill’s novel, Weather (2020) appeared in the The New York Times: “How to Write Fiction When the Planet is Falling Apart.” (Thanks, Christian and Michele! Who individually sent me this because they knew I would like it.) Sehgal writes:

‘In her new novel, “Weather,” Offill applies her instruments — the fragment, the odd fact, her deep banks of knowledge on mysticism and natural history — to a broader canvas. The stakes are the survival not of a marriage but of the planet itself. “The question I was thinking about in this book,” she told me, “was, Can you still just tend your own garden once you know about the fire outside its walls?”’

And, further in:

‘These might be familiar stories of family life, but now imagine them told in shards, the plot edging forward in jokes, quotes, Zen koans. The fragment is an old form, perhaps even our native form — don’t we speak to ourselves in curt directives, experience memory as clusters of language? In Offill’s hands, however, the form becomes something new, not a way of communicating estrangement or the scroll of a social media feed but a method of distilling experience into its brightest, most blazing forms — atoms of intense feeling. I read somewhere that clouds could be called floating lakes. That is what these fragments feel like: teeming worlds suspended in white space, entire novels condensed into paragraphs… The domestic and intellectual meet on the same plain in her work; the swirl of hair on the back of a baby’s head is as worthy a subject of contemplation as one of Wittgenstein’s aphorisms.’

The fragment has an essential duality. Whatever it is, it exists in its original context, connected to its surroundings (a baby’s hair swirl to the baby; Wittgenstein’s aphorism to the rest of his work); but it also exists on its own, as an image in a frame. It is both broken off (fragment is from frangere, Latin “to break” — it means “broken thing”) yet simultaneously resistant to breakage. Hence Sehgal’s invocation of the atom, which is, ostensibly, the thing which cannot be broken down any further (atom is from ἄτομος, Greek for “a thing that cannot be cut”). The atom as an idea (leaving aside the physical phenomenon which has been given this name) is a hard core, resistant to the processes of damage and loss which eat away at all the things which once surrounded it.

Sehgal’s analysis of Offill’s work highlights how dread in the face of climate change works upon a human observer of reality. Dread eats away perceptive connectivities, and leaves behind the most intense fragments of experience. Fragmentation plays a role in the micro but also the macro. Contemplation of its process brings up the inevitable question: what will survive? Scholars of antiquity look at this process of fragmentation after it has already occurred: shards of Sappho papyri, torturous manuscript traditions, small parts of once colossal statues. And while destruction (accidental or deliberate) has a role to play and how things do or don’t survive from antiquity, the kind of destruction we observe does not compare to what climate change can do to the face of the earth in the near future.

In the urgency of the present moment the question shifts from what will survive, to who will survive. The idea that coffee, such a mundane but profound part of my life now, may disappear in thirty years jolts me. But there is so much more at stake than this comfort, this little fragment of my life. The richest countries are the most responsible for climate change, but it is the poorest who will be most affected. How we treat each other is reflected in how we treat the earth. A renewed focus on what is lost, what we’re in the process of losing, and what we stand to lose soon is frightening, but it is what we need. In context of all of this, the fragment gains a new significance as a symbol both of our perception of reality, but also our capacity for action. Reconnecting what is fragmented, contextualizing the atomized, reframes the discrete, isolated parts of our lives as part of an urgent, global narrative.

This slideshow requires JavaScript.

Internet. 

Excerpt. N. K. Jemisin (The Fifth Season, p150): “A break in the pattern. A snarl in the weft. There are things you should be noticing, here. Things that are missing, and conspicuous by their absence.”

Daily life. We’ve been seeing a lot of films at Coolidge Corner, and it has been wonderful.

EQDnkGrW4AUiGrr.jpg

How to Write

This week at BU I gave a proseminar for our PhD students, “How to Write.” Here’s what I told them:

How to Write. 1) Establish a practice. 2) Contextualize your work. 3) Create writing community. 4) Use good tools. 5) Read about writing. 6) Stop thinking.

1) Establish a practice. Haruki Murakami is a writer and an ultra marathoner. In his memoir, What I Talk about When I Talk about Running (2008), Murakami makes an alignment between writing and running — in that each is a practice: 

“Most of what I know about writing I’ve learned through running every day. These are practical, physical lessons. How much can I push myself? How much rest is appropriate — and how much is too much? How far can I take something and still keep it decent and consistent? When does it become narrow-minded and inflexible? How much should I be aware of the world outside, and how much should I focus on my inner world? To what extent should I be confident in my abilities, and when should I start doubting myself? I know that if I hadn’t become a long-distance runner when I became a novelist, my work would have been vastly different.”

When you run, or do any kind of physical activity, it’s about putting your body in the position to do the practice. You don’t say to your muscles, “get stronger!” But rather, you run, you put yourself in the right environment every day, and over time your muscles do get stronger. It’s the same way with writing. As with running, when you sit down to write you don’t know how well it will go (and, indeed, how well it will go is governed by somewhat mysterious forces). But you put yourself in that position nonetheless; and some days you do well, other days you don’t. But you do it consistently, and, by trial and error, you figure out what kind of practice works best for you. I’ve known scholars who like to write for an hour every day first thing in the morning, and that’s it. I prefer to set aside an entire day so that I can devote several hours in a row to write. We all have slightly different ways of doing it and that’s fine — but whatever you do, do it consistently.

This slideshow requires JavaScript.

1a) Focus on time writing rather than words written. Given the fact that you don’t  know how many words you’ll be able to write in any given session, using words written as a measure of progress will ultimately become frustrating. Instead, focus on time spent writing. Doing so will allow you to have space to think (and thinking is of course part of writing!) and to develop your ideas. When I was writing my dissertation, I spent 3 to 5 hours a day working on it.

1b) Do it a little every day.
Establishing a practice means writing consistently. You may be a morning person, you may be a night owl, but whatever you do — do it every day. (Every day that you’re working, that is. It’s important to take days off, to rest. To extend the running metaphor, you have to rest your muscles for them to grow. And in addition to that, having a robust and dependable writing practice means that you can have a life. Which is important!)

1c) Keep a record of your hours. Keep a journal to note down the hours you write every day. I show an example of mine from summer 2019 below. On Wednesday 5th June, I wrote for 6 hours “with a lot of faffing” (=British slang for “screwing around”), i.e. I put in the hours but there were many distractions. I also noted what I was working on, so that I could pick up from there the next time I wrote. On Wednesday 10th [July??], I struggled. I wrote from 10.49am to 2pm, and then noted: “I need a break!” When I returned, I annotated this with “didn’t **really** take a break but worried from 2-3.15 :).” A nice example of how writing can go well or it can go badly. Nonetheless, you continue your practice!

This slideshow requires JavaScript.

I take inspiration here from another writer, Ursula K. Le Guin. In a 1988 interview, she described her written practice. Worth noting what happened in the evenings: “After 8.00pm — I tend to be very stupid and we won’t talk about this.” No one can write all day. There is always a point of fatigue, and once you reach this, you shouldn’t work against yourself. Go and rest.

How to Write [PROSEMINAR] .008.jpeg

2) Contextualize your work. When you begin your research project as a PhD student, you feel an enormous pressure to be original. Indeed, that’s one of the metrics by which we judge successful research — whether it is a new and original contribution to the field. And it’s easy to think that originality means removing yourself from what has been said before. “No one has ever looked at this issue the way I am.” Yet, truly, your writing will be at its best if you go into it acknowledging the fact that you are not alone. In the slides below I show two different perspectives on this. Is your work a lone tree? Or is it a tree that is in a contextual forest, standing alongside other work in an intellectual ecosystem? Create a dialogue between yourself, the ancient evidence, and prior scholarship. Interweave, entangle. Be synthetic.

This slideshow requires JavaScript.

In a talk from 2004, “Genre: A Word Only a Frenchman Could Love” (reprinted in Words are My Matter), Ursula K. Le Guin describes what it’s like when a writer does not acknowledge the tradition in which they are working:

“A genre is a genre by having a field and focus of its own, its appropriate and particular tools and rules and techniques for handling the material, its traditions, and its experienced, appreciative readers. Ignorant of all this, our novice is about to reinvent the wheel, the space ship, the space alien, and the mad scientist, with cries of innocent wonder. The cries will not be echoed by the readers. Readers familiar with the genre have met the space ship, the alien, and the mad scientist before. They know much more about them than the writer does.”

This slideshow requires JavaScript.

It’s weirdly easier to do this in a research area that you don’t feel as personally invested in. You can practice this deliberately, scholarly interweaving by going through the following steps:

2a) Pick an ancient piece of evidence. It could be anything.
2b) Analyze it. Spend some time just you and it. Tease out points of significance. Take notes. Think about it in the context of other things you know about its period/genre/whatever.
2c) See what other scholars have said; read 3-5 pieces of scholarship. You’ll see that some of the things you noticed have already been published. But, by reading multiple scholars on the same object/text/problem, you’ll ALSO see that the issue is not a closed one — there are multiple interpretations, and they get more interesting if they take into account what has previously been discussed.
2d) Combine b) and c). Synthesize what you have read from various scholars and then add your own analysis in light of what they have said. Now you have written an interesting and rich piece of research!

DON’T BE AFRAID of finding that your ideas have already been published by someone else. A new observer of a problem will always shed new light on the issue. It is ignorance of prior scholarship that will lead you to make unoriginal work. 

How to Write [PROSEMINAR] .016

3) Create writing community. Writing is lonely. You have to spend a lot of time on your own, and because you feel vulnerable about the quality of what you’re producing (especially in the beginning), you can feel wary of those around you. But a community of writing is what you need, and it can be very rewarding. There are a number of ways to do this.

3a) Ask a trusted friend or classmate to read your work. The “trusted” part is quite important. Not everyone around you in the intellectual environments which you find yourself in will be a good interlocutor for you. I have a close friend from grad school with whom I still share work, but it had to be this person and not anyone else. It’s personal! Creating relationships where constructive critique can happen takes a lot of work, but it is extremely rewarding.

3b) Agree to swap and critique. Talk to a friend who is perhaps in a similar stage of writing as you (for example, you’re both working on the second chapter of your dissertation), and agree to swap work and meet to discuss it. This can be helpful for a number of reasons. It can help you feel like your work has an audience. And reading what someone at the same or a similar stage as you is writing can help you see your own growth.

3c) Arrange or attend writing meet-ups. It’s a well known thing that dissertation writing is hard. With that in mind, a number of institutions have regular meet ups for PhD students where you turn up, write for a number of hours in a room full of other people who are writing, and then have coffee, socialize, etc. This is a great thing for keeping motivated, creating community, and meeting other grad students outside of your field. BU has a Dissertation Writing Group. You may be like me, however, and need to be at home to write in peace. However, if this is something that you think would be useful, it’s a great thing to try.

4. Use good tools. Part of your writing practice will entail finding the right writing tools for you. I suggest the following:

4a) Scrivener. One of the best word processors out there; one which allows for flexibility and non-linear writing. You can watch their videos and see some egs. It isn’t free — although there is a discount for students. However, it may well be a good investment for you. I bought my copy when I started my dissertation and I’m still using it.

4b) EvernoteI don’t recommend this for writing large projects, but it is good for keeping track of various notes, or pdfs. It’s another place to store your ideas which is not on your computer. There is a free version. No matter what system you use, make sure you a REGULARLY BACKING UP YOUR WORK. Back up your files regularly, and in more than one place! 

4c) Forest appSet a timer and Forest will block website of your choice for that time. I am addicted to twitter (no surprise there), so I use the chrome extension version. This is a great way to a) keep track of your writing hours; b) be strict about minimizing distractions. I often set the timer for a short amount of time (c. 20 mins), but go well beyond it. It helps to get into the mindset you need for writing.

How to Write [PROSEMINAR] .022.jpeg

5) Read about writing. In addition to readings books by writers on writing (e.g. by Murakami or Le Guin), there are a number of useful resources out there which specifically give advice about academic writing:

6) Stop thinking. Writing and research requires a lot of thinking (naturally). But it also requires a lot of time not thinking. In addition to the fact that you should have a life (i.e. don’t let writing eat up everything), your writing will be better if you spend time not thinking. I end with the immortal words of Don Draper:

How to Write [PROSEMINAR] .028.jpeg

Visit to Bates; digital humanities and the human body; Elizabeth Marlowe; tempora cum causis (14)

Ancient. This week I was up in Maine visiting the Classical and Medieval Studies department at Bates College. On Thursday (30th Jan. 2020), I talked about my primary research interest right now, Cicero and the Latin poets (I’m finishing up a book on this); on Friday (31st Jan. 2021), I talked about digital approaches to teaching. Handouts below: 

Modern. I can understand why “the digital” as category sometimes seems so distinct from the world of humanism or humanistic inquiry. But investigating the digital within the framework of the extensibility of human embodiment immediately complicates this view. Digital humanists (this term, to some, is tautological; to others, self-negating) often emphasize the essential continuity between established forms of intellectual work and the capacities of contemporary digital techniques; as Eileen Gardiner and Ronald Musto (The Digital Humanities; 2015: 2) write:

…everything from the scholar’s desk and shelves, study, studio, rehearsal and performance space, lecture halls, campuses, research institutes and convention halls can also legitimately be considered environments. Yet in many ways these new digital tools carry on, in analogous ways, the same functions of traditional humanities. Is the very computer upon which humanists rely so heavily still a tool, something akin to their medieval writing tablets?

Digital techniques build upon traditional humanistic practices but also develop them; Sarah E. Bond, Hoyt Long, Ted Underwood (“‘Digital’ Is Not the Opposite of ‘Humanities‘”; 2017):

Much of what is now happening under the aegis of digital humanities continues and expands those projects. Scholars are still grappling with familiar human questions; it is just that technology helps them address the questions more effectively and often on a larger scale.

“Digital humanists”, who spend so much time theorizing their own relationship to classical traditions and contemporary technology, are often met with knee-jerk reactions by those who have not taken the time to situate their own intellectual complaint. It all brings to mind Ursula K. Le Guin (I keep coming back to her), who regularly drew attention to the fact that her critics could not get past the genre of her writing to grasp the meaning of its content. At face value digital projects can have an alienating effect on traditional sensibilities, but when we dig deeper we quickly see that the intellectual processes required for such work are just as complex and interesting as the standard products of scholarship. I have written elsewhere about how teaching with digital techniques encourages students to sharpen analytic skills and deepen their intellectual commitment to research.

Anyway, returning to the embodiment part in all this. Technology is absolutely bound to the human body; formed for human use, imagined as an extension of human manipulation (in a literal sense of manus, i.e. ‘hand’) of reality. While contemporary technology sometimes feels so seamless as to be invisible to our own theorization, looking at older artefacts in digital history makes this incredibly clear. Take, e.g., the Philippe Henri’s (1984) “Cadavres Exquis / Exquisite Corpses.” This is a program for a computer generated poem: i.e. Henri wrote the code, but the actual poem was “written” when the program was run on a computer; and indeed rewritten anew each time the program was run. The code was circulated on paper (see the slide below: from Nick Montfort’s [@nickmofo] lecture at BU last year, “Translating Computational Poetry” — watch a video recording of the lecture here); and in order to run the program, a human being had to type it by hand into a specific computer, the TRS 80.

This slideshow requires JavaScript.

“Cadavres Exquis”, which is only one e.g. of a whole genre of computational poetry, very clearly demonstrates the entanglement of technology with the essences of humanity, not just the body, but indeed the “soul” (if such a dichotomy is even truly real). The human spark which invents the poetry; the human body which materializes it; the technological body (i.e. the computer) which extends that invention and materialization.

When I found out about this example of entangled text and technology from Nick Manafort’s talk at BU, it immediately made me think of the contemporary emulators used to play old video games on modern computers; i.e. programs which simulate the hardware of the N64 so that you can play Ocarina of Time without having to use the physical tools required in 1998. Such digital reconstructions (if that’s even the right word) have a preservative effect, but they also make me think about the relationship between my own body and the console at the time when the game was originally released. Sitting cross-legged on the floor of the attic, holding a controller (that was physically attached to the console – lol!), blowing the dust out of a Goldeneye cartridge. There are so many structural similarities between our relationship with these modern artefacts, and the historical processes which we study; the reception and reconstruction of ideas from antiquity to modernity. The relationship between text and context. The social and embodied nature of textual production.

Internet.

Excerpt. Elizabeth Marlowe (@ElizMarlowe) Shaky Ground (2013: 9): “Many archaeologists follow the thinking of Paul Kristeller, who suggested that ‘art’ as we know it wasn’t invented until the eighteenth century. According to this view, notions of pure, historically transcendent form slide perilously close to deeply suspect ones of ahistorical universal beauty. Ancient objects should instead be understood as manifestations of ‘visual culture’ or ‘material culture’ — the understanding of which depends heavily on context. In this and in much of the recent literature, the binaries are conspicuous: archaeology vs. art history, academia vs. museums, context vs. form, artifact vs. art, history vs. beauty, resonance vs. wonder.” 

Daily Life. Morning light in Maine. 

EPnP0g8WAAYwctn

Emotions in intellectual networks (Randall Collins & Fire Emblem); tempora cum causis (13)

Ancient. Classes for the Spring 2020 semester began this week. I’m teaching Women in Antiquity (#womenancient) again, and a grad seminar on Roman Intellectual Life (#romanintellect). Here are the syllabuses for each of them: 

Modern. Intellectual and artistic life is often figured as solitary. And, yes, parts of it often are. There has to be time where you’re working on your craft, reading and studying and developing. There has to be practice. But the idea that there is one genius at work, a unitary soul, one which doesn’t rely on or need the presence of others feels fundamentally wrong. There is a social aspect to all of this. And an emotional one. In The Sociology of Philosophies (1979), Randall Collins consistently pairs thought with emotion in his analysis of how socio-intellectual networks form. There are at least three things which Collins suggests are needed for an intellectual interaction: 1) present individuals physically assembled; 2) shared focus and awareness of that shared focus; 3) shared emotional state, or mood.

We could really push back against the insistence that individuals need to be physically assembled (“face-to-face”); interaction rituals take place on the internet every day, where the embodied nature of human interaction is tested and extended. But an important part to linger on here is the function of emotion in creating intellectual as much as social structure. Awareness and self-awareness play a role, and emotions are a processing tool that allows mutual understanding and self-reflexivity.

Collins suggests that focus and emotion have a role to play in the empowering of actions (and texts/objects used in such action) and individuals alike. Ritual actions are “charged up” via repetitions which are needed to give them meaning; if they aren’t replenished in a timely manner, then they lose their significance (consider: going regularly to a therapist; to yoga; to a place of worship; to Latin class). Engaging in social actions regularly insists on their significance, and on the very significance of social interaction regardless of the activity; i.e. going to yoga is as much about communing with your social network and mutually affirming the significance of that shared focus as it is about the technical or spiritual actions involved.

Just as social actions need to be regularly “charged up” in this way (via repetition), so too, Collins suggests, do individuals need to be “charged up” emotionally. Collins writes (p23) that intellectual “encounters have an emotional aftermath”; an individual whose emotional energy is replenished is thereby empowered with charm and leadership capability, but an individal who is not emotional “charged up” will become demoralized — passive, depressed. The sliding scale of emotionality, Collins suggests, strongly impacts the individual’s ability to engage in the socio-intellectual structure which creates these emotions in the first place; consider: a beginning yoga student who feels alienated in their practice and does not feel adequately supported by their teacher; or a Latin student who is not given the emotional space to make errors by their instructor. Motivation is an extension of emotional state, and emotional state in intellectual networks consists in a relationship to the social structure (and hierarchy): the yoga studio; the Latin classroom. 

Interestingly, the theory of motivation and emotional energy in relation to the creation and development of community is a significant feature in Fire Emblem: Three HousesIn this game, you are a “professor” (of war), tasked with developing the skills of a group of students, each with their own personalities, desires, and talents. While you train your students, you must pay attention to their “motivation level” – the students individually have a bar with four “charges” of motivation, which you spend in distributing skill points. When student motivation is low, you can replenish their energy with a range of activities: share a meal with them, have tea with them, return lost items, answer their questions after class, etc. etc. This is actually a rather sophisticated reflection of a solid pedagogical theory: students will not progress and not level up unless you create an environment in which they become emotionally replenished.

Internet.

Excerpt. Randall Collins (1979/2002: 21): “Intensely focused situations penetrate the individual, forming symbols and emotions which are both the medium and the energy of individual thought and the capital which makes it possible to construct yet further situations in an ongoing chain.” 

Daily Life. Over the last two weeks I’ve been cycling as much as possible. I remembered that I had a copy of Eleanor Davis’ You & a Bike & a Road (Koyama Press 2017) and read it again the other evening. It’s a moving memoir of a solo cross-country bike tour. I remember following along on twitter back in 2016 when Davis posted updates as it was happening

EPEEa4eWsAUn9so

this is me rn
This is me these days.

On ‘self-care’ and ‘mindfulness’; Hayden White; tempora cum causis (12)

Modern. When new terms enter the contemporary lexicon, it’s natural to find them kind of annoying at first. (I remember when ‘selfie’ was new and now I have no shame in using it, or taking them.) I’ve been annoyed with two in particular: 1) ‘self-care’; 2) ‘mindfulness.’* Yes, part of the annoyance is that when new catchwords arise they seem to be everywhere; the combination of their novelty and their ubiquity is probably what really rubs me the wrong way. But also: a new term needs time and repeated use to develop meaning, and that process of negotiation can reveal subtleties which complicate the word’s original intent. And there are some reasons to be distrustful of ‘self-care.’ The whole point, as I understand it, of self-care as an idea is that you disrupt your work habits (which you’ve feverishly developed in order to become, or remain, gainfully employed in the unstable economic landscape of 2020) in order to spend some time doing things that you actually like, which make you feel rejuvenated, and return to you some of your innate creative abilities. Okay! Well, obviously, the first issue is that time to yourself – the time to contemplate, to ‘do nothing’ (see: Jenny Odell), should, by all rights, be a bigger part of our lives in the first place. Secondly, self-care is often pitched not as a rebellion against the commodification and infestation of our private lives, but rather as its tool; i.e. self-care is supposed to rejuvenate us so that we can get back to work. At the “Facing Race” conference (Nov. 2016), Roxane Gay put it well (paraphrase via live-tweet): “I can’t stand the phrase self-care. It’s what women can do, but don’t do. We do it so we can work, but what comes next?”  

Lastly, self-care is regularly figured as a consumerist activity; you should try searching “self-care face mask” in twitter. Self-care as the deliberate derailing of learned habits of overwork is itself a good thing, I think. But it’s hard to practice. And as a result, self-care has entered the zeitgeist as something quite frivolous, a superficial manifestation of something that is mostly invisible; a negotiation with yourself, and your self-perception. Likewise, ‘mindfulness.’ The point of this, again, as I understand it, is to consciously pay attention to what is happening in the very moment; including, if not particularly, your own internal, emotional landscape. To put it oversimplistically, we only really have what we are experiencing right now. Sure: we have indications of the future; and we have records of the past. But we are experiencing the present. Mindfulness as a practice is intended to remind us of this, and to encourage us to engage in the present fully, and to perceive its granularities. And to give us the ability to understand when we are being drawn into behaviours which are not totally within our control.

When it comes down to it, I love twitter. Over the years it has brought me community and a sense of belonging in a field that is often quite severe towards its members. I like its pluralism; I thank it for giving me more perspectives on certain issues. I think it can be empowering. In Classics, it’s where a lot of the social justice work starts. And because my personal life is deeply intertwined with my professional life, it has also been good for my work. I never want to write a screed against its use, and indeed, despite its documented toxicities, I still find myself encouraging people to use it so that they can get their work out into the world. But for all its functionalities, I don’t always like how I feel when I use it. I don’t like mindlessly scrolling; and I don’t like the possibility that at any given moment of casual scrolling, I can be made to feel all sorts of negative emotions that were not there seconds ago (and twitter privileges emotionally volatile content). It’s a turbulance which I volunteer for, but I don’t have to. I don’t have to participate in the parts that are engineering me.

I don’t want to leave twitter. I did a hiatus last summer to work on my book, and I hated it. As much as I want to have time that is my own, I also want to engage with the internet soul. So, here’s what I’ve been thinking. Snail mail (physical letters! some things, they tell me, still exist in “material reality”, whatever that means) only arrives once a day. You check it, and then you know what you’ve got, and there won’t be another thing to check till tomorrow. You get on with your day. But twitter (and email, don’t get me started) can come for you whenever you open that app. Sometimes, I think about social media in terms of the functionality of Stardew Valley. Long story short, this is a very charming, and calming, farm simulator, which operates on a calendar with days and seasons. Every morning when you wake up in game, the fruits and vegetables whose seeds you had planted previously have produced new growth, which you can harvest. But this harvesting should only take up a little part of the day. After which, you can explore the world, talk to the characters, maybe go fishing or mining.

Yes, it’s a farming simulator, but even this game understands there’s more to life than your occupation! I want to treat social media and work emails like this. Harvest (i.e. open, and deal with?) once or twice a day. What I’m doing right now is letting every twitter or email notification take my attention whenever it sends me something, and this is the equivalent of virtually sitting in my field and staring at my crops until they tell me I can harvest them. Actually, the more I think about it, video games in general have a built-in mindfulness which reality sometimes does not. You, the protagonist, receive missions, but you choose in which order, when, or even if you want to do them. You can dissent from tasks given to you, you can (usually) take your sweet time and indulge in as many side quests as you want. We can learn something from this. There’s an intentionality which we often (or at least I do, I’ll speak for myself) willingly give up. But you can always get it back.

* ‘Self-care’ as a term actually appears with the meaning ‘self-interest’ as early as the 16th c., where it was used by the English poet, George Turberville‘s translation of Ovid’s Heroides (specifically: 19.205). ‘Mindfulness’ too has a long history, appearing in English as “the quality or state of being conscious or aware of something; attention” in the 16th c. (see Oxford English Dictionary). These terms are ‘new’ to the extent that they have reappeared in the context of a specific socio-cultural moment, in which the modern human life is structured according to 21st c. philosophies of productivity.

Internet.

Excerpt. Hayden White 2010*: 114: “The kind of understanding we get from following his story is different from the kind of understanding we might get from following his arguments. We can dissent from the argument while assenting, in such a way as to increase our comprehension of the facts, to the story itself.” 

*repr. of “The Structure of Historical Narrative” (1972)

Daily Life. I recently fell in love with cycling again because of Boston’s city bikes. It’s good stuff. 

Tom Habinek, realism vs. the ‘glob’, Kurt Vonnegut Jr.; tempora cum causis (11)

Ancient. Last weekend was the annual meeting of the Society for Classical Studies. Since I was still back in the UK with my family over the New Year, I missed most of it, but I was there for the last day to take part in the panel commemorating Prof. Tom Habinek, who sadly died last year. Tom was my PhD advisor, and a major influence in the field of Roman Studies. The event was very poignant, but fitting. On Sunday evening I posted my part of the panel, which you can read here: “Tom Habinek on ‘generativity’.” 

Modern. In an essay originally published in 1971, “The Culture of Criticism”, Hayden White describes the frustrations of Ernst Gombrich, Erich Auerbach, and Karl Popper (respectively: art historian, philologist and literary critic, philosopher of science) with the avant-gardists as typified by, for example, the abstract expressionist, Jackson Pollock. Each of these scholars held an attachment to realism; in some cases considering realism, in historiography and art alike, to be a means of resisting authoritarianism, with its power to overwrite the experience of reality by means of ideology. White (2010*: 105) writes that for these critics, historical, literary, or artistic realism, i.e. an attempt to represent reality as it actually is or was “results from the controlled interplay of human consciousness with a shifting social and natural milieu.” In the face of the fact that realism is supposed to reflect the human perception of reality, the avant-garde is taken by these critics to be a frustration of perception rather than a refinement of it. More than this, this break with tradition is a challenge to perception. White writes (2010: 107): 

“The surfaces of the external world, so laboriously charted over the last three thousand years, suddenly explode; perception loses its power as a restraint on imagination; the fictive sense dissolves — and modern man teeters on the verge of the abyss of subjective longing, which, Auerbach implies, must lead him finally to an enslavement once more by myth.”

(The fear of “myth” — figured as an antitype to so-called “rationality” in tandem with “realism” — has probably produced a number of negative results itself.) By the end of this essay, White (2010: 108-110) points to one of the real comforts of realism, one which lies in its hierarchical nature. Realistic art or narrative reflects a grammatically syntactical worldview, i.e. a mode of composition which privileges certain ideas over others, and arranges information around that privilege; whereas artefacts of the avant-garde might be interpreted as paratactical — presenting discrete elements “side-by side” (= παρά) in a “democracy of lateral coexistence” (2010: 109).

In Washington DC last weekend, I found myself face-to-face with Hans Hofmann’s Oceanic (1958) in the Hirshhorn Museum. I was really struck by the large heaps of paint in certain parts of the work, which I have now affectionately come to call “globs.” It feels appropriate!

Inspired by that visit, when I returned to Boston I wanted to go and look closely at more oil paintings in the MFA. Last night we got up close with some more excellent globs from Lee Krasner (Sunspots, 1963) and Joan Mitchell (Chamonix, c. 1962):

Digitization is vital, and I depend on it for my teaching and my scholarship, and I would never want digital resources to be taken away from me. But there is pretty much nothing like looking a glob straight in the eye, if you get the chance to. You can get a general sense of texture from a photograph. But the glob is just so noticeable IRL. Krasner applied oils straight from the tube onto the canvas for Sunspots, and you can tell. Looking at that painting tells the story of its making. As for Mitchell’s Chamonix, you can see the movement of her body in its wide, energetic strokes. Each is a record of embodiment, one which figurative, narrative, and supposedly veristic accounts tend to leave invisible. Back to Hayden White (2010: 110) one last time:

“The avant-garde insists on a transformation of social and cultural practice that will not end in the substitution of a new elite for an old one, a new protocol of domination for the earlier ones, nor the institution of new privileged positions for old ones — whether of privileged positions in space (as in the old perspectival painting and sculpture), of privileged moments in time (as one finds in the older narrative art of fiction and conventional historiography), of privileged places in society, in privileged areas in the consciousness (as in the conservative, that is to say, orthodox Freudian psychoanalytic theory), of privileged parts of the body (as the genitally organized sexual lore insists is ‘natural’), or of privileged positions in culture (on the basis of presumed superior ‘taste’) or in politics (on the basis of a presumed superior ‘wisdom’).”

* “The Culture of Criticism” (1971) is reprinted in The Fiction of NarrativeEssays on History, Literature, and Theory (2010), edited by Robert Doran.

Internet.

Excerpt. Kurt Vonnegut Jr. 1987: 44: “I thought about myself and art: that I could catch the likeness of anything I could see — with patience and the best instruments and materials. I had, after all, been an able apprentice under the most meticulous illustrator of this century, Dan Gregory. But cameras could do what he had done and what I could do. And I knew that it was this same thought which had sent Impressionists and Cubists and the Dadaists and the Surrealists and so on in their quite successful efforts to make good pictures which cameras and people like Dan Gregory could not duplicate.” 

Daily Life. We spent New Year’s Eve walking along the shore at Troon. 

This slideshow requires JavaScript.

Tom Habinek on “generativity” (SCS 2020)

On 5th January 2020 I took part in a commemoration of Tom Habinek at the SCS organized by James Ker, Andrew Feldherr, and Enrica Sciarrino; with Basil Dufallo, Zsuzsanna Várhelyi, Scott Lepisto, and Enrica Sciarrino, and myself as panelists. With the generosity of Hector Reyes, we were able to read Tom’s (incomplete) book manuscript on the topic of personhood and authorship. Here’s the text of my contribution to the workshop, in case of interest. Enormous thanks to everyone involved and everyone who came to the panel.


It is my task today to speak on the concept of generativity as discussed in Tom’s manuscript. When I think of Tom’s work and the influence he had on students like me, it is, indeed, particularly his theorization of generativity which I feel to have been the most impactful. In earlier works, Tom’s interest in generativity manifested in his study of social generation via cultural production and reproduction, with a focus on how ritual acts instantiated Roman community. In a key passage of The World of Roman Song (2005, p129), Tom cited the work of the anthropologist, Paul Connerton, who, in How Societies Remember (1989, p62) discussed Thomas Mann’s understanding of the Freudian ego: 

“We are to envisage the ego, less sharply defined and less exclusive than we commonly conceive of it, as being so to speak ‘open behind’: open to the resources of myth which are to be understood as existing for the individual not just as a grid of categories, but as a set of possibilities which can become subjective, which can be lived consciously. In this archaising attitude the life of the individual is consciously lived as a ‘sacred repetition’, as the explicit reanimation of prototypes.”

The “explicit reanimation of prototypes” is how Tom understood Roman self-construction: the invocation of ancient exemplars; the continuous citation and reinscription of Roman ancestral memory; rituals which resubstantiated the dead in the bodies of the living. Roman literary and political history demonstrates clearly that the Romans were interested in how their culture generated and regenerated itself; how the present day related to the past and preserved a tensile balance between new iterations of Roman youth, and their ancestral blueprints. All we need think of is the late Republican Brutus contemplating his ancestor, the expeller of kings; or perhaps Cicero in the Pro Caelio raising the ancient, blind Appius Claudius from the dead to speak with Cicero’s own lips and chastize Cicero’s own enemies.

In his latest work Tom approached the question of Roman generativity from some new perspectives. In his search for an understanding of Roman personhood, he figured the Roman persona as an active process, not a passive state; I think that for Tom persona was not a noun, but a verb. “Personifying” is a practice — it is an action, it is alive. Starting from this position, Tom was able to see different kinds of ancient evidence not as discrete, disconnected elements of Roman intellectual systems, but rather mutually supportive organs of an organic, synthetic whole. Tom’s work instantiates a theorization of human culture which does not merely render literature, or law, or art objects into cynical, insensate records of elites and auteurs. His organic approach reveals that the ancient artefact is an expression and mirror of biological as well as cultural forms. To put it another way, without really knowing that they are doing so, humans make things which reflect their insides. Tom’s work makes you realize that when you read a Latin text, that text is actively trying to constitute you into a Roman reader — like a 3D printer with instructions to produce a piece of plastic in a specific way, the scientific, ethical, political scripts of the Roman text tries to make us.

With this, or something like this, in his mind, Tom in this latest work proceeded to examine generativity in a number of different types of ancient evidence, ranging from the practices of Roman bride dowries to the emergence of birthday celebration as a theme in Latin love elegy. Underneath each artefact, Tom found a consistent preoccupation in the Roman attitude to cultural and biological reproduction which expressed a profound anxiety, one which can be conveyed in the form of a simple question. Will we continue to survive?

Romans expressing this anxiety in different ways figured reproduction, with its insistence upon a continuity of resources, as relating to survival in the long term. The fact that Roman bride dowries are reabsorbed into the natal family to allow women to marry again and to have children is, Tom suggests, an intentional defense mechanism against the failure to reproduce. As a result, generativity in Roman thought relates not only to explicit, biological reproduction (i.e. producing children), but making provision for a self-sustaining reabsorption of assets as part of a framework which allows such reproduction to take place. At its core, this legal provision expresses a care to conserve not just culture or biology but energy; like keeping a little something left in the storeroom in case of an unexpected hunger. Cast in this light, Roman conservatism, which is so frustratingly obvious and, frankly, obtuse sometimes (just think of Cato the Elder) seems to be not simply fanatical traditionalism, but indeed a form of conservationism.

It is an impulse to conserve that Tom saw in the Roman discourse around luxuria. The chastizing of luxuria is not simply, Tom suggests, a knee-jerk political reaction against perceived excesses and hedonism, but rather a criticism of “pointless growth” — i.e. the expenditure of energy which will not return, will not be reabsorbed and thereby conserved for future use. Tom notes that criticism of luxuria in Roman texts so often employ agricultural and botanical metaphors because luxuria was an metaphysical outgrowth which defied the boundaries of the carefully proportioned Catonian fields, designed and tended to produce year after year. Incidentally, Tom made a point to note that luxuriant excess — a squandering of resources, the refusal to regenerate, to conserve, to recycle — expressed itself in many different ways: the fact that furniture, fine art, construction, urban development, and non-reproductive sex were each as bad as each other speaks to the intersection of conservatism with conservationism in the Roman attitude; i.e. having fancy pedestal tables and sideboards (Livy 39.6) is just as bad as fucking your boyfriend because you should, good Roman, be conserving your attention and energies for generative activities. Here, Tom seems to have revealed a kind of biological essentialism in Roman thought which is not usually, I think, made explicit. Tom notes that while the elegists and other figures from the Roman counter-culture were “ambivalent” about such a formulation of luxuria, they nonetheless accepted its definition; that is, while they did not play by these rules, they accepted that these indeed were the rules. Even if you are walking away from Rome rather than towards it, you are still on the road to Rome.*

Tom translates the Latin luxuria as “pointless growth”, “withering growth”, “wild growth.” An agricultural, biological symptom of “bad” growth is itself a helpful tool to reveal the nature of “good” growth, and Tom realized that, in Roman thought, “good” growth often related to an inseparable dualism: life and death. An insistence that growth (that is “good” growth, not luxuria) is actually related to death appears, Tom says, in the Pro Marcello (23): Cicero’s exhortation of Caesar to propagate new growth includes the impossible wish that Caesar could bring the dead back to life, if only that were possible. Indeed, the relationship between the living and the dead at Rome was one of Tom’s deepest preoccupations; in the book proposal for the project, Tom had focused in on a passage from Rudolph Sohm which I believe was, for him, programmatic: “the heir is treated as though he were deceased…the deceased continues to live in the person of the heir” (1907, p504). Indeed, the idea that the dead live in the face, the name, and the actions of the living is one of the vital aspects of Roman generation, regeneration, generativity. Tom’s discussion of generativity in this manuscript reveals a living organism, a beating heart underneath the details of textuality. According to his understanding, the Romans formulated their generative function as a life pulse which conserved itself, returned to itself, and, being limited, precious, did not waste itself.

*Ursula Le Guin, The Left Hand of Darkness (1969/1999, p151): “To oppose something is to maintain it. They say here ‘all roads lead to Mishnory.’ To be sure, if you turn your back on Mishnory and walk away from it, you are still on the Mishnory road.'”

The Witcher and Star Wars IX; tempora cum causis (10)

This slideshow requires JavaScript.

With the release on the same day (Dec. 20th 2019) of both the Netflix adaptation of The Witcher and the final installation of the new Star Wars trilogy, The Rise of Skywalker, this week we got an object lesson on how cultural criticism works on a mass scale. Before we dive in to either of these, I want again to invoke Jia Tolentino’s analysis of social media as a commercially driven organ, designed to privilege negative or otherwise emotionally provocative content. In Trick Mirror, Tolentino writes that over time, personal lives transforming into public assets via social media meant that “social incentives — to be liked, to be seen — were becoming economic ones” (2019: 6). She goes on: “Twitter, for all its discursive promise, was where everyone tweeted complaints at airlines and bitched about articles that had been commissioned to make people bitch” (2019: 7-8). Looking at the internet as an exercise of performativity (one that extends and magnifies the natural human performativity of the offline world), Tolentino writes that “the internet is defined by a built-in performance incentive” (2019: 8). In How to Do Nothing, Jenny Odell (2019: 18) discusses social media too, drawing in the remarks of Franco Berardi: 

“Berardi, contrasting modern-day Italy with the political agitations of the 1970s, says the regime he inhabits ‘is not founded on the repression of dissent; nor does it rest on the enforcement of silence. On the contrary, it relies on the proliferation of chatter, the irrelevance of opinion and discourse, and on making thought, dissent, and critique banal and ridiculous.’ Instances of censorship, he says, ‘are rather marginal when compared to what is essentially an immense informational overload and an actual siege of attention, combined with the occupation of the sources of information by the head of the company.’ [Berardi 2011: 35] It is this financially incentivized proliferation of chatter, and the utter speed at which waves of hysteria now happen online, that has so deeply horrified me and offended my senses and cognition as a human who dwells in human, bodily time.”

The commercial incentive of online interaction is what particularly disturbs Odell; the communities and networks of social media are one thing, the design of such platforms to fulfill a capitalist purpose is another. Odell continues (2019: 60):

“Our aimless and desperate expressions of these platforms don’t do much for us, but they are hugely lucrative for advertisers and social media companies, since what drives the machine is not the content of information but the rate of engagement. Meanwhile, media companies continue churning out deliberately incendiary takes, and we’re so quicky outraged by their headlines that we can’t even consider the option of not reading and sharing them.”

All of this has a bearing on what happened this week. When Netflix dropped The Witcher last Friday, it was met with some noteworthy and negative reviews. Darren Franich and Kristen Baldwin’s “Netflix’s The Witcher is nakedly terrible: Review” (Entertainment Weekly) gave the series an F grade, with a 0/100 on Metacritic. These reviewers immediately, and justifiably, came under fire themselves given that they admitted that they did not watch the series in its entirety. Reponse to The Witcher has been divided: critics hate it, the public loves it. So is The Witcher any good? One of the barriers here is the general distaste for “genre” pieces. Some might avoid science fiction, fantasy, or romance just because it is labled so. Ursula K. Le Guin took on this problem in her essay, “Genre: a word only a Frenchman could love” (reprinted in Words are My Matter, 2019: 10):

“So we have accepted a hierarchy of fictional types, with ‘literary fiction,’ not defined, but consisting almost exclusively of realism, at the top. All other kinds of fiction, the ‘genres,’ are either listed in rapidly descending order of inferiority or simply tossed into a garbage heap at the bottom. This judgemental system, like all arbitrary hierarchies, promotes ignorance and arrogance. It has seriously deranged the teaching and criticism of fiction for decades, by short-circuiting useful critical description, comparison, and assessment. It condones imbecilities on the order of ‘If it’s science fiction it can’t be good, if it’s good it can’t be science fiction.'” 

In the preface to her (critically acclaimedThe Left Hand of Darkness, Le Guin had already drawn attention to the fact that science fiction, like any literature, is about its present, not the future (1969/1999: xvi):

“All fiction is metaphor. Science fiction is metaphor. What sets it apart from older forms of fiction seems to be its use of new metaphors, drawn from certain great dominants of our contemporary life — science, all the sciences, and technology, and the relativistic and the historical outlook, among them. Space travel is one of those metaphors; so is an alternative society, an alternative biology; the future is another. The future, in fiction, is a metaphor.”

The Witcher is not actually “about” magic and monsters; it’s about the relationship between storytelling and reality (Jaskier’s song vs. Geralt’s action), about the pain of isolation (Yennefer), about trying to live your life despite tempestuous circumstances (Geralt); it’s about assembling strange families, when biological ones fail (Geralt, Yennefer, Ciri). Assigning an F to The Witcher because it successfully engages with its own genre, one which you, the reviewer, do not know or care enough about to situate the object of your critique within, removes the rich layers of cultural entanglement which may make such a show worthwhile to a viewer like me. Le Guin continues (2019: 10): “If you don’t know what kind of book you’re reading and it’s not the kind you’re used to, you probably need to learn how to read it. You need to learn the genre.”

I’m not coming at this from a neutral perspective, since I voraciously played and replayed, and loved Witcher 3. But is Netflix’s The Witcher “objectively bad”? No, it’s not. It has haunting performances from Anya Chalotra (Yennefer) and Henry Cavill (Geralt) is perfection. The fight scenes are incredible. And it’s beautiful to look at. Yes, they say “destiny” too many times. But, look, it’s a romp!

On to Star Wars, then. Since we kept up our tradition of seeing the newest Star Wars on Christmas eve, I was aware of an enormous amount of critical disappointment and fan anger regarding the latest installment before I saw the film itself. You know what? It was fine. Yes, it had a very fast pace, and it wasn’t seamless with the trilogy’s own self-mythologizing. The Star Wars universe is full of holes because of the method of its composition; to some extent the writing, and overwriting (if you think that’s what J.J. is doing) resembles the process of story development in the oral tradition of the Greek epic canon, and in its reception. Consider Odysseus in the Iliad vs. Odysseus in the Odyssey vs. Odysseus in Sophocles’ Ajax. Indeed, the empty spaces projected by Star Wars are part of its charm: it’s a perfect landscape for imaginative rethinking, whether in the form of fan fiction, fan art, or roleplaying games like Edge of The Empire. That Star Wars captures the modern imagination so strongly is somewhat ironically reflected in the strength of the vitriol against it (and in the fan art. Peruse #reylo only if you dare).

All of this might be fine if it really were so simple. The emotional economy of the internet has a role to play here, but in this case we end up in a different place than we did with The Witcher. Anthony Breznican of Vanity Fair recorded J.J. Abrams’ public response to the backlash against TROS :

“After a screening at the Academy of Motion Picture Arts and Sciences on Friday, I [=Breznican] asked Abrams what he would say to those who are unhappy. Are they not getting something? Is there a problem in the fandom? ‘No, I would say that they’re right,’ he answered quickly. ‘The people who love it more than anything are also right.’ The director had just returned from a global tour with the film, where he also fielded questions about that mixed reaction. ‘I was asked just seven hours ago in another country, ‘So how do you go about pleasing everyone?’ I was like’ What…?’ Not to say that that’s what anyone should try to do anyway, but how would one go about it? Especially with Star Wars.’ With a series like this, spanning more than four decades, nearly a dozen films, several TV shows, and countless novels, comics, and video games, the fanbase is so far-reaching that discord may be inevitable. ‘We knew starting this that any decision we made — a design decision, a musical decision, a narrative decision — would please someone and infuriate someone else,’ Abrams said. ‘And they’re all right.'”

You can see how the viewers’ response to Star Wars might be taken as a reflection of contemporary political and cultural life in the US. In the New York Times, Annalee Newitz affirmed Le Guin’s view that cultural artefacts, sci-fi or not, are reflective of the society which produces and consumes them:

Star Wars became a new national mythos; it rebooted America’s revolutionary origin story and liberty-or-death values using the tropes of science fiction. Now, however, the movies no longer strike the same chord. Just as America’s political system is falling into disarray again, our cultural mythmaking machine is faltering as well.”

How and why we critique Star Wars may well reflect some deeper truth about the times we live in, but there’s another dark side to all this (get it?). To some extent the divided criticism is irrevelant, given that TROS earned an enormous amount of money. Indeed, the controversy only helped bring in the dollars (not to mention all the baby yodas hiding under the xmas trees this year). We entrusted our storytelling to a capitalist behemoth, and it’s disconcerting that cultural criticism has no impact on its forward march. Some have suggested that the F rating which Entertainment Weekly gave The Witcher was motivated by a desire to get more eyeballs (and more $) by artificially stirring up controversy. Given that the internet runs on divisiveness and ire (these are our social currencies), that might have been an economically shrewd move. But was it good cultural criticism?

Jenny Odell on Cicero, Suzanne McConnell on Kurt Vonnegut Jr.; tempora cum causis (9)

Ancient and Modern. In the De Fato (10-11), Cicero discusses whether it is possible for the individual to overcome their nature. Here comes the Loeb:

Stilponem, Megaricum philosophum, acutum sane hominem et probatum temporibus illis accepimus. Hunc scribunt ipsius familiares et ebriosum et mulierosum fuisse, neque haec scribunt vituperantes sed potius ad laudem, vitiosam enim naturam ab eo sic edomitam et compressam esse doctrina ut nemo umquam vinolentum illum, nemo in eo libidinis vestigium viderit. Quid? Socratem nonne legimus quemadmodum notarit Zopyrus physiognomon, qui se profitebatur hominum mores naturasque ex corpore oculis vultu fronte pernoscere? stupidum esse Socratem dixit et bardum quod iugula concava non haberet—obstructas eas partes et obturatas esse dicebat; addidit etiam mulierosum, in quo Alcibiades cachinnum dicitur sustulisse. [11] Sed haec ex naturalibus causis vitia nasci possunt, exstirpari autem et funditus tolli, ut is ipse qui ad ea propensus fuerit a tantis vitiis avocetur, non est positum in naturalibus causis, sed in voluntate studio disciplina; quae tollentur omnia si vis et natura fati…firmabitur.

“The Megarian philosopher Stilpo, we are informed, was undoubtedly a clever person and highly esteemed in his day. Stilpo is described in the writings of his own associates as having been fond of liquor and of women, and they do not record this as a reproach but rather to add to his reputation, for they say that he had so completely mastered and suppressed his vicious nature by study that no one ever saw him the worse for liquor or observed in him a single trace of licentiousness. Again, do we not read how Socrates was stigmatized by the ‘physiognomist’ Zopyrus, who professed to discover men’s entire characters and natures from their body, eyes, face and brow? He said that Socrates was stupid and thick-witted because he had not got hollows in the neck above the collarbone—he used to say that these portions of his anatomy were blocked and stopped up. He also added that he was addicted to women—at which Alcibiades is said to have given a loud guffaw! [11] But it is possible that these defects may be due to natural causes; but their eradication and entire removal, recalling the man himself from the serious vices to which he was inclined, does not rest with natural causes, but with will, effort, training; and if the potency and the existence of fate is proved…all of these will be done away with.”

In this passage, Cicero describes some of the quote-unquote defects which naturally arise in humans. Stilpo (4th c. BCE) reportedly had a natural proclivity for alcohol and sex with women; he was, according to friends, ebriosus (“addicted to drink”) and mulierosus* (“addicted to women”). But, Cicero says, Stilpo was able to master his nature with philosophical training (doctrina), and was never seen drunk again, and showed no outward sign of lust. Zopyrus (5th c. BCE), applied physiognomy, i.e. the theory that human character can be read in the condition of the body, to Socrates and concluded from the philosopher’s body that he could only be an idiot. Oh, and that he must also be “addicted to women” (mulierosus again). Cicero writes that nature may be responsible for giving us certain tendencies. But, he says, it is human agency that can overcome them: “will” (voluntas), “effort” (studium), and “training” (disciplina). This passage, of course, contains an oversimplistic attitude to addiction as well as an ablest assumption that bodily imperfection is a mirror of morality or intellect. It’s also quite clear that these anecdotes are designed to reflect male power in the context of elite competition: the detail that the notorious party animal, Alcibiades, laughed at Zopyrus calling Socrates names suggests a symposiastic setting (Phaedo’s dialogue, Zopyrus, dramatized a debate between the physiognomist and Socrates). Putting those things aside, what do we make of Cicero’s claim that we can overcome our nature?

In the recent (and superb), How to Do Nothing (2019)Jenny Odell cites this passage of Cicero’s De Fato (pp71-72) in the context of arguing for the creation of a “third space” of attention — one which reframes human interaction with reality as a kind of rejection of market forces and commercially-run social media. The book as a whole is a meditation on and a protreptic towards a modern kind of recusatio, i.e. the technique of saying “I would prefer not to.” Odell asks her reader to refuse to internalize the contemporary narrative of productivity, and to reclaim time and space to “do nothing.” (There are a lot of classical references throughout — Seneca, Epicurus, Diogenes. And Cicero’s cum dignitate otium is clearly a spiritual forebear.) Here’s what Odell says about this passage of Cicero (p72):

“If we believed that everything were merely a product of fate, or disposition, Cicero reasons, no one would be accountable for anything and therefore there could be no justice. In today’s terms, we’d all just be algorithms. Furthermore, we’d have no reason to try to make ourselves better or different from our natural inclinations. VOLUNTATE, STUDIO, DISCIPLINA — it is through these things that we find and inhabit the third space, and more important, how we stay there. In a situation that would have us answer yes or no (on its terms), it takes work, and will, to keep answering something else.”

The possibility of escaping (or mitigating) the frailties of human psychology and embodiment which Cicero suggests relies on the intentional application of the mind (or soul). Odell would have us apply ourselves in this way as an act of resistance against cynical structures of social influence. The concept of “will” (voluntas) invokes the notion of presence — or attention — the ability to be here in the moment, to have an appreciation for the moment in all its granularities. To “focus” (studium). As for the “training” (disciplina), this obviously could take a number of forms. But evidently self-awareness, and awareness of the churning forces around you, is at the core of this idea.

*Mulierosus is quite an unusual Latin word! It only appears in extant classical Latin three times. According to Aulus Gellius (4.9.2), quoting “mulierosus” as discussed by the Pythagorean magician, Nigidius Figulus, the Latin suffix –osus indicates an excess of the characteristic in question.

Internet.

Excerpt. Suzanne McConnell on Kurt Vonnegut Jr., 2019: 134-135: “By its nature, literary fiction ‘teaches’: it shows how people feel, think, respond, vary; how circumstances affect them; how their brains, personalities, surroundings and culture make them tick. How an experiences strikes a particular person a certain way, and another differently. How a person feels inside as opposed to how they act or are perceived. And so on. All writing teaches — communicates something about something. [p135] Even bad writing. So if you’re writing, you’re teaching. You can’t help it. But then there’s intentional teaching through writing.”*

*Austin came into the room to point to a passage written by Vonnegut (on teachers and teaching) which was quoted on this page. So I thank him for the excerpt this week.

Daily Life. Max helped me grade. 

IMG_3441