How to Write

This week at BU I gave a proseminar for our PhD students, “How to Write.” Here’s what I told them:

How to Write. 1) Establish a practice. 2) Contextualize your work. 3) Create writing community. 4) Use good tools. 5) Read about writing. 6) Stop thinking.

1) Establish a practice. Haruki Murakami is a writer and an ultra marathoner. In his memoir, What I Talk about When I Talk about Running (2008), Murakami makes an alignment between writing and running — in that each is a practice: 

“Most of what I know about writing I’ve learned through running every day. These are practical, physical lessons. How much can I push myself? How much rest is appropriate — and how much is too much? How far can I take something and still keep it decent and consistent? When does it become narrow-minded and inflexible? How much should I be aware of the world outside, and how much should I focus on my inner world? To what extent should I be confident in my abilities, and when should I start doubting myself? I know that if I hadn’t become a long-distance runner when I became a novelist, my work would have been vastly different.”

When you run, or do any kind of physical activity, it’s about putting your body in the position to do the practice. You don’t say to your muscles, “get stronger!” But rather, you run, you put yourself in the right environment every day, and over time your muscles do get stronger. It’s the same way with writing. As with running, when you sit down to write you don’t know how well it will go (and, indeed, how well it will go is governed by somewhat mysterious forces). But you put yourself in that position nonetheless; and some days you do well, other days you don’t. But you do it consistently, and, by trial and error, you figure out what kind of practice works best for you. I’ve known scholars who like to write for an hour every day first thing in the morning, and that’s it. I prefer to set aside an entire day so that I can devote several hours in a row to write. We all have slightly different ways of doing it and that’s fine — but whatever you do, do it consistently.

This slideshow requires JavaScript.

1a) Focus on time writing rather than words written. Given the fact that you don’t  know how many words you’ll be able to write in any given session, using words written as a measure of progress will ultimately become frustrating. Instead, focus on time spent writing. Doing so will allow you to have space to think (and thinking is of course part of writing!) and to develop your ideas. When I was writing my dissertation, I spent 3 to 5 hours a day working on it.

1b) Do it a little every day.
Establishing a practice means writing consistently. You may be a morning person, you may be a night owl, but whatever you do — do it every day. (Every day that you’re working, that is. It’s important to take days off, to rest. To extend the running metaphor, you have to rest your muscles for them to grow. And in addition to that, having a robust and dependable writing practice means that you can have a life. Which is important!)

1c) Keep a record of your hours. Keep a journal to note down the hours you write every day. I show an example of mine from summer 2019 below. On Wednesday 5th June, I wrote for 6 hours “with a lot of faffing” (=British slang for “screwing around”), i.e. I put in the hours but there were many distractions. I also noted what I was working on, so that I could pick up from there the next time I wrote. On Wednesday 10th [July??], I struggled. I wrote from 10.49am to 2pm, and then noted: “I need a break!” When I returned, I annotated this with “didn’t **really** take a break but worried from 2-3.15 :).” A nice example of how writing can go well or it can go badly. Nonetheless, you continue your practice!

This slideshow requires JavaScript.

I take inspiration here from another writer, Ursula K. Le Guin. In a 1988 interview, she described her written practice. Worth noting what happened in the evenings: “After 8.00pm — I tend to be very stupid and we won’t talk about this.” No one can write all day. There is always a point of fatigue, and once you reach this, you shouldn’t work against yourself. Go and rest.

How to Write [PROSEMINAR] .008.jpeg

2) Contextualize your work. When you begin your research project as a PhD student, you feel an enormous pressure to be original. Indeed, that’s one of the metrics by which we judge successful research — whether it is a new and original contribution to the field. And it’s easy to think that originality means removing yourself from what has been said before. “No one has ever looked at this issue the way I am.” Yet, truly, your writing will be at its best if you go into it acknowledging the fact that you are not alone. In the slides below I show two different perspectives on this. Is your work a lone tree? Or is it a tree that is in a contextual forest, standing alongside other work in an intellectual ecosystem? Create a dialogue between yourself, the ancient evidence, and prior scholarship. Interweave, entangle. Be synthetic.

This slideshow requires JavaScript.

In a talk from 2004, “Genre: A Word Only a Frenchman Could Love” (reprinted in Words are My Matter), Ursula K. Le Guin describes what it’s like when a writer does not acknowledge the tradition in which they are working:

“A genre is a genre by having a field and focus of its own, its appropriate and particular tools and rules and techniques for handling the material, its traditions, and its experienced, appreciative readers. Ignorant of all this, our novice is about to reinvent the wheel, the space ship, the space alien, and the mad scientist, with cries of innocent wonder. The cries will not be echoed by the readers. Readers familiar with the genre have met the space ship, the alien, and the mad scientist before. They know much more about them than the writer does.”

This slideshow requires JavaScript.

It’s weirdly easier to do this in a research area that you don’t feel as personally invested in. You can practice this deliberately, scholarly interweaving by going through the following steps:

2a) Pick an ancient piece of evidence. It could be anything.
2b) Analyze it. Spend some time just you and it. Tease out points of significance. Take notes. Think about it in the context of other things you know about its period/genre/whatever.
2c) See what other scholars have said; read 3-5 pieces of scholarship. You’ll see that some of the things you noticed have already been published. But, by reading multiple scholars on the same object/text/problem, you’ll ALSO see that the issue is not a closed one — there are multiple interpretations, and they get more interesting if they take into account what has previously been discussed.
2d) Combine b) and c). Synthesize what you have read from various scholars and then add your own analysis in light of what they have said. Now you have written an interesting and rich piece of research!

DON’T BE AFRAID of finding that your ideas have already been published by someone else. A new observer of a problem will always shed new light on the issue. It is ignorance of prior scholarship that will lead you to make unoriginal work. 

How to Write [PROSEMINAR] .016

3) Create writing community. Writing is lonely. You have to spend a lot of time on your own, and because you feel vulnerable about the quality of what you’re producing (especially in the beginning), you can feel wary of those around you. But a community of writing is what you need, and it can be very rewarding. There are a number of ways to do this.

3a) Ask a trusted friend or classmate to read your work. The “trusted” part is quite important. Not everyone around you in the intellectual environments which you find yourself in will be a good interlocutor for you. I have a close friend from grad school with whom I still share work, but it had to be this person and not anyone else. It’s personal! Creating relationships where constructive critique can happen takes a lot of work, but it is extremely rewarding.

3b) Agree to swap and critique. Talk to a friend who is perhaps in a similar stage of writing as you (for example, you’re both working on the second chapter of your dissertation), and agree to swap work and meet to discuss it. This can be helpful for a number of reasons. It can help you feel like your work has an audience. And reading what someone at the same or a similar stage as you is writing can help you see your own growth.

3c) Arrange or attend writing meet-ups. It’s a well known thing that dissertation writing is hard. With that in mind, a number of institutions have regular meet ups for PhD students where you turn up, write for a number of hours in a room full of other people who are writing, and then have coffee, socialize, etc. This is a great thing for keeping motivated, creating community, and meeting other grad students outside of your field. BU has a Dissertation Writing Group. You may be like me, however, and need to be at home to write in peace. However, if this is something that you think would be useful, it’s a great thing to try.

4. Use good tools. Part of your writing practice will entail finding the right writing tools for you. I suggest the following:

4a) Scrivener. One of the best word processors out there; one which allows for flexibility and non-linear writing. You can watch their videos and see some egs. It isn’t free — although there is a discount for students. However, it may well be a good investment for you. I bought my copy when I started my dissertation and I’m still using it.

4b) EvernoteI don’t recommend this for writing large projects, but it is good for keeping track of various notes, or pdfs. It’s another place to store your ideas which is not on your computer. There is a free version. No matter what system you use, make sure you a REGULARLY BACKING UP YOUR WORK. Back up your files regularly, and in more than one place! 

4c) Forest appSet a timer and Forest will block website of your choice for that time. I am addicted to twitter (no surprise there), so I use the chrome extension version. This is a great way to a) keep track of your writing hours; b) be strict about minimizing distractions. I often set the timer for a short amount of time (c. 20 mins), but go well beyond it. It helps to get into the mindset you need for writing.

How to Write [PROSEMINAR] .022.jpeg

5) Read about writing. In addition to readings books by writers on writing (e.g. by Murakami or Le Guin), there are a number of useful resources out there which specifically give advice about academic writing:

6) Stop thinking. Writing and research requires a lot of thinking (naturally). But it also requires a lot of time not thinking. In addition to the fact that you should have a life (i.e. don’t let writing eat up everything), your writing will be better if you spend time not thinking. I end with the immortal words of Don Draper:

How to Write [PROSEMINAR] .028.jpeg

Visit to Bates; digital humanities and the human body; Elizabeth Marlowe; tempora cum causis (14)

Ancient. This week I was up in Maine visiting the Classical and Medieval Studies department at Bates College. On Thursday (30th Jan. 2020), I talked about my primary research interest right now, Cicero and the Latin poets (I’m finishing up a book on this); on Friday (31st Jan. 2021), I talked about digital approaches to teaching. Handouts below: 

Modern. I can understand why “the digital” as category sometimes seems so distinct from the world of humanism or humanistic inquiry. But investigating the digital within the framework of the extensibility of human embodiment immediately complicates this view. Digital humanists (this term, to some, is tautological; to others, self-negating) often emphasize the essential continuity between established forms of intellectual work and the capacities of contemporary digital techniques; as Eileen Gardiner and Ronald Musto (The Digital Humanities; 2015: 2) write:

…everything from the scholar’s desk and shelves, study, studio, rehearsal and performance space, lecture halls, campuses, research institutes and convention halls can also legitimately be considered environments. Yet in many ways these new digital tools carry on, in analogous ways, the same functions of traditional humanities. Is the very computer upon which humanists rely so heavily still a tool, something akin to their medieval writing tablets?

Digital techniques build upon traditional humanistic practices but also develop them; Sarah E. Bond, Hoyt Long, Ted Underwood (“‘Digital’ Is Not the Opposite of ‘Humanities‘”; 2017):

Much of what is now happening under the aegis of digital humanities continues and expands those projects. Scholars are still grappling with familiar human questions; it is just that technology helps them address the questions more effectively and often on a larger scale.

“Digital humanists”, who spend so much time theorizing their own relationship to classical traditions and contemporary technology, are often met with knee-jerk reactions by those who have not taken the time to situate their own intellectual complaint. It all brings to mind Ursula K. Le Guin (I keep coming back to her), who regularly drew attention to the fact that her critics could not get past the genre of her writing to grasp the meaning of its content. At face value digital projects can have an alienating effect on traditional sensibilities, but when we dig deeper we quickly see that the intellectual processes required for such work are just as complex and interesting as the standard products of scholarship. I have written elsewhere about how teaching with digital techniques encourages students to sharpen analytic skills and deepen their intellectual commitment to research.

Anyway, returning to the embodiment part in all this. Technology is absolutely bound to the human body; formed for human use, imagined as an extension of human manipulation (in a literal sense of manus, i.e. ‘hand’) of reality. While contemporary technology sometimes feels so seamless as to be invisible to our own theorization, looking at older artefacts in digital history makes this incredibly clear. Take, e.g., the Philippe Henri’s (1984) “Cadavres Exquis / Exquisite Corpses.” This is a program for a computer generated poem: i.e. Henri wrote the code, but the actual poem was “written” when the program was run on a computer; and indeed rewritten anew each time the program was run. The code was circulated on paper (see the slide below: from Nick Montfort’s [@nickmofo] lecture at BU last year, “Translating Computational Poetry” — watch a video recording of the lecture here); and in order to run the program, a human being had to type it by hand into a specific computer, the TRS 80.

This slideshow requires JavaScript.

“Cadavres Exquis”, which is only one e.g. of a whole genre of computational poetry, very clearly demonstrates the entanglement of technology with the essences of humanity, not just the body, but indeed the “soul” (if such a dichotomy is even truly real). The human spark which invents the poetry; the human body which materializes it; the technological body (i.e. the computer) which extends that invention and materialization.

When I found out about this example of entangled text and technology from Nick Manafort’s talk at BU, it immediately made me think of the contemporary emulators used to play old video games on modern computers; i.e. programs which simulate the hardware of the N64 so that you can play Ocarina of Time without having to use the physical tools required in 1998. Such digital reconstructions (if that’s even the right word) have a preservative effect, but they also make me think about the relationship between my own body and the console at the time when the game was originally released. Sitting cross-legged on the floor of the attic, holding a controller (that was physically attached to the console – lol!), blowing the dust out of a Goldeneye cartridge. There are so many structural similarities between our relationship with these modern artefacts, and the historical processes which we study; the reception and reconstruction of ideas from antiquity to modernity. The relationship between text and context. The social and embodied nature of textual production.


Excerpt. Elizabeth Marlowe (@ElizMarlowe) Shaky Ground (2013: 9): “Many archaeologists follow the thinking of Paul Kristeller, who suggested that ‘art’ as we know it wasn’t invented until the eighteenth century. According to this view, notions of pure, historically transcendent form slide perilously close to deeply suspect ones of ahistorical universal beauty. Ancient objects should instead be understood as manifestations of ‘visual culture’ or ‘material culture’ — the understanding of which depends heavily on context. In this and in much of the recent literature, the binaries are conspicuous: archaeology vs. art history, academia vs. museums, context vs. form, artifact vs. art, history vs. beauty, resonance vs. wonder.” 

Daily Life. Morning light in Maine. 


The Witcher and Star Wars IX; tempora cum causis (10)

This slideshow requires JavaScript.

With the release on the same day (Dec. 20th 2019) of both the Netflix adaptation of The Witcher and the final installation of the new Star Wars trilogy, The Rise of Skywalker, this week we got an object lesson on how cultural criticism works on a mass scale. Before we dive in to either of these, I want again to invoke Jia Tolentino’s analysis of social media as a commercially driven organ, designed to privilege negative or otherwise emotionally provocative content. In Trick Mirror, Tolentino writes that over time, personal lives transforming into public assets via social media meant that “social incentives — to be liked, to be seen — were becoming economic ones” (2019: 6). She goes on: “Twitter, for all its discursive promise, was where everyone tweeted complaints at airlines and bitched about articles that had been commissioned to make people bitch” (2019: 7-8). Looking at the internet as an exercise of performativity (one that extends and magnifies the natural human performativity of the offline world), Tolentino writes that “the internet is defined by a built-in performance incentive” (2019: 8). In How to Do Nothing, Jenny Odell (2019: 18) discusses social media too, drawing in the remarks of Franco Berardi: 

“Berardi, contrasting modern-day Italy with the political agitations of the 1970s, says the regime he inhabits ‘is not founded on the repression of dissent; nor does it rest on the enforcement of silence. On the contrary, it relies on the proliferation of chatter, the irrelevance of opinion and discourse, and on making thought, dissent, and critique banal and ridiculous.’ Instances of censorship, he says, ‘are rather marginal when compared to what is essentially an immense informational overload and an actual siege of attention, combined with the occupation of the sources of information by the head of the company.’ [Berardi 2011: 35] It is this financially incentivized proliferation of chatter, and the utter speed at which waves of hysteria now happen online, that has so deeply horrified me and offended my senses and cognition as a human who dwells in human, bodily time.”

The commercial incentive of online interaction is what particularly disturbs Odell; the communities and networks of social media are one thing, the design of such platforms to fulfill a capitalist purpose is another. Odell continues (2019: 60):

“Our aimless and desperate expressions of these platforms don’t do much for us, but they are hugely lucrative for advertisers and social media companies, since what drives the machine is not the content of information but the rate of engagement. Meanwhile, media companies continue churning out deliberately incendiary takes, and we’re so quicky outraged by their headlines that we can’t even consider the option of not reading and sharing them.”

All of this has a bearing on what happened this week. When Netflix dropped The Witcher last Friday, it was met with some noteworthy and negative reviews. Darren Franich and Kristen Baldwin’s “Netflix’s The Witcher is nakedly terrible: Review” (Entertainment Weekly) gave the series an F grade, with a 0/100 on Metacritic. These reviewers immediately, and justifiably, came under fire themselves given that they admitted that they did not watch the series in its entirety. Reponse to The Witcher has been divided: critics hate it, the public loves it. So is The Witcher any good? One of the barriers here is the general distaste for “genre” pieces. Some might avoid science fiction, fantasy, or romance just because it is labled so. Ursula K. Le Guin took on this problem in her essay, “Genre: a word only a Frenchman could love” (reprinted in Words are My Matter, 2019: 10):

“So we have accepted a hierarchy of fictional types, with ‘literary fiction,’ not defined, but consisting almost exclusively of realism, at the top. All other kinds of fiction, the ‘genres,’ are either listed in rapidly descending order of inferiority or simply tossed into a garbage heap at the bottom. This judgemental system, like all arbitrary hierarchies, promotes ignorance and arrogance. It has seriously deranged the teaching and criticism of fiction for decades, by short-circuiting useful critical description, comparison, and assessment. It condones imbecilities on the order of ‘If it’s science fiction it can’t be good, if it’s good it can’t be science fiction.'” 

In the preface to her (critically acclaimedThe Left Hand of Darkness, Le Guin had already drawn attention to the fact that science fiction, like any literature, is about its present, not the future (1969/1999: xvi):

“All fiction is metaphor. Science fiction is metaphor. What sets it apart from older forms of fiction seems to be its use of new metaphors, drawn from certain great dominants of our contemporary life — science, all the sciences, and technology, and the relativistic and the historical outlook, among them. Space travel is one of those metaphors; so is an alternative society, an alternative biology; the future is another. The future, in fiction, is a metaphor.”

The Witcher is not actually “about” magic and monsters; it’s about the relationship between storytelling and reality (Jaskier’s song vs. Geralt’s action), about the pain of isolation (Yennefer), about trying to live your life despite tempestuous circumstances (Geralt); it’s about assembling strange families, when biological ones fail (Geralt, Yennefer, Ciri). Assigning an F to The Witcher because it successfully engages with its own genre, one which you, the reviewer, do not know or care enough about to situate the object of your critique within, removes the rich layers of cultural entanglement which may make such a show worthwhile to a viewer like me. Le Guin continues (2019: 10): “If you don’t know what kind of book you’re reading and it’s not the kind you’re used to, you probably need to learn how to read it. You need to learn the genre.”

I’m not coming at this from a neutral perspective, since I voraciously played and replayed, and loved Witcher 3. But is Netflix’s The Witcher “objectively bad”? No, it’s not. It has haunting performances from Anya Chalotra (Yennefer) and Henry Cavill (Geralt) is perfection. The fight scenes are incredible. And it’s beautiful to look at. Yes, they say “destiny” too many times. But, look, it’s a romp!

On to Star Wars, then. Since we kept up our tradition of seeing the newest Star Wars on Christmas eve, I was aware of an enormous amount of critical disappointment and fan anger regarding the latest installment before I saw the film itself. You know what? It was fine. Yes, it had a very fast pace, and it wasn’t seamless with the trilogy’s own self-mythologizing. The Star Wars universe is full of holes because of the method of its composition; to some extent the writing, and overwriting (if you think that’s what J.J. is doing) resembles the process of story development in the oral tradition of the Greek epic canon, and in its reception. Consider Odysseus in the Iliad vs. Odysseus in the Odyssey vs. Odysseus in Sophocles’ Ajax. Indeed, the empty spaces projected by Star Wars are part of its charm: it’s a perfect landscape for imaginative rethinking, whether in the form of fan fiction, fan art, or roleplaying games like Edge of The Empire. That Star Wars captures the modern imagination so strongly is somewhat ironically reflected in the strength of the vitriol against it (and in the fan art. Peruse #reylo only if you dare).

All of this might be fine if it really were so simple. The emotional economy of the internet has a role to play here, but in this case we end up in a different place than we did with The Witcher. Anthony Breznican of Vanity Fair recorded J.J. Abrams’ public response to the backlash against TROS :

“After a screening at the Academy of Motion Picture Arts and Sciences on Friday, I [=Breznican] asked Abrams what he would say to those who are unhappy. Are they not getting something? Is there a problem in the fandom? ‘No, I would say that they’re right,’ he answered quickly. ‘The people who love it more than anything are also right.’ The director had just returned from a global tour with the film, where he also fielded questions about that mixed reaction. ‘I was asked just seven hours ago in another country, ‘So how do you go about pleasing everyone?’ I was like’ What…?’ Not to say that that’s what anyone should try to do anyway, but how would one go about it? Especially with Star Wars.’ With a series like this, spanning more than four decades, nearly a dozen films, several TV shows, and countless novels, comics, and video games, the fanbase is so far-reaching that discord may be inevitable. ‘We knew starting this that any decision we made — a design decision, a musical decision, a narrative decision — would please someone and infuriate someone else,’ Abrams said. ‘And they’re all right.'”

You can see how the viewers’ response to Star Wars might be taken as a reflection of contemporary political and cultural life in the US. In the New York Times, Annalee Newitz affirmed Le Guin’s view that cultural artefacts, sci-fi or not, are reflective of the society which produces and consumes them:

Star Wars became a new national mythos; it rebooted America’s revolutionary origin story and liberty-or-death values using the tropes of science fiction. Now, however, the movies no longer strike the same chord. Just as America’s political system is falling into disarray again, our cultural mythmaking machine is faltering as well.”

How and why we critique Star Wars may well reflect some deeper truth about the times we live in, but there’s another dark side to all this (get it?). To some extent the divided criticism is irrevelant, given that TROS earned an enormous amount of money. Indeed, the controversy only helped bring in the dollars (not to mention all the baby yodas hiding under the xmas trees this year). We entrusted our storytelling to a capitalist behemoth, and it’s disconcerting that cultural criticism has no impact on its forward march. Some have suggested that the F rating which Entertainment Weekly gave The Witcher was motivated by a desire to get more eyeballs (and more $) by artificially stirring up controversy. Given that the internet runs on divisiveness and ire (these are our social currencies), that might have been an economically shrewd move. But was it good cultural criticism?

Roman time, “Mrs. Maisel”, Ursula K. Le Guin; tempora cum causis (8)

Ancient. This week, BU hosted the annual Classics day for high school and middle school students from the Boston area, with workshops on different aspects of the ancient world. The theme this year was ancient time, and so I did a workshop on time keeping devices in Rome. We talked about the ham sundial from Herculaneum, the so-called ‘Horologium’ of Augustus, and a 4th c. lunar calendar. I had the students recreate these devices in clay and paper to get a sense of how they worked. Afterwards I posted a thread detailing my workshop on twitter, including pdfs of the materials in case anyone wants to use them for a workshop of their own (pdf of the handout | pdf of the printout). Some twitter users tagged it with the unrolling Thread Reader App, so you can read the thread in the resulting blog format if you wish. I wore my “petrify the patriarchy” shirt from wire and honey for Classics day and received lots of compliments! 

Modern. After sleeping on it for way too long, I’m finally watching The Marvelous Mrs. Maisel. Although I came to Amy Sherman-Palladino’s earlier work, Gilmore Girls, late in life, when I did discover it, I fell deeply in love with it (one time, when we were still living in LA, we saw Keiko Agena outside of iO West, which is closed now). As much as I’m enjoying Mrs. Maisel, I find myself bothered by one of the characters. Midge Maisel’s father, played by Tony Shalhoub, is a professor of mathematics at Columbia. He’s an older man, and he’s “curmudgeonly.” There’s only one student in his maths class that he thinks is any good, and he says so. His students are DESPERATE for his approval. They try out new references to impress him. They follow him around in a pack. When things start to go wrong at the university, the dean tells him: “You’re a brilliant mathematician, but an uncooperative colleague and a very poor teacher.” There are a lot of interesting touches of modernity and anachronism in Mrs. Maisel, set in the late 1950s. The fact that the state of his teaching would be a concern to the scholarly community may number among them.

I found myself being bothered so much by this character, despite Tony Shalhoub’s deep charm (let’s face it, Shalhoub is a national treature), that I had to take a moment to think about why and excavate my emotional response. It’s not the character, really, that I have a problem with, but the trope that it draws upon. Shalhoub’s character, the proud patriarch in crisis, is supposed to be flawed, supposed to be fragile. Depicting professorial grumpiness is a vehicle for this character’s essential nature. But, evidently, I’m bothered by the “professor” stereotype. Sometimes when I’m at academic conferences, I see younger men wearing tweed, bowties, thick-framed or horn-rimmed glasses, as though this were the uniform of the intellectual. This was the contemporary style of dress for the older generation of gentlemen who have now become the senior scholars in our field, but for those men these clothes were just clothes, not a costume. (Well, maybe the elbow patches were an intentional display of identity then too.) The idea that there is a specific scholarly aesthetic implies that there is also a specific scholarly behaviour. Say, curmudgeonliness. Or torturing your students.

Education has changed. What we think education is for, who can receive an education, who can do the educating — all those things have changed. We do so many things today that a professor of the 1950s would never think of doing, may even have been incapable of doing. Scholarship over time has opened itself to new ways of thinking. Scholarly personnel is more varied. We need even more new ways of thinking and we need to open our doors to even more people. The potential to do intellectual work was never limited to one kind of person, but for decades the scholar was basically one kind of person. That’s not the case anymore. Yet the stereotype remains. Scholarship and intellectual life is a practice, not a costume.

This slideshow requires JavaScript.


Excerpt. Ursula K. Le Guin 2019: 5: “All of us have to learn to invent our lives, make them up, imagine them. We need to be taught these skills; we need guides to show us how. Without them, our lives get made up for us by other people.” 

Daily Life. Snow came to Boston.