Nobody Wants to Read Your Sh*t by Steven Pressfield is my favorite book of writing advice. Its core insight is expressed in the title. The best thing you can do for your writing is to internalize this deep truth. 

Pressfield did it by writing ad copy. You can’t avoid internalizing that nobody wants to read your shit when you’re writing ads, which everybody hates and nobody ever wants to read. Maybe you don’t have to go write ad copy to understand this; maybe you can just read the book, or just this post.

When you understand that nobody wants to read your shit, your mind becomes powerfully concentrated. You begin to understand that writing/reading is, above all, a transaction. The reader donates his time and attention, which are supremely valuable commodities. In return, you the writer must give him something worthy of his gift to you.

When you understand that nobody wants to read your shit, you develop empathy. [...] You learn to ask yourself with every sentence and every phrase: Is this interesting? Is it fun or challenging or inventive? Am I giving the reader enough? Is she bored? Is she following where I want to lead her?

What should you do about the fact that nobody wants to read your shit?

  1. Streamline your message. Be as clear, simple, and easy to understand as you possibly can.
  2. Make it fun. Or sexy or interesting or scary or informative. Fun writing saves lives.
  3. Apply this insight to all forms of communication.

Pressfield wrote this book primarily for fiction writers, who are at the most serious risk of forgetting that nobody wants to read their shit (source: am fiction writer). But the art of empathy applies to all communication, and so do many other elements of fiction:

Nonfiction is fiction. If you want your factual history or memoir, your grant proposal or dissertation or TED talk to be powerful and engaging and to hold the reader and audience's attention, you must organize your material as if it were a story and as if it were fiction. [...]

What are the universal structural elements of all stories? Hook. Build. Payoff. This is the shape any story must take. A beginning that grabs the listener. A middle that escalates in tension, suspense, and excitement. And an ending that brings it all home with a bang. That's a novel, that's a play, that's a movie. That's a joke, that's a seduction, that's a military campaign. It's also your TED talk, your sales pitch, your Master's thesis, and the 890-page true saga of your great-great-grandmother's life.

And your whitepaper, and your grant proposal, and your EA forum post. For this reason, I do recommend going out and grabbing this book, even though much of it concerns fiction. It only takes about an hour to read, because Pressfield knows we don’t want to read his shit. Finally:

All clients have one thing in common. They're in love with their product/company/service. In the ad biz, this is called Client's Disease. [...]

What the ad person understands that the client does not is that nobody gives a damn about the client or his product. [...] 

The pros understand that nobody wants to read their shit. They will start from that premise and employ all their arts and all their skills to come up with some brilliant stroke that will cut through that indifference.

The relevance of this quote to EA writing is left as an exercise to the reader. 


 

Comments11


Sorted by Click to highlight new comments since:

jvb -- good post; I agree with the central message: if you're a writer, no reader owes you anything - not their attention, not their sympathy, not their cognitive effort, not their patience, not their money, not their world-view. You have to earn all of it, word by word, line by line, chapter by chapter.

More general point: there's a huge ecosystem of advice out there on writing, outreach, marketing, advertising, public relations, etc. Much of it has dubious content, but a large proportion of it exemplifies good, direct, actionable writing. 

In other words, popular books on topics like advertising (e.g. by marketing guru Seth Godin) don't always deliver useful 'object-level' advice, but they are usually written in an extremely effective and compelling way that's worth studying at a meta-level.

Students who write essays are trained in a very strange environment - they have someone who will read their work no matter what. Try writing a blog - if it's bad,  I don't get an F, I get 10 views on something that took me a day to write.

The forum is honestly pretty good training for this. I've written several posts that took me ~10 hours each that got like 40 karma. Pretty galling, but that's the real world. 

It's actually quite remarkable--the way we teach writing to students is anti-useful.  You could possibly do a worse job than we're currently doing but I don't immediately see how. 

Well Nathan, you know that I want to say "why are you paying attention to karma, you know karma is a bad proxy for what you care about", and believe me I've done way worse than 10 hours for 40 points multiple times. But maybe the point is that karma is awarded on writing style more than anything else? 

does important EA have a crisis on its hands, that researchers who need engagement to thrive aren't getting the comments they need? Number of comments and karma are sorta correlated, right. 

I think karma is awarded more by generality of subject matter than writing style per se? This post I spent a few hours on (including the hour I spent rereading the book) has x3 the karma of another post I put up around the same time that represents the outcome of about a year of part-time research. 

And this is perfectly natural! Everyone on the EA forum has some reason to care about good writing, only some small subset of people on the EA forum  have some reason to care about genetic engineering detection.  

Yeah. I'll add:

  • Single-sourcing: Building Modular Documentation by Kurt Ament
  • Dictionary of Concise Writing by Robert Hartwell Fiske
  • Elements of Style by William Strunk Jr
  • A Rulebook for Arguments by Anthony Weston

There are more but I'm not finished reading them. I can't say that I've learned what I should from all those books, but I got the right idea, more than once, from them.

I'd also add 'The sense of style: The thinking person's guide to writing in the 21st century' (2015) by Steven Pinker (Harvard Psychologist who focuses on language) -- an excellent book.

Thank you for the recommendation. I also found Clear and Simple as the Truth by Thomas and Turner a great philosophical investigation into producing streamlined, simple writing. 

I've come across advice similar to  this  (David Perell for instance) and something that comes to mind is - What if Proust followed this advice? Or William James. Erik Davis. Or hell even Haruki Marukami.  They're a whole bunch of authors whose writing I love that go pretty hard against -"Streamline your message. Be as clear, simple, and easy to understand as you possibly can." 

Don't get me wrong, I appreciate the general thrust of this post and clear writing is usually great and something to aspire to. I also like the quote on non-fiction as fiction.  I think the main problem I have with any 'rules' on how to write is that to me, the best authors break the language.

There's something of a pepperoni airplane effect here. Everyone wants to think they're Proust.  I analogize it to the Picasso thing--you need to "learn the rules" before you can usefully take the training wheels off and start "breaking" them. Scare quotes intentional. 

I disagree re: Murakami (haven't read the others). I find him to be communicating extremely clearly. The actual book is full of specific examples of things that we think of as artful and indirect but that are actually bending the full force of themselves  into conveying  a very bright and specific concept.

The relevance of this quote to EA writing is left as an exercise to the reader.

Thanks for this post, a ton of good insights! I think this has quite a bit of relevance to internal EA community communication, aswell as pitching EA concepts to the general public

I've wondered quite a bit about how we can improve the broadcasting of EA Cause Areas to reach more potential researchers. I think the people who are pretty killer at this are scientific communicators on Youtube (Veritasium, Tom Scott, Physics Girl, etc..). I wonder if it'd be worth trying to get a few people working on reaching out to the top 50 science channels on Youtube to pitch different cause areas in EA as video concepts (Alignment research, novel approaches to Animal Welfare, BioRisk tech, etc...)?

Curated and popular this week
abrahamrowe
 ·  · 9m read
 · 
This is a Draft Amnesty Week draft. It may not be polished, up to my usual standards, fully thought through, or fully fact-checked.  Commenting and feedback guidelines:  I'm posting this to get it out there. I'd love to see comments that take the ideas forward, but criticism of my argument won't be as useful at this time, in part because I won't do any further work on it. This is a post I drafted in November 2023, then updated for an hour in March 2025. I don’t think I’ll ever finish it so I am just leaving it in this draft form for draft amnesty week (I know I'm late). I don’t think it is particularly well calibrated, but mainly just makes a bunch of points that I haven’t seen assembled elsewhere. Please take it as extremely low-confidence and there being a low-likelihood of this post describing these dynamics perfectly. I’ve worked at both EA charities and non-EA charities, and the EA funding landscape is unlike any other I’ve ever been in. This can be good — funders are often willing to take high-risk, high-reward bets on projects that might otherwise never get funded, and the amount of friction for getting funding is significantly lower. But, there is an orientation toward funders (and in particular staff at some major funders), that seems extremely unusual for charitable communities: a high degree of deference to their opinions. As a reference, most other charitable communities I’ve worked in have viewed funders in a much more mixed light. Engaging with them is necessary, yes, but usually funders (including large, thoughtful foundations like Open Philanthropy) are viewed as… an unaligned third party who is instrumentally useful to your organization, but whose opinions on your work should hold relatively little or no weight, given that they are a non-expert on the direct work, and often have bad ideas about how to do what you are doing. I think there are many good reasons to take funders’ perspectives seriously, and I mostly won’t cover these here. But, to
Jim Chapman
 ·  · 12m read
 · 
By Jim Chapman, Linkedin. TL;DR: In 2023, I was a 57-year-old urban planning consultant and non-profit professional with 30 years of leadership experience. After talking with my son about rationality, effective altruism, and AI risks, I decided to pursue a pivot to existential risk reduction work. The last time I had to apply for a job was in 1994. By the end of 2024, I had spent ~740 hours on courses, conferences, meetings with ~140 people, and 21 job applications. I hope that by sharing my experiences, you can gain practical insights, inspiration, and resources to navigate your career transition, especially for those who are later in their career and interested in making an impact in similar fields. I share my experience in 5 sections - sparks, take stock, start, do, meta-learnings, and next steps. [Note - as of 03/05/2025, I am still pursuing my career shift.] Sparks – 2022 During a Saturday bike ride, I admitted to my son, “No, I haven’t heard of effective altruism.” On another ride, I told him, “I'm glad you’re attending the EAGx Berkely conference." Some other time, I said, "Harry Potter and Methods of Rationality sounds interesting. I'll check it out." While playing table tennis, I asked, "What do you mean ChatGPT can't do math? No calculator? Next token prediction?" Around tax-filing time, I responded, "You really think retirement planning is out the window? That only 1 of 2 artificial intelligence futures occurs – humans flourish in a post-scarcity world or humans lose?" These conversations intrigued and concerned me. After many more conversations about rationality, EA, AI risks, and being ready for something new and more impactful, I decided to pivot my career to address my growing concerns about existential risk, particularly AI-related. I am very grateful for those conversations because without them, I am highly confident I would not have spent the last year+ doing that. Take Stock - 2023 I am very concerned about existential risk cause areas in ge
jackva
 ·  · 3m read
 · 
 [Edits on March 10th for clarity, two sub-sections added] Watching what is happening in the world -- with lots of renegotiation of institutional norms within Western democracies and a parallel fracturing of the post-WW2 institutional order -- I do think we, as a community, should more seriously question our priors on the relative value of surgical/targeted and broad system-level interventions. Speaking somewhat roughly, with EA as a movement coming of age in an era where democratic institutions and the rule-based international order were not fundamentally questioned, it seems easy to underestimate how much the world is currently changing and how much riskier a world of stronger institutional and democratic backsliding and weakened international norms might be. Of course, working on these issues might be intractable and possibly there's nothing highly effective for EAs to do on the margin given much attention to these issues from society at large. So, I am not here to confidently state we should be working on these issues more. But I do think in a situation of more downside risk with regards to broad system-level changes and significantly more fluidity, it seems at least worth rigorously asking whether we should shift more attention to work that is less surgical (working on specific risks) and more systemic (working on institutional quality, indirect risk factors, etc.). While there have been many posts along those lines over the past months and there are of course some EA organizations working on these issues, it stil appears like a niche focus in the community and none of the major EA and EA-adjacent orgs (including the one I work for, though I am writing this in a personal capacity) seem to have taken it up as a serious focus and I worry it might be due to baked-in assumptions about the relative value of such work that are outdated in a time where the importance of systemic work has changed in the face of greater threat and fluidity. When the world seems to