All of Sam Glover's Comments + Replies

I am very sceptical about the numbers presented in this article. 22% of US citizens have heard of Effective Altruism? That seems very high. RP did a survey in May 2022 and found that somewhere between 2.6% and 6.7% of the US population had heard of EA. Even then, my intuition was that this seemed high. Even with the FTX stuff it seems extremely unlikely that 22% of Americans have actually heard of EA.

Thanks - I just saw RP put out this post, which makes much the same point. Good to be cautious about interpreting these results!

This seems very likely to be a coincidence. This appears to be a freshers fair, where different university societies try to recruit new students at the university. Freshers fairs in the UK (and I would imagine it may be similar in New Zealand) generally have lots of different societies trying to appeal to new students. Societies that have nothing to do with each other often have these stalls set up next to each other, and in this case I think it's very likely that the Palestinian society and the Effective Altruism society stalls were coincidentally next to each other. 

I wrote an FAQ that may be helpful in your application, you can find it here. If you'd like to know anything else about the application process (or want help with anything else, such as having a fresh pair of eyes on your application), feel free to email me at samueljnglover@gmail.com.

1
meugen
9mo
Oh that's very helpful, thank you very much for offering your support :)

Thanks for writing this, I found it really interesting.

Thanks so much, awesome that you decided to apply and I’m so happy you enjoy the blog!

If you happened to have read this piece, do you think it's substantive/useful enough for me to post on EA Forum proper (with a few edits) or is it more suitable as a shortform? 

2
Saul Munn
1y
I read it, and I think it's worth it to post on the EA forum. FWIW, it convinced me to apply. (also, I'm a big fan of your blog! highlight of the day is when i get the notification from substack that says you posted!)

I wrote up a quick FAQ on what the application process is like for Tyler Cowen's Emergent Ventures program. Generally I think more EAs should apply to EV, it's low-cost and a good way to get some money to get a project off the ground, and also a good signal of ability for young EAs. 

https://www.samstack.io/p/emergent-ventures-faq

4
Sam Glover
1y
If you happened to have read this piece, do you think it's substantive/useful enough for me to post on EA Forum proper (with a few edits) or is it more suitable as a shortform? 

I think it would be useful to know the percentage of women with depression who we would expect to be depression-free after a six month period without any intervention. 

0
Sean Mayberry
1y
Sam Glover- We looked at this with a control study and found that a statistically significant number of women were depression-free after completing StrongMinds therapy versus those that received no treatment. Additionally, using the PHQ-9 (an international standardized tool to assess depression), we saw an average difference of 12 pts after the conclusion of therapy. To contextualize that, in western countries, a change of 4 pts is considered significant in terms of depression recovery. 

Thanks for giving the details,  I couldn't quite remember the full story and should've looked it up and quoted directly. I don't quite know what to make of him doing this - on the one hand, a small lie about being vegetarian doesn't seem particularly pernicious or noteworthy, especially given he went vegetarian after lying about it. On the other hand, it does at least strike me as somewhat odd to do this if he had just eaten a cheeseburger a few hours earlier. It does update me ever-so-slightly towards thinking that he's liable to lie if it makes him look good - it might not just be a 'lie without intending to' situation.

It’s possibly worth noting that in his conversation with Tyler Cowen he did mention that he had previously lied (briefly) about being vegetarian.

I dug up that conversation, and the point you're referring to is presumably here. The story he tells is: he's trying but failing to be a vegatarian, gets asked by another vegetarian in a social setting whether he is one, says he is, and then never eats meat ever again.

I know I'm contradicting what I just said since it is technically a lie, but honesty this doesn't seem like a big deal to me. ImE you can genuinely lie "without wanting to" in social situations. Someone asks you a question, and some unconscious process in your brain produces an answer within ... (read more)

You might be interested in some pieces I wrote on this recently, which don't explicitly show factual errors but do offer a criticism of the book. See here and here.

2
ellie
1y
Thanks, I enjoy reading these. I appreciate that you're cautious to be too strong in your criticism but I do think that Caplan's dismissal of quasi-experiments is a more or less a factual error. 

This feels like a weird interpretation of Will's comment, which doesn't (in my view) imply that for-profit companies can't do a lot of harm, but rather that if you start a company with the sole goal of making a profit, usually the worst outcome (with regards to your goal of making a profit) is that you go bankrupt. 

6
Jonathan Paulson
1y
As FTX just spectacularly demonstrated, Will was wrong. This is because even though FTX was ostensibly started with the sole goal of making a profit, it turns out there were other important implicit goals like “don’t steal billions of dollars from thousands of people”, implicit goals like that always exist, and failure to meet those implicit goals is very bad.

Presumably he means because x-risk is short for 'existential risk' and can refer to things other than extinction. 

Strangely it was Superforecasting by Phil Tetlock, which made me start forecasting on Good Judgment Open. I started interacting with forecasters there, and a load of those guys were into EA, and that's how I got into it. I think a decent number of people have gone from EA (or rationalism) into forecasting but for me it was the other way around. 

This isn't exactly a comprehensive answer to your question about what's morally permissible and what isn't, but my view is that if it's going to be a huge hassle and expense for you to avoid flying, you shouldn't make yourself feel awful about the fact that you've done something you regard as less than ideal. I would just donate to Clean Air Task Force (probably an amount that will more than cover the impact of the flights in expectation) and continue trying to avoid flights in future when feasible if you think that's something you want to do. 

Thanks, good point! I agree that it's possible that a backlash could occur a while after the disruptive protests actually took place.  That being said, it seems likely (at least to me) that if it were the case that these protests were going to lead to people becoming less supportive of climate policies, there would have been at least some evidence of the backlash in the survey data at the height of, or in the immediate aftermath of, the disruption. 

Initially, we had planned only to do two surveys, but decided on commissioning a third when it beca... (read more)

Thanks, yeah I think this was an error on my part rather than anything to do with you. I should have looked more carefully, thought I skipped past the recommendations but the default option of subscribing caught me off guard. This is a shame because I'm now more hesitant about recommending other substacks on my own site.

Looks good - but how come when I subscribed to this I also seem to have been signed up to another newsletter called 'Bentham's Bulldog'? Not that I'm necessarily opposed to being signed up to this other substack, but I didn't opt into it so it seems slightly weird that I've been (seemingly) added to the email list.

5
Richard Y Chappell
2y
Having looked into it more, I gather that after subscribing, you're presented with the list of other substacks that I recommend, and a highlighted option to subscribe to them (selected by default). It's bad form on substack's part that the button to decline ("maybe later") is not so prominent, so you may be led to accidentally over-subscribe to other newsletters.  Sorry about that! (Though it's easy enough to unsubscribe at any time, at least.)
1
Richard Y Chappell
2y
That's odd!  Is it possible that you clicked this commenter's profile (or a blogroll link) by mistake?  If anyone else is having this issue, let me know and I'll try looking into it further.