Possibly, it is enough to just have a disclaimer like "by submitting, you agree to have this turned into an audio format" to satisfy copyright laws?
It seems like an extremely good use of time for EA forum designers (and others) to implement this. I think this is a fantastic project and I really hope this goes ahead. I also think it's valuable for the EA community to take less start-up style shortcuts as the movement gets older and more well-known. I think the risk of giving the impression of being low-integrity or messy is high enough that regardless of whether copyright laws are good or not, it's worth following them.
I upvoted your comment because what you said was interesting even if I disagree with the overall sentiment. I agree that fiction is good for raising difficult questions and exploring nuance. I also agree that didactic fiction can be a bit off-putting. However, most fiction expresses a writer's beliefs about the world and what they value (most writers will take inspiration from what they already know and what they think). When the writing is good, this tends not to be off-putting because the writer handled this with enough nuance. An author's best guess of what is good or bad (or true or false) can still come through pretty clearly without the reader feeling like they have to agree with them because there's enough going on, there is enough complexity and enough nuance, in the story for readers to be able to put different emphasis on different elements (and therefore for readers to feel free enough to come to their own conclusions for the author's point of view to not be off-putting). Simple messages in fiction, dealt with no nuance, make for bad writing. Fiction that makes readers think and confront difficult questions can make for much better fiction. It doesn't follow that fiction can't change people's impressions of what is good or bad (or true or false). If you can change people's impressions of what is good or bad (or true or false), it seems worth taking that into account. Is it a good goal for fiction to confront a reader with any difficult question without considering whether this difficult question is worth thinking about over other questions? My guess is, in full EA spirit, that it is better to choose which difficult questions to ask (especially if some difficult questions will lead people to missing questions/considerations that you think are way more important). I think maybe a good goal for fiction could be to leave the reader with a better understanding of your point of view/the way you see the world than if they hadn't read your piece whilst also having them see the nuances/complexities involved (because there generally aren't ever simple answers and fiction can be a great way to explore that).
I love this post! Thank you for sharing it :)
Yay! I'm glad :)
It is probably totally inappropriate to respond to questions on an AMA for other people, but I thought I'd mention anyway that I loved a talk (linked below) that Hayden Wilkinson gave, which was very relevant to this.
Hayden pointed out that even if, theoretically, your only goal* was to help others as much as you can over your lifetime, you still need to take into account that you are human and what you do now changes what your future self is likely to want to do. If you try and do an extreme amount now, with no plan to give yourself a break from this extreme amount when you need one, then your lifetime impact will probably be less than if you set yourself much less demanding targets. If you then find that the less demanding targets are easy to maintain and you think you really could do more, at that point you can rev up. Likewise, when what you are doing feels too much (even if theoretically, you think you should be doing even more), giving yourself permission to properly take care of yourself in the short-term might be the best way to increase your impact over your lifetime. *For the record, I'd guess that for almost everyone within the EA community, doing as much as they can to help others isn't even their only goal in life, even if it is still a very high priority for them (and for almost all goals that a person might have, self-care for your long-term wellbeing seems really important). I have other goals (like having an enjoyable life) because I am not perfectly selfless, but I think it is plausible that letting myself have other goals increases the chances that this goal (the goal of helping others as much as I can with a significant proportion of my time and money) will be a pretty high priority for me for the rest of my life.
That makes sense! My mistake.
I downvoted your comment despite agreeing with a lot of your critiques because I very, very strongly disagree that posts like this aren't a good fit for the forum (and my best guess is that discouraging this sort of post does significantly more harm than good). If someone who has a good understanding of what effective altruism is has an idea they think is plausibly a high impact use of time (or other resources), the forum is exactly where that sort of idea belongs! This post clearly reaches this standard. Once the idea is on the forum, open discussion can happen about whether it is a high impact idea, or even net positive. If people only ever post ideas to the forum that they are already quite sure the effective altruism community will agree are high impact, it will be much harder for the effective altruism community to not be an echo chamber of only the "approved" ideas. I think the author has improved the forum by making this post for two reasons. The first reason is that the post created an interesting discussion on whether this idea is good one and how it could be improved (the critiques in your comment were an important contribution to this!). Secondly, more importantly, their post nudged the culture of the forum in a direction I liked; making it more normal to post ideas for plausibly* high impact projects that aren't as obviously connected to one of the standard EA ideas that come up in every EA intro talk. Despite me not being sure that this idea is even net positive, it still seems almost absurd to me that this post isn't a good fit for the EA forum (especially if people like you make compelling critiques and suggestions in the comments, ensuring the discussion isn't too one-sided and maybe also allowing plausibly good ideas to iterate into better ideas)!*To me, sufficiently plausible to be a good fit for a forum post, as I said above, is an author who understands what EA is who thinks the idea might be high impact. I actually think this author went well beyond and above what I think a good minimum bar is for such ideas; it sounds like this author put in a great deal of thought into this project, has put quite a bit of work already into getting this idea off the ground and also got feedback from multiple people in the EA community!
I am enjoying all this recent discussion on what we should be calling "effective altruism". As EA ideas become more common and get applied in a larger variety of contexts, it might be good to have different words that are context and audience specific. For example, "global priorities" seems like a great name for the academic field, and it can be acknowledged that it is related to "effective altruism" the social movement which is, itself, clearly very distinct but still related to the LessWrong/Rationality community. Maybe policy orientated effective altruism needs its own name (clearly related to the academic field and social movement but distinct from it?). Similarly, maybe it is also okay for a broader appeal version of effective altruism to have a different name (this is maybe what the GWWC brand is moving towards?).
The effective altruism project is pretty broad and even if a large amount more thought had been put into the name, it still seems unlikely to me that one name could appeal to policy-makers, academics, the broader population and students/ people on the internet that both like to deeply philosophise about morality and base their lives around the conclusions of that philosophising.