GWWC board member, software engineer in Boston, parent, musician. Switched from earning to give to direct work in pandemic mitigation. Married to Julia Wise. Speaking for myself unless I say otherwise. Full list of EA posts: jefftk.com/news/ea
Looking at the two comments, I see:
Your comment on a comment on a quick take, suggesting suing OpenAI for violating their charter and including an argument for why. Voted to +4.
Aaron's quick take, suggesting suing OpenAI for their for-profit conversion. No argument included. Voted to +173.
I don't see anything weird here. With the design of the site a quick take is likely to get much more attention than a nested comment on a quick take, and then when people start voting one up this snowballs because the site makes it more visible.
But even if you'd posted your comment as your own quick take I think it probably wouldn't have taken off: it doesn't give enough context for someone seeing it out of nowhere to figure out if they think it's worth paying attention to, or enough of an explanation for what a suit would look like. You can gloss this as packaging/rigor, I guess, but I think it's serving a useful purpose.
(I think neither posting is amazing: a few minutes with an LLM asking about what the rules are for converting 501c3s into for-profits would have helped both a lot. I'd hold that against them if they were regular posts but that's not a standard we do, or should, hold quick takes or comments to.)
I post a fair number of offbeat ideas like this, and they don't generally receive much attention, which leaves me feeling demoralized
In general, if you want ideas to receive attention you should expect to put in some work preparing them for other people's attention: gather the information that will help others evaluate them, make an argument for why these ideas are important. If you do that work, and then post as a quick take or (better, but requires more investment) top-level post, I do think you'll get attention. This is no guarantee of a positive reaction (people may disagree that you've sufficiently made your case) but I don't think it's a process that selects against weird ideas.
There's a reason people use "low-effort" as a negative term: you pay with your own effort in a bid on other people's attention.
I got downvoted/disagreevoted for asking if there's a better place to post offbeat ideas
Your comment starts with claims about what people want on the forum and a thesis about how to gain karma, and only gets to asking about where to post weird ideas in the last paragraph. I interpret the downvoting and disagree voting as being primarily about the first two paragraphs.
basically acknowledges that this is a hypothetical, and new ideas mostly don't get posted here
I wasn't trying to make a claim either way on this in my comment. Instead, I was adding a caveat that I was going by my impression of the site instead of taking the time to look for specific examples that would support or counter my claim, and so people should put less weight on my claim.
Thinking now, some example ideas that were new/weird in the sense that they were pretty different from the lines of thought I'd seen here before but that still got attention (or at least comments / votes):
Top level post: Let’s think about slowing down AI
Quick take: EA Awards
Copying Chandler's response from the comments of the open thread:
Hi Arnold,
Thanks for your question! You are correct that our funds raised for metrics year 2023, $355 million, was below our 10% percentile estimate from our April 2023 blog post. We knew our forecasts were quite uncertain (80% confidence interval), and, looking back, we see two primary reasons that our forecasts were incorrect.
First, we were optimistic about the growth of non-Open Philanthropy funding. Our funds raised in 2023 from sources other than Open Philanthropy was $255 million, which is about at our 10% percentile estimate and is similar to the $253 million we raised in 2022 from sources other than Open Philanthropy (see the bottom chart in the blog post). We've continued to expand our outreach team, with a focus on retaining our existing donors and bringing in new donors, and we believe these investments will produce results over the longer term.
Second, Open Philanthropy committed $300 million in October 2023 and gave us flexibility to spend it over three years. We chose to allocate $100 million to 2023, 2024, and 2025, which is less than the $250 million we had forecast for 2023.
We discuss our current funding situation in a recent blog post about our approach to grant deployment timelines. We remain funding constrained at our current cost-effectiveness bar. Raising more money remains our single most important lever for maximizing impact---if we have more funding, we'll be able to make more grants to cost-effective programs that save and improve lives.
I do list this on my donations page, but I'm trying to be pretty conservative in what I count as my donations: only the actual money I actually donate. So I don't count it towards my 50% and put it in grey italics like my employer donation matches, donations in exchange for work, the PayPal 1% match, and other counterfactual money moved that I don't fully include.
I think it's fine (and probably good) if others are less strict about this, though!
Sorry! I've edited my comment to make it clearer that I'm trying to say that suffering caused by eating meat is not the only factor you should weigh in estimating expected utility.
(For what it's worth I do still think it's likely that, taking these other benefits into account and assuming you think society is seriously undervaluing the moral worth of animals, veganism still doesn't make sense as a matter of maximizing utility.)
I would expect most donations to be in giving season, though, which in 2022 would be after FTX collapsed