Strongly disagree.
Edit to add: Good to know that people react to other people's discomfort with jokes about mass murder of their family with downvotes.
That's a really good point, thanks! Though if they don't have short timelines, it seems like they are being quite irresponsible as board members not preventing Sam from making increasingly large bets on scaling. Of course, they might not be willing to cross him; the current board presumably learned the lesson from Ilya's ill-fated decision.
Also, you need what are currently considered almost implausibly long timelines to not think that them spending more quickly makes sense.
No, it would not. Per the frame that makes the argument more compelling, as I said; "Secondly, they may be even more successful in building significantly more powerful AI, transforming the world. Obviously, the nonprofit would become far wealthier, but given OpenAI’s mandate, it also becomes irrelevant."
But within the first option, if they are actually more than doubling their value yearly (as implied by 100x in 6 years, which matches their current revenue growth continuing at the current rate,) if they give away $20 billion per year, starting at their current valuation of $150 billion, they end up giving away only a small fraction of their eventual endowment - about 13%. And in that case, given that it's hard to spend 13% of $150b effectively, it's going to be far harder to spend any large percentage of their $15 trillion endowment in later years!
To be clear about my views, I do support spending on local community orgs - but "local organizations or those where I have personal affiliations or feel responsibilities towards are also important to me - but... this is conceptually separate from giving charity effectively, and as I mentioned, I donate separately from the 10% dedicated to charity."
I am not saying everyone is malicious, nor that no-one cares - but belief fixation can happen with a moderate non-majority proportion of a population incentivized to believe it is true, and something like this isn't incompatible with good motivations, and it is about as hard to refute once people claim it's true as it would to establish the claim in the first place.
I'm not trying to imply it, I'm trying to state it clearly. You dismiss the arguments made in the book as not being empirical. If you haven't read your post, here are some quotes indicating where you do this explicitly:
"the chapter presents effectively zero empirical research"
"one might expect Y&S to substantiate their case with empirical evidence"
"lack of empirical evidence"