blonergan

122Joined Mar 2020

Comments
21

Thank you for this post. I missed it when it was originally posted, and only came across it via the recent "Friendship Forever" post.  An organization doing work in this area that might be of interest is the Foundation for Social Connection. They have an "Innovation Accelerator" that could potentially provide funding for projects addressing loneliness. It looks like they are funded in part by two large health insurance companies (Humana and United Healthcare), based on this

Thanks for your reply. I do think it would be unusual to see such promises, particularly from a firm looking for large investments. And I would expect to see a bunch of disclaimers, as you suggest. There might have been such language in the actual investment documents, but still. The excerpt shared on Twitter would have set off red flags for me because it seems sloppy and unprofessional, and it would have made me particularly concerned about their risk management, but I wouldn't have concluded it was a Ponzi scheme or that there was something fraudulent going on with the reported returns. 

It will be interesting to see if all of the FTX/Alameda fraud (if there was fraud, which seems very likely) took place after the most recent investment round. Investors may have failed not in financial diligence but in ensuring appropriate governance and controls (and, apparently, in assessing the character of FTX's leadership).

The returns shown in the document are not indicative of fraud -- those sorts of returns are very possible when skilled traders deploy short-term trading strategies in inefficient markets, which crypto markets surely were at the time.  The default risk when borrowing at 15% might have been very low, but not zero as they suggested. The "no downside" characterization should have been caught by a lawyer, and was misleading. 

Nobody with an understanding of trading would have [EDIT] I would not have concluded they were engaged in Ponzi schemes or were misrepresenting their returns based on the document. There are plenty of sloppy, overoptimistic startup pitch decks out there, but most of the authors of those decks are not future Theranoses.

Thank you for your response. And I apologize for being defensive in my comment. And for not noticing your edits when they happened.

I took your post seriously and had an extended exchange with you in the comments section. I indicated that I shared some of your concerns. I also expressed that I thought you had mischaracterized some of SBF's views about bitcoin and other cryptocurrencies. It appears that you have since edited the post to correct some of those mischaracterizations, but you did not acknowledge having done so, best I can tell. 

I also disagreed with your view that many good projects would lose funding if there were a crypto downturn. Unfortunately, with FTX collapsing so abruptly, there is a risk of that happening. I am hopeful that other donors will step up to fund the highest value projects funded by FTX, but this is a real challenge we face as a community. 

I'm puzzled by your statement in this new post that "It was quite obvious that this would happen..." There was certainly a risk things could go badly, and I think I personally underestimated the risk, but I don't think it is credible to say that it was obvious.

 

I interpreted that as meaning that a $1,000 cash transfer costs a bit more than $1,000, including the direct cost of the cash transfer itself.  So, something like $100 of delivery costs would mean that a $1,000 cash transfer would have a total cost of around $1,100. 

Here HLI comes up with $1,170 as the total cost of a $1,000 cash transfer, which seems reasonably close to your numbers.

This is wonderful news! 

A couple of comments on the new intro to EA article:

  1. The graph in the “Helping create the field of AI alignment research” is an interesting one, but it takes up a lot of space given that it isn’t about the main point of the section.  It seems like the section is about “AI will probably be a big deal and the EA community has helped create and populate the AI alignment field, which is trying to increase the likelihood that AI is beneficial” whereas the graph says “the Industrial Revolution was a big deal” which is somewhat relevant but doesn’t seem to warrant a giant graph in my opinion. Also, some readers might wonder if the graph merely reflects constant exponential growth (my understanding is that it doesn't, but it's not obvious to me by looking at it).
  2. Under “Improving decision-making,” I don’t find the Metaculus example very compelling. The text suggests but does not establish that the forecasting community was ahead of consensus public or expert opinions. And it’s not clear to me what people/entities changed, or could have changed, their decisions in a way that would have been beneficial to humanity by using the Metaculus forecast. Maybe that's obvious to other people though!

On future funding flows, I specifically said "[i]n the event of a crypto crash, fewer new projects would be funded, and the bar for continuing to fund existing projects would be higher," so I don't think we disagree about that. But I disagree with the "lots of good projects (would) have to be ended" statement in your original post.

Load More