Bio

Non-EA interests include chess and TikTok (@benthamite). We are probably hiring: https://metr.org/hiring 

How others can help me

Feedback always appreciated; feel free to email/DM me or use this link if you prefer to be anonymous.

Sequences
3

AI Pause Debate Week
EA Hiring
EA Retention

Comments
1172

Topic contributions
6

Thanks for finding this and writing it up! And thanks to FRI for updating their report. 

I thought this was an interesting point, thanks for writing it.

I feel confused about this response. You're asking for people to give you examples of a thing occurring, I'm asking by what date range you wish to see examples in.

What time frame are you interested in? E.g. if someone says that they have <30y timelines today, would that meet your criteria?

Thanks! Perhaps I phrased this poorly; a person being a patient or not isn't the relevant factor, it's whether or not they are licensed. E.g. if you look at the FDA authorization for the first product it says:

The ContaCT mobile application is intended to be used by neurovascular specialists, such as vascular neurologists, neuro-interventional specialists, or users with similar training who have been pre-authorized by their Healthcare Organization or Facility.

I'm actually not sure whether one could generously interpret "similar training" to include e.g. radiology technicians.  They wouldn't be allowed to make diagnoses, and my guess is that the government would not look kindly on a rad tech saying something like "I'm not diagnosing you with a stroke, but the AI thinks you've had one, wink, wink," but I'm not sure. Perhaps someone with more legal experience could chime in.

In any case, I'm skeptical that a business would want to run that malpractice risk (particularly since, as mentioned above, insurance wouldn't reimburse them for doing so).

And yes, I agree that this probably means these products aren't more clearly safe and effective than e.g. eyeglasses (where businesses are analogously legally prohibited from giving glasses to patients without a licensed human optometrist first performing an exam). It's just worth considering that this is a very high bar![1]

  1. ^

    Although I think maybe it's more accurate to just say that medical device authorization is based on a bunch of factors that are largely unrelated to the safety and efficacy of the product. E.g. I think there's no one who believes that cigarettes are safer than eyeglasses, despite them being available OTC.

I doubt that there are surveys of when people stayed home. You could maybe try to look at prediction markets but I'm not sure what you would compare them to to see if the prediction market was more accurate than some other reference group.

Thanks for collecting this timeline! 

The version of the claim I have heard is not that LW was early to suggest that there might be a pandemic but rather that they were unusually willing to do something about it because they take small-probability high-impact events seriously. Eg. I suspect that you would say that Wei Dai was "late" because their comment came after the nyt article etc, but nonetheless they made 700% betting that covid would be a big deal.

I think it can be hard to remember just how much controversy there was at the time. E.g. you say of March 13, "By now, everyone knows it's a crisis" but sadly "everyone" did not include the California department of public health, who didn't issue stay at home orders for another week. 

[I have a distinct memory of this because I told my girlfriend I couldn't see her anymore since she worked at the department of public health (!!) and was still getting a ton of exposure since the California public health department didn't think covid was that big of a deal.]

Congrats Samantha and the AIM team!

Your answer is the best that I know of, sadly.

A thing you could consider is that there are a bunch of EAGx's in warm/sunny places (Ho Chi Minh City, Singapore, etc.). These cities maybe don't meet the definition of "hub", but they have enough people for a conference, which possibly will meet your needs.

Load more