Would you be interested in a Cause Prioritization Newsletter? What would you want to read on it?
I'll sign up and read if it'd be good 😊
What I'd be most interested in are the curation of
Add to (3) new explanations or additions to methodologies - e.g., I still haven't found anything substantial about the idea of adding something like 'urgence' to the ITN framework.
Definitely! And I'll raise by my general interest in thoughtful analyses of existing frameworks
Is there some sort of a followup?
What does it mean for a human to properly orient their lives around the Singularity, to update on upcoming accelerating technological changes?
This is a hard problem I've grappled with for years.
It's similar to another question I think about, but with regards to downsides: if you in fact knew Doom was coming, in the form of World War 3 or whatever GCR is strong enough to upset civilization, then what in fact should you do? Drastic action is required. For this, I think the solution is on the order of building an off-grid colony that can survive, assuming one can't prevent the Doom. It's still hard to act on that, though. What is it like to go against the grain in order to do that?
Would you be interested in a video coworking group for EAs? Like a dedicated place where you can go to work for 4-8 hours/day and see familiar faces (vs Focusmate which is 1 hour, one-on-one with different people). EAWork instead of WeWork.
Someday, someone is going to eviscerate me on this forum, and I'm not sure how to feel about that. The prospect feels bad. I tentatively think I should just continue diving into not giving a fuck and inspire others similarly since one of my comparative advantages is that my social capital is not primarily tied in with fragile appearance-keeping for employment purposes. But it does mean I should not rely on my social capital with Ra-infested EA orgs.
I'm registering now that if you snipe me on here, I'm not gonna defensively respond. I'm not going to provide 20 citations on why I think I'm right. In fact, I'm going to double down on whatever it is I'm doing, because I anticipate in advance that the expected disvalue of discouraging myself due to really poor feedback on here is greater than the expected disvalue of unilaterally continuing something the people with Oxford PhDs think is bad.
This sounds very worrying, can you expand a bit more?
I don't have much slack to respond given I don't enjoy internet arguments, but if you think about the associated reference class of situations, you might note that a common problem is a lack of self-awareness of there being a problem. This is not the case with this dialogue, which should allay your worry somewhat.
The main point here, which this is vagueposting about, is that people on here will dismiss things rather quickly especially if it's a dismissal by someone with a lot of status, in a pile-on way without much overt reflection by the people who upvote such comments. I concluded from seeing this several times that at some point this will happen with a project of mine, and that I should be ok with this world, because this is not a location in which to get good project feedback as far as I can tell. The real risk here I am facing is that I would be dissuaded from the highest-impact projects by people who only believe in things vetted by a lot of academic-style reasoning and evidence that makes legible sense, at the cost of not being able to exploit secrets in the Thielian sense.
It's interesting that the Oxford PhDs are the ones you worry about! Me, I worry about the Bay Area Rat Pack.
This is also valid! :)
Omg I can't believe that someone downvoted you for admitting your insecurities on your own shortform!! That's absolutely savage, I'm so sorry.
I am seeking funding so I can work on my collective action project over the next year without worrying about money so much. If this interests you, you can book a call with me here. If you know nothing about me, one legible accomplishment of mine is creating the EA Focusmate group, which has 395 members as of writing.
What are ways we could get rid of the FDA?
(Flippant question inspired by the FDA waiting a month to discuss approval for coronavirus vaccines, and more generally it dragging its legs during the pandemic, killing many people, in addition to its other prohibitions being net-negative for humanity. IMO.)
So, I take issue with the implication that the FDA's process for approving the covid vaccine actually delays rollout or causes a significant number of deaths. From my understanding, pharma companies have been ramping up production since they determined their vaccines probably work. They aren't sitting around waiting for FDA approval. Furthermore, I think the approval process is important for ensuring that the public has faith in the vaccine, and that it's actually safe and effective.
This post claims the financial system could collapse due to Reasons. I am pretty skeptical but haven't looked at it closely. Signal-boosting due to the low chance it's right. Can someone else who knows more finance analyze its claims?