Topic Contributions


Flimsy Pet Theories, Enormous Initiatives

It's pretty clear that being multiplanetary is more anti-fragile? It provides more optionality, allows for more differentiation and evolution, and provides stronger challenges.

Flimsy Pet Theories, Enormous Initiatives

I recently gave a talk on one of my own ambitious projects at my organization, and gave the following outside view outcomes in order of likelihood.

  1. The project fails to gain any traction or have any meaningful impact on the world.
  2. The project has an impact on the world, but despite intentions the impact is negative, neutral or too small to matter.
  3. The project has enough of a positive outcomes to matter.

In general, I'd say that outside view this is the most likely order of outcomes of any ambitious/world-saving project. And I was saying it specifically to elicit feedback and make sure people were red-teaming me morally.

However, it's not specifically clear to me that putting more money into research/thinking improves it much? 

For one thing, again the most likely outcome is that the project fails to gain any traction or have any impact at all, so you need to be de-risking the likelihood of that through classic lean-startup MVP style stuff anyway, you shouldn't wait on that, and spend a bunch of money figuring out the positive or negative effects of an intervention at scale that won't actually be able to scale (most things won't scale).

For another, I think that a lot of the benefit of potentially world changing projects is through hard to reason about flow through effects. For instance, in your example about Andrew Carnegie and libraries, a lot of the benefits would be some hard to gesture at stuff related to having a more educated populace and how that effects various aspects of society and culture. You can certainly create Fermi estimates and systems models but ultimately people's models will be very different, and missing one variable or relationship in a complex systems model of society can completely reverse the outcome.

Ultimately, it might be better to use the types of reasoning/systems analyis that work under Knightian Uncertainty, things like "Is this making us more anti-fragile?  is this effectuative and allowing us to continually build towards more impact? Is this increasing our capabilities in an asymmetric way?" 

This is the exact type of reasoning that would cause someone intuitively to think that space settlements are important - it's clearly a thing that increases the anti-fragility of humanity, even if you don't have exact models of the threats that it may help against. By increasing anti-fragility, you're increasing the ability to face unknown threats.  Certainly, you can get into specifics, and you can realize it doesn't make you as anti-fragile as you thought, but again, it's very easy to miss some other specifics that are unknown unknowns and totally reverse your conclusion.

I ultimately think what makes sense is a sort of culture of continuous oversight/thinking about your impact, rather than specific up front research or a budget. Maybe you could have "impact-analysisathons" once a quarter where you discuss these questions. I'm not sure exactly what it would look like, but I notice I'm pretty skeptical at the idea of putting a budget here or creating a team for this purpose. I think they end up doing lots of legible impact analysis which ultimately isn't that useful for the real questions you care about.

FTX EA Fellowships

Sure, but "already working on an EA project" doesn't mean you have an employer.

FTX EA Fellowships

Assuming you have an employer

Effective Altruism Coaching 2020 Annual Review

This is great! Curious what (if anything) you're doing to measure counterfactual impact.  Any sort of randomized trial involving e.g. following up with clients you didn't have the time to take on and measuring their change in productive hours compared to your clients?

Halffull's Shortform

Yeah, I'd expect it to be a global catastrophic risk rather than existential risk.

Halffull's Shortform

Is there much EA work into tail risk from GMOs ruining crops or ecosystems?

If not, why not?

Delegate a forecast
Yeah, I mostly focused on the Q1 question so didn't have time to do a proper growth analysis across 2021

Yeah, I was talking about the Q1 model when I was trying to puzzle out what your growth model was.

There isn't a way to get the expected value, just the median currently (I had a bin in my snapshot indicating a median of $25,000). I'm curious what makes the expected value more useful than the median for you?

A lot of the value of potential growth vectors of a business come in the tails. For this particular forecast it doesn't really matter because it's roughly bell-curved shape, but if I was using this as for instance decisionmaking tool to decide what actions to take, I'd really want to look at which ideas had a small chance of being very runaway successes, and how valuable that makes them compared to other options which are surefire, but don't have that chance of tail success. Choosing those ideas isn't likely to pay off on any single idea, but is likely to pay off over the course of a business's lifetime.

Delegate a forecast

Thanks, this was great!

The estimates seem fair, Honestly, much better than I would expect given the limited info you had, and the assumptions you made (the biggest one that's off is that I don't have any plans to only market to EAs).

Since I know our market is much larger, I use a different forecasting methodology internally which looks at potential marketing channels and growth rates.

I didn't really understand how you were working in growth rate into your calculations in the spreadsheet, maybe just eyeballing what made sense based on the current numbers and the total addressable market?

One other question I have about your platform is that I don't see any way to get the expected value of the density function, which is honestly the number I care most about. Am I missing something obvious?

Delegate a forecast

Hey, I run a business teaching people how to overcome procrastination (procrastinationplaybook.net is our not yet fully fleshed out web presence).

I ran a pilot program that made roughly $8,000 in revenue by charging 10 people for a premium interactive course. Most of these users came from a couple of webinars that my friend's hosted, a couple came from finding my website through the CFAR mailing list and webinars I hosted for my twitter friends.

The course is ending soon, and I'll spend a couple of months working on marketing and updating the course before the next launch, as well as:

1. Launching a podcast breaking down skills and models and selling short $10 lessons for each of them teaching how to acquire the skill.

2. Creating a sales funnel for my pre-course, which is a do-it-yourself planning course for creating the "perfect procrastination plan". Selling for probably $197

3. Creating the "post-graduate" continuity program after people have gone through the course, allowing people to have a community interested in growth and development, priced from $17/month for basic access to $197 with coaching.

Given those plans for launch in early 2021:

1. What will be my company's revenue in Q1 2021?

2. What will be the total revenue for this company in 2021?

Load More