Here's a place to discuss projects, ideas, events and miscellanea relevant to the world of effective altruism that don't need a whole post of their own!
The most interesting current news for effective altruists is that EA Global in San Francisco has just started!
- Video from the first day, interspersed with occasional breaks, is available here.
- Streamed video from the subsequent two days will be available here.
- The program is here.
Note that MIRI is also a few weeks into its fundraiser.
Hope you have lots to discuss amidst these fresh EA talks!
This sounds like a really great idea. I think as a community we tend to make loads of predictions; it seems likely we do this a lot more than other demographics. We do this for fun, as thought experiments and often as a key area of focus such as x-risk etc. It seems like a good idea to track our individual abilities on doing this sort of predicting for many reasons. Identifying who is particularly good at this, for improvement etc. It does make me concerned that we would become hyper-focused on predictions and lead us to potentially neglect current causes; getting too caught up in planning and looking forward and forgetting to actually do the thing we say we prioritize.
I also wonder about how well near-future prediction ability translates to far-future predictions. In order to test how well you are able to predict thing you predict near-future events or changes. You increase your accuracy at doing these and assume it translates to the far-future. Lots of people make decisions based around your far-future predictions based on your track record of being an accurate predictor. Perhaps, however, your model of forecasting is actually wildly inaccurate when it comes to long term predictions. I'm not sure how we could account for this. Thoughts?
To clarify, there is a class of persons known as "superforecasters". I don't know the details of the science to back it up, except their efficacy has indeed been validly measured, so you'll have to look more up yourself to learn how it happens. What happens, though, is superforecasters are humans who, even though they don't usually have domain expertise in a particular subject, predict outcomes in a particular domain with more success than experts in the domain, e.g., economics. I think that might be one layperson forecaster versus one expert, ra... (read more)