DonyChristie

Topic Contributions

Comments

EA and the current funding situation

As someone who knows Anthony in-person and has engaged in more high-bandwidth communication with him than anyone else on this thread, I am happy to stake social capital on his insights being very much worth listening to broadly speaking and that he's worth connecting to anyone who could give his ideas legs. 

I have downvoted at least one comment  in this thread that I felt was not conducive to more of his ideas being externalized into the world due to what I believe to be unnecessary focus on social norms/tone policing over tracking object-level ideas. I am not responding further nor am I responding to particular comments as I don't want to feed the demon thread, but I do want to provide clarity on my judgement of what-is-in-the-right and also state I think Anthony could very possibly provide us Cause X as much as anyone I've seen. 

To that end, I believe his interest in new/different infrastructure for how to communicate and internalize ideas is reasonable, and that it's unreasonable to expect idea providers to also have to be the idea executors in the ideal impact marketplace, especially to the extent of expecting them to engage in implicit politics more than is strictly necessary to get the ball rolling.

When did the EA Forum get so good?!

I personally think the average quality of posts has gone down but that this is probably okay. The total number of good posts has gone up; it's harder but not that much harder to sift through to find them. It would be nice if good recent posts were visible longer. Or if there was a feature to save posts (maybe I could use Thought Saver for this). Karma inflation seems quite high and less informative than it used to be, boosted by social desirability biases, though it makes sense to take it as good news that there are more people reading, evaluating, and contributing. I want more object-level stuff over community-building stuff (maybe these can be separated?).

The AI Messiah

Moynihan's book X-Risk makes a key subtle distinction between religious apocalypticism where God is in control and the secular notion of a risk that destroys all value that only humanity can stop. I'm uncertain how much the distinction is maintained in practice but I recommend the book.

A retroactive grant for creating the HPMoR audiobook (Eneasz Brodski)?

We are building a marketplace (newly published site, very rough-looking) for impact markets. We think it's importance to set it up right as there are short-term and long-term risks involved we want to mitigate. We would be happy to facilitate this sort of funding between Eneasz and retro funders down the line or soon as part of an experiment. (I would personally be quite excited to see retro funding of his video, Shit Rationalists Say.)

Targeting Celebrities to Spread Effective Altruism

I both think this is 'obviously' good in the expected value sense that's going to be undervalued by most people, and likely to be cringe if done by many or at least some fraction of people who try to cram some propaganda down someone's throat.

"Have you heard the good word of effective altruism?" click

Maybe I'd point at an ethos more like 'find scenes you vibe in, befriend and converse with powerful people in those scenes, and they will naturally receive your values through osmosis'. 

Maybe most people here should just focus on engaging in forthright exchange with each other and others you find in the world, and this will naturally tend to exert a memetic gravity that pulls people in rather than trying to push on others.

What should I ask Lewis Dartnell (author of 'The Knowledge' and 'Origins')?

Something like: What does he think about the utility of bunkers and/or Faraday Cages for resilience against GCRs/nukes?

An uncomfortable thought experiment for anti-speciesist non-vegans

Considering your analogy, it is easy to buy clothes that didn't require slave labor

Is this true? I have heard the claim 'there is more slavery going on than at any point in history', but know very little about this and how it's defined. I would guess it's hard for me to avoid this if I'm going to a normal clothing store.

Help us make civilizational refuges happen

I volunteer my amateur enthusiasm to whoever works on this.

One thing I'm curious is what percentage of worlds would require the bunker to be well-hidden in order to be useful, e.g. due to an all-out WW3 where there are automated weapons seeking out targets that would include bunkers. I am less sure the size of the risk of Local Warlords In Your Area, though will note that if it's near a local population the bunker should be cooperative with nearby inhabitants rather than engage in the false individualist bias that is rampant in survivalist thought.

I think it would make sense to have multiple bunkers distributed in different geographies and suited for different GCRs, where some fraction of these bunkers are kept very very secret. But I strongly don't think a v1 (Version 1 / Vault 1) should have that feature.

DonyChristie's Shortform

When I say this (in a highly compressed form, on the shortform where that's okay), it gets a bit downvoted; when Scott says it, or at least, says something highly similar to my intent, it gets highly upvoted.

Open Thread: Spring 2022

This looks like a great idea!

Load More