DonyChristie

Comments

How might better collective decision-making backfire?

Malevolent Empowerment

Better collective decisionmaking could lead a group to cause more harm to the world than good via entering a valley of bad decisionmaking. This presumes that humans tend to have a lot of bad effects on the world and that naively empowering humans can make those effects worse. 

e.g. a group with better epistemics and decisions could decide to take more effective action against a hated outgroup. Or it could lead to better economic & technological growth, leading to more meat eating or more CO2 production.

Humans tend to engage in the worst acts of violence when mobilized as a group that thinks it's doing something good. Therefore, helping a single group improve its epistemics and decisionmaking could make that group commit greater atrocities, or amplify negative unintended side effects.

What are some potential coordination failures in our community?

Yeah I tried contacting people on it and it was pretty hard.

Progress Open Thread: December 2020

I'm jumping back into the assurance contract project (see here for previous discussion on a "Kickstarter for Inadequate Equilibria", though I'll note I feel like at this point that the contributors to those threads missed a bunch of relevant detail on this topic; I should do a writeup, not sure what though).

The long-term mission: Supply global public goods via dominant assurance contracts.

I intend to provide updates here on a regular basis, probably monthly although I can do more frequent if people are interested.

A Case Study in Newtonian Ethics--Kindly Advise

I'm not as high on the social ladder as you'd think, though some of my perspective is probably colored by class views rubbing off on me from other people around me. I have been sort of homeless technically for much of the past couple years and actually think EAs should live in tent cities/off-grid villages. I've also briefly researched becoming a professional beggar in a really wealthy place such as Switzerland; this shifted into the idea of becoming a street performer, which didn't work out.

My perspective was heavily informed by a couple of experiences with people using body language to get really close up to me and ask for a much larger amount of cash than I would otherwise give to the median person. I also had someone yell at me angrily when I didn't say anything to them when they approached me at night. My experience of the reciprocity trick wasn't with someone homeless, but with people hawking their mixtape and "giving it away", as well as signing my name on it, only to take it away if I wasn't giving them "a donation". So I'm not lambasting the average beggar, it's just we need to not let the most dark triad people shake people's pockets and make it harder for more down-on-their luck, earnest people to be able to ask for help.

It was raining yesterday and I offered $4 to someone who was huddling in a tunnel, but they didn't take it. I tend to feel spontaneously generous sometimes when the spirit moves me.

Yesterday was novel, I imagine it gets old.

Inurement is the strongest feature here, I believe. Once you see an endless sea of people, then it becomes overwhelming, demotivating, and you stop being as empathetic.

(Meta: Annoyed as heck by the upvote/downvotes here, coloring our discussion; is there a mod out there to not see karma on EA Forum, like there was for old LessWrong?)

DonyChristie's Shortform

What does it mean for a human to properly orient their lives around the Singularity, to update on upcoming accelerating technological changes?

This is a hard problem I've grappled with for years.

It's similar to another question I think about, but with regards to downsides: if you in fact knew Doom was coming, in the form of World War 3 or whatever GCR is strong enough to upset civilization, then what in fact should you do? Drastic action is required. For this, I think the solution is on the order of building an off-grid colony that can survive, assuming one can't prevent the Doom. It's still hard to act on that, though. What is it like to go against the grain in order to do that?

Linch's Shortform

I'm curious what it looks like to backchain from something so complex. I've tried it repeatedly in the past and feel like I failed.

vaidehi_agarwalla's Shortform

+1 the math there. How does building an app compare to throwing more resources at finding better pre-existing apps? 

I'll just add I find it kind of annoying how the event app keeps getting switched up. I thought Grip was better than whatever was used recently for EAGxAsia_Pacific (Catalyst?). 

Linch's Shortform

The biggest risk here I believe is anthropogenic; supervolcanoes could theoretically be weaponized.

DonyChristie's Shortform

Would you be interested in a video coworking group for EAs? Like a dedicated place where you can go to work for 4-8 hours/day and see familiar faces (vs Focusmate which is 1 hour, one-on-one with different people). EAWork instead of WeWork.

Load More