J

justsaying

202 karmaJoined Dec 2022

Comments
25

I too get extremely annoyed by some anti-power-differential rhetoric that I see on the forum a lot and I think that any attempt to police that through norms or other means is an unacceptable violation of privacy. I agree this post has some elements of that or at least is not clear about the motivations.

However, I think many of the specific things proposed here are on the basis of conflicts of interest rather than power differentials. I see no issue dating co-workers including ones more/less senior than you but I think there is a real problem dating someone in your chain of command because then there are conflicts of interest at play. For this reason I have no issue if one partner finds another with their own money but it seems unethical to dating grantees if you are awarding grants on behalf of another person/organization without disclosing the relationship.

Would you agree that such conflict of interest is a legitimate concern rather than an element of safety culture?

Yea I am also confused about this. I know a bunch of ea-affiliated people in group houses and I've never heard any of them imply that the house is somehow endorsed/sponsored by cea or anything if the sort. If they ever do call it "an ea house" everyone is clear that they just mean it's a house with people who are interested in ea and I don't think there has been any misunderstanding around that. So what's the issue?

This not dating funders/grantees is a little strange to me as phrased, although I certainly strongly agree in the cases most people are imagining.

As phrased it sounds like there is a problem (for example) of paying a girlfriend/boyfriend with your own funds to do an extended project. Which is sort of weird and unusual but what exactly is the problem with that? I think what this is getting at is you shouldn't date a grantee that you are deciding to pay with someone else's money or on behalf of a larger organization. Correct?

People have been arguing about religion for hundreds if not thousands of years. Maybe there has been progress and maybe there hasn't but I'm not sure why you would think EA is particularly well positioned to make any progress on either truth-finding or on convincing anyone of the truth. The sort of "fair trial" you propose sounds extremely alienating to religious people. Many religious people do not believe, for example, that religion should be subjected to rational debate and scientific inquiry. To them, it would be a little bit like a parent making a pro and con list about whether their particular baby is worthy of love based on the baby's particular characteristics. It wouldn't come across as giving the baby a "fair chance", it would just come across as gross. Most adults have given the matter some significant thought and come to a conclusion that works for them. I'm not religious myself but I'm glad that EA is working on building common ground with people across different religions (ea for Christians/Jew/Muslims/etc). This seems like it would burn those bridges to no good end.

What percentage chance would you estimate of a large scale nuclear war conditional on the U.S. bombing a Chinese data center? What percentage of the risk from agi do you think this strategy reduces?

I hope they do wake up to the danger, and I am all for trying to negotiate treaties! 
It's possible I am misinterpreting what EY means by "rogue data centers."  To clarify, the specific thing I am calling insane is the idea that the U.S. or NATO should under (almost) any circumstance bomb data centers inside other nuclear powers.

I appreciate Eliezer's honesty and consistency in what he is is calling for. This approach makes sense if you believe, as Eliezer does, that p(doom | business as usual)>99%. Then it is worth massively increasing the risk of a nuclear war. If you believe, as I do and as most AI experts do, that p(doom | business as usual) <20%, this plan is absolutely insane. 

This line of thinking is becoming more and more common in EA. It is going to get us all killed if it has any traction. No, the U.S. should not be willing to bomb Chinese data centers and risk a global nuclear war. No, repeatedly bombing China for pursing something that is a central goal of the CCP that has dangers that are completely illegible to 90% of the population is not a small, incremental risk of nuclear war on the scale of aiding Ukraine as some other commenters are suggesting. This is insane. 

By all means, I support efforts for international treaties. Bombing Chinese data centers is suicidal and we all know it. 

I say this all as someone who is genuinely frightened of AGI. It might well kill us, but not as quickly or surely as implementing this strategy will.

 

Edited to reflect that upon further thought, I probably do not support bombing the data centers of less powerful countries either.

Thank you, Peter. These are the things that initially attracted me to effective altruism and I appreciate you articulating them so effectively. I will also say that these are ideas I admire you for obviously fostering, both through rethink priorities and your forecasting work.

Unfortunately it seems to me that the first and third ideas are far less prominent of a feature of EA than they used to be.

The first idea seems to me to be less prominent as a result of so many people believing in extremely high short term catastrophic ai risk. It seems that this has encouraged an attitude of animal welfare being trivial by comparison and the welfare of humans in the far future being irrelevant (because if we don't solve it, humans will go extinct within decades). Attitudes about animal welfare seem in my opinion to be compounded by the increasing influence of Eliezer, who does not believe that non human animals (with the possible exception of chimps) are sentient.

The third idea also seems to be declining as a result of hard feelings related to internal culture warring. In my view, bickering about the integrity of various prominent figures, about the appropriate reaction to sbf, about whose fault sbf was, about how prevalent sexual assault is in EA, about how to respond to sexual assault in EA, about whether those responses are cultish or at least bigoted, etc etc etc has just made the general epistemics a lot worse. I see these internal culture wars bleeding into cause areas and other ostensibly unrelated topics. People are frustrated with the community and regardless of whatever side of these culture wars they are on, they are annoyed about the existence of the other side and frustrated that these seemingly fundamental issues of common decency are even a discussion. It puts them in no mood to discuss malaria vaccines with curiosity.

I personally deactivated my real-name forum account and stopped participating in the in person community and talking to people about ea. I still really really value these three ideas and see pockets of the community that still embody them. I really hope the community once again embodies them like I think they used to.

Yes but I still think the vast majority have been in the bay area.

Probably has something to do with it, but lots of cities have very active in person EA activity and I have not heard anywhere near as many complaints about anywhere else as I have about the bay area.

Load more