Heramb Podar

1 karmaJoined Jun 2022


I definitely agree that the current situation of silence makes the overall runaway, fast, dirty  AI development scenario much more likely and the space much tenser. 

Additionally, there might just be concerns that these labs have thought of from a business or at-scale research point of view, which we haven't (this would really help an already strained, resource-scarce alignment field in terms of what to prioritize!). 


Ultimately, I think what is stopping these labs is PR and a sense of "tainting their own field."

Fantastic post i am delighted to read someone take a crack at an impact analysis of something abstract /reliant on 2nd order ripple effects; this has been something that has been bothering me

For EA s starting out, there should be some focus on just doing good and not necessarily trying to aggressively optimize for doing good better, especially if you don't have a lot of credibility in that space.

Also, at the end of the day EA is a just a principle/value system which you can rely on in pretty much any career you end up making. The part about EA being a support system and a place to develop your values is often left out and as a result a lot of early stage exicted EAs just want to "get into " or "get stuff" out of EA