Developing my worldview. Interested in meta-ethics, epistemics, psychology, AI safety, and AI strategy.
To me it seems there is, yes. For instance, see this Harvard professor and this Stanford professor talk about aliens.
Conditioning on the alien intelligence being responsible for recent UFO/UAP discussions/evidence, then they are more advanced than us. If they are more advanced than us, they are most likely much more advanced than us (e.g. the difference between now and 1 AD on earth is cosmologically very small, but technologically pretty big)
Would you say that incorporating the "advice" of one of my common sense moral intuitions (e.g. "you should be nice") can be considered part of a process called "better evaluating the EV"?
Yeah that makes sense
Re: 19, part of why I dont' think about this much is because I assume that any alien intelligence is going to be much more technologically advanced than us, and so there probably isn't much we can do in case we don't like their motives
7. Reminded me of Mark Xu's similar criticism
Yeah, that's pretty much what I was imagining. Though I think the best insight is going to come from a more deliberate effort to seek out a different worldview since e.g. the people with the most different worldviews aren't going to be in EA (probably far from it).
For instance political ideologies
How various prominent ideologies view the world, e.g. based on in-depth conversations