TH

Tom Henry

29 karmaJoined Dec 2022

Comments
9

evidence so far suggests that China might actually be a better steward of our global safety than the US is being

Here's a thought experiment: if we lived in China, could we suggest on a Chinese forum that 'the US might actually be a better steward of our global safety than China is being, at least in the domain of AI development'?

Could we have a discussion that was honest, free, and open with no fear of censorship or consequences?

Where are all the public discussions in China about how the CCP needs to be more responsible in how it uses AI, how perhaps it should stop collecting Uyghurs' biometric data, etc?

I'm not arguing for the reckless pursuit of AGI here, but please let's not have a warm and fuzzy view of the CCP as a steward of global safety.

I have a few questions and a lot of things that give me pause:

  1. Even assuming that the pursuers come to know the risks - perhaps that AGI may ultimately betray its users - why would that diminish its appeal? Some % of people have always been drawn to the pursuit of power without much concern for the potential risks.
  2. Why would leaders in China view AGI they controlled as a threat to their power? It seems that artificial intelligence is a key part of how the Chinese government currently preserves its power internally, and it's not a stretch to see how artificial intelligence could help massively in external power projection, as well as in economic growth.
  3. Why assume Chinese incompetence in the area of AI? China invests a lot of money into AI, uses it in almost all areas of society, and aims for global leadership in this area by 2030. China also has a large pool of AI researchers and engineers, a lot of data, and few data protections for individuals. Assuming incompetence is not only unwise, it disregards genuine Chinese achievements, and in some cases it's prejudiced. Do you really want to say that China does not perform innovative technology research?
  4. If China is genuinely struggling (economically, technologically, etc.), why would leaders abandon the pursuit of AGI? I would have thought the opposite. History suggests that countries which see themselves as having a narrow window of opportunity to achieve victory are the most dangerous. And fuzzy assumptions of benevolence are unwise: Xi Jinping has told the Chinese military to prepare for war, while overseeing one of the fastest military expansions in history, and he has consolidated authority around himself.
  5. Given the potential risks associated with the development of AGI, what approach do you recommend for slowing down its pursuit: a unilateral approach where countries like the US and UK take the initiative, or a multilateral approach where countries like China are included and formal agreements (including verification arrangements) are established? How would you establish trust while also preventing authoritarian regimes from gaining AGI supremacy? The article you linked mentions a lot of "maybes" - maybe China would not gain supremacy - but to be honest, given the high stakes, Western policymakers would want much higher confidence.

Reasons I'm suspicious of this:

  1. No clarity about:
    1. identity of the author
    2. identities of the team
  2. No evidence that the author/team have the expertise to:
    1. set up this community
    2. responsibly manage the funds
  3. In general where large amounts of money are involved, it is better to work with well-established systems and people, as this is lower-risk in terms of "unknown unknowns" etc.

While I'm not persuaded this is a scam, the problem is that even well-intentioned ideas can go wrong if the people leading them don't have the expertise to do so.

I have personally influenced decisions involving billions of dollars before, and despite this, I don't think I would have anywhere near the expertise to lead this kind of organization. I certainly would not want to call it "my idea" - I would want to get a team of financial experts involved, so that it would truly be "our idea."

PS - you mentioned you wanted to raise your own funds as well - is this for this idea?

https://forum.effectivealtruism.org/posts/jczTo4zp7voarkRHC/seeking-funding-for-a-question-and-answer-website-for-ea

If so - I think the idea is interesting, but I agree with some of the commenters there who pointed out that:

  1. existing Q&A setups can be adapted for new websites so it is not necessary to build something from scratch
  2. existing websites and the current forum can be used to address issues
  3. it is difficult to justify a new platform in light of the community costs in setting up a new one
  4. it is important to try simple prototypes first to gauge interest.

The idea you are proposing in this post is different, but you would need to explain why the funding model should change. Just because getting funding is hard does not mean it should be easy! For example, for the project above, there seem to be good reasons why it would not be funded, even though it is an interesting idea.

Decentralization is one thing, and transparency is another. You mentioned:

I happen to be an extremely and obsessively privacy-oriented person (the type that always uses pseudonyms online) so I'm not really keen on putting my name and private details out here in public just yet - but this is not a burner account, it's my main and only EA forum account, just bearing a pseudonym. My intention is to fully "dox" myself to people who join the DAO. To them I will reveal my face and all my important personal details and even go through transparent KYC procedures if necessary.

It's fine to be concerned about privacy. But if someone wants to start or lead anything that involves $1 billion, it's reasonable to expect that people will know who they  are, what their experience levels are, what their track record is, etc. – and be able to verify these things for themselves. If you don't want to do that, fair enough, but in that case, don't start this yourself, persuade someone else who is prepared to be public to start this themselves.

Could it be beneficial to work with lawyers independent of the Effective Altruism movement, if this would give them more objectivity and independence from conflicts of interest / valuation of particular ideas?

I'm concerned that this trend could dilute the significance of human rights.

If everything is considered a human rights issue, it may become challenging to focus on the most important issues. We need to carefully consider what falls under the umbrella of human rights and prioritise the protection of those rights.

First, it's important to consider what you really want to do. How much time do you have to explore different options and how much risk are you willing to take? It's also important to think about whether you need a stable income.

One resource that may be helpful is the 80,000 Hours career guide. It covers key ideas such as problem selection, contribution, personal fit, and career capital. You can find more information here: https://80000hours.org/key-ideas/ and https://80000hours.org/career-guide/job-satisfaction/.

Another thing to think about is Cal Newport's idea of "Lifestyle-Centric Career Planning." This involves determining the lifestyle you want and then working backwards to see how you can get there. Consider factors like your schedule, job intensity and prestige, social life, and work/leisure balance. And when you're looking at career opportunities, choose ones that align with your desired lifestyle instead of just going for the most prestigious or financially lucrative options. You can read more about this approach here: https://www.calnewport.com/blog/2008/05/21/the-most-important-piece-of-career-advice-you-probably-never-heard/.

(For example: If social interaction is important for you, don't pursue a job that involves working in isolation unless you have to. [This also goes for graduate studies.] If your idea of an ideal week is one with lots of meetings with people, coordinating and managing people and events, etc., then pursue a job with that instead. You might be interested in operations, e.g. https://80000hours.org/articles/operations-management/  )

It's also important to remember that your past academic history and current career situation don't define you. It sounds like you might be telling yourself some negative things, like "I barely escaped college with a 2.95 GPA" or "I probably should not have done engineering." But there are plenty of things you can do. It's usually not helpful to try to prove to yourself (or other people) that you're smart or capable, because it can lead to a focus on external validation rather than on impact and personal growth. This creates unnecessary stress and makes it harder to make good decisions. Sometimes people pursue graduate studies to do that, and it rarely turns out well.

If you are interested in a career direction, talk to people who are already working in that area. Set up brief calls with them and ask them about their day-to-day work. If the work and the people are compelling to you, then consider doing graduate studies if they are a prerequisite for that career direction.

How are you feeling about your research and life more generally? Important to consider this because it's easy for us to find all sorts of clever ways to procrastinate ;)

An idea: track your time. My system is just to make a note on a piece of paper whenever I change to do/think about something different.

I found this both confronting + comforting:

(1) we have more time than we think

(2) time perception is shaped by novelty of our experiences

(3) inefficiency can go on for a long time when we are on autopilot

(4) often time is not going where we think it is going*

(5) often it's possible to batch stuff and spend time in refreshing ways

(6) it makes us mindful of time

(7) it creates micro-friction for distractions.

*For example I thought I was spending > 30 min a day on particular tasks around the house. Turned out I was spending about 5-10 min on those tasks.