Link: Technology Favours Tyrrany – Yuval Noah Harari
Summary:
- Economic incentives favour making powerful AI.
- Making powerful AI is easier where you don’t respect liberty / privacy.
- We’re developing a new useless class. People that can’t do anything better than technology. And we don’t know what to do with them.
Raw Notes:
- Democracy has been the exception, not the norm throughout history.
- The author thinks that there are certain technological conditions that promote democracy. And that we're growing further away from those conditions with the technology we're developing.
- I'm not entirely sure what he means. But he gives the example of the "ordinary man" being praised in the 1940s. Everyday steel, coal, farm, military workers would be on propaganda posters in the US, USSR, and Europe. But today, "ordinary workers" are no longer celebrated. They no longer have a future.
- He says ordinary people have political power (in democracies) but are less and less relevant economically. "Perhaps in the 21st century, populist revolts will be staged not against an economic elite that exploits people but against an economic elite that does not need them anymore."
- We can't keep people happy by just getting more and more economic growth. Not if this growth is based on inventing more and more disruptive technologies.
- As technology continues to develop, we'll have new jobs emerge quickly. But the new jobs will also disappear quickly. Ex: Human annotators for AI models. People will need to retrain many many times.
- How will this affect people's stress given how many people are already stressed?
- He thinks that we'll have technology that can tell what people are thinking (roughly). Ex: He says if you look at a picture and get higher blood pressure and activity in your amygdala, then you're getting angry from the contents of that picture. And he says this type of technology could be used in autocratic states to suppress dissent.
- He also says that we already have technology to manipulate our emotions better than our own families can manipulate our emotions. He says that this might be used to convince people to buy into X product, Y politician, Z movement, etc.
- He says that in past times, democracies were better able to use information than autocracies. Because information was available to everyone and more minds could process it. But with AI, the group with the most powerful AI is the best at processing information.
- Ex: If an autocratic government orders all its citizens to take health tests and share their medical data to the government, that government would be much better at biomedical research than governments that respect individual privacy.
- He uses this to point out how the centralised way that autocracies make decisions lend themselves to autocracies being the best at processing information.
- He mentions how we've already started to erode skills based on technological reliance. Ex: Many people don't know how to get information without search engines. Ex: Many people don't know how to navigate long distance without Google Maps.
- For next steps, he suggests that to control the development of technology like AI, we need to control the 'means of production.' In agrarian societies, the means of production was land, so we created property rights. In the 1900s, the means of production were factories/machines, so we protected those most in wartime. In the 2000s, the means of production might be computer chips, data, etc. We currently suck at controlling/regulating this.
Thanks for this - I appreciate summaries of relevant topics.
Some context for other readers: This is a summary of an article from October 2018 (part of a series that attempts to answer the question 'Is democracy dying?')