Depending of different attitudes towards questions like take-off speed, people argue that with the development of AGI we will face situations of world GDP doubling days/weeks/a few years (with the number of years shriking with each further doubling). Many peoples's timelines here seem to be quite broad, including quite commonly expectations like "AGI within the next 2-3 decades very likely".
How the global world order politically as well as economically will change over the next decades is a quite extensively discussed topic in public as well as academia, with many goals and forecasts made until years like 2050 or 2070 ("climate neutral 2050", "china's economy in 30 years"). Barely is AGI mentioned in economics classes, political research papers and the like, despite its apparent impact of making any politics redundant and throwing over any economic forecasts. If AGI was even significantly less mighty than we think and there was even just a 20% chance of it occuring in the next 3 decades, that should be the number one single factor debated in every single argument on any economic/political topic with medium-length scope. Why, do you think, is it the case, that AGI is comparatively so rarely a topic there?
My motivated reasoning would immediately come up with explanations along the lines of
- people in these disciplines are just not so much aware of AI developments
- any forecasts/plans made assuming short timelines and fast takeoff speeds are useless anyways, so it makes sense to just assume longer timelines
- Maybe I am just not noticing the omnipresence of AGI debate in economic/political long-term discourse
@1 seems unreasonable, because as soon as the first AI-economics people would come up with these arguments, if they were reasonable, they would become mainstream
@2 if that assumption was consciously made, I'd expect to hear this more often as side note
@3 hard to argue against, given it assumes I don't see the discourse. But I regularity engage with media/content from the UN on their SDGs, have taken some Economics/IR/Politics electives, try to be a somewhat informed citicien and have friends studying these things, and I barely see AI suddenly speeding up things in any forecasts or discussions
Why might this be the case?
To me it seems like either mainstream academia, global institutions and public discourse heavily miss something or we tech/ea/ai people are overly biased in the actual relevance of our own field (I'm CS student)?
1 seems the most plausible to me. Reasonable arguments might eventually become mainstream, but that doesn't mean they would do so immediately.
In particular (a) there may not be many AI-economics people, so the signal could get lost in the noise and (b) economics journals may tend to favour research that focuses on established topics or that uses clever methodology, rather than topics that are important/valuable.
Hi Charles, you seem to be putting a lot of weight on a short, quick note that I made as a comment on a comment on an EA Forum post, based on my personal experiences in an Econ department (I wasn't 'mentioning credentials', I was offering observations based on experience).
(You also included some fairly vague criticisms of my previous posts and comments that could be construed as rather ad hominem.)
You are correct that there are many subfields within Econ, some of which challenge standard models, and that Econ has some virtues that other social scienc... (read more)