Hide table of contents

How many percentage of ability are AI at current compared to the ability required for human-like intelligence? I heard some people in EA think it's already more than 50%, which shocked me a lot. I'm an definitely AI outsider, I can't imagine how AI conductsvhard tasks(such as do scientific research or social interaction). Most EA claim that since AI has developed in an explanatory speed, so it might reach superintelligence, but it's not very persuasive to me. Because for me AI now seems really far from human-like intelligence, though GPT beat us in some tests, but it is very weak in some types of tasks I'd like to see more articles talking why the greatest AI system now(maybe GPT) has the potential of being human-level AGI, and why fast computing speed, algorithms=AI can be able to do most of things on earth. Better with concrete examples not just abstract algorithms.

6

0
0

Reactions

0
0
New Answer
New Comment

2 Answers sorted by

It's looking highly likely that the current paradigm of AI architecture (Foundation models), basically just scales all the way to AGI. These things are “General Cognition Engines” (watching that linked video helped it click for me). Also consider multimodal  - the same architecture can do text, images, audio, video, sensor data, robotics. Add in planners, plugins and memory (they "System 2" to the foundation model's "System 1") and you have AGI. This will be much more evident with Google Gemini (currently in training).

It seems like there is no "secret sauce" left - all is needed is more compute and data (for which there aren't significant bottlenecks). More here.

This is a big question, of which there are many benchmarks and prediction markets to hint at a potential answer. Here are two potential prediction markets worth looking at:

https://www.metaculus.com/questions/5121/date-of-first-agi-strong/

https://www.metaculus.com/questions/3479/date-weakly-general-ai-is-publicly-known/

Yes, I have read those and accepted the truth lots of people believe human level AGI will come in 20 years, and it's just a matter of time. But I don't know why people are so confident on this. Do people think the AI algorithms now are well enough to do most of the tasks on earth "theoretically", and what we need are only fast computing speed?

1
jackchang110
1y
Or do people think the GPT system now is already very close to AGI? If so, what are the support arguments?(I've read Sparks of AGI by OpenAI)
Curated and popular this week
Relevant opportunities