Open Philanthropy has a nice write-up with a range of estimates for the (upper bound of the) compute required to train a human brain-level AGI. The author gives a 90% confidence rating to the conjecture that AGI will require no more than 10^21 flops (or 1,000 exaflops) to train.

My question for the EA Forum: how big do folks guess the training runs for the largest AIs today? Extrapolating past trends, when would training runs reach 1,000 exaflops?

Comments6
Sorted by Click to highlight new comments since: Today at 7:19 AM

how big do folks guess the training runs for the largest AIs today?

Epoch AI estimates that the compute used in the final training run of GPT-4, the most compute-intensive model to date, was 2e25 FLOP (source).

2e25 FLOP is 2 * 10^25 FLOP. So, if this estimate is correct, then GPT-4 is already beyond Open Phil’s threshold of 10^21 FLOP. Am I wrong?

You appear to be comparing two different things here. GPT-4 was trained using around 2*10^25 FLOP (floating point operations). The Open Phil report estimates that 10^21 FLOP/s (floating point operations per second) would be needed to run—not train—a model that matches the human brain.

It’s possible I could have completely misread the Open Phil post.

Your question is using "flops" to mean FLOP/s in some places and FLOP in other places.

https://www.lesswrong.com/posts/XiKidK9kNvJHX9Yte/avoid-the-abbreviation-flops-use-flop-or-flop-s-instead

Thank you.