We estimate that, as of June 12, 2024, OpenAI has an annualized revenue (ARR) of:

 $1.9B for ChatGPT Plus (7.7M global subscribers),
 $714M from ChatGPT Enterprise (1.2M seats),
 $510M from the API, and
 $290M from ChatGPT Team (from 980k seats)

(Full report in app.futuresearch.ai/reports/3Li1, methods described in futuresearch.ai/openai-revenue-report.)

We looked into OpenAI's revenue because financial information should be a strong indicator of the business decisions they make in the coming months, and hence an indicator of their research priorities.

Our methods in brief: we searched exhaustively for all public information on OpenAI's finances, and filtered it to reliable data points. From this, we selected a method of calculation that required the minimal amount of inference of missing information.

To infer the missing information, we used the standard techniques of forecasters: fermi estimates, and base rates / analogies.

We're fairly confident that the true values are relatively close to what we report. We're still working on methods to assign confidence intervals on the final answers given the confidence intervals of all of the intermediate variables.

Inside the full report, you can see which of our estimates are most speculative, e.g. using the ratio of Enterprise seats to Teams seats from comparable apps; or inferring the US to non-US subscriber base across platforms from numbers about mobile subscribers, or inferring growth rates from just a few data points.

Overall, these numbers imply to us that:

  • Sam Altman's surprising claim of $3.4B ARR on June 12 seems quite plausible, despite skepticism people raised at the time.
  • Apps (consumer and enterprise) are much more important to OpenAI than the API.
  • Consumers are much more important to OpenAI than enterprises, as reflected in all their recent demos, but the enterprise growth rate is so high that this may change abruptly.
     

58

1
0
1

Reactions

1
0
1
Comments8


Sorted by Click to highlight new comments since:

We looked into OpenAI's revenue because financial information should be a strong indicator of the business decisions they make in the coming months, and hence an indicator of their research priorities
 


Is this really true? I am quite surprised by this given how much of the expected financial value  of OpenAI (and valuation of AI companies more generally) is not in the next couple of months, but based on being at the frontier of a technology with enormous future potential.

 

Definitely. I think all contribute to their thinking - their current finances, the growth rates, and the expected value of their future plans that don't generate any revenue today.

We estimate that

Point of clarification, it seems like FutureSearch is largely powered by calls to AI models. When you say "we", what do you mean? Has a human checked the entire reasoning process that led to the results you present here? 

There were humans in the loop, yes.

Hi, thanks for this! Any idea how this compares to total costs?

Hi! We currently don't have a reliable estimate of the cost, but we might include it in the future.

I didn't check whether you addressed this, but an article from The Information claims that OpenAI's API ARR reached $1B as of March: https://www.theinformation.com/articles/a-peek-behind-openais-financials-dont-underestimate-china?rc=qcqkcj

A separate The Information article claims that OpenAI receives $200MM ARR as a cut of MSFT's OpenAI model-based cloud revenue, which I'm not sure is included in your breakdown: https://www.theinformation.com/articles/openais-annualized-revenue-doubles-to-3-4-billion-since-late-2023?rc=qcqkcj

These articles are not public though - they are behind a paywall.

The source for the $1B API revenue claim is given as "someone who viewed internal figures related to the business".

It's not completely implausible, but the implications for OpenAI's revenue growth curve would be a little surprising. 

We have fairly reliable numbers for ChatGPT Enterprise revenue (based on an official announcement of seats sold together with the price per seat quoted to someone who inquired) and ChatGPT Plus revenue (from e-receipt data) from the start of April; these sum to about $1.9B. It's reasonable to add another $300M to this to account for other smaller sources – early ChatGPT Team revenue, Azure (which we did indeed ignore), custom models.

So, with an extra $1B from the API on top of all that, we'd see only $200M revenue growth between the start of April and the middle of June, when it was announced as $3.4B – contrast with $1.2B between the start of January (December's ARR was $2B) and March (estimated $3.2B).

Curated and popular this week
 ·  · 5m read
 · 
[Cross-posted from my Substack here] If you spend time with people trying to change the world, you’ll come to an interesting conundrum: Various advocacy groups reference previous successful social movements as to why their chosen strategy is the most important one. Yet, these groups often follow wildly different strategies from each other to achieve social change. So, which one of them is right? The answer is all of them and none of them. This is because many people use research and historical movements to justify their pre-existing beliefs about how social change happens. Simply, you can find a case study to fit most plausible theories of how social change happens. For example, the groups might say: * Repeated nonviolent disruption is the key to social change, citing the Freedom Riders from the civil rights Movement or Act Up! from the gay rights movement. * Technological progress is what drives improvements in the human condition if you consider the development of the contraceptive pill funded by Katharine McCormick. * Organising and base-building is how change happens, as inspired by Ella Baker, the NAACP or Cesar Chavez from the United Workers Movement. * Insider advocacy is the real secret of social movements – look no further than how influential the Leadership Conference on Civil Rights was in passing the Civil Rights Acts of 1960 & 1964. * Democratic participation is the backbone of social change – just look at how Ireland lifted a ban on abortion via a Citizen’s Assembly. * And so on… To paint this picture, we can see this in action below: Source: Just Stop Oil which focuses on…civil resistance and disruption Source: The Civic Power Fund which focuses on… local organising What do we take away from all this? In my mind, a few key things: 1. Many different approaches have worked in changing the world so we should be humble and not assume we are doing The Most Important Thing 2. The case studies we focus on are likely confirmation bias, where
 ·  · 2m read
 · 
I speak to many entrepreneurial people trying to do a large amount of good by starting a nonprofit organisation. I think this is often an error for four main reasons. 1. Scalability 2. Capital counterfactuals 3. Standards 4. Learning potential 5. Earning to give potential These arguments are most applicable to starting high-growth organisations, such as startups.[1] Scalability There is a lot of capital available for startups, and established mechanisms exist to continue raising funds if the ROI appears high. It seems extremely difficult to operate a nonprofit with a budget of more than $30M per year (e.g., with approximately 150 people), but this is not particularly unusual for for-profit organisations. Capital Counterfactuals I generally believe that value-aligned funders are spending their money reasonably well, while for-profit investors are spending theirs extremely poorly (on altruistic grounds). If you can redirect that funding towards high-altruism value work, you could potentially create a much larger delta between your use of funding and the counterfactual of someone else receiving those funds. You also won’t be reliant on constantly convincing donors to give you money, once you’re generating revenue. Standards Nonprofits have significantly weaker feedback mechanisms compared to for-profits. They are often difficult to evaluate and lack a natural kill function. Few people are going to complain that you provided bad service when it didn’t cost them anything. Most nonprofits are not very ambitious, despite having large moral ambitions. It’s challenging to find talented people willing to accept a substantial pay cut to work with you. For-profits are considerably more likely to create something that people actually want. Learning Potential Most people should be trying to put themselves in a better position to do useful work later on. People often report learning a great deal from working at high-growth companies, building interesting connection
 ·  · 1m read
 · 
I wanted to share a small but important challenge I've encountered as a student engaging with Effective Altruism from a lower-income country (Nigeria), and invite thoughts or suggestions from the community. Recently, I tried to make a one-time donation to one of the EA-aligned charities listed on the Giving What We Can platform. However, I discovered that I could not donate an amount less than $5. While this might seem like a minor limit for many, for someone like me — a student without a steady income or job, $5 is a significant amount. To provide some context: According to Numbeo, the average monthly income of a Nigerian worker is around $130–$150, and students often rely on even less — sometimes just $20–$50 per month for all expenses. For many students here, having $5 "lying around" isn't common at all; it could represent a week's worth of meals or transportation. I personally want to make small, one-time donations whenever I can, rather than commit to a recurring pledge like the 10% Giving What We Can pledge, which isn't feasible for me right now. I also want to encourage members of my local EA group, who are in similar financial situations, to practice giving through small but meaningful donations. In light of this, I would like to: * Recommend that Giving What We Can (and similar platforms) consider allowing smaller minimum donation amounts to make giving more accessible to students and people in lower-income countries. * Suggest that more organizations be added to the platform, to give donors a wider range of causes they can support with their small contributions. Uncertainties: * Are there alternative platforms or methods that allow very small one-time donations to EA-aligned charities? * Is there a reason behind the $5 minimum that I'm unaware of, and could it be adjusted to be more inclusive? I strongly believe that cultivating a habit of giving, even with small amounts, helps build a long-term culture of altruism — and it would