Corewood Logo

The Unsustainable Economics of Generative AI: Finding a Path Forward

4 min read

Let's start with the brass tacks: many companies are demoing and raising capital on capabilities they fundamentally cannot afford to operate. Why? Because we're in the early adoption stage of LLMs, and organizations are using these models at massive discounts from what they actually cost to run.

The Sobering Reality of AI Economics

What does it actually cost to operate LLMs? The numbers tell a stark story:

Putting aside the hype and bluster, OpenAI — as with all generative AI model developers — loses money on every single prompt and output. Its products do not scale like traditional software, in that the more users it gets, the more expensive its services are to run because its models are so compute-intensive.

The Financial Reality of Generative AI in 2024-2025

Company2024 Revenue2024 LossesKey Financial Facts
OpenAI$4 billion$5 billion (after revenue), $9 billion total operating costs• Compute for training models alone: $3 billion
• Compute for running models: $2 billion
• Salary expenses: Over $700 million (before stock compensation)
Anthropic$918 million$5.6 billion• 60-75% of revenue came from API calls
• Currently raising $2 billion at a $60 billion valuation
PerplexityJust over $56 millionNot profitable• Valued at $9 billion (late 2024)
• Projecting $127 million revenue in 2025

Source: Ed Zitron's "Where's The Money?" - Data compiled from reporting by The Information, New York Times, and CNBC.

The Fundamental Question

IF YOUR BUSINESS IS NOT USING PROFITABLE TECH, HOW CAN YOUR BUSINESS BE PROFITABLE?

This question cuts to the heart of the current AI landscape. The rush to implement generative AI features has outpaced reasonable economic considerations. Companies are building business models on technology that fundamentally loses money with every transaction.

A Different Path Forward

If we want AI to be sustainable and profitable, we need to build it differently. This means developing high-performance, low-cost-of-operations technology that addresses:

  • How models access context and on-prem data: Context windows and retrieval mechanisms need fundamental redesign
  • Systems design from first principles: Rather than scaling expensive architectures, rethinking how inference operates
  • Economic realities: Understanding the true cost structure of AI operations

The current usage pattern of AI fits the textbook definition of a bubble stage. It will pop. The only question is when.

Practical Steps Forward

What can forward-thinking organizations do now?

Start building AI systems you can afford to operate. "But we need to validate our approach..." I hear you say. If you can't rebuild your prototype into something economically viable, you're not validating - you're gambling.

There's a dangerous assumption that systems can be "fixed later" if the business succeeds. This is often incorrect with generative AI because these models can do things you simply can't hire people to program efficiently - but at a cost structure that few businesses can sustain.

The Outsourced Foundation Problem

Using vendor-provided generative AI for core features means your business model rests in someone else's hands. What happens when they inevitably need to become profitable? Your margins become their target.

Think of it as hiring a Harvard-educated PhD to staff your support desk. They might do excellent work, but it's a fundamentally inefficient allocation of resources - unless perhaps your support desk is managing a nuclear power plant.

A Better Path: Custom, Small Language Model Training

I'm not here to burst bubbles or ruin parades. I'm offering a more sustainable path: custom, small language model training and operation with reimagined inference frameworks.

At Corewood, we've assembled a team with deep expertise across:

  • System design, cloud infrastructure, security, and identity
  • Engineering management and front-end/last mile delivery
  • Backend engineering and framework optimization
  • Academic and applied AI, spanning LLMs, SLMs, and purpose-built models

We're not a one-trick pony but a collective of essential industry knowledge uniquely positioned to help your organization navigate through this bubble without getting caught in its inevitable burst.


Ready to build AI systems you can actually afford to operate? Let's talk about how Corewood can help.

Contact us today