Tuesday, April 14, 2026
  • Login
CEO North America
  • Home
  • News
    • Business
    • Entrepreneur
    • Industry
    • Innovation
    • Management & Leadership
  • CEO Interviews
  • Opinion
  • Technology
  • Environment
  • CEO Life
    • Art & Culture
    • Food
    • Health
    • Travel
No Result
View All Result
  • Home
  • News
    • Business
    • Entrepreneur
    • Industry
    • Innovation
    • Management & Leadership
  • CEO Interviews
  • Opinion
  • Technology
  • Environment
  • CEO Life
    • Art & Culture
    • Food
    • Health
    • Travel
No Result
View All Result
CEO North America
No Result
View All Result

CEO NA Magazine > Technology > AI open models have benefits. So why aren’t they more widely used?

AI open models have benefits. So why aren’t they more widely used?

in Technology
AI open models have benefits. So why aren’t they more widely used?
Share on LinkedinShare on WhatsApp

When grocery shoppers find a generic product that’s 90% as good as the brand name version but costs 87% less, they usually put it in their carts. 

But when it comes to large language models, most artificial intelligence users pick the more expensive option.AI at WorkResearch and insights powering the intersection of AI and business, delivered monthly.Yes, I’d also like to subscribe to the Thinking Forward newsletterEmail

A new paper co-authored by Frank Nagle, a research scientist at the MIT Initiative on the Digital Economy, found that users largely opt for closed, proprietary AI inference models, namely those from OpenAI, Anthropic, and Google. Those models account for nearly 80% of all AI tokens that are processed on OpenRouter, the leading AI inference platform. In comparison, less-expensive open models from the likes of Meta, DeepSeek, and Mistral account for only 20% of AI tokens processed. (A token is a unit of input or output to an AI model, roughly equivalent to one word in a prompt to an AI chatbot.)

Open models achieve about 90% of the performance of closed models when they are released, but they can quickly close that gap — and the price of running inference is 87% less on open models. Nagle and co-author Daniel Yue at the Georgia Institute of Technology found that optimal reallocation of demand from closed to open models could cut average overall spending by more than 70%, saving the global AI economy about $25 billion annually. 

“The difference between benchmarks is small enough that most organizations don’t need to be paying six times as much just to get that little bit of performance improvement,” Nagle said. “They need to think about how to use the right tool for the right job instead of defaulting to what’s popular.”

Closed models versus open models

Much time and effort goes into training frontier AI models, which are the models atop which inference models are built. There are architectures to design and trillion-token datasets to curate, and processing units powerful enough to operate continuously for months need to be acquired.

To defray those costs, some companies keep their inference models proprietary, or closed. That means users must pay for the model and the underlying computing resources to access AI inference services. With a limited number of players in the market, high markups are all but inevitable.

Open models make public one or more model details. Common examples include model weights, source code, training data, or architecture. That openness means that users can host and run models locally — even on a laptop, provided it’s powerful enough.

“This openness enables inference at only the cost of compute power,” Nagle and Yue write, “essentially making the software free and creating competitive pressure akin to commodity markets.”

Nagle likened the market of third parties building open AI inference models to the emergence of companies such as Red Hat, which offers software, training, and customer support atop the open-source Linux operating system. Users are able to reap the advantages of open source while vendors, with the advantage of economies of scale, assume the risk for issues like uptime and security.

Savings of 70% left on the table

The market for open models in AI inference isn’t quite as mature. 

From May to September 2025, Nagle and Yue observed daily token usage on the OpenRoutersite, which captures approximately 1% of all global spending on AI model inference. The researchers noted that the site is popular because users can engage with multiple providers of model inference through a single interface.

“OpenRouter attracts the type of user who’s more likely to be willing to use open models, as the point is to be able to swap between models,” Nagle said. Even so, closed models still accounted for close to 80% of AI token usage over the five-month study period, as well as nearly 96% of the revenue that passed through OpenRouter. But closed models cost 87% more to run — $1.86 per million tokens, on average, compared with 23 cents per million tokens for open models.

Next, Nagle and Yue combined OpenRouter usage data with model performance benchmark data from Artificial Analysis and LM Arena. The former compares models against multiple performance metrics; the latter crowdsources user preferences. Their analysis showed that open models averaged 89.6% of closed-model performance but were usually able to close the gap within 13 weeks of a closed model’s initial release. That figure had dropped from 27 weeks just one year prior.

Using that information, the researchers calculated that, by switching to open alternatives superior to the closed models they are currently using, OpenRouter users could reduce costs more than 70% while improving benchmark performance by more than 14%. They extrapolated the impact for all AI inference models, using a market size estimate of more than $35 billion from Menlo Ventures, determining that “optimal substitution to open models could save the AI industry approximately $25 billion annually.”

Open model benefits, both local and global

If users could spend less on AI inference models and see better performance, why aren’t they switching? Nagle said companies often have two sets of concerns. One set is valid; the other is based on misconceptions, he said. 

Among the valid concerns is that there may be significant costs associated with switching to a new model. These costs are not accounted for in the estimated $25 billion in annual savings and differ greatly among users.

“People build a whole ecosystem around an existing closed model, refine it, and build more on top of it. It’s not going to be trivial to switch to an open model,” Nagle said. “In the long run it may be cheaper, but in the short run it will be costly.”

Further, there may be reliability, regulatory, or security concerns that are easier to assuage with closed models than open models. For example, DeepSeek (an organization that produces open models) is based in China, and as a research organization, it’s not subject to Chinese AI regulations for commercial products.

The misconceptions are based on perceptions of inferior performance, which the researchers demonstrate are inaccurate, and a fear that using open models means private data suddenly becomes public. “That’s not correct,” Nagle said. “Open models can be built and run within your own infrastructure. Your data’s never leaving your servers.” 

Nagle encouraged companies to periodically review their use of AI inference models the same way they reevaluate software and infrastructure investments. If closed models are in place, it’s worth considering whether there’s a more cost-effective way to meet the company’s needs. 

Open AI models matter for global economics and politics, not just corporate balance sheets, Nagle added. Nations that don’t have a lot to spend will turn to frontier open and inference models. If these models come from China alone, then America’s market influence may wane, as it has in Africa and Asia amid China’s infrastructure investments.

“The United States is investing in data centers but not data models,” Nagle said. “It may benefit the U.S. to invest in creating frontier models to compete with China.”

Read the full article by Brian Eastwood / MIT Sloan

Related Posts

Antitrust probes planned for OpenAI, Nvidia and Microsoft
Technology

OpenAI touts Amazon alliance in memo, says Microsoft has ‘limited our ability’ to reach clients

Tech is doing everything right and getting left behind
Technology

Tech is doing everything right and getting left behind

Generative AI is an energy hog. Is the tech worth the environmental cost?
Technology

Analysis-Investors press Amazon, Microsoft and Google on water, power use in US data centers

Major outgoing CEOs are citing AI as a factor in their decisions to step down
Technology

Major outgoing CEOs are citing AI as a factor in their decisions to step down

Meta’s court losses spell potential trouble for AI research, consumer safety
Technology

Meta’s court losses spell potential trouble for AI research, consumer safety

OpenAI announces launch of new ChatGPT business tier
Technology

OpenAI is shutting down its Sora video app just months after launch

Rivian CEO on how its Uber deal came together, and why the AI ‘driver’ is the future of mobility
Technology

Rivian CEO on how its Uber deal came together, and why the AI ‘driver’ is the future of mobility

Bitcoin makes a resurgence, nudging above $71,000
Technology

US securities regulator issues long-awaited crypto guidance

Gen Z’s fan edits are a staple of internet culture. Movie studios are finally buying in
Technology

Gen Z’s fan edits are a staple of internet culture. Movie studios are finally buying in

Salesforce lowers outlook during CFO transition
Technology

Software companies fight back against fears that AI will kill them

No Result
View All Result

Recent Posts

  • Wells Fargo CEO attributes strong Q1 results to ongoing investment strategy
  • Geopolitics Is the Market Force—So What Comes Next?
  • Jet fuel shock from Iran war worsens crisis for global airlines
  • Wholesale prices increased by 0.5% in the U.S. in March
  • Citi off to ‘exceptionally strong start in 2026’ as profits jump 42%

Archives

Categories

  • Art & Culture
  • Business
  • CEO Interviews
  • CEO Life
  • Editor´s Choice
  • Entrepreneur
  • Environment
  • Food
  • Health
  • Highlights
  • Industry
  • Innovation
  • Issues
  • Management & Leadership
  • News
  • Opinion
  • PrimeZone
  • Printed Version
  • Technology
  • Travel
  • Uncategorized

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org

  • CONTACT
  • GENERAL ENQUIRIES
  • ADVERTISING
  • MEDIA KIT
  • DIRECTORY
  • TERMS AND CONDITIONS

Advertising –
advertising@ceo-na.com

110 Wall St.,
3rd Floor
New York, NY.
10005
USA
+1 212 432 5800

Avenida Chapultepec 480,
Floor 11
Mexico City
06700
MEXICO

  • News
  • CEO Interviews
  • Opinion
  • Technology
  • Environment
  • CEO Life

  • CONTACT
  • GENERAL ENQUIRIES
  • ADVERTISING
  • MEDIA KIT
  • DIRECTORY
  • TERMS AND CONDITIONS

Advertising –
advertising@ceo-na.com

110 Wall St.,
3rd Floor
New York, NY.
10005
USA
+1 212 432 5800

Avenida Chapultepec 480,
Floor 11
Mexico City
06700
MEXICO

CEO North America © 2024 - Sitemap

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Home
  • News
    • Business
    • Entrepreneur
    • Industry
    • Innovation
    • Management & Leadership
  • CEO Interviews
  • Opinion
  • Technology
  • Environment
  • CEO Life
    • Art & Culture
    • Food
    • Health
    • Travel

© 2026 JNews - Premium WordPress news & magazine theme by Jegtheme.