Events & Resources

Learning Center
Read through guides, explore resource hubs, and sample our coverage.
Learn More
Events
Register for an upcoming webinar and track which industry events our analysts attend.
Learn More
Podcasts
Listen to our podcast, Behind the Numbers for the latest news and insights.
Learn More

About

Our Story
Learn more about our mission and how EMARKETER came to be.
Learn More
Our Clients
Key decision-makers share why they find EMARKETER so critical.
Learn More
Our People
Take a look into our corporate culture and view our open roles.
Join the Team
Our Methodology
Rigorous proprietary data vetting strips biases and produces superior insights.
Learn More
Newsroom
See our latest press releases, news articles or download our press kit.
Learn More
Contact Us
Speak to a member of our team to learn more about EMARKETER.
Contact Us

AI search’s high costs could be a vicious cycle as Big Tech eyes profitability

The data: The generative AI-powered search rivalry comes at a steep cost.

  • Training GPT-3, the AI model underlying ChatGPT, required 1,287 MWh of energy and contributed over 550 tons of CO2 emissions to the environment, per Wired.
  • For context, the typical car emits 4.6 tons of CO2 annually, so it would take almost 120 years for the emissions to match that of AI model training.
  • Powering search with generative AI uses at least four to five times more computing power than standard search, according to QScale cofounder Martin Bouchard. He says current data center infrastructure won’t be able to cope with the demand.
  • Integrating the technology into search has significant energy and emissions implications—ChatGPT has about 13 million users per day, according to UBS data. Microsoft Bing crunches half a billion searches daily and Google 8.5 billion.

Why it could backfire: Microsoft, Google, Baidu, and Opera are making AI-powered search available to consumers. The problem is that the associated energy costs and carbon emissions add to the litany of generative AI’s problems.

  • Widespread reports of AI chatbot errors and limitations means companies will be steadily training new models and retraining existing ones.
  • With data centers already contributing 1% of the world’s greenhouse gas emissions, according to the IEA, we can expect generative AI will add pressure to political controversy around tech infrastructure expansion in Europe and elsewhere.
  • The technology could find itself in the crosshairs of a global energy crisis exacerbated by war and natural disasters and could contribute to cloud outages during heatwaves.

A rushed job: The steep environmental costs aren’t inevitable. Making data centers and neural networks run more efficiently could reduce the fallout. The problem is that tech companies are deploying technology supported by a weak foundation.

  • To ease the computational workload of Bard, Google is initially using a scaled-back version of its LaMDA AI model, which might have contributed to an error that cost the tech giant $100 billion in market value.
  • Constantly retraining models is expensive, which is likely the reason OpenAI has been operating a version of ChatGPT that uses data from 2021 and earlier.

The high compute and energy costs of the technology make profitability uncertain and could contribute to a vicious cycle for tech companies. Launching scaled back systems to cut costs means that the tech might not live up to the hype, undermining the consumer confidence these companies need to make it viable.

This article originally appeared in Insider Intelligence's Connectivity & Tech Briefing—a daily recap of top stories reshaping the technology industry. Subscribe to have more hard-hitting takeaways delivered to your inbox daily.

You've read 0 of 2 free articles this month.

Create an account for uninterrupted access to select articles.
Create a Free Account