Microsoft Reportedly Runs GitHub's AI Copilot At A Loss

Analysis Microsoft is reportedly losing up to $80 a month per user on its GitHub Copilot services.

According to a Wall Street Journal report citing a "person familiar with the figures," while Microsoft charges $10 a month for the service, the software giant is losing $20 a month per user on average and heavier users are costing the company as much as $80 every 30 days.

Announced in 2022, GitHub's Copilot employs OpenAI's large language models (LLMs) to assist programmers as they write and debug code in IDEs including Microsoft's own Visual Studio Code. Copilot essentially suggests source to drop into your projects as you type out comments, function definitions and code, and other lines. In March 2023 the platform got an upgrade to OpenAI's GPT-3.5 and GPT-4 models.

We've asked Microsoft for comment on the cost of running these AI models; we'll let you know if we hear back.

Running products at a loss is a common tactic across the technology industry, with the aim of building a dedicated user base and increasing prices once users are hooked. Microsoft sells its Xbox games console line below cost and recoups that loss as players spend on software and other content.

The same logic could apply to AI — a market Microsoft is investing heavily in to secure a first-mover advantage.

It's no secret that the hardware used to train and run most LLMs is expensive. Nvidia’s H100 accelerators sell for around $30,000 apiece, and we’ve seen them priced at $40,000 on eBay.

Microsoft employs tens of thousands of Nvidia A100s and H100s. This AI hardware and the servers it lives in guzzle electricity, too.

It’s hard to calculate the cost of Copilot’s ongoing operations, though OpenAI CEO Sam Altman has stated GPT-4 — the most advanced version of the company's LLM — cost more than $100 million to train.

One way that Big Tech has tried to control the cost of AI is with custom accelerators, such as Google’s Tensor Processing Unit and Amazon’s Trainium and Inferentia silicon. Now, if a report last week is to be believed, Microsoft may be about to reveal its own custom AI accelerator.

OpenAI is rumored to be considering working on its own custom processor for its ML workloads, too.

Microsoft’s current generative AI workloads are running on GPUs, largely down to the latency and bandwidth requirements of these models, which has made running them on CPUs impractical, Cambrian AI analyst Karl Freund told The Register.

As a result, these models benefit most from large quantities of high-bandwidth memory to hold all the model's parameters. For really large systems, such as OpenAI's 175 billion parameter GPT-3 model, multiple GPUs may be required per instance.

But it's worth noting GitHub Copilot isn't a general purpose chatbot like ChatGPT or Bing Chat: it does code. As we understand it, more specialized models usually can get away with fewer parameters and thus require less memory and therefore fewer GPUs.

As AI providers get a better grasp on the economies of scale involved, we could see higher prices for these features. Otter.AI, an AI-powered audio transcription service beloved by journalists and others, has raised prices and implemented new consumption limits on several occasions over the past few years.

Meanwhile, Microsoft and Google plan to charge a $30 premium on top of the regular Office 365 subscription and G-Suite plans to unlock gen-AI functionality.

It would not be surprising if Microsoft raised the price of GitHub Copilot once it has demonstrated its value to enough customers. After all, it's not a charity. ®

RECENT NEWS

From Chip War To Cloud War: The Next Frontier In Global Tech Competition

The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more

The High Stakes Of Tech Regulation: Security Risks And Market Dynamics

The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more

The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics

Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more

The Data Crunch In AI: Strategies For Sustainability

Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more

Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser

After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more

LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue

In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more