Join us in returning to NYC on June 5th to collaborate with executive leaders in exploring comprehensive methods for auditing AI models regarding bias, performance, and ethical compliance across diverse organizations. Find out how you can attend here.


When OpenAI announced its ChatGPT desktop app for macOS at its Spring Update conference, there were questions about why it did not start with Windows. After all, Microsoft is OpenAI’s main financial backer, and it would only be natural for it to get earlier access to new ChatGPT features. 

Now we know why. Microsoft’s new Copilot+ PC announced at the Build conference shows a rich set of features deeply integrated into the operating system and seamlessly using AI models on the device and in the cloud. The ChatGPT desktop app from OpenAI, presently for Mac only, looks trivial in comparison.

Both Apple and Microsoft have been trying to integrate AI features into their applications but with different strategies. It is still too early to tell which roadmap will win. But for the time being, Microsoft seems to be way ahead.

On-device AI

Apple’s strategy has always been to create a very polished product with cutting-edge technology and lots of headroom for future applications and software features.

VB Event

The AI Impact Tour: The AI Audit

Join us as we return to NYC on June 5th to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.


Request an invite

You can see this in the Vision Pro, the new iPad Pro M4, and even the MacBooks and iPhones it releases every year. New devices usually have much more memory and compute power than most people need for their daily use.

Gradually, Apple releases operating system updates and new features and software that take advantage of the accelerators and specialized hardware.

For example, when the M1 chip was announced, nobody was thinking about running large language models (LLMs) on Macs. But now, there is an entire library and toolset for optimizing models to run on Apple silicone, and the Apple research team is regularly releasing models that can run on its devices.

However, the problem with Apple’s approach is that on-device generative AI has not matured yet. And while the past year has seen impressive progress in the field, we still haven’t reached the point where on-device models are reliable enough to independently perform multiple tasks. This is the gap that the ChatGPT app for macOS will fill until Apple figures out a way to run its own cloud-based models or on-device AI reaches a point where Apple won’t need cloud-based models at all.

The result is a patchwork of different AI tools that don’t integrate seamlessly into the operating system.

Seamless AI

On the other hand, Microsoft focuses on delivering cutting-edge AI technologies and then figures out how to bring them as close to the user as possible.

First, the investment and partnership with OpenAI put Microsoft in a unique position to integrate frontier models into its products.

But Microsoft also cast a wider net by supporting open models such as Llama and Mistral. At the same time, Microsoft started releasing small language models (SLMs) such as Phi and Orca

At the World Economic Forum, Microsoft CEO Satya Nadella was asked whether his company had become too reliant on OpenAI. Here is how he responded:

“I feel we are very capable of controlling our own destiny… Our products are not about one model. We care about having the best frontier model, which happens to be GPT-4 today. But we also have Mixtral in Azure as a model as a service. We use Llama in places. We have Phi, which is the best SLM from Microsoft. So there is going to be diversity in capability and models that we will have, that we will invest in. But we will partner very very deeply with OpenAI.”

Finally, Microsoft reduced its dependence on OpenAI by creating a layer of abstraction through its Copilot brand. Microsoft users will be interacting with the Copilot across Windows. Behind the scenes, the AI assistant will use the most suitable model for the task.

These capabilities were put on full display during Microsoft Build, where the new Copilot+ PCs showed a wide range of AI capabilities, including image creation, live captions, productivity tools, and yes, the creepy Recall feature. Some of these features run on-device, some in the cloud, and some are distributed between both.

The field will continue to evolve, especially as Microsoft is doubling down on its efforts to bring advanced ARM chips to its laptops. The hardware will continue to become more powerful, the on-device LLMs will continue to become more efficient, and the Copilot’s backend models will change without disrupting the user experience. 

The demo was impressive enough that Stratechery’s Ben Thompson described it as “MacBook Air-esque” and “unlike Apple’s offering, actually meaningfully integrated with AI in a way that not only seems useful today, but also creates the foundation to be dramatically more useful as developers leverage Microsoft’s AI capabilities going forward.”

Microsoft’s Trojan horse?

After the announcement of the ChatGPT app for macOS, some users took to social media to mock Microsoft for investing $10 billion in OpenAI to release a macOS app.

But in light of the Microsoft Build announcements, it seems like ChatGPT has become Microsoft’s Trojan horse in the Apple ecosystem. ChatGPT runs on the Azure cloud, and the deeper integration of ChatGPT into macOS and iOS will give Microsoft a stronger foothold in the Apple user experience. It seems like Satya won this round. But the race is not over. We’ll have to see what Apple reveals at WWDC in June.

Source link