“We want you to be able to throw out your user manual,” was one of the first things Dean Miles, Amazon’s director of product for its smart vehicles, told me as we sat down to chat over Amazon’s big push to stick AI in cars at CES 2024. The idea was first billed to me as—essentially—“yo dawg, we put a chatbot in your car…” Less than a means for your car to hallucinate bad poetry at you, Amazon is trying to work with BMW to make your car the best authoritative source about your car. I would be sold if it weren’t for the major hurdles of requiring always-on connectivity.
So yes, Amazon is helping BMW stick a large language model inside its cars. This is being billed as a “voice assistant LLM,” which works with the existing BMW Personal Assistant to provide a far more detailed and sophisticated response to users’ questions. Amazon took a large language model and fed it the car maker’s 300-plus page instruction tomes.
Say you wanted to switch your car to power mode. If you didn’t read the instructions, you could tell the car to make the change for you. In my demo inside a BMW X1 with an early version of the voice assistant active, you could also ask what the AI recommends for driving on Las Vegas Boulevard. In my case, the AI apparently wanted other cars on the Strip to hear your engines roar.
Miles described this feature as if all those drivers who had no real conception of their car’s features could also become “power users,” AKA the kinds of vehicle owners who could tell you the difference between when Dynamic Traction Control is on and when it’s not.
Amazon started trying to shove its Alexa assistant into BMWs back in 2022. Still, that partnership came at the very advent of the generative AI boom that would utterly consume Silicon Valley in 2023. That feature was essentially billed as an extension of the BMW assistant but powered by an LLM; Amazon seems to think an assistant could diagnose what may be going wrong with your car. The feature is still in development, but Miles said it may be able to interpret diagnostic details on your car to answer more technical queries.
If you’ve owned a car, you have already had to flip through the esoteric owners’ manual to decipher your car’s pop-up diagnostics. My 2024 Hyundai Elantra might be sophisticated enough to tell me that my car’s tires are slightly deflated. Still, it’s not sophisticated enough to tell me the correct amount I should aim for (it’s 33 PSI, in case you were curious). Unless you’ve read up on a car’s user manual like you were cramming for the most important test of your life, you probably don’t know off the bat what kinds of warning lights are blowing up your dashboard like a Christmas tree.
Besides replacing your owner’s manual, the AI still has all the more inane qualities of generative AI. In my demo the AI took a few seconds between asking a question and getting a response, I’d imagine more if the car doesn’t hear you correctly. Miles said they’re not looking to have an easy-to-use distraction machine in cars (to add to the multitude of obvious distraction machines in the form of in-vehicle touchscreens). However, the company is still trying to come up with precise guardrails for what the AI will or won’t do.
But the AI won’t work off the car’s onboard computer. Currently, the AI voice assistant works mostly off the cloud, and it requires users to have satellite connectivity in their vehicles. Some parts of the AI-enabled voice assistant will work without internet, but the most advanced features won’t be there unless you pay the extra price for 5G connection in your car.
Amazon was also extremely reticent to discuss what kind of AI they’re employing in BMW. It’s Amazon’s own language model, according to Miles, but the company wouldn’t discuss anything about the AI’s capabilities to compare with any other models. All the secrecy and inherent limitations of cloud-connected AI puts a damper on Amazon’s grand car concept, as much as I want a chatbot to finally tell me how to use my car’s wireless charging.