MAUI, HI – It’s time to turn the page on the software-defined vehicle – rapidly becoming basic table stakes in next-gen vehicle development – toward a new chapter: the AI-defined vehicle.
That’s probably the biggest takeaway at Qualcomm’s recent Snapdragon Symposium here, where the San Diego-based chipmaker and technology company unleashed its new Snapdragon Elite line of digital cockpit and automated-driving processors powerful enough to take advantage of what fast-developing artificial-intelligence technology can bring to the automobile.
Of course, it’s hard to say how quickly the industry will move toward SDVs now that many automakers are pulling back on planned battery-electric vehicles based on new platforms that were expected to lead the transition.
However, at least some suppliers say the shift will continue with or without a broad migration to BEVs, and for its part, Qualcomm predicts the movement remains on pace to creating a market worth nearly $650 billion annually by 2030. The forecast includes hardware and software demands from automakers rising to $248 billion per year from $87 billion today and a doubling in demand from suppliers to $411 billion.
The vision laid out by Qualcomm and executives of other tech developers and enablers on hand here calls for the AI-driven SDV to lead a critical migration in computing away from the cloud to directly onboard the vehicle as the industry moves to its next stage toward what some are calling the Intelligent SDV.
“The shift of AI processing toward the edge is happening,” declares Durga Malladi, senior vice president and general manager of tech planning for edge solutions at Qualcomm. “It is inevitable.”
From Cloud To Car
Two things are making possible this movement away from the cloud and toward so-called edge processing that takes place within the vehicle itself: the rapid and seismic leaps in AI coding and data efficiency and capability, and the increasingly powerful and energy-efficient computers needed to process the information.
Answering the call for greater computing capability is Qualcomm’s new Elite line of Snapdragon Cockpit and Snapdragon Drive platforms, set to launch on production vehicles in 2026, including upcoming models featuring Mercedes-Benz’s next-gen MB-OS centralized electronics architecture and operating system and vehicles from China’s Li Auto.
Current-generation Snapdragon chips also will underpin new Level 3 ADAS technology and central-compute architecture Qualcomm is developing with BMW that will be deployed in BMW’s Neue Klasse BEVs expected to roll out in 2026. That ADAS technology will be offered to other OEMs through Qualcomm as well.
These new top-of-the-line Elite processors are many times more powerful than current-generation Snapdragon platforms, already with a strong foothold in automotive. Qualcomm says that as of second-quarter 2024, it had a new business pipeline for current products totaling more than $45 billion.
Moving computing away from interactions with the cloud and onboard the vehicle as the AI-driven architecture takes hold will result in several advantages, experts here say, including development of safe and reliable self-driving technology.
For automated driving, “you need more serious edge intelligence to map the environment in real time, predict the trajectory of every vehicle on the road and decide what action to take,” notes Nakul Duggal, group manager, automotive, industrial and cloud for Qualcomm Technologies.
And for cockpit operation, relying less on the cloud and more on the vehicle’s computers provides faster response and greater security and privacy. It also ratchets up the ability of the vehicle’s AI-based systems to adapt to the occupant’s preferences.
Qualcomm Snapdragon Elite Brings New Capabilities
In promoting the new Elite line of Snapdragon systems-on-a-chip for automotive, Qualcomm presents a future in which the onboard AI assistant better recognizes natural-speech commands, anticipates needs of the occupants, is capable of providing predictive maintenance alerts and does things like buy tickets to events and make reservations. The vehicle will be able to drop occupants off at their destination and then locate and drive to an open parking spot – and pay if necessary – all on its own. Not sure what that road sign indicated that you just passed? The AI assistant will be able to fill you in. If the scene ahead would make a good picture, the virtual assistant can use the vehicle’s onboard cameras to take a digital photo.
The AI assistant also will have contextual awareness, meaning it might decide it’s better not to play sensitive messages if other occupants are in the vehicle.
“Automakers are looking for new ways to personalize the driving experience, improve automated driving features and deliver predictive maintenance notifications,” says Robert Boetticher, automotive and manufacturing global technology leader for Amazon Web Services. “Our customers want to use AI at the edge now, to enhance these experiences with custom solutions built on top of powerful models.”
Beyond virtual assistants, these new onboard computers will be there to support high-resolution 3D mapping, multiple infotainment screens, personalized audio zones that don’t interfere with what other passengers are listening to and sophisticated cabin-monitoring technology. They will be capable of fusing data from both the ADAS and infotainment systems to provide vehicle occupants with more granular information and intuitive driving assistance that acts more like a human would – guiding the vehicle around a known pothole on your daily commute, for instance.
And the human-machine interface promises to evolve as a result. With AI, the world is edging away from a tactile experience – such as pushing buttons on a screen to access data – to one where infotainment is voice-, video- and sensor- (lidar, radar, camera) driven, Malladi says.
“The bottom line is, the AI agent becomes the one starting point that puts it all together for you,” he says.
AI Landscape Evolving, Rapidly
AI interest was somewhat dormant in automotive until two years ago, when the release of ChatGPT caused a stir in the tech world.
“For a lot of us in the (chip) industry, we were working on AI for a long time,” Malladi says. “But for the rest of the world, it was an eye-opener. Everyone was talking about it.”
Within a year, he says, large-language-model technology took giant leaps in simplicity. While the model used to create ChatGPT was about 175 billion parameters (a measure of its complexity), that has shrunk considerably, Malladi says.
Large-language-model technology “went from 175 billion parameters to 8 billion in two years, and the quality has only increased,” he says. That translates into less required storage capacity, faster compute times, greater accuracy and fewer chances to introduce bugs into the code.
A new AI law is emerging as a result, Malladi adds, in that “the quality of AI per parameter is constantly increasing. It means that the same experience you could get from a (large, cloud-based) data center yesterday you (now) can bring into devices that you and I have.”
Making it all possible inside the vehicle are the new-generation chips now emerging that easily can handle the AI workload.
“We can run with the next generation up to 20 billion parameter models at the edge,” Duggal says, adding that compares to about 7 billion with the current-generation processors. “Everything is in your environment (and processed) locally. This is the big advancement that has happened with the latest AI.”
The chips also are becoming more power efficient, as well, a key factor in the evolution toward BEVs, where automakers are looking to squeeze out as much driving range as possible from their lithium-ion batteries.
The new Snapdragon platform is said to be 20 times more efficient than what is required to generate AI from a data center today.
“The power draw of the devices we now use – that have the power of a mini-supercomputer of 25 years ago – is down to less than an LED lightbulb,” Malladi says.
Moving from the cloud to the edge onboard the vehicle also will save money – it’s closer to zero cost when accessing the data onboard the vehicle, and unlike operation of huge cloud servers, there’s no impact on the electrical grid, making it more environmentally favorable, Malladi points out.
Minimal reliance on vehicle-to-cloud computing also reduces the risk a cloud server won’t be available at a critical juncture, says Andrew Ng, founder and CEO of AI visual solutions provider Landing AI. “AI brings low latency, real-time processing, reduced bandwidth requirements and potentially advanced privacy and security,” he says.
It will take time for AI-based SDVs to begin penetrating the market in big numbers, and automakers will have to carefully determine what features customers will want and avoid packing vehicles with capabilities they won’t appreciate. But the general direction seems clear.
“Bringing AI in is no small task,” Duggal admits, but he says it offers limitless potential and notes nearly every major automaker has shown an interest in the new Snapdragon Elite platform that can help unleash the technology.
Sums up Qualcomm’s Anshuman Saxena, product management lead and business manager for automotive software and systems: “AI is definitely becoming a focal point for the whole industry – and for us too.”