Zephyrnet Logo

Challenges Mount In New Autos

Date:

Electronics are becoming the primary differentiator for carmakers, adding an array of options that can alter everything from how a vehicle’s occupants interact with their surroundings to how the vehicle drives. But the infrastructure needed to support these features also raises a slew of technology and business questions for which there are no simple answers today.

For example, how will new technology function over long lifetimes and under extreme conditions? How will car makers achieve economies of scale if they have to design everything from scratch for a limited number of vehicles? And how will terabytes of data from sensors be managed and stored? Not least, will wariness of tech affect new car sales?

AI as driver
More intelligence inside a vehicle requires more algorithms that need updating to stay current with rapidly evolving regulations and protocols. Many AI/ML chips are designed at advanced process nodes for maximum data throughput, making them more susceptible to heat-, vibration-, and aging-related failures.

“We’ve known for several years that AI silicon, or AI-related silicon, is going to be where the vast majority of profits are going to be made across the entire vehicle within the next few years, and it might not be for the reason that one would expect,” said David Fritz, vice president of hybrid-physical and virtual systems at Siemens Digital Industries Software. “What we’re seeing now is AI in engine controllers and in mission controllers. We’re seeing multiple levels of AI doing some relatively simple things, like looking for somebody who left their backpack behind on public transportation, or somebody who didn’t put their seatbelt on. AI is popping up all over in transportation.

“This means it’s not just getting you from point A to point B, which remains a challenge,” said Fritz. “AI also is taking more and more steps toward intelligence in the vehicle – not necessarily for piloting the vehicle, but rather for knowing what’s happening in and around the vehicle and taking action accordingly. This is how people start getting used to the car being smarter. Losing the steering wheel is only a minor step in that direction. Instead of the consumer saying, ‘I don’t have a steering wheel anymore,’ they’re asking, ‘How do I know this car isn’t going to stop me in the middle of a field and say I’ve arrived?’”

This hasn’t deterred OEMs. Mercedes-Benz, BMW, VW, Audi, GM, Ford, and others are embracing the AI-powered cockpit. AI expands infotainment access, replaces hard keys with voice input, touchscreen or hand gesture input, while improving navigation. Years ago, for example, Mercedes-Benz selected NVIDIA’s AI chip for its cockpit design, with the code name MBUX (Mercedes-Benz User Experience). It included 3D touchscreen displays along with a voice-activated assistant using the phrase, “Hey, Mercedes.” MBUX is still used in 20 models today.

The second-generation product provides even more functionality. The MBUX in the S-Class supports up to five displays, including three in the rear. (In China, the bosses usually sit at the back of the car). 3D glasses are not required to view the 3D displays. On the display, buttons appear in a spatial view (similar to mid-air), making them easier to see. The “Hey Mercedes” voice input understands 27 different languages with natural language understanding (NLU) capability. If multiple microphones are used, the voice input recognizes where the voice originates. The MBUX Interior Assist responds to gestures. By using AI, 27 mechanical switches (hard keys) have been replaced with natural human input, such as touching, swiping, and hand gestures. BMW, GM, Ford, and others are all doing something similar.


Fig. 1: New digital automotive cockpit display is capable of 3D view without the need for 3D glasses. Source: Mercedes-Benz

Automakers also are starting to implement in-cabin monitoring capabilities, which will improve overall safety. By using data from multiple cameras, the AI cockpit can respond by sounding an alarm, for instance, when the driver is nodding off.

But at this early stage of using AI, carmakers are still experimenting. They are exploring the use of software to enhance vehicle functions and user experiences.

Less noise, fewer distractions
With the move to electric vehicles and AI tools, the environment in cars is changing. “As vehicles become quieter, road noise and the noise from other vehicles become much more pronounced to both the driver and the vehicle’s passengers,” said Dave Bell, product marketing director, Tensilica IP at Cadence. “AI is being used to cancel this external sound, using the infotainment system to keep the cabin area quiet for a more relaxed driving environment. Additionally, in-cabin noise can be locally suppressed to create ‘sound bubbles’ for the driver and select passengers through AI. So AI can be used to intelligently cancel and suppress noise, while still allowing and even enhancing or alerting the driver in the case of emergency vehicles or notifications that require the driver’s attention.”

AI also allows for minimizing the use of touch controls, which can distract drivers and wear out with age. “Gesture control is being integrated with the infotainment system based on mmWave radar sensors or camera input to replace or supplement traditional button or knob controls for things such as in-cabin climate and radio,” Bell noted. “Gesture recognition with AI is also useful for new and enhanced features for screen view manipulation, such as zoom and focus of a navigation map, or scrolling through a directory.”

Voice commands are big as well. “Voice commands are supported in a small percentage of vehicles already,” he said. “The trend for voice command support is to enhance it with additional AI-based processing for always-listening keyword detection (no push-buttons), and to include a richer on-device dictionary to allow local operation when connectivity is not present. Additional AI-based enhancements include noise suppression for the microphone input to pre-process the incoming speech, and beamforming to enhance speech capture of the speaker — which may include passengers as well as the driver.”

Such changes require more software, hardware, and sensors, while raising new challenges for the electronic supply chain.

Software manages everything from the entertainment systems to engine performance. It is not uncommon for users to take brand-new luxury vehicles back to the dealer to fix problems such as windshield wipers that refuse to operate due to a software glitch. Additionally, multiple engine control units (ECUs) have to play well together.

Chip architectures
Automotive design complexity increases as it becomes increasingly digitalized. It is not uncommon for luxury vehicles to have more than 150 million lines of code supporting 100+ ECUs. These ECUs, in turn, interface with many different types of sensors (cameras, radar, lidar, microphones, tire pressure) and control many engineering functions. Traditionally, China manufactured many of these automotive electronics. But with digitalization and the introduction of AI, many mobile and consumer chip manufacturers are now joining the competitive market. Current intelligent cockpit SoC providers include Qualcomm, Intel, NVIDIA, Huawei, AMD, NXP, STMicroelectronics, MediaTek, SemiDrive, HiSilicon Technologies, and Rockchip, with more to come.

For a long time, these SoCs have been operating in the range of 1 to 2 tera-operations per second (TOPS). The chip architecture in automotive, however, is about to change, and those speeds are likely to increase by at least a couple orders of magnitude. Instead of using hundreds of ECUs, OEMs are consolidating with a few high-performance AI cockpit chips.

Intelligent cockpit SoCs process a massive amount of cockpit data. To handle video from multiple cameras, audio/image processing, and voice input/output to multiple HD and possibly 3D displays, these SoCs squeeze as many functions as possible into a single package. Additionally, the package has to handle wireless data connections. Typically, these SoCs include a GPU or a data processing unit (DPU), a network processing unit (NPU), DSP, and I/O supports. Here are a few examples of AI-based cockpit SoCs.

The MBUX design is based on NVIDIA’s DRIVE Orin SoC and the DRIVE IX software stack. Although NVIDIA has not disclosed additional product specifications, what’s known is that the Orin SoC can deliver 254 TOPs of compute performance. It supports autonomous driving capabilities at various levels, digital instrument clusters, and AI cockpits.

Qualcomm, traditionally a chip supplier in the mobile phone market, is now offering AI cockpit chips to many OEMs. Its SoC in development can deliver compute power on the order of 30 TOPS, a threefold increase over its previous generation, and will be available in 2023. This new SoC will be based on a 5nm process, while most AI cockpit chips still use the 7nm and 8nm processes. The previous generation (S8155P) consumed 7W. Even though the specification of the new SoC has not been finalized, it would be safe to assume that the power consumption will be higher.

NXP, a longtime supplier of automotive chips, is coming out with the i.MX 93, consisting of multicore Arm-based processors with integrated display, camera, and audio capabilities. It also supports multiple OSes (Linux OS, FreeRTOS, Green Hills, QNX, Microsoft Azure Sphere, and VxWorks). To move and process massive amounts of data in and out the cockpit chips, it is important to support various OS and protocols.

SoCs from NVIDIA deliver higher data rates (TOPS). But OEMs will be making selection decisions based on functional requirements, power, performance, software support, and costs tradeoffs, not just TOPS.

Not all OEMs that want to design their own system with AI SoCs are well versed in automotive software. These vendors likely will rely on Tier 1 or Tier 2 suppliers for that expertise. For example, Lantronix provides a system-in-a-package (SiP) solution based on the Qualcomm AI chips. Additionally, these suppliers can provide OS support, including QNX, Green Hills Software, Android, and Linux.

With the availability of highly integrated AI-based cockpit chips from chipmakers, OEMs will be replacing multiple ECUs with a single high-performance SoC. There are many benefits to this approach. It will cut down the number of cable harnesses, simplify the manufacturing process, and reduce errors. In fact, some OEMs, such as Volkswagen, has already begun consolidating and centralizing its cockpit design in anticipation of new cars to be introduced within the next six to eight years.

Another development in the automotive design is the increasing role software plays. Volkswagen’s software group, CARIAD, is partnering with STMicroelectronics to define the future technology stack and SoCs. The company is also developing digital functions for the vehicle, including driver assistance systems and a standardized infotainment platform. Additionally, its future software functions will focus on linking powertrains, chassis, and charging technology. Volkswagen’s roadmap is a good indication of where the auto industry is heading.

Data management, privacy, security
However, the magnitude of all these changes is causing massive disruption among automakers, which several years ago were focused on improving gas mileage or performance in internal combustion engines. The fast-changing electronic industry is putting constant pressure on the relatively slow-changing automotive industry. Succeeding in the digitization era requires a change of mindset and adaptation in automotive design. Overcoming these challenges affects the entire supply chain.

Consider the growing problem of data management. Massive amounts of data need to be processed throughout vehicles, prioritized, and either stored, trashed, or sent to edge or cloud servers for further processing. But in the automotive world, this is compounded by the fact that different protocols may be required to move the data, and those protocols may depend on multiple operating systems. That significantly increases the complexity of the design, especially if OEMs are not familiar with automotive software. And then there is the question of who really owns that data.

“Data from the vehicle is going to be a key inflection point in how car manufacturers, service providers, and fleet owners use and potentially monetize the data that is collected from both inside and outside the vehicle,” said Robert Day, director of automotive partnerships, Automotive and IoT Line of Business, Arm. “This will affect how and where the data is stored, who owns the data, and ultimately who benefits from using the data (and how much they are willing to pay for it). This leads to discussions around data privacy, anonymization of the data, and how much we, the consumers or users of the vehicle, are prepared to have our vehicle journey data shared. This does lead to some design considerations for the OEMs, in that if they use a service provider to manage the user experience, does the user data naturally belong to the service provider rather than the OEM, and has the OEM then lost an opportunity to keep and potentially monetize their relationship with the consumer?”

Related to that are privacy and security issues. With wireless connectivity and potential software vulnerability, vehicles will be a target of hackers. A successful attack would have very serious consequences. Therefore, automotive engineers not only have to manage software well, but also stay on top of security and constantly test for vulnerability.

“As the automobile becomes more intelligent, privacy and security concerns rise significantly,” said Bart Stevens, senior director of product marketing at Rambus. “The most important aspect is that the intelligent systems need to be able to execute their intended functions at all times. This needs to be safeguarded by security and safety mechanisms that are present in central gateways, as well as zonal gateways in the various domains of the vehicle.”

Implemented safety mechanisms must be certified against ISO 26262 ASIL safety integrity levels. Security mechanisms need to comply to ISO 21434 cybersecurity standards and support industry standard security functionality, such as AUTOSAR, SHE+, and EVITA, for offering security use cases such as secure boot, execution, debug, firmware updates, and communications.

Cars also will rely increasingly on smart sensor systems as more autonomy is added into vehicles. These systems include, for example, proximity sensors, radar, lidar, and cameras that give the vehicle situational awareness of its surroundings. Often, the sensor systems are capable of local AI edge preprocessing.

“Reducing the heavy traffic flowing from AI sensors to centralized processors offloads the in-car networks,” said Stevens. “Yet these in-car networks carry high amounts of traffic. Handling this heavy traffic is causing reliable Automotive Ethernet to become mainstream. Having the ability to secure in-car communication calls for the support of a proven link security protocol such as IEEE 802.1AE-2018 MACsec.”

Security remains a big concern, and one that will continue to grow as cars become more connected. “While there are rarely easy answers to security problems, several security techniques can help a lot here,” said Steve Hanna, distinguished engineer at Infineon Technologies. “Isolating sensitive systems from external attacks with a security gateway is a first step toward reducing the risk of infection. Placing critical functions into dedicated security chips or security cores can protect them from vulnerabilities in the main code. And employing a secure development methodology throughout can make the whole system more robust.”

Conclusion
The automotive industry’s future is full of opportunities and challenges, and no doubt AI will be front and center.

“On the way to full automation, the automotive industry will take many turns,” said Ron DiGiuseppe, automotive IP and subsystem segment manager in Synopsys’ Automotive Group. “AI will play a major role in future automotive designs and SoCs. AI is used in driver monitoring, including driver drowsiness detection, as well as voice input using natural language processing. Additionally, AI is widely used in Level 2 autonomy ADAS applications, such as automotive emergency braking (AEB) and lane-keep aid (LKA) and lane centering.

As the industry targets higher levels of autonomy such as Level 3 and Level 4 vehicles, the complexity, including integration of new sensors, will continue. Beyond ADAS and cockpit applications, AI will be used in EV powertrain and battery management/control, and in preventive maintenance.”

Designing chips used in those systems will require a combination of both extreme performance and low power. But they likely will lead to more consolidation of functionality, which will help. “SoCs will be used to process a massive amount of data received from sensors and other ECUs, plus the intensive AI computation,” DiGiuseppe said. “OEMs need high-performance SoCs with 1,000 TOPs to replace multiple ECUs. There will be a consolidation of ECUs in the next few years. This will simplify design with fewer ECU suppliers to manage. While it is a challenge to implement such high performance in SoCs, leading-edge customers are targeting a higher goal for systems in planning with up to 2,000 TOPs performance.”

spot_img

Latest Intelligence

spot_img