Last week’s CES electronics trade show in Las Vegas saw many announcements from autonomous vehicle (AV) technology developers. Along with car manufacturers presenting vehicle prototypes (e.g. Toyota) and digital moguls presenting their software platforms (e.g. Baidu), perhaps the most interesting announcements related to dedicated processors, or more precisely: Systems-on-a-chip (SoC), that are able to process all the data coming from cameras and other sensors. Manufacturers of such SoCs include automotive silicon incumbents such as Renesas, Dutch NXP and Texas Instruments, but also several chip developers who are relatively new to the automotive game: NVIDIA, Intel-owned Mobileye and Samsung.
What does this mean?
An autonomous vehicle needs eyes and ears (i.e. cameras and other sensors) and a brain to process and combine all the real-time and stored data. Chipmakers today are developing integrated systems-on-a-chip that are purpose-designed to process sensor data (e.g. image processing) and SoCs that are able to run the AI that actually drives the cars. Interestingly, those who are new to the automotive value chain (NVIDIA and Intel) are seeking to leapfrog towards high levels of automation (e.g. NVIDIA says Level 2 automation will be ready next year), while the incumbents are opting for a stepwise approach by gradually enabling assisted-driving features (e.g. lane keeping). This dynamic is quite similar to the wider electrification trend, in which Tesla, as a new entrant, immediately went all-in in terms of electrification and automation (with its Autopilot feature), while traditional car makers have taken a much more careful and stepwise approach.
As a result of new assisted-driving features and the broader trend of electrification, the market for automotive chips is expected to grow almost 10% per year towards 2024. As such, there is probably more potential for growth here than in other computing markets (e.g. smartphones). However, the nature of the automotive industry (with a focus on safety and reliability) requires highly integrated systems of hard- and software and cooperation between developers in various levels of the AV-stack is thus paramount. Because of this, openness seems to be the keyword in the development of SoCs and AIs for self-driving cars. That is, other developers (including car manufacturers) are invited to develop their own hard- and software systems on top of an SoC. At the same time, several (chip) developers are trying to develop a full-stack product on their own; Tesla is making its own SoC and Mobileye is working on a full suite of processors, AI and sensors.