The car was very much the star at the annual Consumer Electronics Show (CES) this year, with car makers filling the halls of Las Vegas Convention Center with cutting-edge automotive technology. 2019 is shaping up to be the year of the tangible. Complementing the proof-of-concept autonomous driving demos was a focus towards in-car tech that’ll be improving the user experience for drivers by the end of the year, whether that’s in-vehicle infotainment (IVI), advanced driver-assistance systems (ADAS) or simply intelligent climate control and a more comfortable interior enabled by technology.
Each of these technologies require varying degrees of compute power while maintaining stringent power efficiency and security standards, so having the right hardware under the hood is universally important. And with more than 85 per cent of infotainment systems and many under-the-hood applications powered by Arm-based chips, it’s little wonder that so many of the automotive innovations on display at CES 2019 rely on the Arm ecosystem.
We scoured the show floor in order to bring you the latest and greatest automotive technology with Arm at its core…
This restored 1969 Mustang has been modified with extra horse-power and modern technology powered by Cypress Semiconductor
Silicon designer Cypress Semiconductor reimagined the car of the future as a stunningly-restored 1969 Ford Mustang – albeit one fully loaded with cutting-edge tech. The Manticore muscle car has been a labor of love for CEO Hassane El-Khoury for over a decade; get behind the wheel and you’ll find a fully-digital instrument cluster and IVI, powered by the Arm Cortex-R5 based Cypress Traveo MCU. We particularly like the fingerprint-based ignition, based on Cypress TrueTouch fingerprint reader another Arm-powered component. Sadly, we can’t lay claim to our favorite component the 435bhp 5-liter supercharged Coyote engine.
Automotive computer vision technology may be evolving at lightning speed, but that matters little to those of us who can’t afford to replace our cars on a regular basis. Arm partner Mapbox seeks to solve this problem by turning the world’s most ubiquitous device the smartphone into a second pair of eyes for your car. The Mapbox Vision SDK offers developers the ability to marry augmented reality with object detection and semantic road scene analysis, adding crucial context to the heads-up navigation experience. Mapbox employs Arm’s Project Trillium AI platform, making use of onboard CPUs, GPUs and AI chips (if available) to perform highly efficient neural processing on Arm devices without the need to upload and process video in the cloud.
Computer vision company DeepScale unveiled Carver21, its first publicly available ADAS product. This artificial intelligence (AI) driven perception software enables complex deep neural networks on automotive-grade processors by ‘squeezing’ the compute resource requirements down. The software brings together data from various sensors to help ADAS and autonomous vehicles of all levels of autonomy to better understand the world around them. DeepScale technology is optimized for Arm-based vision processors, ensuring the smallest and most power-efficient physical implementation possible. Read the DeepScale case study for more information.
While DeepScale kept its eyes firmly on the road, Wind River turned the camera on the driver demonstrating a real-time driver monitoring solution at the show based on its Hypervisor and VxWorks® RTOS and Human Perception AI from Affectiva. The software, running on an Arm Cortex-powered Renesas R-Car H3 system-on-chip (SoC), can personalize the car experience based on the emotions and alertness of the driver. As well as improving the driver experience, the built-in Affectiva Human Perception AI also performs an important safety role, looking out for signs of driver anger, inattentiveness or drowsiness.
Embedded software provider Green Hills Software brought a variety of Arm-powered automotive demonstrations to the show. BlueBox Autonomous Racer, pits a human driver against a computer-controller opponent based on the company’s INTEGRITY® real-time operating system (RTOS) and running on the Arm-powered NXP BlueBox autonomous development platform. But while it may look like a video game, this goes far beyond the level of AI you’d find in Gran Turismo powerful machine learning (ML) algorithms calculate tens of thousands of paths per second in order to choose the most efficient route with impressive accuracy, meaning this AI is fully capable of controlling an autonomous vehicle in the real world.
Blackberry-owned RTOS developer QNX chose to showcase its autonomous and ADAS development platform which is being developed using a Karma Revero luxury sedan. Powering it all is a Renesas R-Car V3H SoC, featuring six Arm Cortex cores. QNX told us it’s currently using only 30% of the system’s maximum compute performance to provide the ADAS functionality.
Itself a cutting-edge marvel, the 2019 Revero on display features both current and proof-of-concept QNX tech such as the IVI system, digital cluster and in-vehicle acoustics.
For years, a car’s dashboard has stuck to a familiar, detached format — the instrumentation cluster behind the steering wheel, infotainment in the center console and so on. This week, embedded automotive software developer OpenSynergy revealed a far more fluid future demonstrating how its COQOS Hypervisor platform can be used to design and power flexible, fully digital cockpits that move and evolve based on the applications in use at the time. The hypervisor allows the instrument cluster to be displayed safely, isolated from any issues in other systems and rebooted quickly in the case of a problem.
One demonstration showed the instrument cluster tachometer sliding out of the way to reveal a 3D-rendered map supplied by the navigation system running in a different operating system. Another showed content from a 3D game running on a separate development board being rendered on the IVI system. The future of in-car experiences seems a lot more exciting.
The way to a car’s heart is through its OBD-II port, and Intrepid Control Systems’ neoOBD 2 PRO is all you need to build automotive IoT applications. Combining an Arm Cortex-M4 processor with Wi-Fi BLE and SPP, the device enables prototyping and development of IoT-based functions such as vehicle tracking, loT management and real-time driver analysis. The neoOBD 2 also runs the Amazon FreeRTOS operating system, and Intrepid even demonstrated the system responding to Alexa skills from an Amazon Echo.
For more information on the Arm solutions driving these innovations, please visit our website.
[CTAToken URL = "https://www.arm.com/solutions/automotive" target="_blank" text="Arm Automotive solutions" class ="green"]