Chinese version 中文版：高级驾驶员辅助系统 (ADAS) 取得诸多进步——无人驾驶汽车离我们越来越近
Automobile manufacturers are constantly making improvements to the design of cars, creating an experience that is safer and more comfortable with each new model. In the last number of years we have seen a defined move towards more technology in our cars, designed to make it easier to drive. This year’s CES was notable not only for the amount of automobile manufacturers in attendance, but also the fact that nearly all of them had a version of a self-driving car, prompting speculation that they will appear on our roads within the next 5 years. My first experience of a driverless car came while watching the movie Demolition Man back in 1993 (although I’m sure there are prior examples of this in sci-fi movies). The movie, set in 2032, also had very accurate depictions of an iPad and Skype that can be seen in this video compilation. The only surprise is that the movie director was too conservative with his estimate of when these technologies would be developed!
Science fiction is always an interesting barometer of predicting the future, introducing concepts ahead of their time. It’s exciting to see these visions turn into a reality, but there is also a more pressing need for their development. In August of 2012, KPMG and the Center for Automotive Research published a comprehensive report on Self-driving cars: The next Revolution. It included some powerful statistics on the dangers involved in driving a car:
“In 2010 there were approximately six million vehicle crashes, of which 93 percent were attributable to human error.”
Therefore there are huge social benefits to making cars safer through applying new technologies. As technology enthusiasts and engineers, we are generally more concerned with how to make the automotive experience safer. Much has been publicised about the safety benefits of autonomous vehicles, and the path to achieving this is via advanced driver-assistance systems (ADAS). ADAS is a combination of technologies located in the car designed to enhance vehicle systems for safety and better driving, as soshunarai explains. Certain features like cruise control, rear-view cameras or automated lighting have been included as standard options for many years now. Some of the features we are currently seeing are things like traffic warnings, keeping the driver in the correct lane, providing visibility of blind spots and automated braking. The rate of consumer acceptance is generally slower than the rate of development, which may be why it takes time to see safety features appear in consumer automobiles.
ADAS can be based upon different systems, including vision/camera, sensor technology, car data networks, vehicle-to-vehicle, or vehicle-to-infrastructure systems. The Connected Car of the future will increasingly utilise wireless networks to interact with other vehicles and the highway, providing extra safety and more up-to-date information to passengers.
One of the fascinating things about working at ARM is seeing how our partners develop SoCs to make the quantum leap of turning concepts into reality. The bottom line is that self-driving cars will be controlled by an SoC instead of a person. At Embedded World 2015 Xilinx announced the new UltraScale+ family of FPGAs, 3D ICs and MPSoCs, which included the Zynq UltraScale+ MPSoC (multi-processor system-on-chip). You can find the full press release here. It is leading in the area of heterogeneous MPSoCs, with the All Programmable UltraScale SoC architecture providing processor scalability from 32 to 64 bits with support for virtualization, a combination of soft and hard engines for real time control, graphics/video processing, advanced power management, and technology enhancements that deliver multi-level security, safety and reliability. All of these improvements have powerful implications for next-generation driver assistance systems. You can see ARM’s philburr speaking to Larry Getman of Xilinx at Embedded World about the new release.
The new Zynq MPSoC has gone through a rigorous planning and validation process to ensure no compromises have been made on security, safety and reliability. The processing sub-system includes a dual core ARM® Cortex®-R5 real-time processor for deterministic operation, ensuring responsiveness, high throughput, and low latency for the highest levels of safety and reliability. A separate security unit enables military-class security solutions such as secure boot, key and vault management, and anti-tamper capabilities—standard requirements for machine-to-machine communication and industrial IoT applications. Every millisecond counts on the road, and ARM's 16FF memory compilers generate fast cache instances to ease access to the most critical data and allow the system to perform even quicker. The CoreLink™ CCI-400 Cache Coherent Interconnect is also implemented in the SoC to provide fast communication across the chip and full cache coherency to the processor clusters. In addition, ARM CoreSight debug and trace technology was implemented in the chip’s development to provide on-chip visibility that enables fast diagnosis of bugs and performance analysis. Amongst other things, CoreSight ensures it meets the high quality standards required by ISO 26262.
There are significant enhancements over the previous generation Zynq-7000, with performance increases and power savings. Along with the advantage that comes with multiple processors, it has also integrated the host controller to become the primary computing system for driver safety ECUs. The system intelligence of a more integrated vehicle ECU expands the functional safety capabilities that next-generation automobiles will be able to provide.
The new Zynq UltraScale+ MPSoCs deploy new UltraScale+ FPGA technologies that included enhancements to DSP, transceivers and included new features like UltraRAM memory and interconnect optimisation. This is in addition to providing unprecedented level of heterogeneous multi-processing, deploying ‘the right engines for the right tasks’. At the centre of the processing-subsystem is the 64-bit quad-core ARM Cortex-A53 processor, capable of hardware virtualization, asymmetric processing, and full ARM TrustZone® support. The new MPSoC delivers approximately 4X system level performance/watt relative to previous alternatives.
One of the key aspects of increasing driver safety involves showing what is happening around the vehicle in real time with the use of cameras. Displaying 3D surround view with flying camera requires efficient 3D graphics rendering. For complete graphics acceleration and video compression/decompression, the Zynq incorporates an ARM Mali™-400MP dedicated graphics processor as well as a H.265 video codec unit, combined with support for Displayport, MIPI D-PHY and HDMI.
The combined power of the Xilinx FPGA with the Cortex-R5 and Cortex-A53 processors, along with optimized HW/SW partitioning, allows the Zynq to perform features such as adaptive cruise control, forward collision warnings and autonomous braking for cyclists or pedestrians. Cyclists will be delighted to hear that car manufacturers keep them in mind when designing safety features! Finally, dedicated platform and power management unit (PMU) has been added that supports system monitoring, system management and dynamic power gating of each of the processing engines.
I will conclude with another statistic from the KPMG report I mentioned above, “The economic impact of crashes is also significant. According to research from the American Automobile Association (AAA), traffic crashes cost Americans $299.5 billion annually”. When viewed with this perspective the onset of ADAS and autonomous cars can’t come quickly enough. Thankfully the current rate of technology development makes it likely for them to appear much earlier than the year 2032 as imagined in Demolition Man by sci-fi directors of the 90s. In fact the Xilinx Zynq is a major step towards that, as it greatly increases the features that next-generation ADAS will provide.
Read more about ARM's commitment to ADAS
For more information on the Zynq UltraScale+ MPSoC please visit the Xilinx website
I can put off a lot of new technology solutions using a variety of excuses (it's too expensive right now; it's too early and unproven, etc.) but the self-driving car I would buy in a heartbeat. Of course with my luck, mass adoption will come at a point where I no longer have a long Silicon Valley commute. In any case, innovate away, folks!!