The automotive supplier ZF Friedrichshafen has significantly expanded its technology portfolio for autonomous driving and showed what is possible with it today at CES 2019 in Las Vegas. The ZF ProAI RoboThink, the most powerful AI-capable supercomputer, which is designed to meet all requirements for automotive use, celebrates its world premiere. In combination with a comprehensive set of sensors, the ProAI platform can also analyze and react to complex traffic situations in real time, enabling autonomous mobility as a service applications, as the company demonstrated live in a ride hailing application with a Robo-Shuttle. Fully networked system solutions based on the ZF cloud connect vehicles with the Internet of Things - and thus also with customer applications such as payment systems or ride-hailing offers.
An autonomous Robo-Taxi with ZF technology provided an outlook on new forms of mobility. The recently announced cooperation with the French mobility provider Transdev at CES underscores the market readiness of the company's own solutions and its willingness to shape the next generation of mobility.
"Today, we present the ZF ProAI RoboThink, the most powerful AI-enabled supercomputer currently available in the mobility industry," said Wolf-Henning Scheider, CEO of ZF Friedrichshafen. "With its unique concept in terms of flexibility, modularity and scalability, our platform accelerates the development of new mobility concepts to move people and goods autonomously.
AI Platform RoboThink with up to 600 TOPS
A performance of up to 600 trillion computing operations/s (600 TeraOPS) will put the ZF ProAI RoboThink at the forefront of supercomputers that meet all automotive requirements. However, 600 TOPS of computing power requires four interconnected units from the scalable, modular RoboThink platform, which can drive power consumption in the 800-1000 W region. The significant increase in computing power comes from the use of the NVidia dGPU "Turing", which supports Nvidia's SoC "Xavier". The Turing can also be found on the Pegasus boards of the graphics chip manufacturer in double version.
Thanks to this computing power, the control box is able to process the data stream from internal and external sensors, cloud-based input, and car-to-X communication in real time. This is enough to safely operate autonomous vehicles from level 4 on in public transport - a prerequisite to support future application scenarios such as ride-hailing services; from designated areas with fixed routes such as company or campus areas to significantly more complex environments in public transport.
As part of this development, ZF also presented its own software stack for new mobility concepts at CES. Together with the ProAI RoboThink and a comprehensive set of sensors, ZF can thus offer a fully integrated system for autonomous vehicles that can be adopted and used by new mobility providers.
The computing power of the ZF ProAI RoboThink, which is decisive for mobility as a service solutions, and the approach of open, flexible, modular, and scalable configurability of this platform as a whole frees customers from predefined or closed hardware and software combinations. Other advantages include not only the freedom to use chipsets from different manufacturers, but also the ability to adapt the software to the individual needs of customers. At the CES, Scheider announced a cooperation with Xilinx, a manufacturer of programmable logic components and devices, regarding the ProAI platform. Initially, ZF will use Xilinx's Zynq UltraScale+ MPSoC platform for real-time aggregation, pre-processing, and distribution of data, as well as providing computer acceleration. In addition, the Versal product family recently introduced by Xilinx is expected to replace the Zync Ultrascale+ MPSoCs in the medium term.
Nvidia Drive Autopilot First on ZF ProAI
The performance and flexibility of the ZF ProAI convinced Nvidia to choose ZF as one of its preferred partners for the introduction of the new Level2+ Nvidia Drive AutoPilot. As ZF ProAI series production will begin within the next twelve months, it is the only automotive AI-enabled supercomputer capable of meeting Nvidia's ambitious schedule for the launch of the Drive AutoPilot from the outset. Scheider commented: "We currently offer the advantage of offering an automotive supercomputer that will soon go into series production. Our open, flexible, modular, and scalable ZF ProAI product family enables individually tailored configuration for a variety of applications across all levels of automated driving.
e.GO People Mover Comes in 2019 and First Customer Announced
Another CES highlight from ZF is the e.GO People Mover, which is being developed together with the German start-up e.GO Mobile and distributed via the joint venture e.GO Moove. Production capacity in Germany is now being expanded with the goal of producing five-digit quantities as early as 2019.
The concept and availability of the e.GO People Mover have aroused great interest among mobility providers and cities worldwide - at the CES, ZF, and e.GO Moove also announced their first customer: Transdev is one of the leading international mobility providers with 11 million customers daily. ZF will work with Transdev to further expand its mobility as a service business based on the e.GO People Mover.
Yann Leriche, CEO of Transdev North America and Head of Autonomous Transportation Systems, commented: "We believe that public transportation will be the first segment to develop truly autonomous services for the public. The cooperation with ZF and e.GO is an excellent opportunity to complement our existing mobility solutions with new autonomous vehicles in order to offer our customers even better solutions".
The current state of the art is demonstrated by the innovative ride-hauling vehicle presented at CES: a fully functional technology carrier that covers ZF's complete system approach for "Next Generation Mobility" solutions.
Without steering wheel or pedals, the vehicle combines sensor technology, computing power, networking, mechatronic actuators, and safety systems under one roof and thus provides all essential elements for autonomous ride-hailing. In addition to the ZF ProAI platform, the test vehicle integrates six radar and six lidar sensors as well as nine cameras.
Thereby, ZF is lowering the entry threshold for new automotive customers who want to develop Robo-Taxis. "Already today, our system solutions for the ride-hailing van or the e.GO People Mover show how we are shaping the mobility of the next generation," Scheider explained. "It is even more important that the technologies used in the vehicles take the automotive industry a big step further on the road to clean, safe, and affordable individual and public mobility for all.
From 3D Full-Range Radar to Solid-State Lidar
ZF has developed its own set of environment sensors that enable automation. "As system architects for autonomous driving, we have developed a sensor set that equips cars with all the senses necessary to digitally perceive their environment," explained Torsten Gollewski, Head of Predevelopment at ZF and Managing Director of Zukunft Ventures GmbH. "It can determine and process the environmental data very accurately and redundantly in real time, which is elementary to enable safe automated driving functions.
The ZF sensor set includes the latest generation of radar, cameras, lidar, and acoustic sensors, as well as - in the software area - tools and algorithms for detecting and classifying objects and controlling the vehicle, which are managed by the central computer ProAI. The entire architecture is designed for demanding automotive requirements, such as extreme temperatures and vibrations. The sensor systems are especially important for meeting future safety requirements (e.g., according to NCAP).
ZF's high-resolution full-range radar installed at the front of the vehicle offers outstanding detection performance in the four dimensions speed, distance, side angle, and height. This powerful 77 GHz system is designed for the most advanced ADAS applications, for automated and autonomous driving. Like other radar systems, it emits electromagnetic waves to determine the distance, angle, and speed of objects using the waves that were reflected off that object. In addition, the high-resolution sensor can make precise height measurements and capture a three-dimensional image of the surroundings. This also works in bad weather, bad light, and poor visibility conditions.
Lidar sensors based on lasers, together with software tools, can also create an accurate 3D model of the vehicle environment. Thereby, they can help to detect objects and free space very precisely - and thus also complex traffic situations under almost all lighting conditions. The new, high-resolution solid-state lidar sensors, which ZF is developing together with its subsidiary Ibeo, can also observe pedestrians and small obstacles three-dimensionally. This plays an important role for highly automated driving from level 3 and higher. The absence of moving components (solid-state) makes this innovation much more robust than previous solutions. Thanks to their modular design with selectable field of view options, these LiDAR systems are suitable for a wide range of applications.
The S-Cam4 from ZF stands for the further development and expansion of the S-Cam portfolio. With a 100-degree viewing angle and a 1.7-megapixel image sensor with high dynamic range (HDR), the camera recognizes pedestrians and cyclists in urban environments. At the same time, it can be combined with longitudinal and lateral control algorithms for adaptive cruise control (ACC), automatic emergency braking (AEB), lane keeping assist (LKA), and other functions.
In addition, remote camera heads, which fit into very small housings, help to capture the entire immediate environment around a vehicle and stream it to the driver via video, or to categorize the existing objects. Up to twelve cameras can be combined to create a 360-degree image of the vehicle environment. Each remote camera has a sensor resolution of between 1.2 and 8 megapixels and a field of view of between 28 and 195 degrees. This allows a multi-camera system to be adapted to the customer's specific requirements.
Highly automated driving will expand the passenger's freedom of movement in the vehicle interior. A 3D interior camera from ZF can offer new comfort and safety advantages. As part of the ZF Interior Observation System (IOS), it enables acquiring real-time information on the size, position, and posture of passengers. This helps to adjust the various safety components of the vehicle in an emergency in such a way that the consequences of a collision are mitigated as far as possible.
Driver monitoring systems will also play a very important role in transfer scenarios between humans and the autopilot: The IOS can also be used to determine whether the driver has his hands on the steering wheel, whether he is actively steering the vehicle, and whether his eyes are on the road.
ZF uses Sound.AI to help make it possible for cars to hear: Among other things, this system analyzes the sound of emergency vehicles’ sirens and recognizes from which direction itthey are approaching (siren detection). The system can also provide the driver with important information via the display - including recommendations for action such as "drive to the right" or "form a rescue lane". Fully automated vehicles from level 4 can perform these maneuvers themselves.