At CES this year, ZF is highlighting its powerful sensor portfolio which can more accurately detect vehicle surroundings – both inside and outside of the vehicle – and thereby help to enhance the safety of conventional and automated vehicles. The resulting architecture – including a new, full-range radar, solid- state LiDAR, innovative cameras and acoustic sensors – are combined with supercomputers from the ZF ProAI product family to create a powerful overall sensor system.
“As system architects for autonomous driving, we’ve developed a sensor set which equips cars with all of the necessary senses to digitally detect their surroundings,” said Torsten Gollewski, head of Advanced Engineering at ZF and general manager of Zukunft Ventures GmbH. “Our systems can more precisely and redundantly enable real-time sensing and signal processing of the surrounding environment, which is essential to help enable safe, automated driving functions.“
The sensor set comprises ZF’s latest generation cameras, radars, LiDAR and acoustic sensors, and – in terms of software – tools and algorithms for detection and classification, and vehicle control, that are hosted in the ZF ProAI central control unit. The entire architecture is designed to
address demanding automotive requirements including extreme temperatures and vibrations. These highly advanced sensor systems are also important in helping to comply with future safety regulations and consumer safety ratings (e.g., NCAP).
Superior vision
Fitted to the front of the vehicle, ZF’s high-resolution Full-Range Radar features superior detection performance in the four dimensions of speed, distance, angular resolution and height. This high-performance 77-GHz sensor is designed for premium ADAS applications, and highly automated and autonomous driving (Level 3 and higher). Like other radar systems, it transmits electromagnetic (radio) waves to target and determine the range, angle or velocity of objects (echo principle). The high-resolution sensor, however, can also more accurately measure height to create a three-dimensional view of the environment. The radar can work – even in most poor weather, low light and bad visibility conditions – similar to ZF’s Medium-Range Radars, which provide a range of ADAS functions.
Shedding light on automation
Combined with software tools, LiDAR sensors based on laser technology can also create a more accurate 3D model of the vehicle’s environment. They can help to better recognize objects and free space – including complex traffic situations, and in virtually all lighting conditions. The new, high-resolution Solid-State LiDAR – which ZF is developing together with IBEO – can also better detect pedestrians and small objects in 3D. This plays an important role for highly automated driving at level 3 and above. The solid-state technology makes this innovation much more robust than previous solutions. Due to its modular design and field-of-view options, the sensors are suitable for a wide range of applications.
Vehicle cameras
ZF’s S-Cam4 highlights the further development and expansion of the S-Cam portfolio. With a 100-degree field-of-view and a 1.7-megapixel High Dynamic Range (HDR) image sensor, the technology offers high performance when it comes to detecting pedestrians and cyclists in a city environment. The cameras can also include ZF’s advanced longitudinal and transverse control algorithms for Adaptive Cruise Control (ACC), Automatic Emergency Braking (AEB) and Lane Keeping Assist (LKA), as well as other functions.
Remote Camera Heads, which can be installed in very small housings, can help to detect the surrounding vehicle environment and stream video to the driver, or classify objects. It is possible to combine up to 12 cameras to build a 360-degree view of the vehicle’s surroundings. For each remote camera, manufacturers can choose sensor resolutions of between 1.2 and 8 megapixels, and fields-of-view between 28 and 195 degrees. This means that a multi-camera system can be tailored to meet the customer’s specific requirements.
An eye on the interior: safe and comfortable
Highly automated driving will give vehicle occupants more freedom of movement inside the vehicle. A 3D interior camera from ZF can enable new comfort and safety benefits. As part of the ZF Interior Observation System (IOS), it can collect real-time information about the size, position and posture of passengers. As a result, the performance of various occupant safety systems in the vehicle can be adapted in such a way that in an emergency, the impact of a collision can be better mitigated. Driver monitoring will also play a key role in transfer scenarios between human driver and autopilot; the IOS can also determine whether the driver has his hands on the steering wheel, is actively steering the vehicle and has his head facing the road.
Listening for danger
With Sound.AI, ZF also helps enable cars to hear. Among other things, the system analyzes siren signals to determine what kind of emergency vehicle is approaching, and from which direction (siren detection). The system display can also provide the driver with important information including instructions such as “pull over to the right” or “move to an emergency lane“. Fully automated vehicles from Level 4 upwards can independently perform maneuvers like this.
“Our combined sensor power can help to address future requirements from one single supplier. Whatever the weather or lighting conditions are, our environmental recognition systems are designed to work with the level of precision and redundancy required for safe highly automated and autonomous driving,” said Torsten Gollewski. The massive volume of data generated by the radar, camera, LiDAR and Sound.AI systems must be translated into a clear digital environment model. For this purpose, ZF has developed the ZF ProAI product family – the most powerful central computers currently available in the automotive industry.