The most important criterion for autonomous vehicles is safety. To ensure this, a completely safe perception of the surrounding environment must be the first step.

Camera and radar systems are widely used as the conventional active safety sensors for advanced driver-assistance systems (ADASs), however these sensors have limitations. For example, cameras don’t work well under bad ambient light conditions, and radar has limitations in detecting stationary obstacles.

To achieve a full safety perception for autonomous driving, additional sensor types—ranging from underground mapping, thermal imaging, and LiDAR (light detection and ranging), combined with better camera and radar—are going to play a role in ensuring the highest safety standards possible.

For example, both Audi’s A8, and its very limited Level 3 automated driving capability, and the vehicles of Waymo One, an automated ride-hailing service, have used LiDAR. Level 3 autonomous passenger vehicles using the technology are expected to gradually become the industry standard.

That’s the view of Dr. Leilei Shinohara, Co-partner and Vice President of LiDAR developer RoboSense, who says to achieve the required level of safety perception, an automotive-grade multiple-sensor data fusion solution combining LiDAR, camera, and radar is under development.

“Robosense’s components have been reduced from hundreds to dozens in comparison to traditional mechanical LiDARs, which helps to reduce the cost and shortening production time, achieving a breakthrough in manufacturability,” Shinohara said.

He explained that a coin-sized optical module processes the optical-mechanical system results to meet autonomous driving performance and mass production requirements, and it includes artificial intelligence (AI) capabilities.

The RoboSense RS-LiDAR-M1 sensor has its own embedded AI algorithm and SoC (system-on-chip) that transforms the traditional LiDAR hardware, which is solely an information collector, to a full data analysis and comprehension smart-perception systems.

“The AI perception algorithm can semantically segment and detect the obstacles from the point clouds, then further recognize the type of obstacles,” he said. “With the integrated AI perception algorithm, we can reduce the performance and cost requirements for the domain controller within the vehicle.”

 

The ground beneath our wheels

The surface world is an ever-changing landscape of shifting scenery, which has led one sensor specialist to a part of the environment that is less likely to change over time—the world beneath the surface.

WaveSense has designed ground-penetrating radar which “reads” the geological features under the vehicle like the scan of a subterranean landscape, adding an additional layer of redundancy and reliability to the sensor suite.

“The underground is very well suited for map-based localization,” the company’s CEO and Co-founder, Tarik Bolat, explained. “It’s rich in differentiated features, and stable over time—that’s what you’re looking for in any kind of map.”

Measuring about 2 ft (61 cm) wide, 1 ft (30 cm) long, and 1 in (2.5 cm) deep, the plate-like module sits under the vehicle and is built to take a beating.

“It’s quite robust and ardent in relation to sensors that would be in the bumper—you can really beat it up. And it does get beat up because it’s on the bottom of the vehicle,” Bolat said.

In fact, the original application for the technology was developed at MIT Lincoln Laboratory, a military research and development lab, for off-road scenarios. “At least as far as what was made public,” Bolat added.

Because every road in the world has a unique subsurface signature, WaveSense’s ultra-wide-band radar is able to create a map of those subsurface signatures—scanning the ground 126 times per second—from which self-driving cars can navigate.

“You’re really doing localization without regard to what’s going on at the surface level. It doesn’t matter if it’s snowing or foggy or you can’t see the surface markings,” Bolat said. “You’re looking below the surface, so you’re not susceptible to the dynamism of the surface.”

He said that for autonomous vehicles to gain public acceptance and trust—an uphill battle according to a number of recent surveys—it’s critical to prove reliability in any conceivable driving condition or scenario.

“That’s where we have a real role to play, because we allow companies to demonstrate that, hey, it’s OK that there are no road signs or it’s snowing, because we’re not relying on those road-side markings,” he said. “Some of our partners have assumed they can reduce their LiDAR field-of-view because of how good our performance is.”

 

The competition heats up

The next piece of the sensor puzzle also goes beyond the visible world, as Paul Clayton, Vice President & General Manager, OEM and Emerging Markets Segment for FLIR, explains.

The ability to sense thermal infrared radiation, or heat, within the context of an ADAS system or AEB (autonomous or automatic emergency braking) system provides both complementary and distinct advantages to existing sensor technologies such as visible cameras, LiDAR, and radar systems.

For years now, the company has been working on a thermal camera technology for cars that can help reduce accident rates by recognizing living things through body heat, in conditions where other sensors may have trouble scanning the environment.

“What thermal brings to the space is the ability to see living things in all types of conditions—rain, glare, fog, going from a dark tunnel into the light, places where other cameras struggle,” Clayton explained. “If you look at the sensors in cars today, there are gaps in that sensor suite where the visible camera doesn’t work and where the LiDAR doesn’t work. Thermal fills in gaps and adds a redundant feature.”

Though he declined to name specifics, Clayton said the company is working with “a lot of OEMs and a lot of disruptors,” revealing that FLIR’s Tier 1 partner Veoneer has captured a “very large OEM” for a Level 4 autonomous car coming out in 2021 or 2022.

When combined with visible-light and distance-scanning data from LiDAR and radar, thermal data, paired with machine learning, creates a more comprehensive detection and classification system.

Clayton noted that the company has also made starter thermal datasets available to accelerate localized testing of thermal sensors on self-driving systems, which includes classification of five groups: people, dogs, cars, bicycles, and other vehicles.

The company offers a free ADK (Automotive Development Kit) with an initial set of more than 14,000 annotated summer driving thermal images, captured at day and night, and their corresponding RGB imagery for reference.

“We wanted to get people from 0-60 very quickly with a plug-and-play thermal camera with the ADK, and that’s why they came back and saw the benefit,” Clayton noted. “That’s why we did it—we wanted to help enable people to understand the benefit thermal brings. The development kit did its job, and now we’re having people come back to us.”

The company is currently working on refining the technology to make the hardware components smaller, the range longer, shrinking pixel size, and reducing costs as well as optimizing where on the vehicle the module should be mounted—it can’t see through glass.

“For AEB, we’re talking to OEMs about very clever packaging that sits outside the car but is very streamlined,” Clayton said. “We are making sure it’s nice and neat and not obtrusive. You have to be a little careful about the packaging.”

 

Packaging with advanced plastics

To implement large-scale production of integrated sensor housing, German chemical company BASF provides optimized plastics used for radar transmission and absorption that increase the accuracy of radar sensors.

Dr. Thomas Bayerl, Segment Marketing Manager for Automotive E&E applications at BASF, explained that there will be an increasing number of cars with radar sensors in the future and, therefore, vehicles in traffic will constantly be exposed to higher radar noise.

“With our engineering plastics, radar sensors can help reduce this noise,” he said. “The material we produce can absorb the noise by up to 65% or higher. When it comes to the sending of radar, the transmission rate is more than 90%. This is essential for the proper function of the sensor as well as for filtering out the inference signals from other vehicles or sensors.”

He explained that, due to the requirements of the EU and Euro NCAP, radar sensors will become standard in the future, and robust scalability, paired with cost-effective use, will lead to new requirements as well as new types of sensor housings.

The company is currently working with OEMs and Tier 1 suppliers to develop and carry out individual parts testing as well as provide consultancy in design and simulation testing.

Its short- and long-range radar-compatible plastics have high dimensional stability—meaning resistance to temperature and other environmental changes—and are hydrolysis-resistant, meaning they perform well in hot water and steam environments.

Those are two essential elements to protect against potentially harmful and degrading environmental influences. BASF tests the safety of these solutions by using its specialized equipment to measure radar transparency and absorption at high frequencies.

“New areas of application are also being examined, including visible areas, invisible areas, and integration in lighting systems,” Bayerl said. “A wide variety of requirements, in particular transparent materials, also have to be developed.”

He noted a balancing act has to be found between the purely optical properties—in this case radar transparency—and the given boundary conditions, which are specified from the operating environment and processing and include mechanics, weldability, and chemical resistance.

 

Surveying the field

Bobby Hambrick, CEO of AutonomouStuff has a good vantage point from which to survey the entire field of sensor suppliers; the company provides solutions for automated driving to thousands of customers worldwide with the aim of helping enable the future of autonomy.

“The market is not much different than it has been for the last decade in terms of the suite of sensors—the core tech is more or less the same,” he noted. “Most of the tech has some sort of trade off, so you have to use multiple sensor modalities.”

He explained that most automakers are using a combo of ultrasonic, radars, LiDARs, and cameras, not to mention the streaming data that’s coming from a plethora of other sources—which some in the industry find superfluous.

“I think the utopia is a significant reduction of sensors in the future—Elon Musk [of Tesla] believes cameras and radar are enough. It depends on the use case and the market you’re after. In slow zones, for example, or in dedicated lanes in perfect weather, you need a lot fewer sensors,” Hambrick said.

In a mental survey of the current sensor market, Hambrick noted that each sensor type has its own technical advantages and that there are market leaders in each area, from a penetration point of view. He also noted that there is room for LiDAR companies to focus in on certain applications for autonomous functionality.

“The end game is to get it cheaper, and better, and smaller. LiDAR is going all solid state at the chip level, but it depends on the application,” he said. “If we’re talking L5 with no driver, that’s a totally different application than lane centering.”

The market is currently going through an evolution. Radar, for example, started with mechanical scanning, and now they’re all electronic, smaller, and more distributed.

“Not that long ago, everyone had 64-layer sensors spinning on the roof, and now the LiDAR companies are launching chipsets that can be distributed across the car,” he said. “The next step is getting electronic LiDARs and boosting compute power—which is still a bottleneck because of the data processing along with the fusion of putting all this data together.”

Though Hambrick points out that 5G tech will enable the offloading of some of that processing, he thinks sensor compute power will be the next big breakthrough. However, he believes tha achieving better range while keeping power consumption low in a mass-market way will be a big challenge.

As goes the technology, so will the number of suppliers, he said, as sensor suppliers bring costs down, improve reliability, and zero in on the strengths.

“There are 100 LiDAR companies out there right now, and not all of them are going to survive,” he said. “But there’s a lot more to automated driving than Uber driving—from agriculture, to mining, to defense, to logistics types of applications—so the market is big and there’s room for many suppliers, but not as many as there are today. Not if they want to make money, anyway.”

Hambrick counts himself a firm believer in technological progress, and he points out how far sensor technologies have come in a relatively short period of time. However, he is under no illusions as to the enormity of the task OEMs and sensor developers have before them.

“The systems will continuously improve, but full automation is probably one of the most significant engineering challenges of all time,” he said. “It’s going to constantly be learning and growing and getting better and better.”

 

A Continental investment in radar

German automotive supply giant Continental is investing $110 million in the construction of a new plant in New Braunfels, TX, to expand its capacity for the production of radar sensors, emphasizing their place as a key sensor technology beside cameras and LiDARs for future autonomous vehicles.

“Radar sensors are superior in measuring speed and distance to objects surrounding the car including severe weather and illumination conditions,” Karl Haupt, Continental’s Head of Advanced Driver Assistance Systems business unit, explained.

Besides those benefits of radar sensors mentioned above, Haupt said Continental sees “great potential” in all types of sensors, and he expects the demand for sensors to grow powerfully in the next years.

“At the moment, the demand is still mainly driven by safety regulations and consumer ratings like NCAP or IIHS Top Safety Pick,” he noted. “The number and also the design of the sensor setup strongly depends on the specific function to be implemented, be it driving or parking. However, the higher the degree of automation, the higher the number of sensors installed.”

He pointed out that systems for autonomous driving have to be equipped with different sensor technologies to create redundancies on the one hand and to create a precise and reliable environment model to validate sensor data on the other hand.

“That is also why we offer all relevant components for both assisted and automated driving from a single source,” he said. “This includes the sensors for environment detection based on radar, camera, LiDAR, and ultrasound technology.”

In addition, the company also supplies the central control units, such as its Assisted & Automated Driving Control Unit, with the necessary computing power to realize the higher functional scope; this includes software solutions (end-to-end) as well as system integration competence related to the complete system architecture.

In the end, the future of autonomous vehicle technology won’t just be driven by one all-powerful sensor, but rather a collection of instruments working in tandem to see through, beneath, around, and over the immediate environment.

Smaller devices, faster processing, and more refined capabilities are among the most important goals sensor makers are working toward, and as the market grows more crowded—and competitive—the push to provide the most reliable and safest solution will continue to grow.

That’s the sense you get, anyhow.

 

Sensitive sensors drive up repair costs

-by Nathan Eddy

While the addition of sensors is intended to make every automobile safer, fender-benders will always happen, and the placement of those delicate devices on bumpers, side view mirrors, and other vulnerable parts of the vehicle are driving up the cost of repairs.

Those were the results of an AAA study that found vehicles equipped with ADAS systems can cost twice as much to repair after a collision, with the replacement of parts equipped with ADAS sensors potentially adding up to $3000 in extra repair costs.

Greg Brannon, Director of Automotive Engineering and Industry Relations for AAA, noted that sensors are commonly located in the front and rear bumpers, on windshields, and in side mirrors.

“These will likely remain common positions for ADAS sensors for the near term, though any location on the exterior of a vehicle is vulnerable to damage from collision or environmental conditions,” he said.

While previous AAA testing has shown that ADAS offers many safety benefits, minor vehicle damage that affects these systems may be inevitable.

For the vehicles in AAA’s study, the repair bill for a minor front or rear collision on a car with ADAS can run as high as $5300, almost two and half times the repair cost for a vehicle without these systems.

“As is the case with any new technology, automakers will refine design, durability, and placement of these sensors over time,” Brannon said. “However, even if sensors are made more durable, they will likely still need to be repaired or replaced if affected during a collision.”

He noted that another important aspect is the calibration of these sensors and systems; a traditional bumper repair has gotten much more technical because of the sensors behind the bumper.

“The tools and processes used to calibrate these safety systems will need to evolve to match expectations from consumers for long-term ownership costs,” he said.

That point of view was backed up by Mike Croker, Global Repair & Training Product Manager at Chief Collision, who said having to do multiple calibrations on a vehicle adds a lot more time to the repair process, and that time equates to a higher repair cost.

“These calibrations can be very time consuming, in some cases a single calibration may take a couple of hours to complete. That is also assuming the vehicle structure and alignment are correct,” he said. “So even if a vehicle is in a minor collision, a wheel alignment may still be required, which is another step in the repair process that, in the past, wasn’t common on a small impact front-end collision.”

In addition to the cost of repairing an otherwise inexpensive fender-bender, AutonomouStuff CEO Bobby Hambrick pointed out that changing or modifying vehicle components could have ramifications for the validity of the sensors.

Hambrick said that he agrees that putting a bunch of sensors in the bumper can become very expensive when a driver has an accident—and repair centers have “no idea” how to deal with those situations.

“What happens if you modify the vehicle in any way, say wider tires or other types of customization?” he explained. “You validate a safety system based on how it’s designed, and if it changes, it’s no longer valid. You would probably have to set laws around that; otherwise those sensor systems won’t be appropriately validated.”

As Hambrick pointed out, delicate sensors could, in theory, be moved to safer locations—behind the windshield, for example—but the AAA survey also noted windshield damage is especially common, with more than 14.5 million replacements annually.

“If the sensors aren’t properly calibrated, then they won’t work as the OEMs intended,” agreed Croker. “That’s important to understand.”

For example: Let’s say a vehicle has a damaged windshield and the vehicle is equipped with a front-facing camera. A repair shop replaces the windshield, but an ADAS calibration isn’t performed and the car is sent back out on the road with the vehicle’s front-facing camera off by a millimeter.

Now the driver is going down the interstate. The car’s front-facing collision camera sees an object on the side of the road, but because the camera is slightly off, it thinks it’s in front of the vehicle and automatically brakes while doing 60 mph.

“The outcome could be disastrous,” Crocker warned.