What is ADAS?
ADAS promises to make driving safer for motorists & pedestrians with features like collision avoidance & adaptive cruise control. Learn more in NI’s guide.
The appeal of autonomous vehicles is easy to understand: a car or truck that can safely drive from point A to point B without human intervention frees people up to read a book, take a nap, or just enjoy the scenery.
The reality of self-driving cars is more complicated. Automakers are unwilling to sacrifice safety in the pursuit of the convenience provided by self-driving cars and ensuring safer roads while working to reduce and eliminate collisions, injuries, and accidents is a big priority for all involved. Currently, automotive engineers are working to tackle stubborn problems that threaten to halt progress on this promising technology. NI is doing its part by providing the hardware and software solutions needed to validate the systems that will drive autonomous vehicles forward.
Autonomous vehicles have the capability to navigate from one location to another seamlessly, all without requiring human input. A fully self-driving car or truck not only operates without a human driver but can also efficiently manage control through a combination of advanced sensors, sophisticated computers, and seamless connectivity with other road users and the surrounding environment, including infrastructure. The integration of artificial intelligence (AI) also plays a pivotal role in helping these vehicles make informed decisions when encountering both routine and unexpected road conditions and obstacles.
Safety stands as the non-negotiable bedrock of autonomous driving innovation. Automakers must ensure that self-driving vehicles adhere to safety standards and regulations through rigorous testing to ultimately usher in a societal shift toward widespread consumer trust and adoption of driverless vehicles.
There are no self-driving cars in commercial production as of 2023. However, many modern vehicles are equipped with advanced driver assistance systems (ADAS) that offer some level of automation. Common ADAS features include forward collision monitoring, automatic braking, blind-spot monitoring, adaptive cruise control, lane departure warning systems, just to name a few. In the case of automatic braking the vehicle can apply the brakes itself when it determines a collision is likely.
SAE International, formerly the Society of Automotive Engineers, has created a classification system that sorts autonomous driving capabilities into six levels, SAE Level 0™ to SAE Level 5™.
Levels 0 to 2 encompass ADAS systems, where a human is still driving the vehicle. Levels 3 to 5 refer to what are commonly thought of as self-driving abilities. Level 3 autonomy may require occasional driver intervention upon the vehicle’s request, but levels 4 and 5 allow for fully autonomous driving.
There aren’t any fully autonomous vehicles on the market. According to Engadget, as of early 2023 Mercedes announced it received Level 3 certification from the SAE, the first auto manufacturer to achieve this feat.
While driverless vehicles aren’t yet a reality, many of the sensors and systems needed for autonomous transportation are in development or deployed in ADAS. Self-driving cars will depend upon automotive LiDAR, radar, cameras, artificial intelligence software, powerful computers, V2X connectivity and other cutting-edge technology to operate in the future.
V2X, or Vehicle-to-Everything, enables vehicles to exchange real-time data with other vehicles, infrastructure, pedestrians, and the environment. This constant information flow enhances situational awareness, allowing cars to anticipate and react to potential hazards beyond their line of sight. Whether optimizing speed for green lights or sharing sudden braking actions, V2X serves as the digital nerve center, fostering safer, more efficient, and interconnected transportation systems.
Short for Light Detection and Ranging, LiDAR uses lasers to identify and judge the distances to objects. The onboard computer creates a map of the scene (point cloud), enabling the car to “see” other vehicles, people, animals, and obstacles to drive safely.
Modern vehicles can use radar for many features, including blind-spot detection and automatic parking assist. A sensor transmits and receives radio waves to identify objects. If an obstacle is nearby, it can stop or take evasive action to avoid a collision.
Visual-light cameras see the world in much the same way as humans, so they’re often used to identify traffic signs, lane lines, and other markings that would be difficult or impossible for LiDAR or radar to capture. Cameras are crucial when it comes to classifying objects into categories like cars, trucks, pedestrians, bicyclists, and many more.
AI and machine learning algorithms are essential to interpreting sensor data for the safe operation of self-driving cars. Tasks that are simple for humans, such as differentiating between a scrap of rubber in the road or a small animal, require programming and continual training for machines. Unexpected situations like a ball bouncing across your path (and anticipating what might still follow after the ball) or a traffic accident blocking the way must be dealt with carefully as a mistake on the part of the computer could lead to injury or death.
Autonomous driving requires high-performance computers that can collect, process, and respond to sensor data quickly. There isn’t time for latency when an oncoming vehicle swerves into your lane at speeds of 60 mph (~100km/h). While automotive computers are nothing new, those used in self-driving vehicles must be able to make decisions at the same speed—or faster—than humans.
Here are some of the reasons we don’t have self-driving cars yet:
When it comes to handing full control from human drivers to a car or truck, failure truly isn’t an option. Autonomous driving test and validation, including a focus on scenario simulation, is necessary to bring self-driving vehicles to market. NI’s hardware-in-the-loop (HIL) testing is used for evaluating LiDAR, radar, cameras, and associated vehicle sensors and electronics. Work is already well underway with companies like Jaguar Land Rover, which has partnered with NI to address myriad challenges.
Since autonomous driving sits at the intersection of automotive design and technology, success often means collaboration with many companies. Self-driving or ADAS capabilities require a constellation of semiconductors to work—more than needed in traditional cars and trucks. This has led microchip suppliers, OEMs, and Tier 1 partners to rethink semiconductor supply chains and test frameworks to ensure their efforts are scalable and effective.
As we advance into the era of electrification and heightened ADAS/AD features, the role of semiconductors in reshaping the automotive landscape becomes increasingly pronounced, necessitating ongoing innovation and cooperation among multiple stakeholders to meet evolving business, regulatory, safety, and customer expectations. This collaboration will ultimately hasten the day when humans can take their eyes off the road and enjoy a safe drive across town or from one end of the country to the other.