Autonomous Cars

The Buzzword- ‘Autonomous Car’ has been trending in our midst for a while now. Autonomous cars or more familiarly known as ‘Self-driving cars’ are cars that use a combination of sensors like cameras, radars and lidar to perceive the environment. This Perceived Environment is then processed by the Computation Powerhouses inside the Cars to infer meaningful data. Different techniques for data fusion, Computer vision and Artificial Intelligence is used for modules like Object Detection, Lane Detection, and Traffic Sign Detection.   All these features work in sync to drive the car from one destination to the other without any human intervention. Every year globally there are 1.2 million road accidents, so the idea behind creating autonomous cars is to reduce the human error which is the most common cause of road accidents.

SAE Levels of Autonomous Cars

Levels 0, 1 and 2 are where human driver monitors the driving environment.

  1. Level 0 or No automation– Cars that fall into this category are incapable of self-driving. Cars may come with a proximity sensor that can help detect the cars in the surrounding but to stop them it still requires user intervention.
  2. Level 1 or Driver Assistance– If your cars can control itself to some degree based on the information it gathers about its surroundings it belongs to this category. So features like lane assist where your car can auto steer if you are drifting out of your lane or radar based cruise control that automatically slows down and resumes your previous speed depending on what is ahead of you, these are hallmarks of level 1 vehicles.
  3. Level 2 or Partial Automation– It denotes cars that can drive completely on its own in specific situations by both steering and handling the acceleration instead of just one or the other. So if your car has an advanced autonomous cruise control system that can nudge your car left and right or a feature that could squeeze your car in a peaky parallel parking spot, you have a level 2 car. Another feature of this level is that the driver always has to be attentive and ready to intervene.

Level 3, 4 and 5 are where the automated driving systems monitor the driving environment.

Source: Autoblog

4. Level 3 or Conditional Automation– These cars are capable of driving through a normal road and  a specific location according to the destination punched in by a driver. Level 3 cars do not have a meaningful fail safety. They still are supposed to have drivers which can instantaneously take over if anything goes wrong.

5. Level 4 or High Automation– Like level 3, this level uses a human driver as a fail-safe but it is a much more intelligent and can handle much trickier situations. Level 4 cars can be designed to carry a specific task fully or drive within a limited geographic area. It requires a human driver for difficult situations like a complicated merge on a busy highway. The biggest difference between level 3 and level 4 is if a car encounters a situation outside its limits it safely aborts the drive by parking the car safely and stopping till the time the human side takes over completely. Currently, Google’s self-drive Waymo cars are level 4 cars that can operate without anyone being inside but is restricted to a limited geographical location.

6. Level 5 or Full Automation – The holy grail of complete hands-off driving is level 5. A car that is always in complete control of itself with zero intervention from the driver ever necessary or even possible. Many level 5 concept cars do not even have a steering wheel. A level 5 car should be able to handle any condition that a human would and with more safety and precision. Many companies are working on level 5 cars but it isn’t clear when it will become a reality. There are significant legal hurdles with level 5 cars that needs to be clear with government regulations.

 

 

Working of an Autonomous Car

An autonomous car can be divided into few core components which are

Computer Vision– It uses colour, edges and gradients to find a lane on the road and then it is trained to a deep neural network which draws bounding boxes at the other vehicles on the road. Deep neural networks are a part of machine learning or artificial intelligence where computers can learn what other objects look like simply by sending lots and lots of data. They see lots of cars and then understand what cars look like. It sees the number of cars on the road and this system is pretty similar to what advanced drivers do today on the road. They see the world through camera images.

Sensor Fusion- With Camera, Lidar and Radar fitted in the Autonomous car, its often asked why these many sensors are needed. Well these sensors have certain disadvantages and are overcome by other sensors. Like camera doesnt work well in low light visibility, Lidar and Camera dont perform well in Fogy weather while as Radar doesnt get effected.  So we can have a lot of data of a single object can be obtained from different sensors, this sensor information can be combined or fused to have a better understanding of the environment.

Localisation– GPS is only accurate about 1-2 meters and that is a large error in terms of road management. So we need a much more sophisticated mathematical algorithm and high definition maps that will localise our error up to single digit centimeters. To achieve this accuracy a particle filter is used which is a very sophisticated type of triangulation where the vehicle moves in the real world, its distance is measured from various landmarks and it’s figuring out how far it is from these landmarks by comparing it to the map using that to figure out where is it in the world. Landmarks can be street lights or traffic signs or mailboxes or even manhole covers.

Path Planning– Once we have figured out where we are in the world and how the world around us looks like the next step is to chart a path around us to figure out how to get where we want to go and this is path planning. The self-driving cars predict where the other vehicle is supposed to go and then what manoeuvre the respective car should take in response. After that, it builds a series of waypoints which are green points which the car should follow. The car builds different kinds of points according to the decision it is supposed to take like turn right or left, slow down, etc.

Control– The final step in the pipeline is control after knowing all the above details. Control is actually how we turn the steering wheel, hit the throttle and apply the brake in order to execute that trajectory that we built during path planning.

Sensors in Autonomous Cars

  1. Ultrasonic Sensors– Ultrasonic sensors are used in back-up warning and advanced driver assistance systems (ADAS). They work on ultrasonic sound waves and the time taken by the wave to return is used to calculate the distance. They are used to detect obstacles in the vicinity like another vehicle, pedestrian or a sign board. It calculates the distance of the obstacle on the basis of time of flight.

d = ½ c t

c = c0 + 0.6 T

c0 = 331 m/s

T- Temperature in degrees Celsius

c- Speed of sound

d- Distance of the object

Advantages

  • Cost-effective
  • The colour and the transparency of the objects does not matter while using an ultrasonic sensor
  • Dark environment does not affect the reading of the sensor
  • It is great for mid-range detection of objects
  • Easy interface with controllers

Limitations

  • The objects which are covered in a very soft fabric absorb more sound waves making it hard for the sensor to see the target.
  • The accuracy of ultrasonic sensors vary with the change in temperature. With the alteration in 5-10 degrees of temperature the sensitivity is affected. Although now there are sensors which are calibrated with immunity to voltage and temperature.
  • The maximum sensing range of the ultrasonic sensor is up to 10 meters which is a great disadvantage in long range sensing.

 

  1. Image Sensors– They are used to capture the image of the surroundings of the car and produce it towards the driver. Stereo cameras can produce a 3-dimensional image for the long-range detection.

Advantages

  • Can detect colour and font
  • Cost Effective
  • Can act as a backup system
  • Increases the security of the vehicle

Limitations

  • Massive amounts of data is used to process the image
  • Computationally complex algorithm
  • Weather limitations reduce the accuracy
  • Reach of the camera should be improved for anticipatory driving
  • Recognition algorithms should be improved

 

  1. Radar Sensors– Radar stands for Radio Detection and Ranging. They emit radio waves which when reflected from the obstacle indicates velocity, distance and the angle of the approaching object.

Advantages

  • Works in all weather conditions
  • High accuracy for speed, distance and angular resolution measurements (Radar of the 77GHz)
  • Lower interference problems (Radar of frequency 24 GHz)
  • Power loss can be controlled

Limitations

  • Cost ineffective
  • 2-Dimension radars cannot detect the height

 

  1. LiDAR sensor- LiDAR stands for Light Detection and Ranging. It scans the environment with invisible laser beams and creates a 3-Dimensional Map of the surroundings. It is used for obstacle detection.

Advantages

  • Faster and accurate data acquisition
  • Is not affected by the weather
  • Can scan more than 100 meters in all directions

Limitations

  • Data generated is enormous
  • Quite expensive for OEM (Original Equipment Manufacturer) to cheaply implement
  • In LiDAR constant beam should be used over flash

 

  1. Cloud– It’s a dynamic electronic horizon which offers real-time map data that vehicles draw on. Constantly updated by an intelligent system of vehicles, for example, informing of closed lanes and defective traffic lights. Therefore it is a type of input.

Advantages

  • Has the largest range for detection of automation
  • Increases safety

Limitations

  • Map is not precise enough for highways and rural areas
  • Only possible once a sizeable amount of cars are connected

 

Environmental Conditions for Autonomous Cars

Winter– The winter conditions can be harmful for autonomous driving. According the FHWA report 24 percent of weather-related accidents occur on snowy, slushy and icy pavement. The following are some factors that affect the road accidents in an adverse situation-

  • Wind speed

    Source: Youtube
  • Precipitation
  • Fog
  • Pavement temperature
  • Pavement conditions
  • Standing water

The LiDAR sensor which is present in these cars is not capable of sensing snow covered roads.

Rains– LiDAR gives us a more detailed picture of the world by creating a 3-dimensional picture of the space around it and sending numerous infrared radiations out in the world. The process of measuring how long a beam takes to come back after it ricochets from an object, can cause erroneous judgement by severe rains. Imagine the LiDAR beams are out and it’s raining heavily and the beam hits a drop of water and reflects back, now an autonomous driving system will think it’s an obstacle and will apply the brake. In severe rain even the camera could not provide us with a clear image. Hence rains are a big hindrance to autonomous driving.

Solar Storms– Solar storms is a large explosion in the sun’s atmosphere caused by the disturbance in the magnetic energy of the sun. These solar storms usually don’t bother us on earth but can affect our satellites in space thereby massively affecting our GPS system. These storms come in a cycle of 11 years last it occurred was in 2014 so the next solar storm is predicted by 2025. It is the same time when autonomous cars would be in fashion so the makers of autonomous cars should take care of the solar storm and the GPS while designing the cars.

The DARPA grand Challenge– The first large-scale competition of autonomous ground vehicles where the vehicles demonstrate various concepts of autonomous driving under competitive harsh and non-cooperative driving conditions. On March 13, 2004 the challenge was held in the Mojave Desert region of the United States. The vehicles had a set of sensors for detecting obstacles ahead, it seemed that the software that made use of the sensing information was not yet ready for this task. The requirement to win this challenge was to drive at an average speed of 25 mph but due to the lack of spatial data (landmarks) the path could not be planned well for the vehicle. Since speed was the criteria to win this challenge and the range sensors (LiDAR and Stereo Vision) were operating in limited safe ranging distance so the speed picked up was quite slow eventually no winner was declared.

 

Autonomous Cars in Emergency Situations

Imagine you are suffering from a very serious disease and you call for an ambulance and you have to be rushed to the hospital. With the sensors attached to the ambulance, your  car with all autonomous capability will drive at its requisite seeing its surrounding and will safely take you to the hospital, but you don’t have that much time to be rushed to the hospital. So in that case what would you do, would choosing an autonomous car be a terrible decision?

The makers can try introducing an emergency button in the vehicle which will rush in such situations but in that case it might break a lot of traffic rules and can cause a threat to the public on road. Such emergency situations are pre-programmed in the vehicle but how do you explain an autonomous car the concept of emergency without violating any traffic rules or without affecting any pedestrian or car on road? This remains a challenge for the makers to solve.

Joshua Neally, a lawyer and Tesla owner from Springfield, Missouri. He uses the semi-autonomous driving system called the Autopilot on his Tesla. On being interviewed by an online magazine Slate, he said Autopilot drove him 20 miles down a freeway to a hospital. Neally suffered a potentially fatal blood vessel blockage in his lung, known as a pulmonary embolism. The hospital was right off the freeway exit, and Neally was able to steer the car the last few meters and check himself into the emergency room, the report said. In this case the man did not die because the vehicle was autonomous.

Chrysler Pacifica
Waymo Autonomous Car (Source: Wired)

Waymo, teaches its autonomous cars to detect emergency vehicles. They used a minivans upgraded sensor which includes an audio detection system designed in house and an upgraded LiDAR and vision system, which are capable of seeing emergency vehicles and their flashing lights. Waymo already taught its car to detect a siren and now by being able to point out the direction of the sound it can move aside to let them pass first.

 

Companies testing Autonomous Cars in harsh conditions

Ford is one of the first companies to test driverless vehicles in the snow, and this took place on icy roads at MCity, where a real-world environment was simulated in the University of Michigan. They have tested the cars in varying degree of harshness of winter weather with pedestrians, cyclists and the normal traffic being operational.

French automaker, PSA group would partner with NuTonomy, a Massachusetts startup owned by Delphi spinoff Aptiv to test self-driving in tropical Singapore one of the most congested cities in the world. BMW has also applied to test its autonomous cars in China as it’s unarguably the world’s largest market.

Waymo, undoubtedly is one of the most advanced automakers in building autonomous cars. Waymo has been testing vehicles in Texas, Arizona, Washington, Nevada, California and Michigan They have tested their cars in all sorts of conditions be it sun or snow. Waymo has been conducting cold weather testing since 2012 around the border of Lake Tahoe.  Car Mapping is one of the main challenges faced by Waymo during these conditions.

Waymo then took their car to Death Valley, CA one of the hottest places on Earth. The self-driving car Chrysler Pacifica Hybrid minivans is equipped with all the latest sensors. Waymo concluded that their cars are road ready for these extreme conditions. With full air conditioning these cars were able to support themselves in the middle of desert.

Conclusion

Undoubtedly by virtue of upgradation in technology Autonomous Cars soon are going to be more commercially available in the coming years. Autonomous Cars still have a social stigma regarding their acceptance in the world today which it needs to battle for mass replacement of ordinary cars. The companies claim to resolve the technological issues very soon. These cars still have to overcome the legal proceedings. In the long run, autonomous cars, will achieve its motive of reducing road mishaps proving to be a boon in the automobile industry.

Visit Homepage for latest updates and Technical articles: Automotive Electronics

Visit  our forum to discuss : Automotive Electronics Forum

This article was written by Aditya  and Eeshan for any correction or guest article mail us at [email protected]

 

 

 

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.