Timeline: Driverless Car Deaths
Driverless driving is real, so are the mishaps associated to it. This article is to bring the timelapse of the accidents happened with these piloted systems. The intent of this article isn’t to prove that technology is bad but to aware people that you have to understand the level of autonomy provided by the vehicle manufacturers and then put the trust. With Tesla not considered as a level 3 vehicle its mandatory to be aware of the surroundings and be responsible for the decision taken by the car as it is the responsibility of the driver to correct it considering the level of autonomy of the car.
20th January 2016
Gao Yaning, a 23-year-old man died after his Tesla Model S crashed into a parked truck. The footage above is caught by the car’s dashboard camera. The camera recorded no images, sounds or jolts that would suggest the driver or the car hit the brakes before impact. The incident happened on a highway near Handan, a city about 300 miles south of Beijing.
The company stated there is no way to determine whether the car was in autopilot mode or not during Gao Yaning’s accident. The Chinese Police concluded that Gao was at fault for the accident and the emergency brakes were not applied before the accident. Victim’s family believes that the car was in autopilot mode. As a result, the victim’s father filed a lawsuit against Tesla and its authorized dealer in Beijing, with the case going to trial on September 20, 2016.
Tesla in its statement said it, “tried repeatedly to work with” Gao’s family to determine the cause of the crash, but the family “has not provided us with any additional information that would allow us to do so.”
In defence, Gao Jubin, the victim’s father, responded with, “The car is still there, and the data can still be extracted. A consumer can’t read the data, but Tesla could read the data”. “When it was approaching the road sweeper, the car didn’t put on the brake or avoid it. Instead, it crashed right into it,” a police officer said in the CCTV report.
Initially, Gao sued Tesla for 1 Yuan to make public awareness but later on increased his price to 10,000 yuan. Car’s manual does say that the drivers should be vigilant on the road because autopilot can have trouble identifying obstacles, especially on highways. So it is certain that Gao wasn’t obeying the protocols listed.
This frees Tesla from the accusation against them, but the way Tesla has marketed the product has created a lot of questions in China. The company issued an apology for the way they promoted the feature and has changed Tesla’s Chinese website, to ‘automated assisted driving’ from ‘self-driving’.
According to China Central Television (CCTV) reports, they confirm that the autopilot mode was in use but any substantial evidence is not provided.
7th May 2016
In Williston, Florida, Joshua Brown who was driving his Tesla died. This accident took place just months after the one in China. The case became the first known case of a car accident where a partly autonomous vehicle was involved. This accident rose a lot of questions on the safety of self-driving cars.
A tractor-trailer turning left in front of a Tesla Model S led to a collision. It was a white coloured vehicle approaching from the other end. On a brightly lit day, the sensors of the Tesla Model S failed to detect the truck and ended up in a collision.
The witness to the accident said that the driver was watching a “Harry Potter” movie while driving. Tesla says there is no way it is possible as no other video is displayed on the infotainment screen while the touch screen is activated. It is highly possible that Joshua could have been using another device.
Even while using autopilot mode the company says that both the hands need to be on the steering. The company stated that this is the first fatality in 130 million miles. The National Transportation Safety Board report has detailed information on the subject.
Brown’s Tesla visually warned him 7 times and audibly warned him 6 times to put his hands on the wheel before the crash. The journey was for 41 minutes out of which the car was on the autopilot mode for 37 minutes. Out of those 37 minutes Joshua just had his hands on the steering for just 25 seconds.
An investigation earlier found out that Tesla’s autopilot feature did not have any defect. There were no efforts made by the driver to brake, steer or avert the accident. The car was driven at the speed of over 70 miles per hour. 2 minutes before the crash, Brown manually increased the speed of the vehicle.
After the crash, Tesla has updated its autopilot feature to include a strikeout system. Drivers who repeatedly ignore safety warnings risk have their autopilot disabled until the next time they start the car.
18th March 2018
Elaine Herzberg is the first case of a pedestrian being killed by an autonomous car. Herzberg was cycling in Tempe, Arizona when she was struck by an Uber taxi. The taxi was operating autonomously with a driver sitting on the driver’s seat for safety at the time of the crash.
The driver was identified as Rafaela Vasquez who was clearly seen not concentrating on the road. There has been a charge sheet filed against the driver. Having said that, it can also be seen that the driver had very little time to react. Vasquez told the police officials that she had been “monitoring the self-driving system interface”.
The Uber car had identified the pedestrian but failed to apply brakes. The vehicle was travelling at a speed of 38 mph at the time of the crash. Lidar and radar sensors detected the pedestrian 6 seconds before the crash. Due to a software bug, it saw the pedestrian as ‘false positive’, which are objects on which the car can run over, for example, a plastic bag.
Uber car was a Volvo XC90 SUV. 1.3 seconds before the crash the vehicles computer detected emergency brakes were needed. The automatic emergency braking system in the car was disabled by Uber to avoid erratic driving experiences.
Immediately after the crash Uber suspended the testing of all its self-driving car programs in Tempe, San Francisco, Pittsburgh and Toronto. The company is also undergoing an internal safety review. On May 22 Uber stated that they are permanently ending tests in Arizona. This decision affects about 300 test drivers in the area.
It is also found out that the number of Lidar sensors, crucial hardware in autonomous driving was reduced by Uber. Lidar present in the car was a LiDAR sensor HDL 64 and the number was just one! In doing that Uber introduced a blind zone around the perimeter of the SUV that cannot fully detect pedestrians, according to interviews with former employees and Raj Rajkumar, the head of Carnegie Mellon University’s transportation center who has been working on self-driving technology for over a decade.
To know more about this case click here.
23rd March 2018
A 38-year-old Apple engineer, Walter Huang met with a fatal accident at Mountain View 101S Highway, California. The driver was driving a Tesla Model X and the autopilot mode had been activated prior to the crash.
This arose a lot of questions about the autonomous driving safety. His family hired a law firm for legal options against the electric automakers. Tesla Model X collided in the concrete barrier followed by a fire which completely destroyed the first half of the vehicle.
The condition of the car after the crash
“The driver had received several visual and one audible hands-on warning earlier in the drive,” Tesla wrote. “The driver’s hands were not detected on the wheel for six seconds prior to the collision.” Tesla said the driver had “about five seconds and 150 meters of unobstructed view” of the lane divider before the fatal crash. Tesla has admitted that the autopilot is an imperfect system but far better than the conventional one.
In the victim’s defence, his brother said Huang complained 7-10 times that the car would swivel toward that same exact barrier during Autopilot. Walter took it into dealership addressing the issue, but they couldn’t duplicate it there.
Tesla also stated that
“None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.”
Tesla has encountered 3 deaths and they certainly are losing trust in the market. Due to this reason the company’s shares are 22% down.
Although Tesla has been claiming the autopilot mode is not fully autonomous and human intervention would be required while driving, their advertisement does not highlight the key points in driving. Tesla has a 190 page manual on autopilot but would the consumers read it? In this case, Tesla should be vigilant and careful enough to advertise their product with utmost intelligence.
In all of the above tragic deaths, one thing is clearly understood this technology is still not mature. We are a long way to go to Level 4/5 till that time it will be always driver who will have to take over incase the car fails to interpret.
Check this informative video if you are still not convinced.
Visit the Homepage for latest updates and Technical articles: Automotive Electronics
Visit our forum to discuss or doubts: Automotive Electronics Forum