First ever fatal crash in a Tesla Model S fuels scrutiny over how we use automated cars

    161

    Ever smarter cars are taking care of more complex decision-making during our drives but we’re still having trouble grappling with the line of separation between machine and man and the technology’s role when accidents happen. Now the first-ever fatal car crash involving a Tesla Model S using Autopilot just brought that line of questioning to the forefront of headlines.

    40-year-old Joshua Brown was killed on a highway in Williston, Florida when his Tesla Model S collided into the side of an 18-wheel semi turning left through his lane. The impact tore off the roof of the car as it slid underneath the semi and drifted onto the side of the highway, hitting two fences and a power pole.

    The crash happened on May 7th but we’re learning about it now because the National Highway Traffic Safety Administration just opened an inquiry looking into Tesla’s Autopilot system.

    The driver of the truck, Frank Baressi, told the Associated Press Brown was “playing Harry Potter on the TV screen” at the time of the crash. And the NHTSA confirmed they found a running DVD player in the car. Tesla’s owner’s manual states drivers must keep their hands on the wheels at all times when engaging Autopilot.

    Statistically speaking, the accident is a testament to the increased safety automated driving systems promise to provide. “This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.”

    Still, this accident raises the question of how we should be using automated driving systems in the first place.

    Tesla’s autopilot feature more like other advanced driving features we see today than the kind of all-purpose smart car we imagine. Just like assisted parking or automatic collision avoidance, it’s supposed to be viewed as an incremental step toward automation which requires the driver’s full attention at all times. Tesla describes it as a kind of advanced cruise control and explicitly states it’s still in beta.

    photo/wikipedia.com
    photo/wikipedia.com

    The NHTSA released a 0-4 scale of autonomy for cars where 0 means your car is dumb as nails and 4 means it can drive by itself. Tesla’s Autopilot falls somewhere between a 2 and 3. But one has to wonder, if a car can seemingly drive itself, is it even reasonable to ask a driver to stay alert at all times? If an impending accident will occur in a matter of seconds, is there enough time for the driver to assess that the autopilot system will not react and take over accordingly?

    Companies pursuing self-driving technology are taking different approaches. For example, Google decided it would only pursue building a completely driverless car in 2012 when it took prototypes on test runs with employees. Even though employees were asked to pay attention at all times, they were filmed taking part in “silly behavior”. The car’s self-driving ability lulled them into a false sense of security.

    “Developing a car that can shoulder the entire burden of driving is crucial to safety,” Chris Urmson, director of Google’s self-driving car program, told the United States Senate. “Human drivers can’t always be trusted to dip in and out of the task of driving when the car is encouraging them to sit back and relax.”

    Meanwhile, Toyota wants to launch a “guardian angel” system where the human does most of the driving and the car only takes over when it senses a collision is imminent. Ford Motor is looking at both level 3 and 4 cars, according to Ken Washington, vice president for research and advanced engineering.

    Still, the task of building a level 4 technology that could handle all the complicated real-life scenarios possible on the road is an enormous undertaking and Tesla is choosing to release a feature that already has the capability to improve the quality of the drive today.

    Joshua Brown himself was a Tesla Car enthusiast who posted numerous Youtube videos of himself driving in Autopilot. The former Navy SEAL and technology company owner had even posted one video showing Autopilot narrowly avoiding an accident when a truck merged into his lane without looking.

    Following the crash, various experts have provided their own opinions on the technology. While opinions varied on Tesla’s Autopilot, there was a clear unison that automation is here to stay and we should put a focus on improving its performance.

    “Accidents like this are tragic, but are sadly inevitable. One of the main advantages of driverless cars is reducing the risk of accidents. But driving heavy metal vehicles around at high speed is by its nature potentially dangerous, and risk can never be eliminated completely,” said Prof Slawomir Nasuto, Professor of Cybernetics at the University of Reading.

    Avatar
    Kelly Paik writes about science and technology for Fanvive. When she's not catching up on the latest innovations, she uses her free-time painting and roaming to places with languages she can't speak. Because she rather enjoys fumbling through cities and picking things on the menu through a process of eeny meeny miny moe.