Here\’s Why Everyone\’s Talking About Tesla\’s Autopilot System

0
367

Here's Why Everyone's Talking About Tesla's Autopilot System HowStuffWorks\”Here's Why Everyone's Talking About Tesla's Autopilot System HowStuffWorks\”Here\’s Why Everyone\’s Talking About Tesla\’s Autopilot System HowStuffWorks

On May 7, Joshua Brown died in a car accident. Brown was riding in his Tesla Model S. He had engaged the vehicle\’s Autopilot feature, which ramps up the driver-assist features in Tesla cars. Brown\’s vehicle was traveling east on U.S. 27A in Florida. A tractor-trailer was traveling west on the same highway. The truck driver, who did not appear to be injured in the crash, made a left turn onto another street. Brown\’s car collided with the trailer as it spanned the eastbound section of the highway.

Neither Brown nor the Model S\’s autopilot system engaged the brakes. According to Tesla, the sky was brightly lit and the rig\’s trailer was white, making it difficult to see. The camera system on the Model S failed to detect the trailer. The radar system may have misidentified the trailer as an overhead road sign. Typically, Tesla radar ignores these signals. Otherwise, a Tesla vehicle would brake every time an overhead sign got within range.

It\’s a tragic story and one that has sparked a heated conversation about whether autonomous cars are safe. Is it smart to entrust our safety to uncaring machines? And to what extent is Tesla responsible?

Complicating matters is the fact that Tesla didn\’t publicly acknowledge the accident until more than a month had gone by. Nine days after the accident occurred, Tesla filed a report with the National Highway Traffic Safety Administration (NHTSA). But the general public didn\’t hear about it until June 30, 2016.

Between the accident and Tesla\’s blog post, the company held a shareholder meeting. That meeting included a stock offering that was slated to raise more than $1.4 billion. Some people, like Fortune journalist Carol J. Loomis, have criticized the company for withholding information about the accident until after the shareholder meeting.

Tesla founder Elon Musk dismissed this criticism and has responded to it. He pointed out that other automakers aren\’t expected to acknowledge and discuss every accident that involves one of their company\’s automobiles. He also claimed that if Tesla\’s autopilot feature were standard on every vehicle in 2015, 500,000 lives would have been saved.

Tesla representatives say that the company has logged more than 130 million miles (209 million kilometers) of Tesla vehicles operating in autopilot mode. The average for fatalities per miles driven is one in every 94 million miles (151 million kilometers) in the U.S., or one in every 60 million miles (nearly 97 million kilometers) worldwide. Tesla argues this metric supports the company\’s position that Autopilot is an important safety feature.

On July 1, another accident allegedly involving Tesla\’s Autopilot happened on a turnpike in Pennsylvania. A Tesla Model X driver says that he had put the vehicle in Autopilot mode before the car hit a guardrail on the right before veering across the turnpike and colliding with a concrete median on the left. As of this writing, few details about the accident are publicly available.

Although some critics may say these accidents are examples of why driverless cars aren\’t yet ready to hit the streets, Tesla notes that it doesn\’t consider Autopilot to be the same thing as an autonomous vehicle. In fact, in order to engage Autopilot mode you must first acknowledge a message from Tesla stating that drivers should keep their hands on the steering wheel at all times and be ready to take full control of the car. Autopilot, says Tesla, is a driver-assist feature. It\’s not a replacement for a human driver.

It\’s also important to remember the feature is in beta mode, meaning it\’s not a finished product. Tesla takes data from vehicles operating in Autopilot to analyze and modify the system. In other words, this is a feature that\’s supposed to improve the more people use it. It might one day lead to a fully autonomous car, but according to Tesla that\’s not what it\’s meant to do today.

In general, driver-assist features and autonomous cars react faster and more effectively to emergency situations than human drivers. Joshua Brown\’s accident was tragic, but it\’s also an outlier. It\’s not necessarily indicative of an inherently unsafe approach to operating a vehicle.

It\’s likely we\’ll see a lot more conversation and debate in the autonomous car space moving forward, even as Tesla protests that Autopilot isn\’t really an autonomous car mode. And while statistics might back up Tesla\’s claims that these features really do contribute to driver safety, the emotional weight of Joshua Brown\’s story will still be a factor.

LEAVE A REPLY

Please enter your comment!
Please enter your name here