Is Tesla’s Autopilot safe? By Reuters
[ad_1]

2/2
By Hyunjoo Jin, Mike Spector and David Shepardson
(Reuters) – Robin Geoulla had doubts about the automated driving technology equipped on his Tesla (NASDAQ:) Model S when he bought the electric car in 2017.
“It was a little scary to, you know, rely on it and to just, you know, sit back and let it drive,” he told a U.S. investigator about Tesla’s Autopilot system, describing his initial feelings about the technology.
Geoulla spoke to an investigator on January 18, 2018, just days after his Tesla with Autopilot activated crashed into the back seat of an unoccupied firetruck parked along a California interstate highway. Reuters couldn’t reach him to provide additional information.
Geoulla started to believe Autopilot was trustworthy and began to track vehicles in front of it. However, Geoulla noticed that the system could sometimes be confused by direct sunlight and a vehicle changing lanes in front of it. According to transcripts of his interview with an NTSB investigator.
The investigator was told that he had been driving towards the sun at the time he ran into the firetruck.
Autopilot’s design allowed Geoulla to disengage from driving during his trip, and his hands were off the wheel for almost the entire period of roughly 30 minutes when the technology was activated, the NTSB found.
Although the U.S. agency makes recommendations, it does not have enforcement power. The NTSB previously asked regulators at NHTSA to examine Autopilot’s shortcomings, driver misuse potential, and safety risks after a number of fatal crashes using the technology.
“The past has shown the focus has been on innovation over safety and I’m hoping we’re at a point where that tide is turning,” the NTSB’s new chair, Jennifer Homendy, told Reuters in an interview. There is no comparable between Tesla’s Autopilot system and other aviation autopilot systems that require trained pilots and rules to address fatigue, as well as testing for alcohol and drugs.
Tesla didn’t respond to our written queries.
The company states that Autopilot, an advanced driver-assistance technology, does not currently render cars autonomous. Before enabling Autopilot, Tesla requires that all drivers agree to maintain complete control over their cars and keep their hands on the steering wheel.
LIMITED VISIBILITY
Geoulla’s 2018 crash is one of 12 accidents involving Autopilot that NHTSA officials are scrutinizing as part of the agency’s farthest-reaching investigation since Tesla Inc introduced the semi-autonomous driving system in 2015.
According to Reuters, a NHTSA report, NTSB documents, and reports from police, most of these crashes occurred at night or under conditions that make it difficult for drivers to see. That raises questions about Autopilot’s capabilities during challenging driving conditions, according to autonomous driving experts.
“NHTSA’s enforcement and defect authority is broad, and we will act when we detect an unreasonable risk to public safety,” a NHTSA spokesperson said in a statement to Reuters.
The 33 U.S. car safety regulators sent separate teams of crash investigators to investigate Tesla collisions involving 11 people. NHTSA has disapproved of Autopilot’s use in three non-fatal collisions.
The current NHTSA investigation https://www.reuters.com/business/autos-transportation/us-opens-formal-safety-probe-into-tesla-autopilot-crashes-2021-08-16 of Autopilot in effect reopens the question of whether the technology is safe. It represents the latest significant challenge for Elon Musk, the Tesla chief executive whose advocacy of driverless cars has helped his company become the world’s most valuable automaker https://www.reuters.com/article/tesla-stocks-int/tesla-market-value-crosses-800-billion-for-the-first-time-idUSKBN29D20B.
Tesla offers advanced driver assistance features like lane changing and lane change at a cost of up to $10,000 to customers. However, they promise that their cars will eventually be able to drive autonomously using just cameras and sophisticated software. Others carmakers, as well as self-driving companies, use more advanced hardware and cameras in the production of their cars.
Musk stated that a Tesla equipped with eight cameras would be safer than humans. Industry executives and experts agree that the technology’s ability to see in darkness, sun glare, or adverse weather conditions, such as rain, snow, or fog can affect it.
Raj Rajkumar is a Carnegie Mellon University professor in electrical and computer engineering. “Today, our computer vision was far from perfect.
In the first known fatal U.S. crash involving Tesla’s semi-autonomous driving technology, which occurred in 2016 west of Williston, Florida, the company said both the driver and Autopilot failed to see the white side of a tractor trailer against a brightly lit sky. The Tesla collided against the 18-wheel truck instead of slowing down.
DRIVER MISUSE, FAILED BRAKING
NHTSA in January 2017 closed an investigation of Autopilot stemming from that fatal crash, finding no defect in the Autopilot performance after some contentious exchanges with Tesla officials, according to documents reviewed by Reuters.
As part of the investigation, NHTSA asked Tesla in December 2016 to give details about the company’s response to any safety concerns regarding Autopilot. This included the possibility for driver abuse or misuse, as per a special order issued by regulators.
Todd Maron (then-general counsel) tried to get Tesla to respond again after a NHTSA lawyer had found Tesla’s original response inadequate. According to Reuters, he told regulators that his request was “grossly broad” and that it would not be possible to catalogue all the concerns during Autopilot development.
Maron said that Tesla still wanted to work with regulators. During Autopilot’s development, company employees or contractors had raised concerns that Tesla addressed regarding the potential for unintended or failed braking and acceleration; undesired or failed steering; and certain kinds of misuse and abuse by drivers, Maron said, without providing further details.
Maron didn’t respond to requests for comment.
We don’t know how the regulators responded. A former U.S. official claimed that Tesla had generally cooperated with the probe, and provided all requested information promptly. The investigation was closed by regulators just prior to the inauguration of Donald Trump. They found that Autopilot worked as intended and that Tesla had taken steps to stop it being misused.
LEADERSHIP VACUUM IN NHTSA
NHTSA has been without a Senate-confirmed chief for nearly five years. Joe Biden has not yet nominated anyone to lead the agency.
NHTSA documents indicate that regulators would like to learn how Tesla vehicles try to see emergency vehicles flashing lights or detect police cars and firetrucks in their path. Similar information has been sought from twelve other automakers.
“Tesla has been asked to produce and validate data as well as their interpretation of that data. NHTSA said that it would conduct its own validation and analysis on all data.
Musk, an electric-car innovator, has worked hard to defend Autopilot against regulators and critics. Tesla has used Autopilot’s ability to update vehicle software over the air to outpace and sidestep the traditional vehicle-recall process.
Musk has repeatedly promoted Autopilot’s capabilities https://www.reuters.com/article/us-tesla-autonomous-factbox/elon-musk-on-teslas-self-driving-capabilities-idUSKCN1RY0QY, sometimes in ways that critics say mislead customers into believing Teslas can drive themselves – despite warnings to the contrary in owner’s manuals that tell drivers to remain engaged and outline the technology’s limitations.
Musk also continues to promote what Tesla calls “Beta” versions via software upgrades.
“Some manufacturers are going to do what they want to do to sell a car and it’s up the government to rein that in,” the NTSB’s Homendy said.
[ad_2]