Traffic footage taken in the Yerba Buena Island Tunnel of the San Francisco–Oakland Bay Bridge Complex showed a previously reported eight-car pile-up after a Tesla Model S with the “Full Self-Driving” software malfunctioned and abruptly stopped in the fast lane.
KRON 4 reported the traffic accident happened on Thanksgiving Day. The crash resulted in nine people being injured–including a two-year-old who was hospitalized after suffering a bruise and an abrasion to the rear left side of his head.
The accident footage and photos were obtained by The Intercept via a California Public Records Act request.
It showed the first direct look at what happened from two different perspectives, confirming witness accounts the Tesla went into the far left lane at a high speed with the flow of traffic and then stopped for no reason.
\u201cFootage released of a \u201cFull Self-Driving\u201d crash in which the Tesla\u2019s \u201cleft signal & brakes activated,\u201d & it moved into the left lane, \u201cslowing to a stop directly in path of travel\u201d\nThe 8 car crash injured 9 people, including a 2yo & blocked traffic on the bridge for over an hour.\u201d— ChudsOfTikTok (@ChudsOfTikTok) 1673377013
The same day—just hours before the accident—Tesla CEO Elon Musk congratulated Tesla employees on a "major milestone” after announcing the vehicle's “Full Self-Driving” capability was made available in North America.
According to the Austin, Texas-based automotive and clean energy company, the Autodrive feature became available to over 285,000 Tesla owners in North America.
\u201cTesla Full Self-Driving Beta is now available to anyone in North America who requests it from the car screen, assuming you have bought this option. \n\nCongrats to Tesla Autopilot/AI team on achieving a major milestone!\u201d— Elon Musk (@Elon Musk) 1669275286
According to a report, the Tesla Model S driver behind the wheel during the crash told the California Highway Patrol the vehicle was in "Full Self-Driving" mode while traveling at 55 miles per hour.
The driver claimed the vehicle unexpectedly—of its own accord—switched into the far-left lane and slowed to 20 m.p.h., resulting in the pile-up.
A June 15, 2022 article from the Washington Post reported there had been 273 reported crashes involving Tesla vehicles using its Autopilot software over roughly the past year.
The numbers published by the National Highway Traffic Safety Administration (NHTSA) at the time showed:
"Tesla vehicles made up nearly 70 percent of the 392 crashes involving advanced driver-assistance systems reported since last July, and a majority of the fatalities and serious injuries—some of which date back further than a year."
\u201c\u201cTeslas accounted for almost 70 percent of 329 crashes in which advanced driver assistance systems were involved, as well as a majority of fatalities and serious injuries associated with them.\u201d\n\nThis stat + video are terrifying \nhttps://t.co/H6B3C4OgLk\u201d— Eileen Guo | Eileenguo@journa.host (@Eileen Guo | Eileenguo@journa.host) 1673390277
The Tesla Autopilot feature allows drivers to cede physical control of their electric vehicles, but they must be actively paying attention.
While activated, the automated vehicles are supposed to be able to maintain a safe distance, stay within lanes and make lane changes while sharing the highway with other vehicles.
\u201c@eileenguo @DemSoc7890 I don't even like another person driving me, let alone a damn car driving itself.\u201d— Eileen Guo | Eileenguo@journa.host (@Eileen Guo | Eileenguo@journa.host) 1673390277
The “Full Self-Driving” beta is part of an expanded feature of the Autopilot software.
It allows the vehicles to maneuver city and residential streets and stop at stop signs.
\u201c@ChudsOfTikTok Yep....what could go wrong? This. \n\nAnyone with sense knows there is no safe way for this technology to be used yet.\u201d— ChudsOfTikTok (@ChudsOfTikTok) 1673377013
Tesla has been scrutinized by transportation safety experts who are concerned about the Autopilot technology's safety since its AI is being tested and trained along with other drivers on public roads.
\u201c@ChudsOfTikTok So,....is there just no regulation on self driving technology?\n\nCan they cause unlimited accidents from tech failures and everyone just has to deal with it?\u201d— ChudsOfTikTok (@ChudsOfTikTok) 1673377013
\u201c@eileenguo Phantom breaking at its finest there. It\u2019s even scarier for the driver when the car just stops for nothing\u201d— Eileen Guo | Eileenguo@journa.host (@Eileen Guo | Eileenguo@journa.host) 1673390277
Complaints of "unexpected brake activation" or what drivers call "Phantom Braking"–where Tesla vehicles independently slam on the brakes at high speeds–have increased in recent months.
The Washington Post reported roughly 100 such complaints have been filed within a three-month period.
Since 2016, the NHTSA investigated a total of 35 crashes where Tesla’s “Full Self-Driving” or “Autopilot” systems were likely being used.
The combined total of deaths from those accidents was 19 people.
NHTSA’s administrator, Steven Cliff, expressed caution about the feature.
Cliff said:
“These technologies hold great promise to improve safety, but we need to understand how these vehicles are performing in real-world situations."
\u201c@ChudsOfTikTok Sounds like it\u2019s time to outlaw these self-driving Teslas until they can work out the bugs.\u201d— ChudsOfTikTok (@ChudsOfTikTok) 1673377013
As more car companies are introducing electric motor vehicles to their lineup, Musk endeavored to make Tesla more "compelling" to stay above the tightening competition.
Last June, he proclaimed focusing on the "Full Self-Driving" technology was "essential," adding:
“It’s really the difference between Tesla being worth a lot of money or worth basically zero.”