In 2009, Waymo introduced its first fleet of driverless cars, sleek pods equipped with sensors, AI, and a “Sense, Solve, Go” system designed to navigate roads autonomously without human input. According to the company, its robotaxis now experience 91 percent fewer crashes and 91 percent fewer serious injuries than human drivers over the same distances.
But even as Waymo brags about its spotless stats, co-CEO Tekedra Mawakana is already bracing for the inevitable: the first fatality caused by one of its cars, and she thinks society will accept it.
During TechCrunch’s Disrupt summit last week, Mawakana said plainly:
“We really worry as a company about those days. You know, we don’t say ‘whether.’ We say ‘when.’ And we plan for them.”
Ah yes, the classic Silicon Valley pep talk: innovate, disrupt, and maybe kill someone along the way.
The comment was startlingly blunt, but the logic is tough to dispute: if Waymo’s cars are truly safer than human drivers, then statistically, fewer people die overall. Squint a little, take a shot of tequila, and suddenly the moral math looks almost reasonable, body count and all.
When asked if the public is ready to face a Waymo-caused death, Mawakana replied:
“I think that society will. I think the challenge for us is making sure that society has a high enough bar on safety that companies are held to.”
Ah, yes, nothing like a little moral outsourcing to get the public on board.
See the moment Mawakana says the quiet part out loud below:
Still, self-driving cars live in a regulatory gray zone, and no one seems entirely sure how to handle an AI-caused death. Are we ready to forgive a computer for something we’d never forgive a person for?
Waymo’s record looks squeaky clean compared to most competitors, but not spotless.
Between February and August 2025, its cars were involved in 45 reported crashes, according to government filings. Most were minor, and, as Understanding AI noted, “the large majority of these crashes were clearly not Waymo’s fault,” including 24 when the car wasn’t even moving and seven when it was rear-ended.
Still, a few hiccups stood out: three incidents where passengers opened doors into cyclists and scooters, and one car that literally lost a wheel mid-ride. Not ideal, but hey, at least no one’s getting autonomously ejected.
In The Atlantic, University of South Carolina law professor Bryant Walker Smith—an expert on autonomous-vehicle regulation—put it bluntly:
“I like to tell people that if Waymo worked as well as ChatGPT, they’d be dead.”
Good to know the new gold standard for AI is “didn’t kill you.”
But Mawakana has been clear that transparency is key, urging all autonomous-car companies to publish crash data, something Waymo proudly does on its online “safety hub.” That’s a not-so-subtle dig at Tesla and Cruise, whose records are, well, less brag-worthy.
Tesla’s Autopilot has been linked to fatal crashes and multiple lawsuits, while GM’s Cruise imploded last year after one of its robotaxis dragged a pedestrian 20 feet down a San Francisco street then tried to hide the video from regulators. Nothing says “trust our technology” like obstructing justice.
Waymo, in contrast, has taken a slow-and-steady approach, pausing testing whenever things go sideways. Mawakana told TechCrunch the company “pulls back and retests all the time,” even halting operations when vehicles block emergency responders.
And Mawakana didn’t mince words:
“We need to make sure that the performance is backing what we’re saying we’re doing.”
Reasonable enough—though Waymo still won’t disclose how often its “remote operators” have to step in when the system falters. So maybe “driverless” is doing some heavy lifting there.
And Mawakana’s viral quote hit the internet faster than a recall notice:
Then came Atlanta. Earlier this month, one of Waymo’s robotaxis was caught on video illegally passing a stopped school bus as children were getting off—an offense that usually earns human drivers a $1,000 fine and possibly jail time.
Georgia State Representative Clint Crowe told KGW8 he was stunned:
“I’m a big fan of new technologies and emerging technologies, and I think driverless cars are going to become more prevalent. But we’ve got to think about how they’re going to comply with the law.”
Crowe co-sponsored Addy’s Law, named after 8-year-old Addy Pierce, who was killed crossing to her bus stop in 2024. The law increased penalties for passing stopped school buses—penalties that, Crowe insists, should also apply to autonomous vehicles.
In response, Waymo issued the standard crisis-mode statement:
“The trust and safety of the communities we serve is our top priority. We continuously refine our system’s performance to navigate complex scenarios and are looking into this further.”
Ah, yes, the corporate equivalent of “thoughts and prayers,” now available in self-driving form.
But no one was injured, and the incident perfectly illustrated Mawakana’s point: perfection isn’t possible, and trust isn’t earned through spreadsheets and press releases.
You can watch Mawakana’s full interview below:
- YouTube TechCrunch
So yes—the driverless future has arrived. Fasten your seatbelts, everybody. The car might not need one, but you definitely still do.














Hide Blanket GIF by Instanietje
Foam Reaction GIF
Mental Health Therapy GIF by All Better 

Friday Driving GIF by FIA World Rally Championship
episode 11 bad food GIF
talking homer simpson GIF
Cartoons Button GIF by Nickelodeon 
@DanTML_/YouTube
@voeqx7894/YouTube
@hendrxx9593/YouTube
@anakinskywalker8877/YouTube
@getshiddonn/YouTube
@sachmanyo/YouTube