Are self-driving cars safe, or do they lack the nuanced decision-making of human driving?
Waymo standoff brings traffic to a standstill in San Francisco
Two of the Waymo vehicles appeared to collide with each other, and a third vehicle sensed the collision and stopped.
- Recently, a Waymo self-driving car was seen on video blocking an ambulance responding to a mass shooting in Austin.
- This is not an isolated incident, as Waymo vehicles are also being investigated for illegally passing school buses and driving through police activity scenes.
- Despite their advanced technology, we know that these self-driving cars are not perfect and can malfunction in real-world scenarios.
Recent incidents have just revealed why Waymo’s self-driving cars could have a bigger impact on traffic and safety than some Americans expected.
TMZ has obtained footage of a Waymo self-driving car blocking the path of emergency workers in Austin who were responding to a tragic mass shooting on March 1st.
According to Fox News, Austin police officers arrived on the scene and were able to gain entry to the car, which they then moved to a garage out of the way. Did this accident expose a major flaw in the design of self-driving cars, or was this an isolated incident?
Does the Waymo incident in Austin point to a larger underlying problem?
Waymo’s recent incident in Austin is not the only accident the company’s self-driving cars have encountered thus far.
On March 3, Reuters reported on a National Transportation Safety Board investigation into a January incident involving a Waymo vehicle that was suspected of illegally passing a school bus. According to Reuters, Waymo was recalled in December 2025 “after Texas officials say it illegally passed school buses at least 19 times since the start of the school year.”
In December 2025, a Waymo driverless vehicle drove through a busy and potentially dangerous Los Angeles police scene, according to an NBC News article. In recent months, there have been multiple reports of Waymo vehicles doing dangerous things that could have been avoided by normal drivers.
Some of these incidents, like the recent ambulance issue, require immediate human intervention.
How do Waymo vehicles drive themselves?
Waymo’s self-driving cars “leverage AI to predict the behavior of other road users, leveraging information collected in real-time and experience built with over 20 million miles of real-world driving and over 20 billion miles of simulation,” the company’s website states.
These vehicles use cameras, remote sensing, advanced imaging radar, and AI-powered perception to understand their geographic location. Waymo cars use this advanced technology to navigate riders to their destinations while anticipating the movements of drivers and pedestrians.
Like human drivers, Waymo drivers are not foolproof and can malfunction in some real-world scenarios, based on recent accidents.
Where will Waymo operate in 2026?
Waymo operates “24/7” in major cities including San Francisco, Phoenix, Los Angeles, Miami, Orlando, Dallas, Houston, and San Antonio. According to Google’s support page, it will also be available through the Uber app in Austin and Atlanta.
The global autonomous vehicle market size is estimated to be approximately $364 billion in 2026, according to data from Precedence Research, a global market insight firm. Companies like Waymo and Tesla are aiming to roll out their “robotaxis” services nationwide over the next few years.
Waymo and Tesla Robotaxi vehicles can learn from data and real-world driving situations to improve over time. However, neither service is immune to software and AI errors that create awkward, illegal and potentially dangerous situations for passengers and pedestrians on the road.

