driverless cars
While self-driving cars are on the horizon, today's vehicles still need trusted care from a human expert. AAA is here to help. We can connect you with a nearby AAA-owned Car Care location or a AAA Approved Auto Repair facility. Members enjoy a 10% discount on repair labor (up to $75) and receive a robust warranty on repairs: 36 months/36,000 miles at AAA-owned locations and 24 months/24,000 miles at approved facilities.
Autonomous vehicles (AVs) have long been presented as the future of transportation. This technological leap is designed to make our roads safer by reducing human error and improving traffic flow. For years, companies have worked to create a world where accidents are a thing of the past.
However, recent headlines tell a different story. Reports show self-driving cars committing traffic violations, from failing to yield to pedestrians to navigating intersections incorrectly. These incidents raise an important question: Are autonomous vehicles living up to their promise of enhanced safety, or are they introducing new challenges to our roads?
Let's explore why these advanced systems sometimes break the rules, compare their performance to human drivers and discuss the path forward for improving their reliability.
While AVs promise a safer future, an increasing number of real-world incidents show they are not immune to breaking traffic laws.
--- In Phoenix, a Waymo vehicle was pulled over by police after it reportedly ran a red light and briefly drove into oncoming traffic.
--- In San Bruno, California, officers stopped a driverless Waymo that performed an illegal U-turn. Because there was no human driver, they could not issue a citation.
--- Waymo is also under federal investigation after a vehicle reportedly failed to yield to a stopped school bus. The AV initially stopped but then maneuvered around the front of the bus, even though the bus's red lights were flashing and its stop sign was out.
Understanding why these vehicles make errors is key to improving them. The issues often come down to a few core challenges:
These incidents have sparked broad discussions about accountability, safety, and how to integrate driverless technology onto our roads. As AVs become more common, state and local governments must update their traffic enforcement frameworks.
Laws for self-driving vehicles vary by state. Generally, they require AVs to comply with existing traffic laws, meet federal safety standards, and have a system to alert a human driver or bring the vehicle to a safe stop if the technology fails.
When a self-driving car breaks a traffic law or gets into an accident, figuring out who is at fault isn't simple. The responsibility could fall on several different parties, and the laws for these situations are still being written.
To determine who is legally responsible, investigators look at the details of what happened. For example, did a software glitch cause the problem, or was it a hardware failure? Was the human operator not paying attention? The answer depends on the vehicle’s level of automation and then what ultimately caused the incident. This means the fault could lie with:
Figuring out responsibility is like solving a puzzle. Investigators start by looking at all the clues. This often means examining the data stored in the vehicle's "black box," which records everything the car's system does. They also review camera footage and check maintenance history. This evidence helps experts and insurance companies determine if the accident was caused by a person's mistake or a problem with the car's technology.
As we get closer to fully self-driving cars that do not need a human operator, the laws will likely change. In the future, responsibility for accidents may shift almost entirely to the companies that make the car or design its software. This could lead to a "strict liability" standard. This means if a defect in the vehicle is proven to have caused an accident, the manufacturer would be held responsible.
The journey toward fully autonomous vehicles is filled with both exciting possibilities and significant hurdles. While the promise of safer roads remains a powerful motivator, recent incidents show that the technology still has a long way to go.
As companies refine their systems and lawmakers work to create a clear legal framework, the conversation around AV safety and accountability is more important than ever. The path forward requires collaboration between developers, regulators and the public to build a future where we can truly trust our vehicles to follow the rules of the road.
Can we trust autonomous vehicles to follow the rules of the road, or will they always need a human hand to guide them?
driverless cars
While self-driving cars are on the horizon, today's vehicles still need trusted care from a human expert. AAA is here to help. We can connect you with a nearby AAA-owned Car Care location or a AAA Approved Auto Repair facility. Members enjoy a 10% discount on repair labor (up to $75) and receive a robust warranty on repairs: 36 months/36,000 miles at AAA-owned locations and 24 months/24,000 miles at approved facilities.
An autonomous vehicle, also known as a self-driving car, is a vehicle that can guide itself without human conduction. It uses a combination of sensors, cameras, radar, and artificial intelligence (AI) to travel between destinations.
Yes, but with restrictions. Laws vary by state, but many allow for the testing and operation of autonomous vehicles on public roads. Most require a human driver to be present and ready to take control.
This is a complex and evolving area of law. Currently, if a human is in the driver's seat and expected to be monitoring the vehicle, they could be held responsible. However, new laws are being developed to allow citations to be issued directly to the vehicle manufacturer or operator company.
Proponents argue that AVs have the potential to be much safer, as they eliminate human errors like distracted or drowsy driving, which cause most accidents. However, as technology is still developing, AVs can make mistakes in unpredictable situations that a human driver might handle better.
Autonomous vehicles are designed with fail-safes. If the system detects a critical failure, it is supposed to alert the human driver and, if necessary, bring the vehicle to a safe stop in what’s known as a “minimal risk condition.”
https://exchange.aaa.com/automotive/automotive-trends/autonomous-vehicles/
https://www.acg.aaa.com/connect/blogs/5c/auto/what-are-autonomous-vehicles
https://exchange.aaa.com/automotive/automotive-testing/advanced-driver-assistance-systems/
https://www.acg.aaa.com/connect/blogs/5c/auto/what-are-autonomous-vehicles
https://san.com/cc/usdot-opens-door-to-nationwide-autonomous-vehicles-with-new-rules/
https://www.washingtonpost.com/technology/2025/03/13/waymo-robotaxis-parking-tickets/