auto

When Self-Driving Cars Break the Law, Who’s at Fault?

Kevin Feather
Kevin Feather 5 Min Read
A self-driving car on the road, symbolizing the potential future of autonomous vehicles

Article overview

  • Recent incidents show that autonomous vehicles are not immune to traffic violations, ranging from running red lights to failing to yield to pedestrians.
  • Determining who is at fault when an AV breaks the law is complex, involving the operator, manufacturer, and software developer.
  • Laws and regulations are evolving to address these challenges, with a future focus likely shifting liability toward manufacturers as technology continues to advance.

driverless cars

The future isn’t quite here yet

While self-driving cars are on the horizon, today's vehicles still need trusted care from a human expert. AAA is here to help. We can connect you with a nearby AAA-owned Car Care location or a AAA Approved Auto Repair facility. Members enjoy a 10% discount on repair labor (up to $75) and receive a robust warranty on repairs: 36 months/36,000 miles at AAA-owned locations and 24 months/24,000 miles at approved facilities.

Find a Facility
steering wheel while driving autonomous car. 3D illustration

Autonomous vehicles: Are they safe or are they creating new challenges?

Autonomous vehicles (AVs) have long been presented as the future of transportation. This technological leap is designed to make our roads safer by reducing human error and improving traffic flow. For years, companies have worked to create a world where accidents are a thing of the past.

However, recent headlines tell a different story. Reports show self-driving cars committing traffic violations, from failing to yield to pedestrians to navigating intersections incorrectly. These incidents raise an important question: Are autonomous vehicles living up to their promise of enhanced safety, or are they introducing new challenges to our roads?

Let's explore why these advanced systems sometimes break the rules, compare their performance to human drivers and discuss the path forward for improving their reliability.

Recent traffic violations by autonomous vehicles

While AVs promise a safer future, an increasing number of real-world incidents show they are not immune to breaking traffic laws.

Driverless transportation with Ai system detecting traffic elements. 3D generated image

Notable incidents of AVs breaking traffic laws

  • Running a red light: In one attention-grabbing case, a Waymo taxi drove through a red light after receiving a remote command. A moped rider nearby reacted, lost control and slid. Thankfully, no one was seriously hurt. Waymo explained that the remote operator missed seeing the red light, This example demonstrates the challenges of blending human input with autonomous systems.
  • Failing to yield to pedestrians: A pedestrian in San Francisco reported a more alarming behavior. Over one week, they documented more than a dozen Waymo vehicles that failed to stop for them at marked crosswalks. These near misses highlight potentially dangerous decision-making in the vehicle's systems.
  • Wrong-lane and U-turn violations:

--- In Phoenix, a Waymo vehicle was pulled over by police after it reportedly ran a red light and briefly drove into oncoming traffic.

 --- In San Bruno, California, officers stopped a driverless Waymo that performed an illegal U-turn. Because there was no human driver, they could not issue a citation. 

--- Waymo is also under federal investigation after a vehicle reportedly failed to yield to a stopped school bus. The AV initially stopped but then maneuvered around the front of the bus, even though the bus's red lights were flashing and its stop sign was out.

Driverless car with environment sensors. 3D generated image

Why are autonomous vehicles making mistakes?

Understanding why these vehicles make errors is key to improving them. The issues often come down to a few core challenges:

  • Sensor limitations: AVs rely on sensors like cameras and radar to "see" the world. But these can be limited in complex environments, such as detecting a pedestrian partially hidden by another car.
  • Software flaws: The software that makes driving decisions is incredibly complex. Flaws in the code can lead to incorrect actions, like misinterpreting a traffic sign or a police officer's hand signals.
  • Human oversight: Many AVs still rely on some level of human monitoring. As seen in the red-light incident above, human errors can happen when communication or attention breaks down and incorrect procedures are undertaken by the autonomous system.

Legal and regulatory challenges

These incidents have sparked broad discussions about accountability, safety, and how to integrate driverless technology onto our roads. As AVs become more common, state and local governments must update their traffic enforcement frameworks.

Current laws for self-driving vehicles

Laws for self-driving vehicles vary by state. Generally, they require AVs to comply with existing traffic laws, meet federal safety standards, and have a system to alert a human driver or bring the vehicle to a safe stop if the technology fails.

The role of governments in managing AV operations

  • The National Highway Traffic Safety Administration (NHTSA) has launched investigations into Waymo’s software after receiving reports of vehicles driving into construction zones, failing to follow traffic laws or misinterpreting road signage. 
  • New legislation is coming. In California, for instance, a law set to take effect in July 2026 will let officers’ issue a “notice of autonomous vehicle noncompliance” directly to the manufacturers instead of individual drivers. 
Smiling businesswoman using tablet PC in autonomous driving car

Liability questions: Who is responsible when an AV breaks the law?

When a self-driving car breaks a traffic law or gets into an accident, figuring out who is at fault isn't simple. The responsibility could fall on several different parties, and the laws for these situations are still being written.

To determine who is legally responsible, investigators look at the details of what happened. For example, did a software glitch cause the problem, or was it a hardware failure? Was the human operator not paying attention? The answer depends on the vehicle’s level of automation and then what ultimately caused the incident. This means the fault could lie with:

  • The driver in the driver’s seat
  • The company that made the car
  • The developer who created the software

Figuring out responsibility is like solving a puzzle. Investigators start by looking at all the clues. This often means examining the data stored in the vehicle's "black box," which records everything the car's system does. They also review camera footage and check maintenance history. This evidence helps experts and insurance companies determine if the accident was caused by a person's mistake or a problem with the car's technology.

As we get closer to fully self-driving cars that do not need a human operator, the laws will likely change. In the future, responsibility for accidents may shift almost entirely to the companies that make the car or design its software. This could lead to a "strict liability" standard. This means if a defect in the vehicle is proven to have caused an accident, the manufacturer would be held responsible.

Smart transportation technology smart city concept

The future is now

The journey toward fully autonomous vehicles is filled with both exciting possibilities and significant hurdles. While the promise of safer roads remains a powerful motivator, recent incidents show that the technology still has a long way to go.

As companies refine their systems and lawmakers work to create a clear legal framework, the conversation around AV safety and accountability is more important than ever. The path forward requires collaboration between developers, regulators and the public to build a future where we can truly trust our vehicles to follow the rules of the road.

Can we trust autonomous vehicles to follow the rules of the road, or will they always need a human hand to guide them?

driverless cars

The future isn’t quite here yet

While self-driving cars are on the horizon, today's vehicles still need trusted care from a human expert. AAA is here to help. We can connect you with a nearby AAA-owned Car Care location or a AAA Approved Auto Repair facility. Members enjoy a 10% discount on repair labor (up to $75) and receive a robust warranty on repairs: 36 months/36,000 miles at AAA-owned locations and 24 months/24,000 miles at approved facilities.

Find a Facility

Frequently asked questions about autonomous vehicles and safety

An autonomous vehicle, also known as a self-driving car, is a vehicle that can guide itself without human conduction. It uses a combination of sensors, cameras, radar, and artificial intelligence (AI) to travel between destinations.

Yes, but with restrictions. Laws vary by state, but many allow for the testing and operation of autonomous vehicles on public roads. Most require a human driver to be present and ready to take control.

This is a complex and evolving area of law. Currently, if a human is in the driver's seat and expected to be monitoring the vehicle, they could be held responsible. However, new laws are being developed to allow citations to be issued directly to the vehicle manufacturer or operator company.

Proponents argue that AVs have the potential to be much safer, as they eliminate human errors like distracted or drowsy driving, which cause most accidents. However, as technology is still developing, AVs can make mistakes in unpredictable situations that a human driver might handle better.

Autonomous vehicles are designed with fail-safes. If the system detects a critical failure, it is supposed to alert the human driver and, if necessary, bring the vehicle to a safe stop in what’s known as a “minimal risk condition.”