Summary
- Bad weather is a major weakness due to sensor interference, especially for Tesla’s FSD system.
- Geofencing limits self-driving cars to specific areas, hampering universal rollout and scalability.
- Self-driving cars struggle with traffic obstructions, human interaction cues, emergencies, and unpredictable scenarios.
Every couple of years, like clockwork, some tech CEO makes promises about how self-driving cars are just around the corner (looking at you, Elon). But even though we’ve gotten Teslas and Waymos with level 2 and level 4 autonomy respectively, Level 5—the Holy Grail of autonomy—is still out of reach. Here’s everything that’s still holding us back.
Bad Weather Is Still a Weakness
Most AVs rely on a combination of cameras, ultrasonic sensors, LiDAR, and radar to “see” the world, but the accuracy of these systems could be compromised in anything other than bright sunny weather. Rain could introduce noise into sensor data, fog could reduce camera visibility, and snow could cover lane markers–all of which make it harder for autonomous vehicles to navigate the roads safely.
As if these challenges weren’t difficult enough, some companies’ approach to self-driving is particularly susceptible to bad weather. Tesla CEO Elon Musk controversially ditched ultrasonic sensors and LiDAR in favor of a camera-only system for Tesla’s Full Self-Driving (FSD). He called LiDAR a “crutch,” and argued that self-driving cars should rely on cameras and silicon neural networks, just like humans use their eyes and biological neural networks.

Related
Your iPhone Pro Has LiDAR: 7 Cool Things You Can Do With It
Got an iPhone 13 Pro, 12 Pro, or iPad Pro? Here are some of the ways you can use your device’s LiDAR scanner.
The problem is, this approach eliminates redundancy. So what happens when visibility is poor? Or when the cameras make mistakes? In an incident reported by NBC News, these questions were brought into startling clarity when a Tesla with FSD engaged failed to recognize a passing train in heavy fog, forcing the driver to take control at the last second to avoid a crash.
However, Tesla isn’t the only self-driving company struggling with bad weather. Even Waymo, one of the few companies to have achieved level 4 autonomy, has yet to solve the problem. That’s why, for now, its robotaxis only operate in sunny cities like Phoenix, San Francisco, and Los Angeles. To their credit, Waymo has been testing in snowy locations like New York and Michigan—but whether its system can handle real winter conditions remains to be seen.
Geofencing Limits Their Usefulness
I mentioned earlier that Waymo is one of the few companies to have achieved Level 4 autonomy, and they’ve been able to do this largely because of geofencing.
Before Waymo’s robotaxis are rolled out in a new environment, teams with specialized vehicles drive around and scan the area, creating a hyper-detailed map of the environment. This area is their geofence, and it is within it that their robotaxis are gradually rolled out—first with safety drivers at the wheel, and then as the cars learn the ins and outs of the road, autonomously.
This slow and steady approach means that Waymo has been able to avoid many of the safety incidents that have plagued its competitors, but it also means its robotaxis only work in specific areas that have been mapped, and its cars have been trained on. Scaling this kind of system is hard and expensive, which is not exactly promising for a universal rollout of self-driving cars.
When in Doubt, They Stop—Even in the Worst Situations
Another weakness I’ve noticed with self-driving cars is that when they get confused—whether that’s by unusual road layouts, unpredictable human drivers, or unexpected obstacles, their usual response is to stop and reassess. But sometimes, stopping is the worst thing they can do.
You’ve probably seen those YouTube videos with self-driving cars frozen dead in the middle of the road. In one, a Waymo was attempting a U-turn when it encountered a motorcade and promptly froze in the middle of the road. It eventually had to be manually driven off by a police officer.
These kinds of traffic obstructions are annoying, but they are not even the real challenge. What happens when a self-driving vehicle gets confused and stops in a dangerous situation?
In one incident shared on YouTube, a Tesla with FSD encountered a railroad crossing immediately followed by a stop sign. The car spotted the stop sign too late and came to a halt right on the tracks. Fortunately, there wasn’t an incoming train, but these kinds of situations highlight the risk.
Beyond traffic hazards, this habit of stopping can make self-driving cars and their passengers easy targets. In one slightly disturbing case, two men stood in front of a robotaxi at a red light and harassed the passenger inside. Because the car detected an obstacle, it refused to move—even though the situation could have quickly turned dangerous.
They Struggle With Human Interaction
Driving isn’t just about following traffic lights and road signs—there’s a layer of subtle, unspoken communication between drivers. Whether it’s a quick wave to let someone merge or a nod at an intersection, human drivers rely on these cues to navigate smoothly. Self-driving cars, however, don’t always pick up on them.
Hand signals are a prime example. Road workers sometimes use them to direct traffic, but even though self-driving cars have been trained on them, the system isn’t perfect. A video posted on Reddit shows a Waymo robotaxi struggling to understand directions from a road worker, hesitating and failing to react properly.
Self-Driving Cars Have No Situational Awareness
Sometimes, self-driving cars struggle with basic common sense things a human driver would instinctively understand.
Take accidents, for example. A human driver would understand that you ought to go around the scene, but as seen in a video posted to Reddit, a Waymo drove straight through the accident scene instead of rerouting like the other cars.
In another instance, multiple Waymos were caught on camera plowing full speed through a massive pothole, completely ignoring a public works vehicle trying to block off the area.
Then there’s how they handle emergency vehicles. A human driver knows they should yield to ambulances and fire trucks, but autonomous cars often lack that awareness. In 2023, the San Francisco Fire Department reported that self-driving vehicles had interfered with emergency responders at least 66 times. Some of the worst offenses include:
- Driving through yellow emergency tape and ignoring warning signs.
- Blocking fire station driveways.
- Stopping motionless on a one-way street, forcing a firetruck to back up and take a longer route to a burning building.
That’s not a technology ready for prime time, if you ask me.
There Are Edge Cases That We Can’t Predict Until They Happen
Perhaps the biggest issue with self-driving cars is that they can’t handle unusual scenarios—especially ones they’ve never been trained on. They can learn from data, but what happens when they encounter a situation no one thought to prepare for?
The 2023 Cruise accident highlighted this problem. In that incident, a pedestrian was hit by a human-driven car and knocked into the path of a Cruise robotaxi. The robotaxi couldn’t avoid hitting the pedestrian, but it made the bad situation even worse by attempting to pull over, and in the process, dragging the pedestrian 20 feet at nearly 8 mph before finally stopping.
An investigation into the cause of this behavior revealed that when the woman was knocked into the path of the AV, she fell, so only her legs were visible to the car. Because she was partially out of sight, the car failed to classify her as a human being and ended up doing further damage when it attempted to pull over.
This kind of edge case is nearly impossible to predict, and yet, it and several more that we haven’t even considered exist. It’s like black ice on the road—you don’t know it’s there until you’re skidding, and by then, it might be too late.
Self-driving cars have made great strides, but they still have major flaws that make full autonomy an elusive goal. Until they can handle bad weather, unpredictable humans, and the unexpected, we’ll have to keep our hands on the wheel—whether we like it or not.