Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.
Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.
The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!
Can this be solved with just cameras, or would this need additional hardware? I know they removed LIDAR, but thought that would only be effective short range, and would not be too helpful at 65 km/h.
Theoretically yes, but in reality, not with current technology.
LIDAR actually has quite a long range. You can look up some of the images LIDAR creates, they’re pretty comprehensive.
Teslas never had LIDAR. They did have ultrasonic sensors and radar before they went to the this vision only crap.
If for some bizarre reason you would want to stick to cameras only, you could use 2 cameras and calculate the distance to various points based on the difference between the images. Thats called stereoscopy and is precisely what gives our brains depth perception. The issue is that this process is expensive computationally so I’d guess that it would be cheaper to go back to lidar.
Subaru does that!
Theoretically, yes. A human would be smart enough not to drive right into a painted wall, using only their eyeballs combined with their intelligence and sense of self-preservation. A smart enough vision system should be able to do the same.
Using something like LIDAR to directly sense obstacles would a lot more practical and reliable. LIDAR certainly has enough distance (airplanes use it too), though I don’t know about the systems Tesla used specifically.
As I understand it, this is uncommon and mostly used for topological mapping.
Most commercial aircraft use a radar, augmented with a GPS-based terrain map, for their ground proximity warning (EGPWS, “Enhanced Ground Proximity Warning System”).
I could be wrong though, I’m not a pilot.
Good question. I don’t know if they ll succeed but they have a point that humans do it with just vision so why can’t ai do at least as well? We’ll see. I’m happy someone is trying a different approach. Maybe lidar is necessary, but until someone succeeds we won’t know the best approach, so let’s be happy there’s at least one competing attempt
I gave it a try once and it was pretty amazing, but clearly not ready. Tesla is fantastic at “normal” driving, but the trial gave me a real appreciation how driving is all edge cases. At this point I’m no longer confident that anyone will solve the problem adequately for general use.
Plus there will be accidents. No matter how optimistic you may be, it will never be perfect. Are they ready for the liability and reputation hit? Can any company survive that, even if they are demonstrably better than human?
It works pretty well as a highway assist. I never use it on city streets because its so slow and hesitant which is worse.
For me the biggest downside was really poor road maintenance: lines worn off, long cracks that could be interpreted as lines, offset intersection where you can’t go straight across and no lines … or at night not enough cleared space so the side camera decides it’s obscured.
I have this one really narrow windy road that too many humans have trouble with. I really wanted to see what it would do but decided there wasn’t enough room for me to take over if I needed to