In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.
The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.
Painted wall? That’s high tech shit.
I got a Tesla from my work before Elon went full Reich 3, and try this:
To which I’ll add:
Say what you want about European cars, at least they got usability and integration right. As did most of the auto industry. Fuck Tesla, never again. Bunch of Steve Jobs wannabes.
This, if the so called Tesla fans even drive the car, they know all of the above is more or less true. Newer cars have fewer of these issues, but the camera based Auto Pilot system is still in place. The car doesn’t even allow you to use cruise control under certain circumstances, because the car deems visibility too poor. The camera also only detects rain when its pouring, every other situation it will just randomly engage/disengage.
I drive a Tesla Model 3 (2024) daily and I wouldn’t trust the car driving itself towards a picture like that. It would be an interesting experiment to have these “Tesla Fans” do the same experiment and use a concrete wall for some additional fun. I bet they won’t even conduct the experiment, because they know the car won’t detect the wall.
It’s brake, the car brakes.
It probably breaks as well, but that’s not relevant right now.
I read that in Leslie Nielson’s voice.
Walk on by. I break down and cry.
It’s brake not break
In this case it might be both
Frunk is short for front trunk. The mp3 issues mostly goes away if you pay for LTE on the car. The rest of the issues I can attest to. Especially randomly changing the cruise control speed on a highway because Google maps says so, I guess? Just hard breaking at high speeds for no fucking reason.
Our Mazda 3’s adaptive cruise thought a car that was exiting was in our lane and hit the brakes, right in front of a car I had just passed. Sorry, dude, I made the mistake of trusting the machine.
Incidents like that made me realize how far we have to go before self driving is a thing. Before we got that car, I thought it was just around the corner, but now I see all the situations that car identifies incorrectly, and it’s like, yeah, we’re going to be driving ourselves for a long time.
Tbh false stopping is a lot better than driving over children by mistake