In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • FuglyDuck@lemmy.world
    link
    fedilink
    English
    arrow-up
    165
    ·
    10 hours ago

    As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

    This has been known.

    They do it so they can evade liability for the crash.

    • fibojoly@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      1 hour ago

      That makes so little sense… It detects it’s about to crash then gives up and lets you sort it?
      That’s like the opposite of my Audi who does detect I’m about to hit something and gives me either a warning or just actively hits the brakes if I don’t have time to handle it.
      If this is true, this is so fucking evil it’s kinda amazing it could have reached anywhere near prod.

    • Simulation6@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      5 hours ago

      If the disengage to avoid legal consequences feature does exist, then you would think there would be some false positive incidences where it turns off for no apparent reason. I found some with a search, which are attributed to bad software. Owners are discussing new patches fixing some problems and introducing new ones. None of the incidences caused an accident, so maybe the owners never hit the malicious code.

    • bazzzzzzz@lemm.ee
      link
      fedilink
      English
      arrow-up
      32
      arrow-down
      1
      ·
      8 hours ago

      Not sure how that helps in evading liability.

      Every Tesla driver would need super human reaction speeds to respond in 17 frames, 680ms(I didn’t check the recording framerate, but 25fps is the slowest reasonable), less than a second.

      • orcrist@lemm.ee
        link
        fedilink
        English
        arrow-up
        33
        arrow-down
        2
        ·
        6 hours ago

        They’re talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.

        And then that creates a discussion about how much time the human driver has to have in order to actually solve the problem, or gray areas about who exactly controls what when, and it complicates the situation enough where maybe Tesla can pay less money for the deaths that they are obviously responsible for.

        • jimbolauski@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          ·
          3 hours ago

          They’re talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.

          The plaintiff’s lawyers would say, the autopilot was engaged, made the decision to run into the wall, and turned off 0.1 seconds before impact. Liability is not going disappear when there were 4.9 seconds of making dangerous decisions and peacing out in the last 0.1.

          • FauxLiving@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 minutes ago

            Defense lawyers can make a lot of hay with details like that. Nothing that gets the lawsuit dismissed but turning the question into “how much is each party responsible” when it was previously “Tesla drove me into a wall” can help reduce settlement amounts (as these things rarely go to trial).

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        42
        ·
        8 hours ago

        It’s not likely to work, but them swapping to human control after it determined a crash is going to happen isn’t accidental.

        Anything they can do to mire the proceedings they will do. It’s like how corporations file stupid junk motions to force plaintiffs to give up.