In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • PraiseTheSoup@lemm.ee
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    17 hours ago

    Autopilot shuts itself off just before a crash so Tesla can deny liability. It’s been observed in many real-world accidents before this. Others have said much the same, with sources, in this very thread.

    • melpomenesclevage@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      17 hours ago

      well yes but as long as there’s deniability built into my toy, then YOU’RE JUST A BIG DUMB MEANIE-PANTS WHO HATES MY COOL TOYS BECAUSE YOU DON’T HAVE ONE because there’s no other possible reason to hate a toy this cool.