• BlueLineBae@midwest.social
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    3
    ·
    edit-2
    1 day ago

    I have no sources for this so take with a grain of salt… But I’ve heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, if it was on while it drove on the tracks I would think would prove it’s Tesla’s faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla’s fault if it was on at the time.

    • meco03211@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 day ago

      Pretty sure they can tell the method used when disengaging fsd/ap. So they would know if it was manually turned off or if the system lost enough info and shut it down. They should be able to tell within a few seconds if accuracy the order of events. I can’t imagine a scenario that wouldn’t be blatantly obvious where the tesla was able to determine an accident was imminent and shut off fsd/ap wroth enough time to “blame it on the driver”. What might be possible is that the logs show fsd shut off like a millisecond before impact/event and then someone merely reported that fsd was not engaged at the time of the accident. Technically true and tesla lawyers might fight like hell to maintain that theory, but if an independent source is able to review the logs, I don’t see that being a possibility.

      • pixeltree@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        1 day ago

        Of course they know, they’re using it to hide the truth. Stop giving a corporation the benefit of the doubt where public safety is concerned, especially when they’ve been shown to abuse it in the past

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      They supposedly also have a threshold, like ten seconds - if FSD cuts out less than that threshold before the accident, it’s still FSD’s fault

    • SoleInvictus@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      That would require their self driving algorithm to actually detect an accident. I doubt it’s capable of doing so consistently.

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      On a related note, getting unstuck from something like train tracks is a pretty significant hurdles. The only real way is to back up IF turning onto the tracks wasn’t a drop down of the same depth as the rails. Someone who is caught off guard isn’t going to be able to turn a passenger car off the tracks because the rails are tall and getting an angle with the wheels to get over them isn’t really available.

      So while in a perfect world the driver would have slammed on the brakes immediately before it got onto the tracks, getting even the front wheels onto the tracks because they weren’t fast enough may have been impossible to recover from and going forward might have been their best bet. Depends on how the track crossing is built.