Self-driving cars are often marketed as safer than human drivers, but new data suggests that may not always be the case.

Citing data from the National Highway Traffic Safety Administration (NHTSA), Electrek reports that Tesla disclosed five new crashes involving its robotaxi fleet in Austin. The new data raises concerns about how safe Tesla’s systems really are compared to the average driver.

The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.

      • halcyoncmdr@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        I don’t think it’s necessarily about cost. They were removing sensors both before costs rose and supply became more limited with things like the tariffs.

        Too many sensors also causes issues, adding more is not an easy fix. Sensor Fusion is a notoriously difficult part of robotics. It can help with edge cases and verification, but it can also exacerbate issues. Sensors will report different things at some point. Which one gets priority? Is a sensor failing or reporting inaccurate data? How do you determine what is inaccurate if the data is still within normal tolerances?

        More on topic though… My question is why is the robotaxi accident rate different from the regular FSD rate? Ostensibly they should be nearly identical.

          • halcyoncmdr@piefed.social
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 days ago

            Alright, so the radar is detecting a large object in front of the vehicle while travelling at highway speeds. The vision system can see the road is clear.

            So with your assumption of listening to whatever says there’s an issue, it slams on the brakes to stop the car. But it’s actually an overpass, or overhead sign that the radar is reflecting back from while the road is clear. Now you have phantom braking.

            Now extend that to a sensor or connection failure. The radar or a wiring harness is failing and sporadically reporting back close contacts that don’t exist. More phantom braking, and this time with no obvious cause.

            • merc@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 days ago

              Now you have phantom braking.

              Phantom braking is better than Wyle E. Coyoteing a wall.

              and this time with no obvious cause.

              Again, better than not braking because another sensor says there’s nothing ahead. I would hope that flaky sensors is something that would cause the vehicle to show a “needs service” light or something. But, even without that, if your car is doing phantom braking, I’d hope you’d take it in.

              But, consider your scenario without radar and with only a camera sensor. The vision system “can see the road is clear”, and there’s no radar sensor to tell it otherwise. Turns out the vision system is buggy, or the lens is broken, or the camera got knocked out of alignment, or whatever. Now it’s claiming the road ahead is clear when in fact there’s a train currently in the train crossing directly ahead. Boom, now you hit the train. I’d much prefer phantom breaking and having multiple sensors each trying to detect dangers ahead.

              • NotMyOldRedditName@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                6 days ago

                FYI, the fake wall was not reproducible on the latest hardware, that test was done on an older HW3 car, not the cars operating as robotaxi which are HW4.

                The new hardware existed at the time, but he chose to use outdated software and hardware for the test.

                  • NotMyOldRedditName@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    0
                    ·
                    edit-2
                    6 days ago

                    As a consumer product, you are responsible and supposed to be paying attention at all times and be ready to take over.

                    It is completely acceptable that it does not function perfectly in every scenario and something like a fake wall put on the road causes issues, that is why you need to pay attention.

                    There is nothing to recall about this situation.

                    If the car is failing on things it shouldn’t be, like both Tesla and Waymo failing to properly stop for school busses while in autonomous mode, that does require an update. Alhough ive seen 0 reports of an autonomous Tesla doing this yet only supervised ones.

                    A Tesla not stopping for a school bus in supervised mode is acceptable though because the driver is responsible to stop.

                    Edit: and note, a problem like the school busses is a visual processing understanding problem. Lidar won’t help with that kind or problem.

                    Edit: and sorry to be clear, it is hardware still on the road, but I’m saying its acceptable that hardware does it because its not autonomous. If the newer hardware running without supervisors was doing it, that’s another story.

      • tomalley8342@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 days ago

        Nah, that one’s on Elon just being a stubborn bitch and thinking he knows better than everybody else (as usual).

        • ageedizzle@piefed.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 days ago

          He’s right in that if current AI models were genuinely intelligent in the way humans are then cameras would be enough to achieve at least human level driving skills. The problem of course is that AI models are not nearly at that level yet

          • kameecoding@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            7 days ago

            I am a Human and there were occasions where I couldn’t tell if it’s an obstacle on the road or a weird shadow…