“Translation: all the times Tesla has vowed that all of its vehicles would soon be capable of fully driving themselves may have been a convenient act of salesmanship that ultimately turned out not to be true.”

Another way to say that, is Tesla scammed all of their customers, since you know, everyone saw this coming…

  • ristoril_zip@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    4
    ·
    13 days ago

    I realized self-driving on roads is impossible for so-called when someone pointed out what human drivers do when there’s like a flock of geese camped out in the middle of the road.

    We know that we should slowly move forward until they get out of the way, including bonking then with the car (gently). Do we want cars deciding that some obstruction in the road is “ok” to hit? I don’t. So what’s the solution? Something other than pure autonomous self driving.

    We can probably have some very high level driver assist. Maybe.

    • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      edit-2
      13 days ago

      All the issues with self-driving could be solved if they actually gave a shit about making it work. You don’t let the machine choose. You give it hard fucking rules to follow. It doesn’t need to identify geese, human, ball, dog, child to react differently to each; it should see an obstruction and stop to avoid damaging the fucking object and car, regardless of what it is. They are making it way more complicated than it really has to be.

      • Eranziel@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        13 days ago

        You are making it far simpler than it actually is. Recognizing what a thing is is the essential first problem. Is that a child, a ball, a goose, a pothole, or a shadow that the cameras see? It would be absurd and an absolute show stopper if the car stopped for dark shadows.

        We take for granted the vast amount that the human brain does in this problem space. The system has to identify and categorize what it’s seeing, otherwise it’s useless.

        That leads to my actual opinion on the technology, which is that it’s going to be nearly impossible to have fully autonomous cars on roads as we know them. It’s fine if everything is normal, which is most of the time. But software can’t recognize and correctly react to the thousands of novel situations that can happen.

        They should be automating trains instead. (Oh wait, we pretty much did that already.)

        • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          13 days ago

          It would be absurd and an absolute show stopper if the car stopped for dark shadows.

          That’s why they use LIDAR and not just visual cameras. They don’t need to know the difference between different objects; they just need to know an object is there, in the way, or even moving in a way that could potentially put it in the path of the vehicle.

          They’re making it more complicated by working on both autonomous driving, and also image recognition for use by AI.

          • Eranziel@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            13 days ago

            I agree that LIDAR or radar are better solutions than image recognition. I mean, that’s literally what those technologies are for.

            But even then, that’s not enough. LIDAR/radar can’t help it identify its lane in inclement weather, drive well on gravel, and so on. These are the kinds of problems where automakers severely downplay the difficulty of the problem and just how much a human driver does.

          • ristoril_zip@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            13 days ago

            my point is that “if there’s an obstruction, stop” means these cars are going to be stopping and requiring human intervention all the time. That’s semi autonomous at best.

            I don’t know if you’ve encountered intransigent geese in your driving adventures, but the only way to deal with them is to slowly drive through the flock until they move out of your way.

            fully autonomous cars are never going to happen without major changes to our roads. we’d be better off investing in more busses and trains.

    • aramis87@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      12 days ago

      The main issue I have with full self driving is that it’ll probably never actually be full self driving; there’ll always be use cases where people have to take over - ice, snow, slightly flooded roads, sand, whatever*. And humans will have to take over under conditions when it’s extremely helpful for them to have had extensive driving experience under a range of conditions - experience they’ll no longer have because the car’s been driving them everywhere.

      * Yes, I know we’re not supposed to drive in some of these conditions, and yet sometimes we have to, even if it’s just to get to a safer place.