• UndercoverUlrikHD@programming.dev
      link
      fedilink
      arrow-up
      10
      arrow-down
      4
      ·
      7 months ago

      Autopilot turns off because the car doesn’t know what to do and the driver is supposed to take control of the situation. The autopilot isn’t autopilot, it’s driving assistance and you want it to turn off if it doesn’t know what it’s should do.

        • UndercoverUlrikHD@programming.dev
          link
          fedilink
          arrow-up
          8
          arrow-down
          1
          ·
          7 months ago

          Sure, what meant though was that Tesla doesn’t have self driving cars the way they try to market it as. They are no different than what other car manufacturers got, they just use a more deceptive name.

      • Lemming6969@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        If an incident is imminent within the next <2 seconds or so, autopilot must take the action or assist in an action. Manual override can happen at any time, but in such a duration it’s unlikely and only the autopilot has any chance, therefore it cannot turn off and absolve itself if liability.

    • Biyoo@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      Autopilot turns off before collision because physical damage can cause unpredictable effects that could cause another accident.

      Let’s say you run into a wall, autopilot is broken, the car thinks it needs to go backwards. You now killed 3 more people.

      I hate Elon Musk and Teslas are bad, but let’s not spread misinformation.

      • Programmer Belch@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        It seems reasonable for the autopilot to turn off just before collission, my point was more in the line of “You won’t get a penny from Elon”.

        People who rely on Full Self Driving or whatever it’s called now, should be liable for letting a robot control their cars. And I also think that the company that develops and advertises said robot shouldn’t get off scot-free but it’s easier to blame the shooter rather than the gun manufacturer.

        • Biyoo@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          2
          ·
          6 months ago

          Yeah I agree. Both parties should be liable. Tesla for their misleading and dangerous marketing, drivers for believing in the marketing.

  • Hux@lemmy.ml
    link
    fedilink
    arrow-up
    121
    arrow-down
    6
    ·
    7 months ago

    This reminds me of that Chinese law about being personally responsible for all medical debts of a person you run over—incentivizing killing the person, rather than injuring them.

  • lugal@sopuli.xyz
    link
    fedilink
    arrow-up
    59
    arrow-down
    4
    ·
    7 months ago

    I hope this isn’t law anywhere. You’re liable for your car no matter what. You have to take control if necessary

    • Uriel238 [all pronouns]@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      40
      ·
      7 months ago

      I saw a headline about Mercedes offering an autopilot that doesn’t require the driver to monitor, so it’s going to be interesting to see how laws play out. The Waymo taxi service in Phoenix seems to occasionally run in with the law, and a remote service advisor has to field the call, advising the officer the company is responsible for the car’s behavior, not the passenger.

      • Cyclist@lemmy.world
        link
        fedilink
        arrow-up
        24
        ·
        7 months ago

        So in theory the manufacturer takes responsibility because they trust their software. This puts the oness on them and their insurance, thereby reducing your insurance considerably. In actuality your insurance doesn’t go down because insurance companies.

        • conditional_soup@lemm.ee
          link
          fedilink
          arrow-up
          12
          ·
          edit-2
          7 months ago

          I’m not trying to be the grammar police, just thought you might like to know that it’s “onus”.

        • Baŝto@discuss.tchncs.de
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          It’s the reason why they prefer to offer only assistence systems. Aside from warning they can act, but they don’t drive on there own. EU will even require some systems for new cars. They’ll especially annoy people who ignore speed limits and don’t use turn lights.

    • cm0002@lemmy.world
      link
      fedilink
      arrow-up
      34
      arrow-down
      3
      ·
      7 months ago

      You’re liable for your car no matter what

      Nope, it should be law that if an auto manufacturer sells an autonomous driving system that they advertise being able to use while driving distracted then they are liable if someone uses it as advertised and per instructions.

      What you wrote is probably an auto manufacturer executive’s wet dream.

      “You used our autonomous system to drive you home after drinking completely within advertised use and per manufacturer instructions and still got in an accident? Oh well tough shit the driver is liable for everything no matter what™️”

      • warm@kbin.earth
        link
        fedilink
        arrow-up
        26
        arrow-down
        3
        ·
        edit-2
        7 months ago

        When autonomous cars are good enough to just drive people around then yeah the companies should be liable, but right now they’re not and drivers should be fully alert as if they are driving a regular vehicle.

        • monk@lemmy.unboiled.info
          link
          fedilink
          arrow-up
          8
          arrow-down
          1
          ·
          7 months ago

          When autonomous cars are good enough to just drive people around

          they become autonomous cars. It’s not autopilot if I’m liable, simple as that.

        • cm0002@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          7 months ago

          but right now they’re not and drivers should be fully alert as if they are driving a regular vehicle.

          Which is what would be per manufacturer instructions, which still falls under my definition

          • warm@kbin.earth
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            7 months ago

            Replies aren’t always in disagreement! I agree with what you are saying, just adding on my thoughts on information further up the thread too.

        • FlexibleToast@lemmy.world
          link
          fedilink
          arrow-up
          7
          arrow-down
          3
          ·
          7 months ago

          There are already fully autonomous taxis in some cities. Tesla is nowhere near fully autonomous, but others have accomplished it.

          • kakes
            link
            fedilink
            arrow-up
            8
            arrow-down
            1
            ·
            7 months ago

            “Accomplished” is a strong word for something as complex as autonomous driving.

            • FlexibleToast@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              7 months ago

              Fair, but when a company is given the authority to run fully autonomous taxis in cities that’s a huge accomplishment. Granted they are cities that don’t see things like snow storms and I’m sure there is a good reason for that.

        • azertyfun
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          7 months ago
          1. Then don’t call it autopilot
          2. What’s the point of automated steering if you have to remain 100 % attentive? To spare the driver the terrible burden of moving the wheel a couple mm either way? It is well studied and observed that people are less attentive when they’re not actively driving, which, FUCKING DUH.

          Manufacturers provide this feature for the implicit purpose of enabling distracted driving. Yet they will not accept liability if someone drives distractedly.

          Next in We Are Not Liable For How Consumers Use Our Product, Elon will replace the speedometer by Candy Crush with small text that says “pwease do not use while dwiving UwU”.

          • warm@kbin.earth
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            7 months ago

            You choose to activate that mode, while I understand your sentiment and do agree, it’s not as cut and dry as ‘company liable’ or ‘driver liable’, both can be at fault. Taking blame off drivers entirely could make people even less attentive and the safety of lives is more important than some fines to a car manufacturer. The real problem is that mode being allowed to exist at all. It’s clearly not ready for use on public roads and companies are just abusing advertising to try and pin their ‘autopilot’ as something it isn’t.

            Also note: Some manufacturers (Volvo & Mercedes, that I know of) have already said they will claim full responsibility for their cars in self-driving mode.

          • warm@kbin.earth
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            7 months ago

            It’s still in its infacy, eventually it will replace humans entirely and the roads will be much safer. Right now it’s just like improved cruise control and kind of pointless.
            Some manufacturers have already said they will claim full responsibility for their cars in self-driving mode, which makes sense to do.

            • queermunist she/her@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              7 months ago

              I’ll clarify: what is the actual purpose of giving customers access to this infantile technology? It doesn’t make following traffic laws easier like cruise control does, it doesn’t make drivers better at driving or safer behind the wheel, and it merely encourages distracted driving.

              So why did they ship this product? Again, it just seems like a dangerous toy.

      • lugal@sopuli.xyz
        link
        fedilink
        arrow-up
        3
        ·
        7 months ago

        So I say it is law last time I’ve checked (which is a while back tbh), and you say “no, it should be law” in your opinion. You see it, right?

        Autonomous systems aren’t that trustworthy yet and you shouldn’t drive drunk with them. Are they really advertised that way?

  • Technoguyfication
    link
    fedilink
    arrow-up
    66
    arrow-down
    11
    ·
    edit-2
    7 months ago

    I’m not aware of a single jurisdiction on the planet that makes Tesla liable for what the vehicle does when autopilot is enabled. In order to activate autopilot you have to accept about 3 different disclaimers on the car’s screen that state VERY clearly how you are still responsible for the vehicle and you must intervene if it starts behaving dangerously.

    I’ve been driving with autopilot for over 2 years, and while it has done some stupid stuff before (taking wrong turns, getting in the wrong lane, etc.), it has NEVER come close to hitting another vehicle or person. Any time something out of the ordinary happens, I disengage autopilot and take over.

      • Zagorath@aussie.zone
        link
        fedilink
        arrow-up
        41
        arrow-down
        11
        ·
        7 months ago

        Bro bought a Tesla just 2 years ago. Long after it was very widely known just how much of an arsehole Musk was, and after many other excellent EVs were on the market.

        I’ll let you draw the conclusions from those facts.

        • Technoguyfication
          link
          fedilink
          arrow-up
          7
          ·
          7 months ago

          When I bought my car, there were no widespread plans for other manufactures to adopt NACS, you couldn’t get your hands on a Rivian for less than $100k, and I was commonly driving long distances for work so I needed a vehicle with long range that I could charge quickly on trips. Tesla checked all the boxes.

          I haven’t experienced any of these super widespread quality or reliability issues people on the internet talk about. It was delivered with no issues, has needed very little maintenance (just tire rotations basically), and it’s not falling apart like some would lead you to believe. I don’t know what to say other than that my personal experience with the vehicle has been great, and that’s what I really care about in a vehicle. I don’t buy cars based off what the CEO says on Twitter.

        • jose1324@lemmy.world
          link
          fedilink
          arrow-up
          12
          arrow-down
          19
          ·
          7 months ago

          Hate Musk or not, the Tesla is still a very good car. In many markets still the better value often times.

          • pufferfisherpowder@lemmy.world
            link
            fedilink
            arrow-up
            14
            arrow-down
            1
            ·
            7 months ago

            Yeah and while Elon is the fucking worst I assume not everyone knows that he is the Tesla man. It’s incredible actually how much he’s intertwined with the brand. I would totally buy a Toyota or whatever and I couldn’t tell you the name of their CEO, nor of any other car manufacturer, nor would I look up who they are beforehand.

            Granted the poster above is on Lemmy so I assume he knows more about musky boy than he would like.

            • Technoguyfication
              link
              fedilink
              arrow-up
              2
              ·
              7 months ago

              I have a Ford too and couldn’t even tell you who the CEO of Ford is. Teslas are great daily drivers, I don’t care what the CEO does or says online.

            • kameecoding@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              7 months ago

              his username is technoguyfication, either it’s a troll account or he is rolling with the technobro moniker

              • Technoguyfication
                link
                fedilink
                arrow-up
                3
                ·
                7 months ago

                I’ve had this username since I was 11 years old, you don’t need to read that deeply into it haha

          • Zagorath@aussie.zone
            link
            fedilink
            English
            arrow-up
            15
            arrow-down
            4
            ·
            7 months ago

            Everything I’ve heard says that Teslas have had huge reliability problems.

            • jose1324@lemmy.world
              link
              fedilink
              arrow-up
              12
              arrow-down
              7
              ·
              7 months ago

              These days not really. I’m gonna get downvoted to oblivion obviously because this is Lemmy, but generally the cars are more than fine these days

            • Technoguyfication
              link
              fedilink
              arrow-up
              3
              ·
              7 months ago

              Haven’t experienced any myself. I’m just a single data point, but my car has been nothing but reliable from day one. It’s a great daily driver.

      • Technoguyfication
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        7 months ago

        You can think whatever you want, but my experience driving it has been perfectly fine. Range is great, the car is not falling apart like some people claim, it was not delivered with any issues, and chargers are plentiful where I live. Those are the main things I (and many others) care about in a vehicle. I don’t care what the CEO does or says online. I have a Ford as well and couldn’t even tell you who the CEO of Ford is.

    • orcrist@lemm.ee
      link
      fedilink
      arrow-up
      4
      ·
      7 months ago

      If someone is injured or killed by a Tesla car, they can sue the company directly, regardless of any legal agreements you may have as the owner. Whether they win is a different question, but they might win if they could show that Tesla was negligent, and especially if Tesla was willfully negligent.

      Just because you think you’re responsible, even if you agreed in triplicate that you’re responsible, doesn’t necessarily make you legally responsible, depending on the circumstances. And that’s the way it should be.

      • Annoyed_🦀 @monyet.cc
        link
        fedilink
        arrow-up
        28
        ·
        7 months ago

        Nah bro if it’s the choice between raising insurance cost vs killing people + jail time for manslaughter + eating the guilt for the rest of my life, i’ll take the insurance.

        Also wth america your capitalism and your priority is wack.

      • ProgrammingSocks@pawb.social
        link
        fedilink
        arrow-up
        11
        ·
        7 months ago

        I don’t like the spying aspect but it is unironically true that if you slam your brakes at every red light you are driving in a dangerous fashion. It’s more so about the pattern than a one off event though.

      • Timecircleline
        link
        fedilink
        arrow-up
        5
        ·
        7 months ago

        I mean without getting into the privacy nightmare piece, frequent hard braking probably means you have a habit of following too closely, or not paying attention to potential hazards and covering a brake. So I don’t think the car manufacturer should supply it but also think it would be good to let the person with the habit know so that they can learn to be a safer driver?

      • Baŝto@discuss.tchncs.de
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        In the mean time EU will require systems that automatically do emergency breaks and also different signaling for emergency breaks.

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    7 months ago

    Even with autopilot I feel it’s unlikely that driver would not be liable. We didn’t have a case yet but once this happens and goes higher to courts it’ll immediatly establish a liability precedence.

    Some interesting headlines:

    So I’m pretty sure that autopilot drivers would be found liable very fast if this developed further.

    • dejected_warp_core@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      7 months ago

      I am not a lawyer.

      I think an argument can be made that a moving vehicle is no different than a lethal weapon, and the autopilot, nothing more than a safety mechanism on said weapon. Which is to say the person in the driver’s seat is responsible for the safe operation of that device at all times, in all but the most compromised of circumstances (e.g. unconscious, heart attack, taken hostage, etc.).

      Ruling otherwise would open up a transportation hellscape where violent acts are simply passed off to insurance and manufacturer as a bill. No doubt those parties would rush to close that window, but it would be open for a time.

      Cynically, a corrupt government in bed with big monied interests would never allow the common man to have this much power to commit violence. Especially at their expense, fiscal or otherwise.

      So just or unjust, I think we can expect the gavel to swing in favor of pushing all liability to the driver.

      • Hagdos@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        Making that argument completely closes the door for fully autonomous cars though, which is sort of the Holy grail of vehicle automation.

        Fully autonomous doesn’t really exist yet, aside from some pilot projects, but give it a decade or two and it will be there. Truly being a passenger in your own vehicle is a huge selling point, you’d be able to do something else while moving, like reading, working or sleeping.

        These systems can probably be better drivers than humans, because humans suck at multitasking and staying focused. But they will never be 100% perfect, because the world is sometimes wildly unpredictable and unavoidable accidents are a thing. There will be some interesting questions about liability though.

    • SkyezOpen@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      7 months ago

      They’re most likely liable. “FSD” is not full self driving, it’s still a test product, and I guarantee the conditions for using it include paying attention and keeping your hands on the wheel. The legal team at tesla definitely made sure they weren’t on the hook.

      Now where there might be a case for liability is Elon and his stupid Twitter posts and false claims about FSD. Many people have been mislead and it’s probably contributed to a few of the autopilot crashes.

    • stom@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      You’re still in control of the vehicle, therefore you’re still liable. Like plopping a 5 year old on your lap to drive while you nap, if they hit people it’s still your fault for handing over the control to something incapable of driving safely while you were responsible for the vehicle.

      • Norodix@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        But a reasonable person would not consider a child capable of driving. An “extremeley advanced algorithm that is better and safer than humans and everyone should use it” is very different in this case. Aftet hearing all the stupid fluff, it is not unreasonable to think that selfdrivong is good.

        • stom@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          7 months ago

          Teslas own warnings and guidance assert that drivers should remain ready to take control when using the features. They do not claim it is infallible. Oversight and judgement still need to be used, which is why this argument wouldn’t hold up at all.

          • LovesTha🥧@floss.social
            link
            fedilink
            arrow-up
            1
            ·
            7 months ago

            @stom @Norodix Pity Tesla hasn’t taken reasonable precautions to ensure the driver is driving.

            It isn’t unreasonable to have customers expect the thing they were sold to do the thing they were told it does.

  • CCF_100
    link
    fedilink
    English
    arrow-up
    22
    ·
    7 months ago

    Slam on the breaks but oh no you drive a cybertruck and the break petal stops working

  • Uriel238 [all pronouns]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    7 months ago

    The cover-your-ass scenario.

    In the Philosophy Crash Course there was a scenario like this. I’ll paraphrase:

    You’re a traveler exploring a semi-devloped nation in South America. Coming out of the wilderness you come across a squad of soldiers. They are forcing twenty villagers to dig a mass grave. The officer to the soldiers tells you these villagers committed the state crime of supporting a rival to their leader, and are to be executed. But as you are a guest in their country, he will make you an offer: if you shoot one of them, yourself, he will set all the rest free, and then can hike to the border and beg for asylum. (A rough trek, but the neighboring country may take them).

    Do you shoot one of the villagers?

    Actually killing someone is rather hard on the psyche, and most of us cannot bear the thought (and might suffer from trauma as a result). But then, perhaps this is a small price to pay for nineteen human lives.

    Thomas Aquinas and Kant were happy to let the soldiers kill the villagers so as to avoid committing the sin of murder, themselves. Aquinas and Kant even would not lie to the murderer at the door, or Nazi Jew-hunters to save the lives of fugitives hidden in their home, since lying was sin enough, and they would count on God to know His own. Both had contemporaries who disagreed, and felt it was proper to suffer the trauma and do what was necessary (assuming the officer of the soldiers seemed inclined to keep to his word and actually spare the remaining villagers.)

    So, the cover your own ass response has a long history of backers, including known philosophers.

    • JohnDClay
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      7 months ago

      How are you sure the soldiers will follow though with their end of the bargain? Once they give you the gun, can you try and shoot the soldiers? Could you bribe the soldiers to release all the prisoners?

      Thought experiments like this have two options, but real life is never only two options. Getting into that mindset can lead people to accept things for the greater good without exploring all the options.

      • exocrinous@startrek.website
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 months ago

        Yeah, this is the problem I have with the fat man trolley problem. I am not a railway engineer, nor am I a physicist. I have enough knowledge of railways to understand how a switch works and what the consequences are in a few seconds of observation. But knowing a fat man would stop a tram? How do I know that? I don’t have a deep enough understanding of the physics and mechanics of trams to know that! Did a rail engineer tell me to push the fat man? Well then she can do it, I’m not risking making the wrong decision based on her hunch! I can’t even imagine looking at a fat man and seeing his body as a tool I can murder in order to stop a tram. Who the hell thinks that way during an emergency? I’d never come up with that idea even if I was a rail engineer!

    • mondoman712@lemmy.mlOP
      link
      fedilink
      arrow-up
      4
      ·
      7 months ago

      You just described an alternative version of the well known trolley problem, which the post is referencing.

      The answers to the problem from the philosophers is interesting.

      • Uriel238 [all pronouns]@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        7 months ago

        The Trolley problem is a schoolbook example of the failure of creed-based philosophy (deontological ethics), but is also used (the various scenarios) to illustrate that circumstances that don’t affect the basic scenario or outcome do affect our feelings and our response to the scenario.

        It’s easier to pull a lever from a remote position than to actually assault someone or kill them by your own hand, for example.

        There are other scenarios that don’t necessarily involve trolleys, but involve the question of doing a wrongful act in order to produce a better outcome. Ozymandias in The Watchman killing millions of New Yorkers to prevent a nuclear exchange, thereby saving billions of people. (Alan Moore left it open ended whether that was the right thing to do in the situation, but it did have the intended outcome.)

        We like the trolley problem because you can draw it easily on the blackboard, but other situations are much better at illustrating how subtle nuance can drastically change the emotions behind it.

        Try this one:

        The Queen of the land dies. On the day of her sister’s coronation, she declares that Anglicanism is now the faith and Catholics are now unlawful — a reversal of the old order — Catholics are to report to a town or city hall to convert or be executed. You are Catholic. Do you obey the law or flee? And if you obey the law, do you convert or perish at the hand of the state? Do you lie about your faith to state agents or to the national census?

        To a naturalist like myself, I’m glad to lie or convert to spare my own life, but to the devout, pretending to be another faith, or converting by force was a terrible sin, so it’s a very sober (and historically relevant) look at religious principle.

    • Sean@liberal.city
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      @uriel238 @mondoman712
      In the days before Wannsee Conference (Nazis setting up death camps) but after the invasion of Poland where most executions occurred by firing squad, there were German tourists who would travel to partake in the firing squads. So the trauma is not universal across the human experience and there’s some circumstances that would cause individuals to kill. Lynchings and massacres in the US, are examples of this occurring without a war to give cover to killings.

      • Uriel238 [all pronouns]@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        7 months ago

        We’ve seen a similar phenomenon in some of the red states in the ideology conflict here in the US. There are people eager to kill someone just to have the experience, and who volunteer to hunt targeted groups (trans folk, lately) or as participants in an execution by firing squad. I remember in the John Oliver’s first segment on the death penalty (he did a second one recently) executions were stalled due to difficulties obtaining the drugs used in lethal injections, and firing squads were brought up. The expert pointed out the difficulty finding one executioner, let alone seven. The officials suggested recruiting volunteers from the gun-enthusiast citizenry, which the expert saw as naïve.

        I can’t speak to firing-squad executions during the German Reich and the early stages of the holocaust, but I can speak to the Einsatzgruppen who were tasked with evacuating villages (to mass graves) who harbored Jews, harbored enemies of Germany or otherwise were deemed unworthy of life. The mass executions were hard on the troopers, and as a result Heydrich contended with high turnover rates.

        This figured largely into the movement towards the industrialized genocide machine that pivoted around the Auschwitz proof of concept. Earlier phases included wagons with an enclosed back in which the engine exhaust was piped. The process was found to be too slow, and exposed to many service people to the execution process. The death camps were staffed to assure no-one had to interact with the prisoners and process the bodies, so no-one would have to confront the visceral reality of before and after. They were staffed so that anyone who engaged a mechanism was two steps away from the person authorizing (and taking responsibility for) the execution. The guy who flipped the switch was just following orders.

        Interestingly, we’d see a repeat of this during the International War on Terror, specifically the Disposition Matrix which lead to executions of persons of interest on the field by drone strike (Hellfire missile launched from a Predator drone). During the CIA Drone Strike Programs in Afghanistan and Pakistan, the drone operation crews suffered from high turnover rate, with operators suffering from combat PTSD from having pulled the trigger on the missile launches. It didn’t help they were also required to scan the damage to assess the carnage, and identify the casualties.

        Interestingly, this also presented an inverted demonstration of how the human mind can tell the difference between violent video games and the real thing. Plenty of normies play Call of Duty without dealing with the mental after-effects of war, but even when we conduct war operations from continents away, our brains recognize that we are killing actual human beings, and suffers trauma from the act. War continues to be Hell, and video games not so much.

  • bufalo1973@lemmy.ml
    link
    fedilink
    arrow-up
    16
    ·
    edit-2
    7 months ago

    The funny part will be once the car doesn’t have a driver and is full autonomous. If the car kills someone, who’s to blame?

    • Schadrach@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 months ago

      You treat it like any other traffic accident, except if a self driving car is responsible, that responsibility lies with the vehicle’s owner.

      • Wogi@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        7 months ago

        It would have to be the manufacturer.

        If someone steals your car and kills someone with it, then disappears without ever being identified, the car owner doesn’t assume liability. Liability falls on whoever was operating it at the time. If software was driving, then the software company assumes the liability.

        • Schadrach@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          6 months ago

          Doubt it. I mean, any self driving car is going to make the driver agree to responsibility for what the car does and ensure the user has a manual override available just in case.

          No company is going to ship fully autonomous driving software (for example to have fully autonomous driverless taxis) without contractually making the fleet owner responsible for their fleet cars.

        • explodicle
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          6 months ago

          But you bought the driverless car and turned it on. You never agreed to the thief’s joyride. Where do you draw the line for “operation” - like operating a steering-assist car, or operating a Roomba?

    • Glytch@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      7 months ago

      The company that rented it to you, because fully self-driving cars won’t be for private ownership, they’ll just replace rideshare drivers.

      • explodicle
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        Who’s to say that will be immediate? Many people won’t be quick to abandon their guaranteed-available vehicle, especially while every house and employer has parking.

          • explodicle
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            Not rhetorical question: has insurance ever immediately eliminated anything?

                • Sizzler@slrpnk.net
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  6 months ago

                  Ok so ten years then. In that time nearly all average family cars will be smart. They will have self-driving (they can come pick you up). Will have a few years of insurance claims and premiums showing they are not responsible for 99% of crashes and insurance will react accordingly pushing up the insurance of the last holdouts so far that it becomes uneconomical for the average person to drive “manual”.

      • bufalo1973@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        It’s not the same. When you have a dog you use a leash and, if needed, you can restrain the mouth.

        In this case you are not in control. And you can’t be. You are just a passenger. And you should have the same responsibility as a passenger in a train: none.

        • boatsnhos931@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          I didn’t know about your parameters. I would think your example pushes it home, no car should ever be fully autonomous and should have a “leash” that a human could “restrain” the car with if necessary. Is no good?

    • supercriticalcheese@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      Whichever was at fault is my non-lawyer opinion.

      What kind of penalty you apply to a self driving car guilty for causing an accident is a good question though.

      • bufalo1973@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        I guess it would be the car maker’s responsibility if you are only a passenger in the car.

  • KillingTimeItself@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    2
    ·
    7 months ago

    this is funny and all, but it doesn’t matter what you’re doing here, you’re technically liable for all of them so uh.

    I’ll wait for a better version of this.

    • marcos@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      7 months ago

      I think that’s the point. There’s a follow-up about killing the people tying others to the rails that fits.

  • DNOS@lemmy.ml
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    7 months ago

    Immagino having a car that doesn’t pretend to drive herself but it’s enjoyable to drive, a car that doesn’t pretend to be a fucking movie because it’s just a car, a car without two thousands different policies to accept in wich you will never know what’s written but a car that you will be able to drive even though you decided to wear a red shirt on a Thursday morning which in you distorted future society is a political insult to some shithead CEO, a car that you own not a subscription based loan ,a car that keeps very slowly polluting the environment instead of polluting it with heavy chemicals dig up from childrens while still managing to pollute in CO2 exactly the same as the next 20 years of the slow polluting one not to mention where the current comes from, a car that will run forever if you treat it well and with minor fixes with relative minor environment impact and doesn’t need periodic battery replacement which btw is like building a new vehicle … This are not only a critical thoughts about green washing but are meant to make you reflect on the different meanings of ownership in different time periods

    And yes I will always think that all environmentalists that absolutely needs a car should drive a 1990s car, fix it, save it from the dump fields and drive it till it crashes into a wall …

    • SirQuackTheDuck@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      I would expect that that 90’s car would eventually be able to be converted to hydrogen combustion. That would save on pumping up petrol (if the hydrogen is not generated with petrol) and it would not cost yet another car to be created.

  • Sibbo@sopuli.xyz
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    6 months ago

    Reminds me of the Chinese issue: you run over someone, but they are likely not dead. Will you save their life but accept having to pay for whatever healthcare costs they have until they are recovered? Or will you run over them again, to make sure they die and your punishment will be a lot lighter?