this post was submitted on 10 Jun 2023
54 points (100.0% liked)

Technology

38596 readers
343 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
 

More or less Tesla's autopilot is not as safe as Tesla would have you believe.

you are viewing a single comment's thread
view the rest of the comments
[–] Wiitigo@lemmy.world 11 points 2 years ago (2 children)

Still almost exactly half the crash rate of human-only drivers. Therefore, we should ban human-only driving.

[–] RandomBit@sh.itjust.works 9 points 2 years ago (1 children)

I don’t think this is a fair comparison since an Autopilot crash is a 2 stage failure: the Autopilot and then the driver both failed to avoid the crash. The statistics do not include the incidents where Autopilot would have crashed but the human took control and prevented it. If all instances of human intervention were included, I doubt Autopilot would be ahead.

[–] Kepler@lemmy.world 1 points 2 years ago (1 children)

If all instances of human intervention were included, I doubt Autopilot would be ahead.

Why would you interpret non-crashes due to human intervention as crashes? If you're doing that for autopilot non-crashes you've gotta be consistent and also do that for non-autopilot non-crashes, which is basically...all of them.

[–] RandomBit@sh.itjust.works 3 points 2 years ago

If a human crashes and their action/vehicle is responsible for the crash, the crash should be attributed to the human (excepting mechanical failure, etc). I believe that if an advanced safety systems, such as automatic braking, that prevent a crash that otherwise would have occurred, the prevented crash should also be included in the human tally. Likewise, if Autopilot would have crashed if not for the intervention of the driver, the prevented crash should be attributable to Autopilot.

As has been often studied, the major problem for autonomous systems is that until they are better than humans WITHOUT human intervention, the result can be worse than both. People are much less likely to pay full attention and have the same reaction times if the autonomous system is in full control the majority of the time.

[–] darkmugglet@lemm.ee 6 points 2 years ago (2 children)

You're missing the point -- with a human driver there is accountability. If I, as a human, cause an accident, I have either criminal or civil liability. The question of "who is at fault" get murky. And then you have the fact that Tesla is not obligated to report the crashes. And then the failures of automated driving is very different than human errors.

I don't think anyone is suggesting that we ban autonomous driving. But it needs better oversight and accountability.

[–] Locrin@lemmy.world 3 points 2 years ago (1 children)

In these cases the human is still accountable. Do you think that if a Tesla plowed into a kindergarten while using Autopilot the driver would avoid punishment? The driver is using a feature of the car. It tells you to stay alert and be prepared to take over on short notice. Those crashing are the idiots that sit in the backseat, go to sleep or play on their phones while the Autpilot is on. The only self driving right now where I would be in favour of punishing the company if something went wrong is those taxis that you purely are the passenger in.

Sit behind the wheel, you are responsible for what happens.

[–] JillyB@beehaw.org 1 points 2 years ago

I don't think this is a practical take. If I'm driving a car, I'm in control and know my intentions. If I'm responsible for an accident, it's because I wasn't fully alert or did something stupid.

If autopilot is driving the car, I don't know the car's intentions. It might cause a dangerous situation before my brain can process that it has bad intentions and take over. If it sees something in the road that isn't there, it might swerve or brake and I won't recognize until it already happened. That's considering an alert driver with full concentration behind the wheel. The whole point of autopilot is to reduce the driver's workload. It does that by requiring less concentration. I think it's inherently dangerous to require human intervention in autopilot systems.

[–] Fubarberry@aiparadise.moe 1 points 2 years ago

I'm all for more accountability, but it's still better than human driving. Cutting human car deaths in half in exchange for murky accountability is clearly a worthwhile trade.