this post was submitted on 10 Jun 2023
54 points (100.0% liked)
Technology
38603 readers
319 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You're missing the point -- with a human driver there is accountability. If I, as a human, cause an accident, I have either criminal or civil liability. The question of "who is at fault" get murky. And then you have the fact that Tesla is not obligated to report the crashes. And then the failures of automated driving is very different than human errors.
I don't think anyone is suggesting that we ban autonomous driving. But it needs better oversight and accountability.
In these cases the human is still accountable. Do you think that if a Tesla plowed into a kindergarten while using Autopilot the driver would avoid punishment? The driver is using a feature of the car. It tells you to stay alert and be prepared to take over on short notice. Those crashing are the idiots that sit in the backseat, go to sleep or play on their phones while the Autpilot is on. The only self driving right now where I would be in favour of punishing the company if something went wrong is those taxis that you purely are the passenger in.
Sit behind the wheel, you are responsible for what happens.
I don't think this is a practical take. If I'm driving a car, I'm in control and know my intentions. If I'm responsible for an accident, it's because I wasn't fully alert or did something stupid.
If autopilot is driving the car, I don't know the car's intentions. It might cause a dangerous situation before my brain can process that it has bad intentions and take over. If it sees something in the road that isn't there, it might swerve or brake and I won't recognize until it already happened. That's considering an alert driver with full concentration behind the wheel. The whole point of autopilot is to reduce the driver's workload. It does that by requiring less concentration. I think it's inherently dangerous to require human intervention in autopilot systems.
I'm all for more accountability, but it's still better than human driving. Cutting human car deaths in half in exchange for murky accountability is clearly a worthwhile trade.