this post was submitted on 27 Jul 2023
434 points (95.0% liked)

Technology

34423 readers
261 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

cross-posted from: https://derp.foo/post/81940

There is a discussion on Hacker News, but feel free to comment here as well.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 82 points 1 year ago (4 children)

Bill the manufacturer 100%, IMO. Thats why I think self driving cars beg an unanswerable legal question, as when the car drives for you, why would you be at fault? How will businesses survive if they have to take full accountability for accidents caused by self-driving cars?

I think its almost always pointless to hold back innovation, but in this case I think a full ban on self driving cars would be a great move.

[–] [email protected] 27 points 1 year ago* (last edited 1 year ago) (5 children)

The most basic driving like long stretches of highway shouldn't be banned from using AI/automated driving. The fast paced inner city driving should be augmented but not fully automatic. Same goes for driving in inclement weather: augmented with hard limits on speed and automated braking for anything that could result in a crash

Edit: I meant this statement as referring to the technology in it's current consumer form (what is available to the public right at this moment). I fully expect that as the technology matures so will the percentage of incidents decline. We are likely to attain a largely driverless society one day in my lifetime

[–] [email protected] 20 points 1 year ago (2 children)

"Self driving with driver assist" or whatever they call it when it isn't 100% automated is basically super fancy cruise control and should be treated as such. The main problem with the term autopilot is that for airplanes it means 100% control and very misleading when used for fancy cruise control in cars.

I agree that it should be limited in use to highways and other open roads, like when cruise control should be used. People using cruise control in the city without being in control to brake is the same basic issue.

Not 100% fully automated with no expectation of driver involvement should be allowed when it has surpassed regular drivers. To be honest, we might even be there with how terrible human drivers are...

[–] [email protected] 22 points 1 year ago

Autopilot systems on airplanes make fewer claims about autonomous operation than Tesla. No pilot relies completely on autopilot functionality.

[–] [email protected] 4 points 1 year ago (1 children)

Autopilot in aircraft is actually kinda comparable, it still needs a skilled human operator to set it up and monitor it (and other flight controls) all of the time. And in most modes it's not even really all that autonomous - at most it follows a pre-programmed route.

[–] [email protected] -1 points 1 year ago (2 children)

Can’t the newer ones take off and land as well?

[–] [email protected] 2 points 1 year ago

They can, but the setup is still non-trivial and full auto landing capability isn't used all that much even if technically available. It also isn't just the capability of the aircraft, it requires a shitton of supporting infrastructure on the ground (airport) and many airports don't support this.

That would be equivalent to installing new intersections where you'd also have a broadcast of what the current signals are for each lane, which would help self-driving cars immensely (and regular cars eventually too, with assistive technologies to help drivers drive more safe), but that's simply not a thing yet.

[–] [email protected] 2 points 1 year ago

Yes, but the pilot still needs to pay attention and be ready to intervene

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago)

At this point, I vote for whatever is most well-demonstrated to be safe. I like your ideas.

I was also thinking, maybe a standardized protocol could be implemented where municipalities can broadcast a signal containing the local road rules, which could then be interpreted by the car's processor. If you could get enough bandwidth then you could feasibly even give site-specific instructions, like provide extra breaking distance and signal time at a specific intersection (or you know, the light status lol), or road state characteristics like dryness, or lockout areas with road work or accidents.

However, I also think that the driver should ultimately be responsible for the safety of the vehicle's operation for the time being, including when the vehicle is driving itself. The driver has the ability to hit the break and take control. While the technology is this immature, it is irresponsible for the operator to not supervise it. Fine the manufacturer a hefty fine for implementing unsafe technology, and fine the operator a much smaller but still meaningful amount for unsafe operation.

Unfortunately, due to the number of vehicles on the roads and the resource and pollution intensity of manufacture and maintenance, the best solution to these problems is to replace personal vehicle infrastructure, not to upgrade it.

[–] [email protected] 2 points 1 year ago (2 children)

Long stretches of highway are good unless there is a stopped emergency vehicle.

[–] [email protected] 5 points 1 year ago

I mean that's a huge issue for human drivers too.

We need assistive technologies that protect us, but if at any point the driver is no longer driving the car manufacturer needs to take full responsibility.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

That would be the augmented part and the AI. ANYTHING that presents a potential hazard already takes a vehicle out of automated driving in most models, because after a few Teslas didn't stop people started suing

[–] [email protected] 2 points 1 year ago

I disagree, I feel no matter how good the technology becomes, the odd one-in-a-million glitch that kills someone is not preferable to me over the accidents caused by humans. (Even if we assume the self driving cars crash at a lesser rate than human drivers).

The less augmentation past lane assist and automated braking the better IMO. I definitely disagree with a capped speed limit built into the vehicle, that should never be limited less than what could melt engine components or something (and even that would be take time to turn on). The detriments that system would cause when it malfunctions far outweigh the benefits it would bring to safety.

[–] [email protected] 2 points 1 year ago

Its why im all for automated trucking. Truck drivers is a dwindling source and living the lifestyle of a cross country truck driver isnt highly sought after job. The self driving should do the large trip from hub to hub, and each hub ahould do the last few miles. Keeps drivers local and fixes a problem that is only going to get worse.

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago) (1 children)

I think its almost always pointless to hold back innovation, but in this case I think a full ban on self driving cars would be a great move.

I agree on both points. Also I think it's important to characterize the 'innovation' of self driving as more social-economic than technological.

The component systems- sensing, processing, communications, power, etc- have a wide range of engineering applications and research and development will inevitably continue no matter the future of self-driving. Self driving only solves a very particular social-economic-technological issue that only exists because of how humans historically chose to address the same issue with older technology. Self driving is more of a product than a 'technology' in my book.

So my point there is that I don't think a ban on full self driving really qualifies as 'holding back innovation' at all. It's just telling companies not to develop a specific product. Hyperbolic example but nobody would say banning companies from creating a nuclear powered oven was 'holding back innovation'. If anything forcing us to re-envision human transportation without integrating into legacy requirements advances innovation more than just trying to use AI to solve the problems created by using humans to solve the original problem of how to move humans around in cars.

[–] [email protected] 4 points 1 year ago

I see it the same way, but an incredible amount of people I've discussed this with say that its stupid to hold back technological innovation "like self-driving cars". Its an unnecessary piece of technology.

I also just think the whole ethical complication is fucked. The way we have it now, every driver is responsible for their actions and no driver ever glitches out on the freeway (and if they do, they bear the consequences). Imagine a man's wife and kids getting killed by a drunk driver vs a self-driving car. In one scenario you can clearly place blame, and take action in a much more meaningful way than just suing a car manufacturer.

[–] [email protected] 5 points 1 year ago (1 children)

Current laws in most places still require a human with a license to drive self driving cars (and I don't see that changing any time soon with how terrible self driving cars still are). That makes the human driver, who should intervene in these scenarios, responsible.

Once we remove the human override, I would consider a self driving car breaking the law to be a faulty product, possibly requiring a recall if it happens more often. If any other part of the car is prone to breaking, you'd demand a recall too.

As for the fines, you'd probably see something like "the driver receives a fine but they can hold the company that sold them the car liable for a faulty product".

Fining the manufacturer directly is a nice idea, but if Tesla does go bankrupt, where do we send the fines then?

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

I'm pretty sure there are autonomous cars driving around San Francisco, and have been for some time.

EDIT: Here's an uplifting story about San Francisco-ians(?) interacting with the self-driving cars.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

The responsible party should be the owner of the vehicle, not the manufacturer or passenger. If a company runs an automated ride share service, for example, that company should be liable. Likewise if you own a car and use the self-driving feature, you are at fault it it goes wrong, so you should use it at your own risk.

That said, for the owner to be truly responsible, they need ownership of the self-driving code, as well as diagnostics for them to be able to monitor it. If they don't have that, do they truly own the car?

That said, there's nothing stopping a manufacturer or dealer from making a deal to cover self-driving fines.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

Well exactly, I see no way that all the self driving source code will be FOSS (I don't think corporations would ever willingly sign onto this). So the responsible party in the case of a malfunction should therefore be the company, because in a full self driving setup the occupant is not controlling the vehicle, and has no reasonable way to ensure the safety of the code.

[–] [email protected] 0 points 1 year ago

Which is why it should be dual responsibility. The owner of the vehicle chose to use the feature, so they have responsibility. If it malfunctions when the driver was following the instructions, the manufacturer has responsibility. Both are culpable, so they should share responsibility.