• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

Google says self-driving car hits municipal bus in minor crash

danarhea

Slayer of the DP Newsbot
DP Veteran
Joined
Aug 27, 2005
Messages
43,602
Reaction score
26,256
Location
Houston, TX
Gender
Male
Political Leaning
Conservative
Alphabet Inc’s Google self-driving car struck a municipal bus in a minor crash - in what may be the first case of one of its autonomous cars hitting another vehicle.

Damn, Google is going for a self driving car that is very realistic. It drives just like some of the drivers here in Houston. What's their next trick? Virtual road rage?

Article is here.
 
Damn, Google is going for a self driving car that is very realistic. It drives just like some of the drivers here in Houston. What's their next trick? Virtual road rage?

Article is here.

They just need to stay away from the Asian software developers and engineers. What they need to do is head to Alabama and get a couple of rednecks to wire the cars up. If you want a driver you've got to go where they drive!:lol:
 
Damn, Google is going for a self driving car that is very realistic. It drives just like some of the drivers here in Houston. What's their next trick? Virtual road rage?

Article is here.
If these self-driving cars catch-on, the liability issues are going to be pretty dayem interesting! :doh
 
If these self-driving cars catch-on, the liability issues are going to be pretty dayem interesting! :doh

Yes. The liability seems to be one of the main issues. What understand is that it should be solvable though. But as a technology it sounds like a great leap forward and disruptive for taxi drivers.
 
Damn, Google is going for a self driving car that is very realistic. It drives just like some of the drivers here in Houston. What's their next trick? Virtual road rage?

Article is here.
Google trunk monkey!
 
If these self-driving cars catch-on, the liability issues are going to be pretty dayem interesting! :doh
The ethics of the programming are quite interesting as well. In the event of an unavoidable crash should the software act to save the life of the driver above all else or act to save the most lives?
 
Damn, Google is going for a self driving car that is very realistic. It drives just like some of the drivers here in Houston. What's their next trick? Virtual road rage?

Article is here.
At least it was Trying to drive, texters have given up on the idea altogether.
 
If these self-driving cars catch-on, the liability issues are going to be pretty dayem interesting! :doh

I think most laws say someone has to behind the wheel and whoever is behind the wheel is responsible for any damage.
whether you can go after google for any liability if their software screws up will yet to be seen however I am sure you are going
to have to sign a waver.
 
The ethics of the programming are quite interesting as well. In the event of an unavoidable crash should the software act to save the life of the driver above all else or act to save the most lives?

Regardless of the choice, there will be problems in both situations :

a) The car protects the driver, but 3-4 people die. No matter how you twist it, the driver doesn't have a better claim at life.
b) The driver dies, but 4 lives are saved. Great. Would you buy a car that makes that kind of decision for you ? Would you buy a car that values the security of others more than your own ? Would you get in a car that can decide "Meh, sorry owner. You gotta die. Great good ya know." ?
 
The ethics of the programming are quite interesting as well. In the event of an unavoidable crash should the software act to save the life of the driver above all else or act to save the most lives?

Simple, self preservation.

Why? All the other cars that will be involved will act in self preservation as well, probably giving all drivers a better chance.

Also: manual override = driver makes his own decisions for himself.
 
Damn, Google is going for a self driving car that is very realistic. It drives just like some of the drivers here in Houston. What's their next trick? Virtual road rage?

Article is here.

Good morning, danarhea. :2wave:

Excellent! :lamo: :thumbs:
 
If these self-driving cars catch-on, the liability issues are going to be pretty dayem interesting! :doh

This was the first instance of at fault collision for an autonomous car. Total liability could be exponentially reduced! Such a reality doesn't bode well for the casualty industry.
 
The ethics of the programming are quite interesting as well. In the event of an unavoidable crash should the software act to save the life of the driver above all else or act to save the most lives?
Excellent point!

I first read about this quandary quite a while ago, and it weighs in my thoughts that I'll likely never own one of these vehicles unless forced upon me; I prefer to control these decisions myself, on a case-by-case basis as they arise. I do not believe it's possible for even our most brilliant minds to foresee every possible future event; as to every rule, there's always an exception!

This sounds right out of Issac Asimov's "I Robot" series!

But then we already seem to have succeeded as a society in surpassing George Orwell' "1984", so it looks like the fantasy treatises of my childhood books are coming to light!

And as crazy as it sounds, we've even managed to march quite a bit down the path Ted Kaczynski predicted in his 'Unibomber Manifesto', which to me is *really* scary!
 
The ethics of the programming are quite interesting as well. In the event of an unavoidable crash should the software act to save the life of the driver above all else or act to save the most lives?

Regardless of the choice, there will be problems in both situations :

a) The car protects the driver, but 3-4 people die. No matter how you twist it, the driver doesn't have a better claim at life.
b) The driver dies, but 4 lives are saved. Great. Would you buy a car that makes that kind of decision for you ? Would you buy a car that values the security of others more than your own ? Would you get in a car that can decide "Meh, sorry owner. You gotta die. Great good ya know." ?

Simple, self preservation.

Why? All the other cars that will be involved will act in self preservation as well, probably giving all drivers a better chance.

Also: manual override = driver makes his own decisions for himself.
It makes for an interesting philosophical question, but don't think that the researchers are devoting much time to it. In order for the car to even be capable of making those kinds of moralistic judgements the car would have to have enough situational awareness to understand the dilema to begin with. And if the car had that kind of situational awareness it would have almost certainly avoided the situation.
 
This was the first instance of at fault collision for an autonomous car. Total liability could be exponentially reduced! Such a reality doesn't bode well for the casualty industry.
Perhaps.

But I suspect it will open-up a whole new genre of law suits against the auto manufacturer, since driver-error or misuse has now been removed from the liability equation!

And this also brings interesting new wrinkles to the table, for example: "Is one liable to charges of DUI if 'operating' the vehicle while inebriated"?
 
Perhaps.

But I suspect it will open-up a whole new genre of law suits against the auto manufacturer, since driver-error or misuse has now been removed from the liability equation!

Sure. But that does not mean there will be more suits. IMO, that industry will be dramatically reduced from insurance to civil damages to repair.

"Is one liable to charges of DUI if 'operating' the vehicle while inebriated"?

Is it a DUI to take an uber home?
 
Sure. But that does not mean there will be more suits. IMO, that industry will be dramatically reduced from insurance to civil damages to repair.



Is it a DUI to take an uber home?
The passenger is not the operator of an Uber, but he is the operator of a smart-car. He's the human in charge, so I think it's a grey area that will need to be explored and defined legally.

Can one be charged with DUI by turning on the radio-on, while in their engine-off parked car in their own driveway, that is and has not been driven? The same precedents may apply here!
 
The passenger is not the operator of an Uber, but he is the operator of a smart-car.

Uber will be the largest single purchaser of autonomous cars when they are rolled out. I think they will be Tesla's.

He's the human in charge, so I think it's a grey area that will need to be explored and defined legally.

At the moment? Sure. But i truly believe we will see reduced overall casualties.

Can one be charged with DUI by turning on the radio-on, while in their car while in their own driveway? The same precedents may apply here![/QUOTE]
 
It makes for an interesting philosophical question, but don't think that the researchers are devoting much time to it. In order for the car to even be capable of making those kinds of moralistic judgements the car would have to have enough situational awareness to understand the dilema to begin with. And if the car had that kind of situational awareness it would have almost certainly avoided the situation.
The programmers may not be programming the cars to make these type heuristic decisions (at this point), but when they are designing their algorithms they are taking into account these possible quandaries. They have to. How can they not?
 
I think most laws say someone has to behind the wheel and whoever is behind the wheel is responsible for any damage.
whether you can go after google for any liability if their software screws up will yet to be seen however I am sure you are going
to have to sign a waver.

Which "laws"?

Most civil liability has a huge common law component.


And in any event, I'm not aware of any jurisdiction with a statute limiting the common law in the way you suggest. If your non-self-driving car has a defective gas pedal which gets stuck, causing you to get into an accident, you're not simply stuck with the liability. If the person you hit sues you, you turn around and sue the manufacturer.

Been a while since I took civil procedure, but I'm pretty sure it's called a "third party complaint"



I don't see why it would be any different with self-driving cars, unless a state decides to protect self-driving car manufacturers with a law that makes it impossible to sue the manufacturer with a third party complaint in those situations.
 
Last edited:
So for any of the older nerds out there. I thought of this.

 
Creating software that can take into account the unpredictable nature of human drivers isn't going to be a piece of cake.

If the software is found at fault, it makes little sense to go after the 'rider' in that autonomous vehicle, they weren't driving.

Who's that leave as the next likely law suit target? The autonomous vehicle manufacturer, right?
 
So for any of the older nerds out there. I thought of this.



that was a great movie for the first few minutes, until it was time to feed the pet, which was a balloon with feet. It all went downhill from there. LOL.
 
Regardless of the choice, there will be problems in both situations :

a) The car protects the driver, but 3-4 people die. No matter how you twist it, the driver doesn't have a better claim at life.
b) The driver dies, but 4 lives are saved. Great. Would you buy a car that makes that kind of decision for you ? Would you buy a car that values the security of others more than your own ? Would you get in a car that can decide "Meh, sorry owner. You gotta die. Great good ya know." ?

as to b) yes, if it lowers my overall risk. As this was a minor crash with no injuries, compared to the several times a human driver has crashed into the google cars, my bet is on google. Also i can take naps or whatever during a commute and insurance costs will undoubtedly go down. So even if it were a slightly greater risk and even if a computer could somehow *know* that by swerving left instead of right the crash will kill the driver and not others AND if had i been driving myself the outcome would be different AND the other cars in the crash don't make the same decision, it could be worth it.

But you know this technology will only improve. By the time they're actually for sale and at an affordable cost, the odds of your scenario will be low indeed
 
Back
Top Bottom