• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

Should your self-driving car kill you to save a school bus full of kids?

Should your self-driving car kill you to save a school bus full of kids? Should your self-driving car kill you to save a school bus? | Digital Trends
No matter networked and "more intelligent," per se, or not networked, I have my concerns about self-driving cars. There are situations where the lesser of evils has to be decided and should be decided by a human, not a mindless machine that has no context, and networking is only a small portion of context, for example in their example what if the school bus was empty and just returning to the yard, now we're back to one on one for potential lives lost unless there's more than one in your vehicle, how would a car know that? Sure sooner or later perhaps all that information would be there, but I kind of doubt it. Secondly in that scenario, why would the smart car swerve to avoid the initial car if the only other choices were to fatally interrupt a bus or cause it's own driver a fatality. Why not just allow the original accident to occur, braking as well as possible to reduce speed and damage, where perhaps injury instead of death or no injury at all? Someone writing that scenario was way too high to be considering what he/she was.
 
Logically, yes.

Personally, however, I think those kinds of vehicles should pretty much always come with a manual override for emergency situations.
 
Logically, yes. Personally, however, I think those kinds of vehicles should pretty much always come with a manual override for emergency situations.
Problem is that in an emergency, there's not likely time to manually override and have any level of success, do you think? I mean emergency does indicate a time sensitive issue, and with driving it's usually seconds not minutes. How quickly would a auto system release? My imagination, since I have no actual knowledge, is that the benefit of manual override would be discounted by the time it takes to switch.
 
Problem is that in an emergency, there's not likely time to manually override and have any level of success, do you think? I mean emergency does indicate a time sensitive issue, and with driving it's usually seconds not minutes. How quickly would a auto system release? My imagination, since I have no actual knowledge, is that the benefit of manual override would be discounted by the time it takes to switch.

Perhaps. However, it's the thought that counts... For me, anyway.

I want to know that I have, at the very least, the theoretical ability to potentially determine my own fate, if not the actual ability to do so.
 
As someone who lives in the boonies, I have a dozen missing streets on a gps to tell you why self-driving cars are a dumb idea.
 
Logically, yes.

Personally, however, I think those kinds of vehicles should pretty much always come with a manual override for emergency situations.

Haven't you seen the Google cars? No brakes, gas, or steering wheel...

The ridiculous thing is that, if the road were filled with self driving cars, why should there be any accidents in the first place?
 
Haven't you seen the Google cars? No brakes, gas, or steering wheel...

The ridiculous thing is that, if the road were filled with self driving cars, why should there be any accidents in the first place?

No machine can ever be perfect if designed by imperfect people. I know. A machine designs and builds the perfect machine. Great idea except where do you get this first perfect machine. As long as the machine can trace its roots back to imperfect people it can never be perfect.

I think at some point in the future the car will be obsolete the same as the horse and buggy. However the horse and chariot or buggy were around for thousands of years before being replaced by the car. The car is only a little over 100 years old. Hmm?
 
Last edited:
**** no, it shouldn't!

No one sane or rational would ever buy a vehicle operated by a computer program that would deliberately kill them.
 
I consider self-driving cars just another example of human laziness.

If people simply learn to drive defensively there would be exponentially fewer accidents than occur now.
 
That's why, first, to not have a computer driven car, insurance costs will go up, all accidents caused by computers will be blamed on "human error", until insurance is prohibitively expensive... Then people will be forced into a computer controlled car.
 
So, if a car is 100% computer driven, then people should be legally allowed to "drive" drunk, right?

I don't think so. The car would by necessity also have to have manual functions... even if Google or whomever starts updating roads constantly and perfectly, there is the possibility of off-road situations. You could not perform those functions if you were drunk.


Now that said, the way cops catch you for DUI is observation of your driving...
 
I don't think so. The car would by necessity also have to have manual functions... even if Google or whomever starts updating roads constantly and perfectly, there is the possibility of off-road situations. You could not perform those functions if you were drunk.

Now that said, the way cops catch you for DUI is observation of your driving...
Sure, but while the article did not specifically say that manual overrides would be literally impossible, I was asking the question from the article's inference that manual overrides would be literally impossible.

If literally impossible, I would see no legitimate reason for there to ever be another DUI again... though I'm sure we'd keep those laws on the books, because that's just what we do.
 
Issues like that are what lead me to believe that while yes, we will one day all have self driving cars, auto pilot will be used much in the same way we use cruise control. A human will still be required to be sitting behind the wheel ready to take over if need be.

The only way around that is if they construct special roads for self driving cars only. Something akin to an HOV lane with barriers. The networked cars would then be working in much the same way trains do. No cross traffic and pedestrians not allowed.

At least until AI is much, much , MUCH more sophisticated.
 
Issues like that are what lead me to believe that while yes, we will one day all have self driving cars, auto pilot will be used much in the same way we use cruise control. A human will still be required to be sitting behind the wheel ready to take over if need be.

The only way around that is if they construct special roads for self driving cars only. Something akin to an HOV lane with barriers. The networked cars would then be working in much the same way trains do. No cross traffic and pedestrians not allowed.

At least until AI is much, much , MUCH more sophisticated.
Pedestrians, bicycles, animals, road debris... all would throw a monkey wrench into the mix.
 
As someone who lives in the boonies, I have a dozen missing streets on a gps to tell you why self-driving cars are a dumb idea.

After driving for 2 miles down a dirt road we ended up in the front lawn of a country's couples house - they were clearing brush, husband was mowing. Wife gave us the real directions to the dig site "since GPS it happens all the time."
 
Issues like that are what lead me to believe that while yes, we will one day all have self driving cars, auto pilot will be used much in the same way we use cruise control. A human will still be required to be sitting behind the wheel ready to take over if need be.

The only way around that is if they construct special roads for self driving cars only. Something akin to an HOV lane with barriers. The networked cars would then be working in much the same way trains do. No cross traffic and pedestrians not allowed.

At least until AI is much, much , MUCH more sophisticated.

The problem is that without actively driving people will quickly lose the ability to drive and respond.
 
Aircraft already have, and have had for decades, collision avoidance systems. The systems do not take into account surrounding terrain, value of life, or any factor other than other aircraft. The systems do calculate vertical paths that can be flown to avoid collision. Soon, they will also have horizontal and vertical paths to avoid collision. But, my point is, if the vehicle is in a position that it can't stop for the vehicle in front of it, there is a logic error already. If the system was correctly designed it would never have to make that decision to begin with. Further, if all vehicles (as the scenario suggests) are on "auto-pilot" then why is there an accident to begin with?
 
After driving for 2 miles down a dirt road we ended up in the front lawn of a country's couples house - they were clearing brush, husband was mowing. Wife gave us the real directions to the dig site "since GPS it happens all the time."
Ya, my wife works HR, hiring for an oil company, and when she calls people from home every so often, it's the same thing every time she must give the warning about driving to the sites.

"Use the directions in the email, if you use GPS it WILL bring you to the wrong place"
 
Back
Top Bottom