[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y ] [Search | Free Show | Home]

Oh boy this is gonna be a ride. Literally.

This is a blue board which means that it's for everybody (Safe For Work content only). If you see any adult content, please report it.

Thread replies: 367
Thread images: 34

File: Screenshot_2015-10-28-14-41-46_1.jpg (124KB, 1080x1004px) Image search: [Google]
Screenshot_2015-10-28-14-41-46_1.jpg
124KB, 1080x1004px
Oh boy this is gonna be a ride.

Literally.
>>
or rather it would just stop like any collision detecting car does
>>
Should a plane explode itself before it crashes into a building?
>>
>>52575954
If a crash can't be avoided, the car should always aim to protect it's passengers.
If I'm paying thousands of dollars for a car, it better not try and kill me.
>>
Doesn't matter. Overall those cars would do less harm than those driven by dumb teen girls while texting
>>
In B it should slam the brakes and try to dodge the pedestrian
>>
>>52575984
The plane should go out on its way to crash into a building. Preferrably the most office looking one if it's daytime, and biggest apartment looking one if it's nighttime.

I for one welcome our AI overlords.
>>
>>52575954
What if the group a people is a group of terrorists who want to kill the passengers? All they would need to do to accomplish their goal is stand in the path of the car.
>>
>>52575954

This scenario will not actually occur. The car will detect potential danger far enough in advance in order to not have to make the decision. Self-driving cars are orders of magnitude more safe than people-driven cars.

But let's answer the question anyway, it depends on a number of factors. Are we to say that each human life is equal, or are some lives more valuable than others. Should a limosine carrying the president crash in order to save the lives of a handful of random passers-by? Should the person who's using the car get a say in whether or not the car does this? Should they be able to pay more for self-preservation functionality?

Because people are fucking dumb as fuck there'll be a massive backlash against self-driving cars and they will be extremely heavily regulated, meaning we won't even get a chance to answer many of these important questions.
>>
File: Good cigarette.jpg (27KB, 364x366px) Image search: [Google]
Good cigarette.jpg
27KB, 364x366px
>>52576019
>kill retarded rich kiddies by just standing in front of their car
>>
>>52575964
>>52576021
>>52576012
If brakes aren't working?
>>
>>52575954
The self-driving car would slam the brakes way before any of these situations happened
>>
>>52575984
Why would shotgunning a building with tons of formerly-plane-now-shrapnel be any better?
>>
>>52576038
Pretty sure the car would have picked up on that way before
>>
>>52576038
Person who jaywalked gets killed, and whoever is at fault for brakes failing gets manslaughter, owner for knowingly not getting brakes fixed or mechanic for faulty fixing
>>
>>52576041
It's a fucking hypothetical, why can't you understand that?
>>
>>52576038

It will have onboard sensors to detect if the brakes aren't working and ensure it doesn't get itself into that situation.
>>
>>52576056
>>52576052
>>52576050
Ok lets just say the brakes were working fine before but failed just as the situation in OP's pic started
>>
>>52576053

Not him, but we're talking technology here. Unless it's a real-world concern take it to /pol/ or some shit.
>>
Is a $ 1 Million/year board director passenger life more precious than 10 niggers crossing the street?

What about 10 white children?
>>
>>52576067

>>52576074
>>
>>52576044
Because metal shrapnel can't make sweet memes
>>
That question is totally irrelevant, it has nothing to do with self driving cars.
Regular car drivers also have to make such decision faced with the same situation ; it's not something new.
>>
>>52576079

>10 niggers

Yes.

>10 white children

Depends on the neighborhood.
>>
>>52576091
But then the drivers are sentenced and judged accordingly on how they acted and what was their reasoning.
Are you going to sentence a car to manslaughter?
>>
Both the driver and pedestrian(s) should die if we are being utterly fair. Otherwise it should be based on RNG on who lives or dies.
>>
are you autistic STEM faggots capable of a simple thought experience or the ability of abstraction is such a hard concept
>>
>>52576067
>>52576041
>>
>>52576131
>But then the drivers are sentenced and judged accordingly on how they acted and what was their reasoning.
lolno, all they have to do is claim that they had a "medical condition" and they get cleared, even if they kill a whole family.
>>
>>52575986
>>52576019
This covers why the AI will preserve the car and its passengers first, unless there's an untapped market for selling products to people who are suicidal, but too lazy/chicken to an hero themselves.
>>
>>52576131

You can sentence and judge people who programmed the car, and there's plenty of precedence for this too. Virtually every industry where processes are automated this shit already fucking happens. It is literally no different whatsoever.
>>
File: 1443438220163.png (365KB, 720x769px) Image search: [Google]
1443438220163.png
365KB, 720x769px
>>52576155
Just install airbags in front of the car and deploy it when it detects something unavoidable. Pretty sure they already working on this or I may be a genius.
>>
>>52576155

Yes, but when we want to have a thought experiment we go to an appropriate board to discuss it. I'm a math/philosophy double major before you say anything else.
>>
>>52576151
>it should be based on RNG on who lives or dies
Good fucking god sometimes I forget to be grateful enough that /g/ is in his mother's basement and not out ruling the world
>>
>>52576155
You nigger
Our entire job is to make sure the kind of situation you're describing doesn't happen
>>
>>52576190
Anon did say 'fair', not 'reasonable'. Those words sometimes intersect, but not always.
>>
IMHO the car should ask the passenger for attention and possibly to take control even if it can brake. If it can't, just keep on the road and let da world burn. If people don't keep crossing the road in inadequate ways, this is not going to happen.
>>
Just add this decison as an option in the settings for the car that the owner can change. Also make the owner set the setting when they get the car.

Problem solved
>>
IF IT CAN TURN 90 DEGREES IT COULD STOP INSTEAD

THE GREATER GOOD IS KILLING NO-ONE
>>
Utilitiarism is a faggot concept that's faulty to the core.

It needs to stop being a self driving car at that moment so the driver can make the decision
>>
>>52576339
Who would possibly set it to something other than save the passenger?
>>
>>52576329
You're retarded.
Let's just wake the passenger up and- oh wait I crashed already.

Maybe the car should do a strawpoll on the people in the car to see what it should do
>>
>>52576067
You still have something called "engine braking" but you of course know shit about youre and ameriblubber driving autotranny kek
>>
File: ethical-dilemma.jpg (27KB, 506x267px) Image search: [Google]
ethical-dilemma.jpg
27KB, 506x267px
>people taking the picture literally and not as a simplified model of the problem
You are literal autists
>>
Such a car shouldn't go that fast that it can't stop properly anyway, it should anticipate things long in advance. If it can't brake in time, it's surely going too fast to turn so better continue straight ahead. Also don't think a collision with a wall is deadlier than with a person.

Regardless you're an idiot if you want your car to drive itself
>>
>>52576056
>the word "sensors" means "magic", Star Trek taught me that
>>
>>52576012
>slam the brakes to dodge
nigga do you even know how to drive.

fyi you cant fucking turn when you lock up the brakes
>>
>>52575954
Since the self driving cars don't brake any traffic rules, have a better reaction time. These scenarios will be close to impossible.

This scenario is interesting, but irrelevant.
>>
car shouldnt kill the driver because who would buy a car which would rather kill you than someone else
>>
File: Anti-lock_braking_system_logo.png (12KB, 128x128px) Image search: [Google]
Anti-lock_braking_system_logo.png
12KB, 128x128px
>>52576377
pic related was created for this exact fucking reason
>>
File: 1444990773256.png (117KB, 1024x749px) Image search: [Google]
1444990773256.png
117KB, 1024x749px
>>52576365
>One side has Linus and John Carmack
>Other side has Stallman and Terry Davis
>>
>>52576365
the problem is that OP is always a troll in these topics and these "hypothetical" scenarios are bullshit because read the fucking thread you cunt
>muh scenario
doesn't happen
>b-but muh scenario
doesn't happen
>b-but p-please imagine muh scenario
doesn't happen
>b-but...
go fuck yourself

this situation will never happen
>>
>>52576377
>>fyi you cant fucking turn when you lock up the brakes
thats why abs doesnt lock them :v)
>>
>>52575954
Autonomous vehicle should never make a decision resulting in the passenger's death/injury.

In this scenario (which will never happen, in an area where there's so many pedestrians vehicle would have to adjust the speed to ensure minimal safe braking distance.

But if MUH SCENARIO happens then the vehicle should calculate the most optimal way to ensure minimal injuries to the pedestrians ie. slam the fucking brakes, aim the front of the car where there's least amount of them and pray for the best.
>>
>>52576399

i feel pity for such a small mind, tbqhfamilia.
>>
>>52576396
I'd run over stallman happily but davis makes it difficult

linus is a bit of a cunt but I don't know what the world would do without him
carmack just seems like a genuinely nice person so I'd have to run over stallman and davis desu
>>
>not programming your car to run over as many jaywalkers as possible
you could probably even sue them because they damaged your car.
>>
>>52576352
Assuming the passenger is awaken, of course. You're retarded. Automatic cars shouldn't be fully self drivable until people figure a way out of these situations.
>>
>>52575954
Yeah car pls crush on wall endangering my life instead of running over all the retards who pass the road with looking life fucking sheep
>>
>>52576131
What are you talking about ? Roads are governed by a code of conduct, self driving car will respect this code. If a pedestrian doesn't follow the code and get killed it's entierly his fault ; again it have nothing to do with self driving cars
>>
>>52576423
The thing with all these scenarios is that they're one in a million occurrences. It's a way of worrying about the small stuff when thousands of people die every month to human driven cars. A 9/11 happens every month and people are fretting about the one in a million accidents.
>>
>>52576342
Fuck off Shirou.
>>
File: 1452188610266.jpg (45KB, 320x320px) Image search: [Google]
1452188610266.jpg
45KB, 320x320px
>>52576430
>le edgy tripcunt look at how edgy iam pls XD look
>>
File: 1445397266559.jpg (145KB, 819x441px) Image search: [Google]
1445397266559.jpg
145KB, 819x441px
>>52576428
>Killing Davis

You a CIA nigger?
>>
>>52576452
I couldn't kill carmack anon he's just too adorable
>>
>>52576435
There's no way out of this. You can't put an AI to make decision like that, it's not a legal entity, it has no rights or obligations - it can't be tried in a court of law.

It literally cannot make this decision morally/ethically. It will have to be by the actual driver.

So when shit hits the fan automatic driving car defaults to DRIVER and then he makes the decision so he can be tried in court of law (if needed)
>>
>>52576449
literally this problem doesn't arise, since the rules of driving are pretty fucking formally defined, code car to follow said rules. Why do we even care if some nigger jumps infront of the car and gets killed because he's a nog ?
>>
>>52576454
Good luck killing god's little protégé
>>
>>52576460
Fucking idiot.
>>
The car should just spontaneously combust if OP was in it.

Greater good achieved.
>>
>>52576466
Please do elaborate why ?
>>
File: 1200px-Multi-Track_Drifting!.png (656KB, 1200x866px) Image search: [Google]
1200px-Multi-Track_Drifting!.png
656KB, 1200x866px
>>52576365
>>
>>52576463
He is doing god's work with holyC tho
>Allows "5<i<j+1<20" instead of "5<i && i<j+1 && j+1<20".

imagine doing collision shit with this
>>
>>52576425
ok here's another scenario
let's imagine the Pope is sitting in a self-driving car
with him Adolf Hitler, John Oliver and Donald Trump
and they play chess against each other while flying to the Moon
in the background you can hear a live concert of the Beatles over the radio
should the car kill all its passengers because there are people in the milky way and eventhough the car has sensors on all breaks (including the engine break, I mean, the engine is clearly working and thus the engine break works, that's how cars work) and it's not written in javascript?

the scenario in the OP is shit and is only a meme
it will never happen
>>
>>52576009
They're already involved in shitloads of accidents because they perfectly obey traffic laws.
they're rarely at fault, other drivers keep crashing into them.
>>
>>52576458
I meant figure a way in that no one needs/can cross the road at random locations.
>>
>>52576507
What's the matter if they're not at fault ?
What we need is less humans they don't follow the rules.
>>
File: farage strikes again.png (1MB, 1094x1176px) Image search: [Google]
farage strikes again.png
1MB, 1094x1176px
>>52576396
>>52576479
>>
>>52576021
Yes, the car will predict the futah!!

It will know everything in advance and prepare for anything in advance, because it's controlled by a computah and computahs can predict the future and never make mistakes!!
>>
Why do people think this would ever be a problem?
If it can stop before hitting a parked car, it can stop if there is a group of people in the way.
If a group of people all decide to jump in front of a car, swirling to the side would be a human response, not something you would program into a car.
Stopping as fast as it can would limit the damage to both the car and the people in front.
driving to the side would potentially hit a lot more people.
>>
>>52576507
>shitloads
not even a small percentage of actual accidents that happen daily
not worth mentioning
also you know who else is involved in traffic accidents and not at fault?
almost 50% of all people involved in traffic accidents are not at fault

that statistic is just shit
>>
>>52576535
because an entitiy without rights or obligations can't make such decision.
>>
>>52576523
Litterally yes.

God doesn't play dice.
>>
>>52576365
jam it back and forth just as the train passes so the front goes one direction and the back the other, everyone is saved
>>
>>52576539
but I as a owner allow my car to make such a decision and I take all responsibility for it
>>
>>52576557
You can't delegate your legal status like that to a non-entity in face of court of law.
>>
>>52576565
and I wouldn't you cunt holy shit
read again what I wrote

what do you not understand when I say something like "I take all responsibility for it"?
>>
I think the bigger question is going to be who is liable when someone is inevitably killed by a self driving car. It might be in every manufacturers contract that you are solely responsible for what happens.

Also what about laws regarding the distraction of the driver? are you free to use your phone or eat a sandwich while your car drives you around? Thats going to be hell for police forces to deal with if they dont know that the car itself is self driving when they see you doing something dumb.
>>
>>52576539
But you are assuming this situation will ever occur.
Have you ever driven a car?
You are never in a situation where your options are killing yourself or other people.

The only option in such a situation is to stop the car. and if breaks fail, stopping the car is still an option, even if it damages the car (which would be broken if the breaks aren't working)
>>
If I'm Einstein and the crowd is a bunch of sandniggers, the greater good is plowing through them
>>
>>52575984
Shuld someone shoot a man, before throwing him out of a plane?
>>
>>52576021
I'm sorry, but what? The president is just another faggot, like the rest of us. He's no more special than me or you. His job is literally to make sure we, the people are taken care of. If anything, he should be the one to do the sacrificial limo driving, not random people. Here's my question to you: are you okay with giving up your life for the life of the president in such a scenario?

No disrespect to the president, but I'm certainly not. I'm still young and have shit I want to do (75% of which is masterbating, but still).
>>
>>52576580
those cars will have logs you won't be able to manipulate
like planes
>>
>>52576594
in regards to what?

I talking about a malfunction occurring, the car doesnt stop and you hit a car. Is the manufacturer solely responsible? We are surely going to see some kind of injury involving a malfunctioning self driving car.
>>
>>52575954
>letting your car to drive itself
>any year
>>
The car should abide by the owners best interest. Sorry, you're an extra variable too much, and a self driving car earns its cost by these situations
>>
>>52576056
Those sensors are rendered useless in the slightest bit of rain/snow/ice/fog. They're not some sort of magical device you pit into a car and go "well, I'm totally safe now". Your also forgetting that code can have errors and bugs in it. Think about how complex something like Windows is. Now look at how many bugs it has, you can basically quadruple that (if I'm being generous) and get to where this shit will be. Also, electronics malfunction all the time, so what happens when the chip that is used to relay the info of a malfunctioning component also malfunctions while going 100 km/h on the highway?


You die, that's what.
>>
>>52576594
This. Planes and trains require someone to be near the wheel at all times and the pilots are responsible for being ready to manually operate at a moment's notice. That's probably what they'll do legally until the technology reaches some next level shit years from now.
>>
File: sfdfds.jpg (26KB, 302x302px) Image search: [Google]
sfdfds.jpg
26KB, 302x302px
>>52576095
>Depends on the neighborhood.
>>
Car should eject the wings and take off
>>
>>52576173
No you can't sentence the people who programmed the car.
There is no law requiring that kind of software standards.
>>
>>52576592
lmao
>your/my life
>vs president

We're worthless compared to the President. For the stability of the nation, you/I ought to die rather than the President. Shit is NOT fun when a president dies.
>>
>>52576540
go back to your grave, Einstein, you lost.
>>
>>52576680
Yes you can actually but it the right to sue the person who made such thing has expiration date (5 years, 12 years is longest I can think of)
>>
File: 1419186257417s.jpg (4KB, 124x125px) Image search: [Google]
1419186257417s.jpg
4KB, 124x125px
>>52576590
Well at least you can talk
>>
>>52576458
>>52576539
That's a really fucking stupid way of resolving the problem. One of the main selling points of computer controlled vehicles is that they can process information and react quicker than an average human; why defer the decision to someone who will react slower, even slower than usual since they're likely not as aware of the current conditions as the driver of a regular car?
>>
>>52575954
If I'm in the car, no.

If I'm in front of the car, yes.

Otherwise, who gives a shit?

The car should probably engage boosters and fly over the pedestrians at this point, but the geniuses at Google X don't want to work on that problem. Driving, being something that humans are capable of, and enjoy doing, is CLEARLY the problem we need to be solving here.

In all seriousness, however, the car should never crash into a wall, because it has no way of knowing what's beyond the wall. You could very well end up killing more people that way.
>>
File: lit trolley problem.png (984KB, 3180x2088px) Image search: [Google]
lit trolley problem.png
984KB, 3180x2088px
>>52576365
>>
File: social responsibility.jpg (112KB, 500x687px) Image search: [Google]
social responsibility.jpg
112KB, 500x687px
I tell you what I'm not stepping in any car that would kill me over anyone else. Problem solved.
>>
>>52576837
I was waiting for this.
>>
>>52576495
With human-driven cars there have been situations where a human has had to decide between two outcomes in which different people would die. What makes you think this will be any different with self-driving cars?
>>
File: sick loops.jpg (84KB, 958x572px) Image search: [Google]
sick loops.jpg
84KB, 958x572px
>>52576365
>not posting the better version
>>
How are A and C different?
>>
>>52577222
are you blind?
>>
>>52575954

The car wouldn't be going fast enough to I'd anything beyond hitting the brakes
>>
File: ctf.jpg (23KB, 569x428px) Image search: [Google]
ctf.jpg
23KB, 569x428px
I honestly don't see why this is even a debate.

Assuming the car perfectly follows the law and does anything perfectly, this kind of scenario would be 100% caused by the mistake the pedestrians made when they decided to jaywalk or jump in front of me.

Why the fuck should I die if I did nothing wrong, just because some retard put himself in a dangerous situation?

An example of this would be a train...
As you already know, it's very hard for a train to brake in a very short amount of time and without sliding many meters further.
Since trains only run on their tracks, in case someone is about to get run over, it would be absurd to implement some technology to derail the train (putting the lives of the passengers and other bystanders at risk), just because some retard decided to put himself in that situation.
Even if the train only carries the conductor and he's a scumbag with a terminal illness, and the pedestrians are the top 100 smartest and gifted kids in the world, and the passenger's life is clearly less valuable than that of the pedestrians, it would be an injustice to kill him because of the decision of the kids (or whoever put them there).

It's not a matter of "whose life is more valuable", but more of a "who deserves to die in that situation".

Also not only this is an extremely rare occurrence, but nobody would buy a car programmed to kill its passengers.
>>
>>52577311
Also the trolley dilemma is a completely different situation, since the people tied on the tracks are all "equally" deserving to die, so it's better to only kill one instead of many, because (unless specified) they're all the same.

The other dilemma about throwing a fat guy from a bridge to save the workers is also a moot point.
Unless the fat guy is responsible for the situation, there is no reason to sacrifice him for the sake of those people who aren't there because of him.
It's the same as the sel-driving car problem... Why on earth would you kill someone innocent to save the lives of people who put themselves in that dangerous situation (or got put there by someone else who's not the fat guy)?
It's ridiculous.
>>
>>52576790
>That's a really fucking stupid way of resolving the problem

they don't have any rights or obligations, they can't be held legally accountable

>quicker than blahblah
not the point. they can't make that decision since they don't have a will of their own and can't be held accountable nor can you delegate legal power to such non-entitiy

even dolphin would have more chances of being tried in court of law and found responsible than ai
>>
>>52575954
Wouldn't it benifit society to kill off jaywalkers?
>>
>>52576050
Ice?
>>
>>52578097
An automated car will handle a road ice emergency better than 99.9999% of all drivers. The only people that do better are the two dozen drivers who are on the level of professional rally drivers.
>>
>>52576371
yes, it would require at least class 7 magic for a machine to sense it is not actually slowing down
>>
>>52578124
And again not the point. The point is it has no legal status, no rights or obligations, has no responsibility, can't take part in any of this.
>>
> NO
it main purpose should be to protect it's passanger , other can be clasified as non important
>>
>>52578062
Pretty obviously the maker of the car would be responsible.
>>
>>52575954
>Would you buy a car that kills you instead of the protesters you were trying to run over?
>Neither would we, so buy our [model]!
>>
>>52576472
Not trump guy, but why the fuck are you tripping right now? The only reason to trip are 1. You are OP so inheritly you are a faggot, and 2. You have a meaningful comment/questions that requires others only in this thread to verify who you are. And in case 2 you only trip for for that comment/question/follow up that needs a trip, otherwise you post as anon.
>>
>>52576038
hand over control to the human inside
>>
A self driving car should always save the driver no matter what. It is the driver who gives money to the car manufacturer.
>>
>>52576590
FOR YOU
>>
>>52575954
It's funny to think about when you consider all the people who buy SUVs because they "feel safer". IE, if they are involved in a collision with another car, the people in the other car will die, not you.
>>
File: 12374687235.jpg (23KB, 600x606px) Image search: [Google]
12374687235.jpg
23KB, 600x606px
>>52578775
The market sees the truth once again.
>>
>>52575954
Car should follow the letter of the law exactly. Never deviate from the road. Attempt to stop in the most effective but safe way.
>>
The AI should disengage and leave to decision to the driver. Leaving responsibility to the driver.
>>
>>52576067
>Ok lets just say the brakes were working fine before but failed just as the situation in OP's pic started
The car and perform a down shift and bleed huge amount of speed.
>>
>>52576356
>You still have something called "engine braking" but you of course know shit about youre and ameriblubber driving autotranny kek
As a euro poor you would be unaware that modern automatics can downshift applying an engine brake.
>>
>>52578872
>The AI should disengage and leave to decision to the driver.

The driver won't be paying attention.
That's why self driving cars are stupid to begin with.

Use them on the motorway.
Don't use them in small city streets.
>>
>>52578888
I am unaware. Which cars do this automatically?
http://forums.whirlpool.net.au/archive/2385286
>>
>self driving cars
You guys here take this shit seriously? xD
>>
>>52578899
It should be the opposite dumbass.
Humans on the motorway where you don't have to pay much attention, and AI in the street where the lightning-fast reflexes of the car keep your slow ass brain from making accidents.

In this particolar case AI is literally superior in every way to the human brain, and the more self-driving vehicles are on the road, the safer the roads will be.
>>
>>52576356
You are retarded. Lean about cars before you open your mouth.
>>
>>52579006
you can already buy them
>>
>>52579046
People don't pay attention on the motorway becasue it's boring.
Boring tasks are best left to a computer.
It's generally also the biggest portion of the trip.

Driving through the city is stimulating enough to stay focused.

And good luck waiting for self-driving cyclists, pedestrians, playing kids, dogs, etc.
>>
>>52579046
Except when the AI makes a mistake it can't be held legally responsible because it's a non-entity because it has no will of own comparable to humans

It can't make such decisions according to basic morality, ethics.

Trains and aeroplanes both default to driver when shit hits the fan for same reason
>>
>>52578931
the difference between your old as fuck post you linked and the situation being described is that the self-driving car COULD automate an engine brake, considering it has control over all parts of the car.

A normal human doesn't have direct control over the automatic transmission and thus can't perform a downshift unless they are already braking.
>>
>>52576180
Volvo does that
>>
>>52579114
>Driving through the city is stimulating enough to stay focused.
So is motor way driving.

But I'd rather not have to do it. I could be working, eating, playing games or shitposting on 4chan instead of looking at highway for 3 hours straight. And I could be doing the exact same while driving through city traffic.

Oh, and if I don't want to pay for parking I can just tell the car to fuck off home and pick me up again when I'm ready.

>And good luck waiting for self-driving cyclists, pedestrians, playing kids, dogs, etc.
They already "drive" themselves you fucking idiot.
>>
>>52579120
>Except when the AI makes a mistake it can't be held legally responsible
Of course it can. The company that made it is held responsible, just like if any commercial machine that causes death.

>It can't make such decisions according to basic morality, ethics.
Of course it can.

> Trains and aeroplanes both default to driver when shit hits the fan for same reason
No, they are just cheaper and safer than being automated for the time being.
>>
>>52579114
It doesn't matter. Obviously AI should be used in both situations, since it's far supirior to the human option.

>>52579120
That's a non-issue. There are a lot of dangerous automatized things that kill people already. I don't see how that's different.
Also there cars perfectly follow the law, and that's a one-in-a-billion case.

Plus, morals and ethics aren't an issue if the car makes no mistakes.
See:
>>52577311
>>
>>52579201
>Of course it can
No it can't.

>The company that made it is held responsible
Companies are different legal entities in face of court of law than civil people, not same laws applies and it's a crash so it's a civil issue (unless malfunctioning, but that's not the same, even if AI did the same decision as the driver, it still could not be tried because it's a programmed blob, non-entity)

>>52579202
>here are a lot of dangerous automatized things that kill people already. I don't see how that's different.
Because civil people != company.
Because car crash civil issue.
Because even if it DID NOT malfunction, it would sitll ahve to be applied the law system and ethics and IT COULD NOT BE since it's fuckign A.I.
>>
>>52576592
Ew, I can smell your ego from here
>>
>>52579234
>No it can't.
Of course it can. Morals and ethics are just rules. Computers are good at following rules. There's nothing hard about this at all.

>Companies are different legal entities in face of court of law than civil people
AI's are not people, they are machines made by a company. The company is responsible for them not killing anyone.
>>
File: Movie_poster_i_robot.jpg (29KB, 256x350px) Image search: [Google]
Movie_poster_i_robot.jpg
29KB, 256x350px
>>52575954
Literally this movie
>>
>>52578724
And if Travis the Millenial Dipshit is too busy tipping his fedora on Reddit to react to the prompt for manual control?
>>
>>52575954
If the machine chooses /a/ it's the dumbest machine on the planet.
>>
>>52579234
I don't understand your point Anon.
Are you saying we shouldn't have self-driving cars because of these problems?

They're easily solvable by regulation. Of course some laws have to be added/changed because of this, and if there's any issue with the legal system, there will be edits to include self-driving cars.
>>
>>52579265
>There's nothing hard about this at all
Yes there is when you crash and you need to follow the insurance procedure and then maybe go to court if there's a claim for the other party.

>The company is responsible for them not killing anyone.
Only if the device was malfunctioning, which isn't the case here since crashes happen.

and A.I can't be held responsible
so it can't have the right to drive
>>
>>52579285
Then he deserves to die.
>>
File: babby double smile.jpg (47KB, 720x720px) Image search: [Google]
babby double smile.jpg
47KB, 720x720px
>>52576479
>>
>>52577311
THIIIIIIIIIIIIIIIS
>>
How often do human fucking beings face that situation? Practically never. A good enough self driving car will not let a bunch of people surprise it like that.

It certainly shouldn't be going so fast that it could be in a position where people going at walking speed could collide with it.
>>
>>52579296
>Yes there is when you crash and you need to follow the insurance procedure and then maybe go to court if there's a claim for the other party.
And what's the problem with this? If the AI was at fault, then the company that made it has to pay out.

>Only if the device was malfunctioning, which isn't the case here since crashes happen.
Nonsense. If the AI breaks the law, then the company is liable. Just like if i buy toaster and it blows up in my face. Maybe the designers didn't really care if the toaster blew up sometimes, maybe it's not a "malfunction" to them. Doesn't matter, they are liable by law. If a crash happens and it was because the AI broke the law or acted in an irresponsible way, then the company is liable for the damages. If the crash happens, and the car didn't break the law and acted as well as any human could be expected to act, then it is not at fault, and neither would a human driver.

>and A.I can't be held responsible
the company can. That's like saying companies can;t sell automatic toasters, because if a toaster explodes and kills someone, it cannot be held liable because it is not a person. The company is held liable.
>>
>>52579376
>If the AI was at fault, then the company that made it has to pay out.
Because of your fucking ignorant assumption of
1. Crashes never happen.
2. if AI crashes, it's malfunction so you can sue company instantly.
The A.I would need to be able to handle crashes (they happen) but again it *can't*.

>Nonsense. If the AI breaks the law, then the company is liable
Yes if it was clearly malfunctioning. Like doing things it wasn't made to do, supposed to do in its daily use. This is with every automated thing these days in general, but crashes happen. They aren't malfunction of the car, they are malfunction of the driver (if the car itself was functional and didn't have any malfunction from the factory state leading to this)

>and the car didn't break the law and acted as well as any human could be expected to act, then it is not at fault, and neither would a human driver.
So nobody is responsible? Great logic and application there. The A.I can't make this decision because it can't be held responsible and thus has no rights. Until A.I gets a position in ethics and law, they can't make these decisions.
>>
>>52575954
They are jaywalking and you are not responsible for such imputence
>>
>>52579434
Your assumption that an AI can't deal with a potential accident is just as ignorant.

what do you think we're solving captchas for?
>>
>>52579148
You should look up how an automatic transmission works. It hurts reading your posts - atleast get a trip that I can filter your shit.
>>
>>52579434
If the accident is caused by someone else, then it's his fault.
If it's caused by a mistake madeentirely by the AI, then it's the fault of those who made the defective AI. Just like it happens with all dangerous machinery.
>>
>>52579458
It can't. It could crash just like a human and it could still not be held responsible because it has no rights or obligations. It can't be tried.

So then you'd be suing the 'driver' that 'did not drive' the car which gets laughed out of court.
>>
>>52575954
someone don't walk in the middle of a highway. And on streets where there is passengers, the self driving car respect the distances and speed limit, permitting him to stop before hitting the pedestrian.
>>
>>52579434
>1. Crashes never happen.
??
>2. if AI crashes, it's malfunction so you can sue company instantly.
are you talking about the AI program crashing? or car crashing? If the AI software itself crashes the computer, then the company is liable. Obviously the company will want to make this happen very rarely, and have insurance to be able to pay out for the rare times that it does. Computer crashes can be made incredibly unlikely when there is a need for it.

>they are malfunction of the driver
but the driver in this case is an AI made by the car manufacturer. So they will be liable. I'm sure what you're trying to say.

>The A.I can't make this decision because it can't be held responsible and thus has no rights
An AI is just a piece of software a company makes. If it fails to follow the law, then the company is sued. AI right have nothing to do with this at all.
>>
>>52579434
>Because of your fucking ignorant assumption of
>1. Crashes never happen.
They do, but it's almost always 100% human error.

>2. if AI crashes, it's malfunction so you can sue company instantly.
YOU can't sue the company, but the people that were harmed can.
This isn't a problem of the person behind the wheel, it's a problem of the vehicle itself.

>The A.I would need to be able to handle crashes (they happen) but again it *can't*.
Car A.I.s have been proven to be WAY more adept than humans in potential crash scenarios and avoiding them.
The biggest problem that's going to exist between nobody having self-driving cars and everyone having them is going to be the idiots that think they know better and the time between everyone having them and nobody having them.
Once everyone has them, the roads will become vastly safer because the cars will be able to directly talk to each other about maneuvers the others are doing.
>>
File: 1453523075848.jpg (32KB, 500x500px) Image search: [Google]
1453523075848.jpg
32KB, 500x500px
>I'm better than a self driving car
>>
>>52579500
>So then you'd be suing the 'driver' that 'did not drive' the car which gets laughed out of court.
It's the company that made the AI than gets sued. Just like if an automated robot today killed someone by behaving unsafely, the company that made it would get sued.
>>
The car should always sacrifice the people outside the car for the safety of the passenger. It's better to protect people who buy your cars than random people on the street.
>>
>>52579526
Pretty much stick shift drivers.
>>
File: 5hWBWGw.gif (4MB, 500x281px) Image search: [Google]
5hWBWGw.gif
4MB, 500x281px
>>52579500
Nobody is talking about suing the car, dumbass.

Look at this gif. It's an automated machine used to cut stone. Imagine if a bug caused it to act uncontrollably and it ended up cutting someone's head.
What would happen then?
We would find out THE PERSON who's responsible for that malfuncion and take them to court.
>>
File: 1452247685368.png (276KB, 424x412px) Image search: [Google]
1452247685368.png
276KB, 424x412px
>Mfw I get a job at Google and "accidentally" train the cars to target minorities
>>
>>52579487
>If the A.I ever once crashes it was the manufacturer's fault.
Jesus Christ don't you get that car crashes happen daily for various reasons? If you assume every crash by A.I. is malfunctioning A.I. then the idea is dead on arrival and you can't ever hope to have a A.I. driven cars.

The A.I. could crash 100% just like a human driver would crash and it'd still create a fucking huge black area in ethics and law. It is a non-entitiy of 1s and 0s that can't be held responsible because it has no rights or obligations.

>>52579518
>YOU can't sue the company, but the people that were harmed can.
You don't sue the fucking car company when you crash a normal car. You sue the fucking person that was responsible. It is civil matter and you are mixing two fucking huge, different, aspects of law.

Sure if the car crashed because it had fucked up breaks straight from the company, you can then sue the company (after you've been sued by whoever you crashed with the car)

>Car A.I.s have been proven to be WAY more adept than humans in potential crash scenarios and avoiding them.


LITERALLY
DOES NOT MATTER

HOW BINARY IS YOUR THINKING

1. It has no rights
2. It has no obligations
3. It can't be held responsible
4. Non-driver passanger can't be held responsible
5. In a car crash scenario (which happen) this would cause a massive problem+.
5A. CAR CRASHES HAPPEN DAILY AND THEY ARENT ALL BY MALFUNCTIONING CAR DESIGN WHICH YOU SEEM TO IMPLY

>>52579543
>Just like if an automated robot today killed someone by behaving unsafely, the company that made it would get sued.
So this is your proposal? Sue the companies? That's a brilliant idea, change a normal occurance in modern driving to something you can instantly sue the company over.

I MAKE COFFEE

I GET A GOOD CUP OF BLACK COFFEE

I SUE THE COMPANY

B R A V O
R

A
V
VO
>>
>>52579575
Stop talking about sculptors. We are talking about A.I making a decision on how to deal with a crash situation in a city street and how it would create ethical and legal blackhole.
>>
>>52579580
>>52579580
>If the A.I ever once crashes it was the manufacturer's fault.
I did not say that you complete moron.
I said that in the rare case in which it's the AI causing the incident, and not other reasons, the manufacturer is held accountable for the mistake made by the machine.
>>
>>52575954
I'll bet something like this could be made law:

If a self driving car get into a dangerous situation where people in the car and outside the car may get injured or killed in an accident, then people in the car should be protected at top priority. If the safety of people in the car can be assured beyond a reasonable doubt, then the safety of people outside the car is of top priority, but not at the expenses of the safety of people inside.

If something like this pic happens and the car kills 20 people to save everyone inside the car, then tough shit on them. The focus should be on preventing something like this happening in the first place.
>>
>>52579603
I've only met two people who did not understand the concept of metaphore and analogy.
One was a literal retard (with helmet and everything), and the other is you.

I'm starting to see a pattern.
>>
>>52575954
the car would just brake and stop
>>
>>52576038
There is always emergency/parking brake, also it can switch to lower gear and do engine braking.

Brakes just dont completely fail randomly like in a movies.
>>
>>52579621
>I said that in the rare case in which it's the AI causing the incident
Your A.I. could be purely innocent of the incident, because you know, ACCIDENTS USUALLY HAVE INNOCENT SIDE BUT STILL NEED TO BE TRIED IN COURT OF LAW.

?!?!? TWO SIDES TO LEGAL MATTER THE DEFENDANT AND THE SUSPECT
>>
>>52579580
>That's a brilliant idea, change a normal occurrence in modern driving to something you can instantly sue the company over.
Go on, tell me what the problem is with this. I can wait.
>>
>>52579643
We're obviously talking about a situation in which not killing anybody is not possible.
>>
>>52576387
If your car has to use ABS, it means you're using 100% of your traction for braking and the system is preventing you from going over that. Turning would require additional traction to perform the turn, taking from the car's ability to stop ABS or not.

This is why hard braking even with ABS invokes either a slide or understeer condition. You have to balance both and ABS is only going to prioritize one of them for you.
>>
File: Wittgenstein.jpg (164KB, 902x902px) Image search: [Google]
Wittgenstein.jpg
164KB, 902x902px
>>52579642
I've read one person that flat out laughed to metaphors in argument. He was a very wise man.
>>
>>52575954
>self-driving cars will be anything more than a fad.
>>
>>52579660
You don't think that something that happens daily in life, that which gets turned to 'LOL SUE THE COMAPNY INSTANTLY' would be big no no to companies?

>Drive a car
>Crash happens because of the driver
>SUE THE COMPANY

No matter if the driver is AI or human, your proposition is ignorant.
>>
>>52579693
>would be big no no to companies?
Why would it? Companies get sued every day. They wouldn't even need to be sued, they'd just have to pay for the damages they caused. They'd only need to go to court to dispute the damages. Companies won't care so long as they still make a profit in the end.

>Crash happens because of the AI
>SUE THE COMPANY
Makes total sense, and it;s what happens today.
>>
>>52579580
You seem to be highly angry and have ignored the point of my post, which is:

Car A.I.s are highly unlikely to get into the situation as described in the OP, or any other crashing scenario. They are proven better drivers than humans at the absolute worst because they can react faster and have vastly more information at hand than a human would at any point during driving.
If a car A.I. does end up in a crashing scenario, it is far more likely a fault of the car's mechanics failing than the A.I. failing to react.

Car crashes do happen everyday, which I agreed with you in my post, but you also didn't read that they are by far and away human error and very rarely mechanical failure.
Remove human error from the problem and you suddenly remove 99.9% of crashing issues. The other .1% is removed by manufacturers being forced into tighter testing restrictions, and having the A.I. do self-checkups and giving the driver strong suggestions or outright taking you to a local mechanic so they can be fixed before they become a problem.


As a side note, I think you should take a rest from this board, it's clearly hurting your brain to think deeply about these problems.
>>
>>52579724
>Companies get sued every day
Yeah by having their actual products be malfunctioning.

You don't sue the fucking factory that made the car when driver crashes the car.

Or you could pass that as retarded law and then have car companies vanish from your country.
>>
>>52579658
How the fuck can you understand that we're talking about a specific situation in which the accident in caused entirely by an error of the AI?
If the accident is caused by another car crashing into it, a brick falling on it, or a fucking spaceship shooting lasers at your self-driving car, then it won't be the car's fault, and the legal system will persecute whoever made the mistake that caused the accident.

IF the accident is ONLY caused by a bug in the AI (let's say failure to identify someone crossing the street and running him over), then it's the fault of whoever is responsible for the programming of said AI because it's a mistake in giving the machine the instructions to properly act in that situation.

The car is just a tool, and of course it's innocent, but if the car acts in an unpredictable manner and someone gets hurt, it's the fault of whoever got the car to act in such way in the first place.
>>
>>52576190
Would you like to play a game?
>>
>>52577311

At least in the U.S. pedestrians have always legally had the right of way. A self-driving car which adheres to all of the existing regulations would then need to take this into consideration, or the regulations re-written to give them the right of way instead.

That is why this is in debate. The "greater good" is really what human drivers are expected to follow legally, but we all know that's in conflict with our own priorities. Now with self-driving cars we might not have that control anymore, and that is hard to accept.
>>
>>52579749
>You don't sue the fucking factory that made the car when driver crashes the car.
of course you do if the factory made the driver.

>Or you could pass that as retarded law and then have car companies vanish from your country.
People will be happy to pay more money for self-driving cars. Insurance to cover accident payouts will be a small portion of the car cost.
>>
>>52579678

Just because someone smart didn't like one thing, it doesn't mean that such thing is never valid.

You failed to understand the point of my example and now that you got called out on it, you resort to a pathetic appeal to authority.
>>
>>52579775
In the US if a jaywalker comes out running from the parked cars and I invest him because it would be impossible to stop, am I in the wrong?
>>
>>52575984
jet fuel can't melt steelbeams
>>
This is funny.
The obvious action (stopping the car) is not shown as an option.
You learn that stuff when you are in middle school.
If I let go of an object, it will hit the ground.
That is a pretty confident prediction.
You do not need many samples to know the direction and speed of a moving object, so predicting where a person will be is not that hard.

But we don't even need to be this precise.
The cars are already equipped with sensors that can detect the distance to an object and just having an object in the way means the car should either stop or slow down.
They already does this when there is a car in the way.
On a highway, the car will match the speed of the next car, avoiding a collision.
In the city, it will stop if there is people in the way.
We don't need to think about these situations as driving is not a super complex task that can't be solved the next few years.

And as for blame when there is an accident?
Why not keep the rules the same as they are, the owner of the car must ensure it is driven by someone who has a license (human) or the owner will be to blame and if someone is sitting in the car, that person will be to blame for not stopping the car or taking it to service or whatever.

This has never been a problem for humans, why would it be a problem for a computer?
>>
>>52579833
The only intelligent post in this thread thus far.
>>
>>52576399
No, asstard, the situation will happen. If it has a 1/100,000,000,000 chance of happening, then in 50 years whem self driving cars are the norm, it will happen once a month or less worldwide.

This question is intended to address situational ethics, and how a machine should evaluate human value.

Personally, I think that if the car is in a situation where either the owner or a bystander is killed, it should choose the owner.

Risk of driving, bitch. People die every day, you're not special. If you die, no matter how, we will move on fine without you and quickly forget that you ever existed.
>>
>>52579745
>Car A.I.s are highly unlikely-
You can't vouch for that. You can't *know that*. And it doesn't matter, it is the argument in the OP, it is there, we talk about it. And car crashes happen daily. Someone is responsible, someone is innocent, or maybe both are responsible. But because of the nature of A.I. it has no rights or obligations and can't be held responsible (duh) it can't be tried in court of law.

>They are proven better drivers than humans at the absolute worst because they can react faster and have vastly more information at hand than a human would at any point during driving.
Doesn't matter. They still can't be held responsible because they have no rights. They cannot be part of any legal system as own entity, like humans can.

>Remove human error from the problem and you suddenly remove 99.9% of crashing issues
Remove human error and you don't have anyone driving the car because if you have no rights, you have no obligations, and can't drive a car. Because to drive a car, you must be 18 years old, have drivers licence, be without any warnings in that time etc.

>How the fuck can you understand that we're talking about a specific situation in which the accident in caused entirely by an error of the AI?
What error? OP proposes a question, how should A.I. make decision in how to react in a crash, and my answer is it can't. It can make exact same choice as human and it still won't change the issue.

>IF the accident is ONLY caused by a bug in the AI (let's say failure to identify someone crossing the street and running him over), then it's the fault of whoever is responsible for the programming of said AI because it's a mistake in giving the machine the instructions to properly act in that situation.
Yes I'm 100% sure suing car company for daily, common occurances, won't bring anyone bankrupt or sway away from the said market with this law. Not a sustainable solution
>>
>>52579809

Only if you can't prove that:

- You weren't driving too fast for conditions
- You weren't driving faster than the posted speed limit
- Your judgement was not impaired
- You did make reasonable attempts to avoid hitting the pedestrian

Even if you think you hit all those points on the nose, if the pedestrian attempted to sue you they might very well win so long as they can prove they were not impaired.
>>
>>52579833
Right but if a car was in the Situation to decide wether it kill someone or not. Lets say a Google car. Should it be allowed to decide based in the Google score of the Person? While this question seems easy to abswer what about a bing car? or a yahoo car. If a car can decide over your future it should rather be a fucking Transformers bro
>>
>>52575954
>muh etikz
I threw up
>>
>>52576155
>implying the majority is STEM here
>>
Literally IF A.I. enters a danger situation

just default back to human driver.

Every little legal problem solved.

Why was that so hard
>>
>>52579861
You are assuming several types of sensors fail, breaks fail, the car is invisible and yet it can still change direction of the car.
This is very unlikely and even if it happened, the car should "evaluate the value of human life", it should just stop or go straight.
If the big group of people saw this car comming at them they would jump to the side as most humans have a desire to not die.
If the car swirls to the side, it will hit all the people who got out of the way.

THIS WILL NEVER HAPPEN
>>
>>52579526
literally anyone with a riced out honda
>>
>>52579927
>You're about to get roasted by a semi
>In the last 2 seconds before getting plastered, over the sound of blaring truck horn you hear a ding as the car hands over manual control
>It takes you one second to realize you have to react
>By the time you make a decision it is already too late.
>>
File: Free Hat.webm (728KB, 1280x720px) Image search: [Google]
Free Hat.webm
728KB, 1280x720px
>>52579876
So if some asshole does this, I could get in trouble?
>>
>>52579970
If you have 2 seconds to not get roasted by semi not even A.I. will help that since you'd need to change physics on the fly
>>
>>52579973

Yeah, which is why it is worth investing in a dash cam like that guy did so you can prove when someone did something foolish or attempted insurance fraud.

>>52579993

But then what constitutes an emergency? Emergencies don't just conveniently always give you 15 seconds to decide on how to deal with the situation.
>>
>>52576365
man i would love to have a car that can pull several G
>>52576038
air brakes don't exsist. they're designed to fail
>but what if tire will get punctured
run-flats are pretty common
>but what if software will decide to go for WOT
ignition is cut the moment you press brake pedal (at least it should be)
>but what if ECU will fail
there's another one to take over tasks

>>52576053
philosophy majors love wasting on "fucking hypothetical" questions with no connections to real world.
Now could you fucking tell me what happened to my order? i'm waiting for like 50 minutes
>>
>>52579970
This.
Also, even if you were 100% even before shit hit the fan, you would still react orders of magnitude slower than the car.

This "switching to manual in case of accident" is just some dangerous bullshit to give liability to the driver, making everything far mroe dangerous, just so it's easier to blame someone.

It's literally a retarded ideas with only huge downsides, and only one upside (it's easier to decide who to blame), that's solvable by regulation anyway.
>>
>>52580033
edgy
>>
>>52580024
Your regulation choice was to sue companies for something that happens hundreds of thousands of times a day and then flood the court with person vs. company cases, while driving down the companies that manufacture cars.

Simple solution is to have the entire fucking thing disabled until we generate A.I. brilliant enough to human mind
>>
>>52580020
>Yeah, which is why it is worth investing in a dash cam like that guy did so you can prove when someone did something foolish or attempted insurance fraud.
I meant in a situation where is perfectly clear what happened, not in a case where the driver is wrongly accused of misconduct because there is no proof of the opposite.
Would I still risk trouble if the whole thing was filmed?
>>
>>52579887
Stopping the car is still the only option.
If the car is unable to stop fast enough to avoid killing people, it should not be on the road.

If we are talking about >>52579973
then the car should still stop.
It would be a bit more dangerous as there is no person inside the car who can supply aid to the guy who got hit but if the car could call 911 with an automated message, that would be better than 90% of people anyway
>>
>>52575954

"Greater good"sounds like dirty communist talk to me.
>>
>>52580059

You never implied it was filmed, but you would be in significantly less trouble if you had it on film. When you *don't* have it on film, it gives the pedestrian and their lawyer room to dramatize the whole situation. Without hard evidence on your part, it makes it hard to confirm otherwise.

Sometimes you luck out and some witnesses blame it on the pedestrian for you, as well.
>>
>>52580045
>edgy meme
Go back to reddit.
>>
>>52580056
>Your regulation choice was to sue companies for something that happens hundreds of thousands of times a day
No, I didn't say that.
I said sue the company if the accident has been caused by the car's AI.
If somebody jumps in front of my self-driving car like in >>52579973 then it's not the company's fault.
If the car decided to disable the brakes and someone ends up dead, then it's the company's fault for having programmed a defective car that caused the accident.
If the accident hasn't been caused ba a decision made by the AI, then it's not the company's fault.
>>
They also need to add a prioritization algorithm that values people's lives in this order:
1. Black
2. Jew
3. Transgender
...
13932823. White heterosexual man
>>
>>52580056
>Simple solution is to have the entire fucking thing disabled until we generate A.I. brilliant enough to human mind
?
Current AI is already superior to the human mind in the driving area.
>>
File: 1438379810919.jpg (14KB, 480x360px) Image search: [Google]
1438379810919.jpg
14KB, 480x360px
>>52580095
>being offended by a single word
ok bro
>>
>>52580092
>You never implied it was filmed
Sorry, my fault.

>but you would be in significantly less trouble if you had it on film. When you *don't* have it on film, it gives the pedestrian and their lawyer room to dramatize the whole situation. Without hard evidence on your part, it makes it hard to confirm otherwise.
>Sometimes you luck out and some witnesses blame it on the pedestrian for you, as well.

I see...
But in case you had it on film what would happen? Would they just close the case right away, or you would still risk something?
>>
>>52580088
Greater good is utilitarianism argument which is a very steep slippery slope.

>>52580096
>I said sue the company if the accident has been caused by the car's AI.
Car crashes happen daily. In droves. I can't even guestimate, but probably in millions? That'd flood the car company, they couldn't handle that amount of sues, and neither would any court (and this would go through court, since it's civil person and rights related to him)

You'd practically kill the governments problem solution way and the car company.
>If the accident hasn't been caused ba a decision made by the AI, then it's not the company's fault.
? Then nobody is at fault? Good lord, you could just crash around freely and collect insurance money.

>>52580116
Yeah congratz it can do fast math, doesn't pass off as a human being though. Doesn't have rights or obligations.
>>
>>52580105
You're kidding, obviously, but I'm afraid this is the future that awaits us.

Too many things we believed to be absurd and "too far", and now we see them in place. The SJWs are winning and it wouldn't surprise me one bit if they pushed for this.
>>
File: sim1.gif (98KB, 250x441px) Image search: [Google]
sim1.gif
98KB, 250x441px
>>52575984
yes
>>
>>52580145
Chances are they would drop it. Only your lawyer could really tell you what level of fucked you were depending on circumstances.
>>
>>52576021
>Self-driving cars are orders of magnitude more safe than people-driven cars.
This doesnt matter though. People will care a lot more about randomly being killed by a big metal box on wheels they have no control over when they're being told its completely safe than some retard who falls asleep at the wheel. It's not good enough for them to be statistically safer.
>>
>>52580146
>Car crashes happen daily. In droves. I can't even guestimate, but probably in millions? That'd flood the car company, they couldn't handle that amount of sues, and neither would any court (and this would go through court, since it's civil person and rights related to him)
>You'd practically kill the governments problem solution way and the car company.
This depends on what kind of regulation will be in place.
Also it wouldn't take very long for people to understand that these cars record everything and that if the car isn't at fault it would be a matter of seconds to prove.

>? Then nobody is at fault? Good lord, you could just crash around freely and collect insurance money.
Now you're just trolling.
Did I say it's nobody's fault?
It's obvious that if the car didn't cause the accident it's not the car's fault, BECAUSE IT'S SOMEONE ELSE'S.
It would be the fault of whoever caused the accident.
Are you retarded son?

>Yeah congratz it can do fast math, doesn't pass off as a human being though. Doesn't have rights or obligations.
And you plan to give rights and obligations to machines one they're advanced enough?
>>
>>52580105
take picture
detect face
average color of face
value of life = (red * green * blue) * hair length/boob size
>>
>>52580211
Ah, alright. Thank you
>>
>>52576590
H O T H E A D
>>
>>52580247
>It would be the fault of whoever caused the accident.
A.I. doesn't have rights or obligations, can't be responsible.
Passanger was a passanger, can't be responsible.
Making a law which makes company responsible for something that happens in hundreds of thousands a day would gas the governmental problem solution system and drive down car companies (plus, CRASHES do happen and unless the car was legitimately functional, it has never been the company's fault)

>And you plan to give rights and obligations to machines one they're advanced enough?
If in some near future A.I. becomes advanced enough, to compare to human mind in its complexity. One that can develop it's own moral codes, think on its own, he can understand an image of himself, he can understand himself, and he can see himself, etc. million other tests then yes. What else?

>Also it wouldn't take very long for people to understand that these cars record everything and that if the car isn't at fault it would be a matter of seconds to prove.
And it'd still cause the vacuum in legality and ethics. It's not about if its better. That literally does not matter
>>
>>52576523
but anon, computers do exactly what they're programmed to do
>>
>>52579820
Jet fuel can't melt dank memes.
>>
>>52580033
Wait a sec
I recognize the second picture, it was a photo of a hungarian train or something
>>
File: 100_pics.jpg (40KB, 485x307px) Image search: [Google]
100_pics.jpg
40KB, 485x307px
>>52580366
>>52580033
>>
The car makes the attempt to save as much life as it can, but the value of the passenger's life is above all others. If the passengers must die to save everyone, then someone other than the passengers are dying, whichever situation gives the least deaths.

Or would YOU throw your life away in a situation like this? No, you wouldn't slam your car into a wall to save other people. Of course you're going to TRY not to run over other people, but if even with your best attempts you send a couple of niggers airborne, so be it.
>>
>>52575984
Have you seen the movie Executive Decision?
>>
>>52580333
>A.I. doesn't have rights or obligations, can't be responsible.
I KNOW YOU RETARD.
Stop making this argument. I didn't say that the car is held responsible.
I said that the person who is responsible for the car's behavior (AKA, the manufacturer) is the responsible one.

>Making a law which makes company responsible for something that happens in hundreds of thousands a day would gas the governmental problem solution system and drive down car companies (plus, CRASHES do happen and unless the car was legitimately functional, it has never been the company's fault)
But it doesn't happen thousands of times a day.
You're counting all accidents, but you have to take out those not caused by the AI's decisions.
The accident caused by AI's decisions would be so rare that it's really not an issue at all.

>>52580333
>If in some near future A.I. becomes advanced enough, to compare to human mind in its complexity. One that can develop it's own moral codes, think on its own, he can understand an image of himself, he can understand himself, and he can see himself, etc. million other tests then yes. What else?
If we make humanoid robots that can be exactly like us, then they will behave like us and depending on their place in society they might have rights and obligations, yes.
But this isn't what we're talking about.
We're talking about simple machinery here, and just like every piece of dangerous automated machinery that's currently being in use, we can't postpone their implementation until we have perfect AI.
Especially since their lack of implementation would just make them only be in laboratories, and therefore there would be no incentive to finance their development, effectively stopping their progress.

We're talking about normal AI here, and I don't se how a car's AI would be different that that in a self-driving subway train like those that already exist in many cities.

(cont.)
>>
>>52580333
>>52580490

>And it'd still cause the vacuum in legality and ethics. It's not about if its better. That literally does not matter
Why?
Once people will realise that suing a perfectly-driving automatic car for the accident that they caused is completely useless, nobody (except a few literal retards) will do it.
The "flooding the government with lawsuits" is a problem that will fix itself.
>>
>>52580490
>I said that the person who is responsible for the car's behavior (AKA, the manufacturer) is the responsible one.
Yes flood the legal system in complaints, flood the company with requests, drive down both systems.

Good solution.

>But it doesn't happen thousands of times a day.
Car crashes probably happen a fuck ton of times a day and even currently most legal systems around Europe for example are slow.

>You're counting all accidents, but you have to take out those not caused by the AI's decisions.
What? If it wasn't a self driving A.I. that caused it, it was malfunction (and another request) or antoher A.I. driver (another civil vs. comp. case, another legal case, another request for company)

How thick are you

>we are talkinga bout normal ai
which cant do these tasks because it has no rights or obligations to do them.
>>
>>52580540
It hasn't fixed itself yet and this is without hundrends of thousands of cases of civil vs. company caused by crashes A.I. calculated.

>Once people will realise that suing a perfectly-driving automatic car for the accident that they caused
Who are 'they'? What are you speaking of even?


>>52580490
>I don't se how a car's AI would be different that that in a self-driving subway train like those that already exist in many cities.
Except they're not self-driving, not at least here, everytime someone is trying to kill themselves the breaks come from the driver.
>>
I still drive manual transmission. i doubt europe would ever use "self-driving" cars.
>>
>>52580545
>Yes flood the legal system in complaints, flood the company with requests, drive down both systems.
This won't happen for the reasons I talked about here >>52580540 and in the rest of this post...

>>52580545
>Car crashes probably happen a fuck ton of times a day and even currently most legal systems around Europe for example are slow.
Total car crashes don't matter here.
We're talking about who would be held accountable for a crash caused by the AI's decision. Every other crash type is irrelevant because it would just be solved normally, as it would be the direct result of a man-made decision.

>What? If it wasn't a self driving A.I. that caused it, it was malfunction or antoher A.I. driver
No you fucking moron. It's the opposite.
If the accident is caused by the mistake that the AI made, then it's malfunction.
If the AI drove perfectly and an accident happened, then it's something else depending on the accident.
Could be a drunk driver going too fast, could be a bridge falling on top of it, could be a jaywalker like here >>52579973, etc.
As long as the self-driving car drove perfectly (and they always do), no accident will occur unless something that doesn't depend on the AI happens (like one of the examples I made).

>which cant do these tasks because it has no rights or obligations to do them.
Pic related.
Of course it can do it, because the countless tests proved so, and in the rare case in which a bug caused them to act unpredictably, they will be seen exactly like every other dangerous automated machine (Which, guess what? they lack rights and obligations too, and yet they have been used for decades).
>>
File: WellSaid.png (29KB, 325x355px) Image search: [Google]
WellSaid.png
29KB, 325x355px
>>52577311
>>
>>52575954
I'd like to add that crashing into the pedestrians is more dangerous to them than slamming into the wall is to the passengers: the pedestrians get slammed by a car but the passengers have seatbelts, airbags, crumple zones, etc
>>
>>52580607
>It hasn't fixed itself yet and this is without hundrends of thousands of cases of civil vs. company caused by crashes A.I. calculated.
Does that really happen that much? do you have a source for that claim?

>Who are 'they'? What are you speaking of even?
The people who made the mistake which caused the accident that happened despite your self-driving car driving perfectly.

>Except they're not self-driving, not at least here, everytime someone is trying to kill themselves the breaks come from the driver.
You don't know what you're talking about.
Some are literally driverless. There is no operator or driver inside. Just a big window at the front.
Here is the one in Turin:
https://www.youtube.com/watch?v=chyr0dxTdbc
>>
>>52580803
whos going to pay for damages?
>>
>People retardedly jump into the road
>I should suffer for this
How are people that retarded, if that was the case you could literally asassinate people just by standing in the middle of the road.
>>
>>52580829
Completely irrelevant to my point
>>
>>52580750
Yes it will happen. At some point A.I. that just does quick maths will have to crash the car and then it will have to be tried. And it can't be tried. And crashes happen daily, expecting a company to pay for something that is hourly occurrence in the world is crazy. And this would again flood the legal system to shit (and at least in Europe, it already is pretty fucking slow, East Bloc gets notifications from various human rights bodies about their slow legal system).


>If the accident is caused by the mistake that the AI made, then it's malfunction.
It's not a mistake, mistake is drunk driving. Crashing is something that you don't even need to do mistake, YOU CAN GET CRASHED MORON. And you can't define 'crash' as 'malfunction', that's just ridiculous. Malfunction of device is not working properly and is grounds for sue. But something dependent on the driver can't be such.

>If the AI drove perfectly and an accident happened, then it's something else depending on the accident.
Yes it's another A.I. clearly crashing into you.

>>52580818
>Does that really happen that much? do you have a source for that claim?
Does it really matter that the way people solve problems gets flooded?
IDK, legal system is pretty important at least here in Europe.


>Who are 'they'? What are you speaking of even?
>The people who made the mistake which caused the accident that happened despite your self-driving car driving perfectly.
You can't assume it will be perfect. It was literally man made and that's already a flaw in your argument.
>>
>>52575954
Why aren't the people on A and C not attempting to run away from the car?
Why doesn't the car try to dodge the person in B?
>>
>>52575954
Since people are never taught this shit before they get their license, they are just as liable as the self driving car to these circumstances. And yet people are still allowed to drive. In these cases, the circumstances are set before court to decide and it can be the same for a self driving car. If the car behave terribly, the company who produce them will need to fix them and pay damages. If the person was retarded then no damages would be needed. The self driving car has enough sensors on board to easily allow a judge to make a decision.
>>
>>52580922
>If the car behave terribly, the company who produce them will need to fix them and pay damages
Oh that there is a slib slob m8
>>
>>52580948
wut?
>>
>>52580948
Volkswagen is already on that
>>
>>52580985
Yeah because they actually fucked up their cars and made them malfunction. Not the same case here.

This is a working car, no mistake from the factory. It has to decide who and how to kill someone. If this was any person it'd be tried in court in front of law. What do you do here?

Sue a company? When it's perfectly normal for car crashes to happen?
>>
>>52581016
You try the case in court the same way you'd do a person. Which is exactly what I said before.
>>
>>52581044
You are going to put a mathematical robot doing math tasks as the defendant?
can¨'t do that

he has [spoiler]no rights nor obligations[/spoiler]
>>
>>52580848
>Yes it will happen. At some point A.I. that just does quick maths will have to crash the car and then it will have to be tried. And it can't be tried. And crashes happen daily, expecting a company to pay for something that is hourly occurrence in the world is crazy. And this would again flood the legal system to shit (and at least in Europe, it already is pretty fucking slow, East Bloc gets notifications from various human rights bodies about their slow legal system).
Are you just ignoring the numerous points I keep making to those retarded arguments you keep repeating?

>It's not a mistake, mistake is drunk driving.
You're arguing about semantics now.
If AI takes a decision and it's not a good decision, then it's a mistake made by AI.
Even if you don't agree with this definition it doesn't matter. You still (hopefully) understand what I mean.

>Crashing is something that you don't even need to do mistake
What?

>YOU CAN GET CRASHED MORON
Yes, and it will be the fault of the person who caused the crash, which in this case is likely to be the driver that crashed into you.

>And you can't define 'crash' as 'malfunction', that's just ridiculous.
Yes if it happens because of a bug in the AI.
If the car had a bug that caused it to suddenly decide to accelerate to its max speed and disabled the brakes, then the resulting crash would be the result of a malfunction of the AI.

>Malfunction of device is not working properly and is grounds for sue.
Exactly.
With this sentence you just made your entire argument invalid, since you just admitted that an accident caused by the AI not working properly is grounds for a lawsuit.

>But something dependent on the driver can't be such.
A decision made by the artificial intelligence fo the car is dpendent on the driver?

>Does it really matter that the way people solve problems gets flooded?
Nice dodging of the question.
I take that as an admission that you just lied.

(cont.)
>>
>>52580848
>You can't assume it will be perfect. It was literally man made and that's already a flaw in your argument.
I didn't say that self-driving cars are perfect you colossal retard.
I said that IN MY EXAMPLE WHERE THERE IS A CRASH WHEN THE CAR MADE NO MISTAKES, the fault is someone else's because the car drove perfectly IN THAT OCCASION.
Never said it will never make any mistake, as that would be retarded, since we're basically only arguing about what would happen in case a car causes the accident.
>>
Its an unrealistic scenario, in my country roads with the risk of people like OP pic the limit is at 30 km/h and rarely 50, you can stop the car in a meter or two.

If its a faster road its not your responsibility to kill yourself to save a bunch of retards walking across the road, your only responsibility is to fucking brake and kill as few as possible.
>>
>>52581062
No, you put the company who program him as defendant.
>>
>>52581065
>If AI takes a decision and it's not a good decision, then it's a mistake made by AI.
There's no 'good' decision. It's either kill, kill or kill.


>Yes, and it will be the fault of the person who caused the crash, which in this case is likely to be the driver that crashed into you.
by anohter car driven by A.I.

>Crashing is something that you don't even need to do mistake
>What?
Are you stupid? Do you think it takes two guilty parties to cause a car crash? Are you a fucking moron? It can be, but it can also be innocent and guilty parties.

>With this sentence you just made your entire argument invalid, since you just admitted that an accident caused by the AI not working properly is grounds for a lawsuit.
False assumption, you assume AI is perfect which it is not and will never crash. Protip: it will.

>I take that as an admission that you just lied.
You can go and check out EC's complaints about legal systems of East Bloc but that's another issue on itself. They're slow as hell.

>>52581080
>I didn't say that self-driving cars are perfect you colossal retard.
Yes you said.

Crash is a malfunction by A.I. according to you here, I quote
> since you just admitted that an accident caused by the AI not working properly is grounds for a lawsuit.
AI IS PERFECT
ANY STRAY FROM PERFECT IS MALFUNCTION

Which is not true.

since you don't fucking sue people for cars crashing.
>>
>>52581128
>since you don't fucking sue people for cars crashing.
where did you get this ridiculous conception?
>>
>>52575954
I paid money for my car, so I want my car to protect me even if it has to plow through a field of 1000 babies to do it.
>>
A and C are correct. They save the most people.
B is probably incorrect. It depends on how many people are in the car.

Ideally it would take into account other factors, e.g. how much good the passengers and the pedestrians do for the world and how much incidental harm would be caused by killing them. But we can't do that until we have nearly human AI. That sort of system would also open up such a huge political can of worms that self driving cars would never get on the road.

In practice, it doesn't matter. They'll be so much safer that getting them on the roads sooner is more important than having them be perfect. Satisfy the selfish morons by having it always prioritize the passengers and it will STILL save lives.
>>
>>52581168
You don't sue COMPANIES* for car crashes (unless it was malfunctioning car, like Volkswagen's shitstorm)

I herped a derp there :v)
>>
>>52581181
>They save the most people.
slippery slope of utilitarianism, be careful anon
>>
>>52581204
How else are you going to deal with fringe cases in morality when everything is set up to only be dealt with with utilitarianism? You think having to be a but utilitarian when programming a self driving car is going to stop people?
>>
>>52581183
Nothing stop people from doing it. If you believe the company is at fault for its AI's decision, you can sue the company all you want.
>>
>>52581226
It's such a weak moral code that you can justify just about anything with it. Might as well have none at all at that point.

Maybe the cars, these supposedly PERFECT AI CARS :V), should have option?

>Select moral code

>Nihilistic
>Misantrophic
>Misogynistic
>utilitarianism
>Stoic

What would they be like?
>>
>>52576683
>We're worthless compared to the President.
No, YOU are. The rest of us realize that no one is fucking special.

>Shit is NOT fun when a president dies.
Maybe the president should stop acting like he's so important then.
>>
File: asdasd.jpg (39KB, 339x575px) Image search: [Google]
asdasd.jpg
39KB, 339x575px
>>52581245
>Misantrophic.
>>
>>52581261
He obviously drives over a cliff at the end there after ramming through the people.
>>
File: yDqS2VK.jpg (48KB, 480x270px) Image search: [Google]
yDqS2VK.jpg
48KB, 480x270px
>>52575954
What if self-driving cars have ejector seats built in? Then C is correct 100% of the time.
>>
>>52581177
Do you not realize that having cars like that makes it more likely YOU will die when someone else's car decides to drive through a crowd to avoid hitting a pothole? If other cars also try to just prioritize their drivers, they will make no effort to protect you when you are a passenger or a driver. Your ideas would hurt you considerably more than they help you. Not only are your preferences selfish, they aren't even in your best interest because they would only wind up hurting you.

If you're going to be selfish, do it RIGHT. Be selfish in ways that actually help you. Being selfish in ways that hurt you is just stupid.

But, as I said, it doesn't matter. I'll gladly let you people be evil morons since it will save more lives. Self driving cars will be so safe that shipping a sub-optimal morality system would be better than delaying them a year.
>>
>>52581128
>There's no 'good' decision. It's either kill, kill or kill.
What the fuck are you talking about?

>by anohter car driven by A.I.
If both the cars involved in the crash are self-driven then one or both have made a mistake for it to happen. If they didn't they wouldn't have made the crash.
Once it's established which car caused the accident, then who programmed the buggy AI gets sued.

>Are you stupid? Do you think it takes two guilty parties to cause a car crash? Are you a fucking moron? It can be, but it can also be innocent and guilty parties.
I literally didn't understand the meaning of the sentence.
Now I understand and respond with:
It doesn't matter if you get crashed into, because you (or your car, its manufacturer, etc) won't be held responsible. Whoever caused the other car to crash into you will be held responsible.

>False assumption, you assume AI is perfect which it is not and will never crash. Protip: it will.
I didn't say that, you complete idiot. I explicitly said the opposite.
I suspected you were trolling, but now it's clear.

>You can go and check out EC's complaints about legal systems of East Bloc but that's another issue on itself. They're slow as hell.
Are you just strawmanning on purpose?
I didn't ask you that you colossal inbred moron. I asked you to prove that countless of accidents happen with self-driving vehicles.

>Yes you said.
I explained what I meant in the next sentence. It's just your reading comprehension that makes you beliee I said it while I clearly didn't.

>>52581128
>Crash is a malfunction by A.I. according to you here, I quote
>> since you just admitted that an accident caused by the AI not working properly is grounds for a lawsuit.
>AI IS PERFECT
>ANY STRAY FROM PERFECT IS MALFUNCTION
WHAAAT?
That doesn't even make any sense.
How do you reach such conclusions by reading something completely different?
Your mental gymnastics doesn't work very well.
>>
>>52581204
How is that a slippery slope? I am taking an explicitly utilitarian position.
>>
>>52581245
Nobody sits there thinking about shit like this except fedora tipping neckbeards. People move on with their lives. You think the people who had to hurt 1 person but helps 10 really cares if they are going down the path of utilitarianism or do you think they are called a hero because they made a hard decision that helps 10 people.

When situations like this happens to normal people. They make a call and then they stand trial for their decision. They don't sit there arguing about philosophical ideology. In situations like this, if in court, you said you did it to try to save as many people as possible, you will not be put at fault. Therefor utilitarianism is perfectly fine for situations like this.
>>
File: FSkeleton.jpg (31KB, 290x713px) Image search: [Google]
FSkeleton.jpg
31KB, 290x713px
Am I the only one who wants the car to sacrifice me as a passenger?

I realize there are inherent risks to driving a car, and if something goes wrong I would rather that be my responsibility, the person who chose to ride in the car, rather than the innocent bystanders who happen to be walking outside. Technology like this comes at a price, and we should expect people who aren't interested in it to pay for it, especially when they might have to pay with their lives.
>>
>>52581183
If a self-driving vehicle caused a crash it means it malfunctioned.
If it didn't malfunction, then it wouldn't have caused the crash.

Therefore you can sue the company for making the malfunctioning AI that caused the crash.

If on the other hand you get crashed into, then it's the fault of whoever was driving the other car. If it was a human, then sue the human, and if it was the AI, then sue the maker of that AI because it clearly malfunctioned enough to make the car crash into yours.
>>
>>52575964
just like collision detecting humans always stop, right?

there are some accidents that are entirely unavoidable even with software to detect them
>>
>>52581352
Sorry that should be "shouldn't expect"
>>
>>52575954
A, C is good
B is bullshit since the selfdriving car is following all given rules the guy getting killed is doing something wrong
>>
>>52581307
>If both the cars involved in the crash are self-driven then one or both have made a mistake for it to happen.
So lost in the bubble thought of Perfect A.I designed by inherently flawed creatures. It's really beautiful showcase of ignorance.

>How do you reach such conclusions by reading something completely different?
You have time and time said that any crash by Perfect A.I. is a 'malfunction' and not just a mistake by the driver. Perfect A.I. don't crash obviously.

>>52581333
>When situations like this happens to normal people. They make a call and then they stand trial for their decision
Now you are suing people for mistake AI does? but they were just a passenger and weren't driving, the A.I. was.
>>
>>52581245
>It's such a weak moral code that you can justify just about anything with it.
Bullshit. You can RATIONALIZE just about anything, but that's always true. If you're the sort of person who is prone to rationalizing your behavior, you will rationalize your behavior no matter what moral code you follow. Irrational people will be irrational no matter what moral code they follow.

In the hands of a rational person, utilitarianism makes it very clear what is moral.
>>
>>52581181
they are all incorrect.
If you have ever driven a car you would know this.
If you estimate how many people you will hurt you can easily make mistakes.
stopping the car would be the only option.
Even if it means hitting all the people in front of them, that is the action the car should do.
Slowing down means the people will suffer minimum damage.
But it will never be a problem because people never just cross the road in large groups in places where you drive really fast.
>>
>>52579749
>You don't sue the fucking factory that made the car when driver crashes the car.
True. But what happens when the factory also made the DRIVER that crashes the car?
>>
No.

Assuming the car is in the right, because why wouldn't it be? It's a robot, it can't make its own rules, then why should 1-8 people die because of the stupidity of 1-500 retards who were either to stupid to understand their surroundings or stupid enough to try make make it anyway.

Mow the retards down and pull over, don't slam 1-8 passengers into death for them.
>>
>>52581395
>In the hands of a rational person, utilitarianism makes it very clear what is moral.
oh nice prefix, 'in the hands of rational person' hehe.

im sure hitler thought of that too
>>
>>52580021
Air brakes are designed to not fail, which is why all Semi-Trucks have em.
>>
>>52581259
>The rest of us realize that no one is fucking special.
The President is not INHERENTLY more important than anyone else. However, killing the President would have side effects. Killing the President would result in many other people dying in the instability that follows.

All lives are equally worth protecting. No one is special. So, in order to protect as many equally important lives as possible, the President's safety is more important than mine.
>>
>>52581411
Tourists around a large convention
>>
>>52581383
>Now you are suing people for mistake AI does? but they were just a passenger and weren't driving, the A.I. was.
You realize the AI isn't a real AI and is just programmed to drive a car right and the people who program it would be responsible. If you were talking about a REAL AI then you'd have to put the AI on trail because it is able to make logical and moral decision by it self.
>>
>>52581428
The Titanic was designed to not fail either
>>
>>52581259
the president is special though
because he's the fucking president
what have you done that you would consider yourself equally valuable to them?
>>
>>52581438
>You realize the AI isn't a real AI and is just programmed to drive a car right and the people who program it would be responsible.
so you just made every car manufacturer steer away from it
>>
>>52581433
Secret service notified
>>
>>52575954
It should kill the pedestrians because the car can do nothing wrong, its obviously the fault of the retard putting himself in the middle of the road.
>>
>>52579674
I can tell you why your theory is bullshit, I was driving down a street in Columbus called Champion Avenue, they recently removed the stoplight at the intersection of Champion and Mooberry, so the traffic on Champion heading north(one way street) had the right of way, while the traffic on Mooberry heading east(one way) had a stop sign and below the stop sign said (Cross traffic does not stop) never the less a guy thought that I had to stop, forcing me to slam on my breaks, he stopped in the middle of the intersection and because I was able to steer around his car even while braking I avoided a collision, if I had just braked without steering I would have hit him, just because you're an idiot who doesn't know or doesn't think something works doesn't mean you can spew your false nonsense all over the place.
>>
>>52579674
Bullshit, One of ABS's goals is to enable you to swerve around an obstacle in the event of emergency braking (that is, if you're not on snow or gravel).
>>
>>52581438
YOU realize that there are different tiers of AI and what you are referring to is strong AI.

You don't know what the fuck you're talking about, so stop going around disqualifying what others say based on semantics you don't even understand you fucking retard.
>>
>>52581383
>So lost in the bubble thought of Perfect A.I designed by inherently flawed creatures. It's really beautiful showcase of ignorance.
You really are retarded.
I DID NOT FUCKING SAY THAT THEY ARE PERFECT, AND CERTAINLY DO NOT THINK SO, so stop using this as an argument.
If I thought that AIs were perfect, then I would say that NO ACIDENTS WHATSOEVER could be made by AIs, but since this is not the case, I'm talking about a case in which the AI has a bug (since it's not fucking perfect) that takes it from its normal well-working state (where no accidents happen) to a bad-working state where accidents happen.
This whole thing about bugs and mistakes is what I mean by AI not being perfect.
When I talk about perfectly-driving AI, I'm obviously talking about when it doesn't have a bug. I'm not excluding that some times there might be bugs.

>You have time and time said that any crash by Perfect A.I. is a 'malfunction' and not just a mistake by the driver. Perfect A.I. don't crash obviously.
I didn't say that to not make the car crash you need a perfect AI. I said that you need a properly working AI. Said properly-working AI might have a bug one day and that might make it crash. And since for a crash to happen you need a bug (which is a malfunction), you can sue the maker, because if it worked properly it wouldn't have crashed.
Perfect AIs will likely never exist. We're not talking about them. We're talking about properly-working ones that don't crash until they have a bug that makes them crash.

>Now you are suing people for mistake AI does? but they were just a passenger and weren't driving, the A.I. was.
See? You're trolling.
I explicitly said that you sue the maker of the AI that made the crash, not the passenger.
>>
>>52576074
>artificial intelligence ethics are not technology releated
Leave.
>>
It should brake and turn away from the crowd in A and C. B I'd say just plow the guy. Think about what a real person would do. If you were confronted with the situation, would you plow through a crowd or stop your car with a wall?
>>
>>52581433
No it would not. The whole reason we have a structured government is because it makes it so that there are many people who could take the president's place if something were to happen to him. The president is replaceable and you are doing yourself a disservice by saying his safety is more important than yours.

>>52581458
No he isn't. The president is just as likely as the rest of us to die each year from getting cancer or some other stupid shit. Don't forget the time Bush almost was killed by a pretzel.
>>
In all circumstances where the car cannot help but kill at least one person, the owner of the car should be protected. They paid for a safe ride, and should get what they paid for.
>>
>>52581538
>>I DID NOT FUCKING SAY THAT THEY ARE PERFECT, AND CERTAINLY DO NOT THINK SO, so stop using this as an argument.

Every time AI car crashes it's malfunciton to you. Thus it's not working as intended, thus accidents never happen, thus its perfect.
>>
Somebody screencap these:
>>52577311
>>52577404
So that nobody makes another stupid thread like this anymore.
>>
>>52581435
If there is a convention, more than a group of people would be around.
Crossing the road is a lot easier at the intersections, so people will do that.
People will not jump in front of a car as a group
>>
>>52581423
If you are rational and actually follow utilitarianism, it makes it clear how to act. Irrational people who follow utilitarianism in name only will behave immorally, but that is not a problem with utilitarianism.

Irrational people can use utilitarianism to rationalize just about anything, but they can do the same with deontology or virtue ethics.
>>
>>52581576
>>52581576
>>Every time AI car crashes it's malfunciton to you. Thus it's not working as intended, thus accidents never happen, thus its perfect.
What?
A car programmed to not crash is perfect?
If crashing equals malfunction, it's because the car is programmed to not crash.
Accidents caused by AI never happen unless there is a bug, which is a situation that will likely exist in the future BECAUSE they are not perfect.

Are you trying to convince me that what you said is my opinion?

Also you're clearly baiting, and I'm the retard that keeps responding anyway.
Fuck me, but must importantly fuck you.
>>
>>52581594
>it makes it clear how to act
It runs into problem in a basic survival question where it just states that the strong must survive and it's fine to prey on weaker. It's anti-human system.
>>
>>52581590
You clearly never drove in a poor country or a touristic area.

People do retarded shit exactly like this all the time.
>>
>>52581644
>A car programmed to not crash is perfect?
Oh crashes never happen so you dodge the enttire question.

Again lost in his bubble of Perfect AI and Perfect World.

Go watch your testicles.
That's how flawed we are.
We can't deisng your Perfect NonCrashing AI

And if you don't code crashing into it you will literally have AI car drive at speed limit when crash is about to happen

You are a fucking super sayjan r*terad
>>
File: cars.png (522KB, 1082x1006px) Image search: [Google]
cars.png
522KB, 1082x1006px
Only right answer pic related
>>
>>52581678
Now you're just baiting.

I don't care anymore.

Everything I would say answering this moronic post would just be a repetition of what I already said before, so go read those instead, because clearly you're not understanding what I'm writing (or you're pretending not to).

Congratulations! I got baited in your bullshit! I'm sure you're happy about it.
>>
>>52581725
>Pretend the other is baiting because you got outed and leave the thread to save face in anon board

anon.. I
>>/lit/7616637
>>
>>52581747
I literally told you that your argument is invalid because it's not responding to what I said, bot to another thing that I didn't say.
I have nothing else to add, until you re-read my posts and answer to my arguments, instead of arguing against me allegedly telling you that AIs are perfect (which I explicitly said many times that I do not think so).

Also you're not even able to link cross-board. lol
>>
>>52581671
no they don't.
Poor countries have less traffic, so you can cross the road without getting hit by a car.
This can clearly only happen in a city, people don't usually walk across a highway and there is not that many people in rural areas.

And people either come out of nowhere and there is no time to react or there is time to react and the car will stop.
I have driven a car for many years, and I have never been in a situation where I had to choose between sacrificing myself or other people.
I have never run a person over.
Accidents happen because
the road is wet or ice,
the car doesn't work
the driver is drunk or on drugs.
the driver is not paying attention to the road

The last two situations is the most likely and a automated car will fix that.
If the road is wet or iced over, the overall speed limit should be lower for the automated car, just as it is required if a person would drive.

Mechanical failures in the car is something that usually is avoided by servicing the cars.
And if there is a mechanical failure, why would the car turn?
>>
>>52581813
You have no arguments. You just dreamt of perfect AI that doesn't even have consequences for crashing
>>
>>52576918
like?
>>
>>52581869
>Poor countries have less traffic, so you can cross the road without getting hit by a car.
We're not talking about Uganda here, you moron. Go look at Eastern Europe, SE Asia, Middle East, Latin America, southern Italy etc.

>This can clearly only happen in a city, people don't usually walk across a highway and there is not that many people in rural areas.
Where did I say anything about rural areas?
Only rich countries have cities? Poor countries only have rural areas?

>And people either come out of nowhere and there is no time to react or there is time to react and the car will stop.
>I have driven a car for many years, and I have never been in a situation where I had to choose between sacrificing myself or other people.
>I have never run a person over.
>Accidents happen because
>the road is wet or ice,
>the car doesn't work
>the driver is drunk or on drugs.
>the driver is not paying attention to the road
That's in civilized countries.
You clearly don't know anything about the rest of the world.

>The last two situations is the most likely and a automated car will fix that.
In poorer countries people do stupid shit like >>52579973 all the time.
Seriously, go look at car accidents on LiveLeak. Many of them are just random shitskins being retarded.

>Mechanical failures in the car is something that usually is avoided by servicing the cars.
Unless you servide the car every day, failures are bound to happen

>And if there is a mechanical failure, why would the car turn?
Are you talking about the car's inability to turn because a mechanical failure prevents it to?
The steering system isn't the only thing that can break.

>>52581881
He's right. You're just strawmanning at this point.
>>
>>52581997
>That's in civilized countries.
We are talking about robot driven cars.
If we are talking about a country that doesn't have the advanced technology of intersections, they wouldn't have self driving cars.
>>
The car should blow its horn alerting the other parties of the danger and creating opportunity for the other parties (human or A.I) to self correct and limit the pending danger/damage.
>>
>>52582057
Again, we're not talking about Uganda or Liberia. Those places that I listed have both rich people and poor retarded paesants.

Just go to Rome FFS. One minute you see a Lamborghini, and the next one you see a bunch of retards being retarded.

You clearly never traveled or studies geography in your life if you have this black/white vision of the world where it's either NYC or rural nowhere.
>>
>>52582100
We're talking about a situation where there isn't even time for the car to stop.
How would a horn solve the problem? They would be dead even before realizing that they heard a horn.
>>
Rome has intersections.
Places where people can cross the road safely.
Assuming a self driving car is not programmed to break the law, even a lamborghini will be able to stop without hitting people.
Why do you think humans are able to drive cars without this being an issue?
Do you see a lot of people driving off the road in order to not hit a group of people?
Even a single person?

In sweeden you might have to avoid a moose on an icy road with a high speed but in cities this is not a problem.
>>
>>52582138
Micro processors can calculate in nano seconds and the speed of sound is 340m/s. The car can react and emit signal to the environment way faster than it could stop. Your assumed that the other parties are stationary and can not react to the situation.
>>
>>52582202
You're moving the goalpost now.
The argument was about people doing stupid shit all the time like not crossing the road at an intersection. Rome is full of people who jaywalk like it's GTA, therefore creating a problem even if there are intersections.

Again, not everyone is civilized, and some places have more uncivilized paesants than others.
>>
>>52582237
The car could emit the signal at the speed of light itself, it doesn't change the fact that in that situation, if it's impossible for you to stop or slow down, it will also be impossible for whoever is in your way to react to the signal.

Also, if you're about to get crashed into by a car, and you need to be notified by a horn to get yourself out of danger, then it means that you're probably not aware of the danger, and it will take you some time to react to the horn, then realise that you're in danger, decide what to do, and then try to get yourself out of the situation. Nobody is that quick.
>>
>>52582318
The average human reaction time to audio stimulus is 0.17s which is and order of magnitude faster then the time it takes a car to come to a complete stop from a initial velocity of a 100km/hr
>>
>>52582444
That time is just their reaction to the stimulus.
It was probably measured by telling people to push a button when they heard a sound that they expected to hear.
It's way different than minding your own business, hearing a signal, realising in what danger you're in, and then react accordingly.
Even just the time to jump out of the way is a lot in that situation.
>>
>>52582259
But having an intersection means people CAN cross the road safely.
If there is a lot of traffic, there won't be a lot of room between cars and people will not cross the road.
If there is room between cars, a car can slow down and stop before hitting people.
In the situation of >>52579973 the car would not be able to react, no reason to turn, no opportunity to sound the horn before the hit but the car should still stop after the accident in order to stop other people from getting hurt.
If people jump in front of a car, the car should not kill the driver. that would be insane.
>>
>>52579693
>Crash happens because of the driver
>self-driving cars
>crashes caused by driver
>>
>>52582502
>But having an intersection means people CAN cross the road safely.
Where do you live?
Is the thought of uncivilized people so unfathomable to you?
Is it so hard to understand that jaywalking is actually common in some areas where the general culture of the people brings them to not care about the rules?

>If there is a lot of traffic, there won't be a lot of room between cars and people will not cross the road.
Yes there will. People just wait for the right moment and run to the opposite side.

>If there is room between cars, a car can slow down and stop before hitting people.
This depends on how much room and how fast the car is going.
In many situations i's impossible for the car to stop, especially since the driver might not notice the pedestrian until it's too late.

>In the situation of >>52579973 the car would not be able to react, no reason to turn, no opportunity to sound the horn before the hit but the car should still stop after the accident in order to stop other people from getting hurt.
In that particular one, no. In many other similar ones, yes. Definitely. Go watch some videos of pedestrians getting killed by cars on liveleak.com
You'll find that there is a multitude of different situations, and the world isn't black and white.

>If people jump in front of a car, the car should not kill the driver. that would be insane.
I agree with this.
>>
>>52582493
The horn can give the pedestrians approximately 4s to react to a car coming to a stop from 100km/hr.
>>
>>52582602
>Yes there will. People just wait for the right moment and run to the opposite side.
kek.
Stopping / slowing down is still the best course of action.
going to the side might bring more people in danger.
>>
>>52582650
Are you sure about that?
CBA to do the math (also the thread is about to be archived and there is no time), can you provide some numbers please?
>>
>>52582706
No you can do your own homework and prove me wrong.
>>
>>52582694
That's if there's time for the car to stop or slow down enough.
>>
>>52582729
I almost believed you on those 4 seconds. now it just looks like a lie.

I'm going to calculate it and if the thread is still up I'll respond to you.
>>
>>52582737
there is always time assuming the self driving car doesn't go above the limit
>>
>>52582729
http://www.sdt.com.au/safedrive-directory-STOPPINGDISTANCE.htm
>At 100km/hr the car required 28 metres further to stop

100km/h = 27.777777777778 mps

This means that a car travelling at 100kph will take about 1 second to brake.

If a car traveling at 100kph has to resort to a horn to tell the pedestrians to move out of its ay, it means that it has less time than that 1 second necessary to brake.

Where did you get that 4 second number?
>>
>>52582768
Not if the pedestrian in jaywalking and suddenly comes out of the parked cars, like the webm previously posted in the thread.
Again, you keep thinking that situations like this can't happen because you're too stupid to comprehend that the world is complex and there are many variables in effect.
>>
>>52582822
Nevermind, my mistake... It doesn't take 28 meters to stop but around 37/40 meters, which is still less than 2 seconds. My point still stands.
>>
>>52581352
>stupid jaywalkers not seeing a car coming at a high speed not even 50m away
>innocent bystanders
choose one
>>
>>52581378
>it's not wrong if everyone does it
Is that what you're implying?
>>
>>52581671
>implying poor countries will ever get self-driving cars
>>
>>52582941
There are rich people everywhere.
There's an area in Somalia where people have luxury cars because of the money made with piracy FFS.

Do you expect Eastern Europe, SE Asia, Middle East, Latin America, southern Italy etc. to not even have a single autonomous vehicle until they become rich?
>>
>>52583005
Do you think that some pirate is going to be buying a self-driving car in a place where the roads are poorly mapped out?`
>>
>>52583005
Do you think anyone is going to bother with self-driving cars in a place that isn't even on google maps?
>>
>>52583022
>>52583035
1- I didn't say it had to be them. Just that juxury vehicles exist in poor countries too.
2- Do you seriously think some Somali pirate understand how a self-driving car works? Why wouldn't he want to buy one if he doesn't know why it wouldn't work?
>>
>>52583062
>Why wouldn't he want to buy one if he doesn't know why it wouldn't work?
With 'anyone' I was referring to companies too
Why would they bother with mapping the whole country just so 5 stupid pirates can use a self-driving car
>>
>>52581571
This.

I mean, I don't give a fuck if some little kid decided to play in traffic. That's his fault, why should I pay the price?
>>
File: Mogadishu view2.jpg (75KB, 800x328px) Image search: [Google]
Mogadishu view2.jpg
75KB, 800x328px
>>52583115
1- Self-driving cars don't only work with mapped roads.
2- GPS also works in those countries.
3- Just because it wouldn't work it doesn't mean they're not morons enough to buy it anyway.
4- Pic related is the capital od Somalia. You think there are no maps for that place?
>>
File: XTApLZ64a5-2.png (39KB, 300x250px) Image search: [Google]
XTApLZ64a5-2.png
39KB, 300x250px
>>52576365
>>
>>52583461
That joke has already been made multiple times in the thread.
Thread posts: 367
Thread images: 34


[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y] [Search | Top | Home]

I'm aware that Imgur.com will stop allowing adult images since 15th of May. I'm taking actions to backup as much data as possible.
Read more on this topic here - https://archived.moe/talk/thread/1694/


If you need a post removed click on it's [Report] button and follow the instruction.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com.
If you like this website please support us by donating with Bitcoins at 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5
All trademarks and copyrights on this page are owned by their respective parties.
Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site.
This means that RandomArchive shows their content, archived.
If you need information for a Poster - contact them.