Uber Self-Driving Vehicle Involved in Arizona Crash
> how will they ever recover?
https://www.bloomberg.com/news/articles/2017-03-25/uber-autonomous-vehicle-gets-in-accident-in-tempe-arizona
>>1898316
Absolutely no info about the crash. Probably not the cars fault.
>>1898316
But self driving cars cant get into accidents because the remove the human factor.. This must be fake news.
>>1898323
there's plenty of human factor on the other end
>>1898316
No system is 100% infallible.
Still, public opinion is more important than actual figures, so incidents like this are a blow to the concept.
The argument will be if there is a 100% chance of a crash into 2 separate groups of civilians, which will the computer decide?
kills family of 4 black people instead of 4 white people
> that's racist.
Kills 2 senior citizens instead of 2 teens
> but they had plenty of time living in life already. Children are our future
Kills 1 person with tattoos instead of person without tattoos
> tattoo's carry probability of poor decision making and crime so their value to society is expendable relatively
The list can go on. How can a computer decide the fate of people?
> protip: it can't and won't.
>>1898419
Is there even any indication this was an accident with ethical considerations?
But to answer in general: Yes, they will have to make their algorithm available for court review and there probably will be several spectacular cases of shitty coding killing people. But in the end we will probably settle on a rather mundane utilitarian algorithm.
We don't have to (and shouldn't) hold computers accountable to the same standard as humans - we should hold them to higher! This is not a question of what would a panicked driver would have done or what would a bystander find ethical. This is a question of optimizing odds. The computer has a completely different set of data available and doesn't care about skin colour or tattoos.
You don't like it chose to mow down black children over white children? Here's my 120 page print-out of telemetric data and medical reasoning based on 500,000 collisions in the database and a complex physics simulation. The 2 white kids had a chance of only 0.01% survival while the 2 black kids had one of 0.03%, so naturally the algorithm told it to crash into the black kids to give the best odds of survival and to minimize damage. How would you contest that? It's just maths. You feed it all the data and tell it to mimize harm. Who cares about these ethical fringe cases?
>>1898419
>How can a computer decide the fate of people?
It doesn't. The programmer does. And they do it all the time already.
>>1898319
dindu nuffin
>>1898419
this is such a faggy argument, the same way an ordinary driver could "decide the fate of people". Such perfect cases would probably never ever happen, especially with the superior senses of a computer which could detect a probable accident long before you could
>>1898419
Its quite simple. Just follow the traffic rules. Slow down when an accident is about to happen. Only dodge when no accidents would occur.
>>1898419
Will never happen
The vehicle will brake as hard as it's ABS allows, and attempt to stay in its lane; if it plows into people, so be it