From the guy that first hacked the iPhone and the PS4, this only month old project is already on the streets of San Francisco. In the creator's own words "I'm the next tech billionaire".
No he isn't, until a few things get legislated you can't have street legal mass production self driving cars.
If a car with his code and tech drives someone over, who's fault is it?
If the car makes the decision to slam into another car when something falls off the back of the truck in front of it do you blame the programmer? The distributor? or the person inside for not taking control fast enough? If a person were to be driving it would just be seen as instinctual swerving, no reasoning, just something done on reflex. The fact that someone programmed the decision into the cars behavior changes things dramatically from a legal and ethical stand point.
This and plenty of other things need to be addressed before the product is available to the general public....Unless we're all retards of course.
>every death from self-driving cars charges the person who coded it
>every time someone dies, one more case of manslaughter against Ted from code-tech.
>there is a standard form in courts across the country to add yet another life sentence for Ted
Depends on circumstances.
>Was the Pedestrian jaywalking?
>Is there an emergency back up? Does the system alert the driver to override the automatic driver?
In theory it should be nearly crash proof especially if all vehicles on the road have a similar system on the road and they were able to communicate with each other.
Of course there is the random suicide. But I'd imagine the car would automatically alert 911 of an accident. Which might negiate some libeliblity.
let me extrapolate then on this part:
>If the car makes the decision to slam into another car when something falls off the back of the truck in front of it do you blame the programmer?
Here's the scenario, you're driving along the highway in your Auto-Car 1000 in a rural area when your autocar comes to be behind a truck hauling bales of hay and finds itself boxed in by an SUV to the right, an economy car behind it and a motorcycle to the right. Suddenly and despite all regulation being followed one of the bales falls off the back of the truck and square into the path of your Autocar.
The programming realizes 3 things, the object is both solid and heavy and will kill you if it slams into it, there isn't enough distance between it and the bale of hay for it to either stop, put the human occupant in charge or otherwise avoid a crash, and three it will have to plow into something before this day is over.
If it were a normal crash it wouldn't matter, it's an accident an act of god we simply through our hands in the air and quite rightly say no one is to blame regardless of the outcome. In this case however we have a problem.
Someone, at some point, told the Auto-Car what to do.
Does it crash into the SUV because it has the highest safety rating? Does it crash into the motorcycle because it's only putting a single life at risk? Does it slam into the bale of hay killing you but potentially saving 14 people? This needs to be coded and written into the programming of the Auto-car, it has to basically act as a judge of lives in extreme circumstances.
In every case you have lawsuits, in every case you have to assign blame to SOMEONE and this shit needs to be figured out before the Auto-car is on the road. The moment you assign blame to the corporations though (and to be fair it's legally the logical choice) you'll make every single company stfu about self-driving cars really damn quick.
>In theory it should be nearly crash proof
Not really no, there is no one outside of marketing teams saying the system will be crash proof in any but the most controlled of laboratory conditions. Real world projections all show that self driving cars will still experience measurable accidents just at far lower rates than humans. The problem is that for there to not exist a legal quandary in regards to the assignation of blame in the result of accidents the accident rate must forcibly be at 0.00%
If this question is not satisfactorily answered by the powers that be before the widespread implementation of self driving automobiles the result will likely be that when an accident forces the question to the supreme court the entire nationwide fleet of self driving cars will be, quite reasonably, confined to their garages until a decision is made. Resulting in millions if not billions lost in revenue across the whole of the country.
>term made by auto industrial complex to shift blame of road injuries to victims
Supposedly, in some countries, poor control of a 1500-5000 lb weapon is actually an offense.
In this country, our thoughts and prayers are with the families of the victims. That's the thin layer of schmoo we have to offer, and that's incidentally where the bus gets off, folks.
>If a car with his code and tech drives someone over, who's fault is it?
The driver's, of course.
If an airliner flies into a mountain under autopilot control, do you blame the autopilot? Of course not, you blame the pilot who was responsible for the aircraft, including setting and monitoring the autopilot itself.
The manufacturer of course. If self-driving cars really are much safer than human drivers, manufacturers should be able to afford to pay out for the rare death/injury. And making them liable will incentivize them to constantly improve the safety of their vehicles.
The manufacturer is ultimately responsible for every component that goes into a finished vehicle, which would include the code that makes those decisions. Maybe they didn't program it themselves, but they definitely signed off on it.
I would assume a program would be written to analyze the situation and choose the path of least destruction, so hitting a parked car over a pedestrian or hitting an SUV over a motorcycle; the car would automatically cause the least damage which is in some cases more than a person could do
>If a car with his code and tech drives someone over, who's fault is it?
What a loaded question.
If his car has the basic programming of stopping at a red light or stopping whenever a large foreign object appears in front of it, then it will always be the fault of the pedestrian.
Nah, fuck toost. There's too much traffic in cities for it to be useful, and poor road conditions are common in the countryside.
About the only place I see it being useful is a small suburban town
I think the real challenge is going to be upgrading our infrastructure and changing traffic law to allow for dedicated autonomous travel
like, think about how much fuel useage we can eliminate by making the US traffic system uniform, and programming freight vehicles to operate at peak efficiency through networked coordination
it's exciting to think that after a century of the petrol industry dictating transit policy in the US, we may actually be within a generations reach of real mass transit, maybe even in my lifetime
Then it's a glitch/malfunction in the hardware of the car (assuming the code has been tested extensively). All recently released models will be recalled and retested or fixed.
What would happen if Toyota or some big auto company accidentally released defective cars and some people died? Would the CEO go to prison? No.
Wait a minute.
If all the cars are self driving, why can they not coordinate with each other? Why do they have to be in a vacuum?
So the SUV pulls off to the shoulder at the same time as your car pulls right. All the cars behind you automatically brake as needed.
It would be pretty easy to make manually driven cars illegal on the freeway, thus avoiding pretty much all the expected accidents.
Anybody who violates this would be at fault for any damage to themselves or others.
Or you could otherwise build new infrastructure for self driving cars only, but that's pretty expensive.
>make manualy driven cars illegal
Thanks for making vehicles more expensive and forcing everyone to buy a robot car.
Whats the point of self driving cars? They are just glorified buses. Remove private vehicles from roads and invest in state of the art public transportation.
The money you'd save on insurance would more than make up for it.
The most expensive part about having a car isn't the car itself, it's paying some guys to insure it because it's a literal deathtrap.
If you take the deathtrap out of the car, insurance prices go way down.
But it only works if everybody does it. It's only safer if each and every car on the road drives itself.
Otherwise I agree that we should be focusing on bolstering public transportation rather than continuing to deal with the clusterfuck that is cars.
But that's somehow even more of a pipedream in America.
the "it" here doesn't refer to insurance, he means the safe flow of traffic, i.e. if all the cars follow the same predictable protocols and/or are networked, then error is reduced. but one guy driving manually can spill his coke on his lap and randomly plow into a whole column of traffic, thus negating any benefit of the self-driving system. so then nobody's insurance actually goes down, either.
i'm told the software is pretty good at dealing with random situations, but comparatively that's still much more dangerous than one standard of computerized driving, particularly if it's networked.
all major manufacturer already explained that self driving cars will never turn away from the road but they will simply stop. They don't have the right to choose who lives and who dies, so they will follow the traffic laws that always says to stop in case of danger and not play Schumacher dodging left and right.
You have to consider that sdc have tons and tons of telemetry data and camera with video feeds. Find what goes wrong will be fairly easy. When a plane go down you don't stop all the airlines, you do it only if it proven that there is a severe error in the airplane project.
>question of the century
are you serious
It's a computer algorithm designed to make absolute decisions based on obeying the law and assumes everyone else is also obeying the law.
>computer, obeying the law, hit's somebody breaking the law
>WELL IT'S IMPOSSIBLE TO KNOW WHO'S FAULT THIS IS
>decisions based on obeying the law
not really. the algo isn't programmed with the law and told "go!," a good human driver drives and the algo learns for a while, i.e. it learns by training, not getting programmed with the rules.
If a human driver randomly and erratically swerved to avoid something falling spontaneously into the street, they could wipe out 6 grandmas and it'd just be a factor of the overall tragedy.
Why people expect a machine to be held to impossibly high, inhuman standards of performance when a not-quite-illegally drunk asshole on a moped next to it would still be judged by the legal standard we have now is beyond me.
OK guys, I have a solution:
>auto car needs to decide how to dodge the obstacle in a morally ambiguous situation
>programmer needs to code the car for this situation
>in order to avoid any moral responsibility, he lets the car generate a random number and makes it act on whatever number comes out
>can't be blamed for the murder since he didn't decide who to kill