Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If a self driving car hits a school bus at high speed (despite braking), I'm pretty sure a lawyer will make a case that it could have avoided the school bus by driving up on the side walk which may or may not have had people on it.

I get your point, but I think it is probably more correct for people who are making a split second decision rather than for a car where the decision has already been made in code and someone (or some company) has to take responsibility for why it was made that way.



Today’s AIs are happy to correctly identify which lane they should be on - counting how many people are on a given vehicle is so way out of scope that there is no point continuing.


This is a strawman.

If the option is available to avoid collision with another car and it brakes instead, colliding and killing someone else, that will be a lawsuit and that lawsuit will easily win.


You'll have lawyers willing to make a case against you even if you make a provably perfect choice.

And the real issue is whatever got you into that situation, not the split second choice you make.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: