Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's not what happened. The AI was not programmed to drive through anything. It was incorrectly programmed to dump previous data when the categorization changed. It correctly identified at each point that it was meant to slow down and/or stop but, by the time it determined what the obstacle was, the previous data was thrown out and it didn't have enough time to stop properly. In your example, it was more like "There's something that 12 o'clock. Bike? Person? Unknown? Stop!!" just before actually hitting the person.


The car did "see" an obstacle for over 6 seconds and did not brake for it, now someone is dead. You are haggling over semantics to make it look like this did not happen and/or this is not a bug. Atrocious.

(Or, more charitably, "oops, somebody forgot that object persistence is a thing" does not excuse the result)


What? That's not at all what I'm doing and you're being extremely disingenuous to suggest that. I'm simply correcting misinformation. The car wasn't programmed to drive through anything. It was programmed to throw away information. Either way, it's an atrocious mistake and I've even said, elsewhere in these comments, that the people responsible for that code should be held liable for criminal negligence. There's no need to lie about my point or my position to defend yourself. That's just silly.


I have misunderstood you then, and I apologize.


Then I forgive you and I'm glad we see eye-to-eye on this. Everyone should be appalled at Uber's role in this and their response along with the lack of repercussions for them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: