Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If people developing a system that controls two-ton death machines get their panties in a twist about having to demonstrate basic competency in a memory-unsafe language… it’s probably good that they quit.

Airline pilots don’t quit in a huff because they have to demonstrate basic competency annually.



This take might have more credibility if Tesla weren't facing criminal charges for its failure to deliver autopilot.


Criminal? No one is going to jail for failure to deliver on a very difficult AI problem


> Criminal?

Yes: https://www.reuters.com/legal/exclusive-tesla-faces-us-crimi...

> failure to deliver on a very difficult AI problem

It's the combination of claims that the feature could be delivered and a failure to deliver.

Failure is never illegal. Lying about failure often is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: