Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Having driven in Canadian winters, I honestly agree that reliable autonomous driving in inclement weather is indeed decades away.

The visual recognition needed is well beyond the systems today.



Isn't that just "image de-obfuscation" though? Seems like narrow AI will be able to out-class humans at that in no time. You can generate as much training data for that as you want. Doesn't really require human-type intelligence. Though I guess you might mean that the obfuscation makes the edge cases even harder, which makes sense.


There's like fifty caveats that go along with this statement, but this is the internet so I'm just going to skip all that.

Something like half your human brain is devoted to visual processing.

There's a tendency to think that things like language is what makes the human brain special, or our ability to plan or think abstractly, and we talk about things like "eagle eyes", but the truth is humans are seeing machines with most everything else as an afterthought.

The reason your cat will attack paint spots on glass for hours and flips the hell out about laser pointers is because their visual systems are too simple to distinguish between those and the objects that actually interest them, like insects.

Vision is not the easy part of AI.


> Vision is not the easy part of AI.

I think it is, actually. Going from raw pixels to objects is the (relatively speaking) easy part. It's the next part (using that for planning and common-sense reasoning) that's the hard part. Machine learning has already advanced past humans in this regard for many classes of problems - which is part of the reason why captchas are getting so hard.

This was several years ago, hence the move away from obfuscated text (which was getting harder and harder to read): https://spectrum.ieee.org/tech-talk/artificial-intelligence/...

I'd be surprised if basic perception tasks as human-ness tests last more than a few more years.


citation for "half your human brain is devoted to visual processing"?


Since the internet places no weight on things like "common knowledge to anyone in the field" or "I took a bunch of classes on the brain in college", here's a random quote from someone at MIT: http://news.mit.edu/1996/visualprocessing


I have a car (Honda Pilot) where the company decided to make the lift gate window too high, probably to accommodate mounting the spare tire inside the cabin. This design makes you dependent on the rear camera for most reverse use cases.

It probably made a lot of sense in the Southern California design center. In Upstate New York, that camera is covered in road spray and salt, and my brain cannot see anything or act effectively without cleaning it. Even after doing that, it will get dirty again after a few minutes of driving.

I’d guess that a least a few dozen people will hurt by this decision.

Take this problem to the self-driving car and things get even worse. You’re going to have a lot of problems with sensor effectiveness that cannot be magically fixed with software.


It sometimes seems as if half the purpose of assistive driving systems serves to compensate for the absolutely horrible sight lines in a lot of newer vehicles.


And I have heard rumors of lobbying going on to get the requirement for rear view mirrors dropped when video feeds are provided to replace the functionality.


Here's my go-to example about the challenges of driving in a Canadian winter.

I was waiting for my bus to work one morning after a large snowfall. The snow clearing crews were hard at work, but the street was effectively blocked by piles of snow, men, and machines.

Yet, my bus arrived on time *driving down the sidewalk".

I am not sure how any self-driving system could have figured that out :)


And if it did, hollow sidewalks are a thing in some places, so...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: