I believe every car manufacturer has a disclaimer that the autopilot can only be used as an assist. That the driver needs to keep his eyes on the road, and ready to intervene at any given time.
Were not at the self-driving level of kicking back the seat and watching netflix on your phone yet.
I doubt we will ever get there; there will always be edge cases which are difficult for a computer to grasp. Faded lane marking, some non-self-driving car doing something totally unexpected, extreme weather conditions limiting visibility for the camera's etc.
> I believe every car manufacturer has a disclaimer that the autopilot can only be used as an assist. That the driver needs to keep his eyes on the road, and ready to intervene at any given time.
-This is the scariest bit, IMHO. Basically, autopilot is well enough developed to mostly work under normal conditions; humans aren't very good at staying alert for extended periods of time just monitoring something which mostly minds its own business.
Result being that the 'assist' runs the show until it suddenly veers off the road or into a concrete barrier, bicyclist, whatever. 'Driver' then blames autopilot; autopilot supplier blames driver, stating autopilot is just an aid, not a proper autopilot.
This is the worst of both worlds. Driver aids should either be just that - aids, in that they ease the cognitive burden, but still require you to pay attention and intervene at all times - or you shouldn't be a driver anymore, but a passenger. Today's 'It mostly works, except occasionally when it doesn't' is terrifying.
This "driver aid" model itself is starting to sound like a problem to me. You either have safe, autonomous driving or you don't.
A model where a driver is assumed to disengage attention, etc but then be expected to rengage in a fraction of a second to respond to an anomalous event is fundamentally at its core flawed I think. It's like asking a human to drive and not drive at the same time. Most driving laws assume a driver should be alert and at the wheel; this is what...? Assuming you're not alert and at the wheel?
As you're pointing out, this leads to a convenient out legally for the manufacturer, who can just say "you weren't using it correctly."
I fail to see the point of autopilot at all if you're supposed to be able to correct it at any instant in real-world driving conditions.
> I fail to see the point of autopilot at all if you're supposed to be able to correct it at any instant in real-world driving conditions.
-The cynic in me suggests we need autopilot as a testbed on the way to the holy grail of Level 5 autonomous vehicles.
The engineer in me fears that problem may be a tad too difficult to solve given existing infrastructure - that is, we'd probably need to retrofit all sorts of sensors and beacons and whatnot to roads in order to help the vehicles travelling on it.
Road sensors ain't gonna fix the long tail of L5. We can't even upkeep roads as is, like crash attenuators, which would have mitigated the fatality in OP article.
Also, highway lane splits are very dangerous in general. It's a concrete spear with 70mph cars whizzing right towards it. Around here, they just use barrels of material, sand I believe. Somebody crashes into one, they clean the wreck, and lug out some more sand barrels. Easy and quick.
It isn't the SOLE action for L5 to be feasible, but I believe it is a REQUIRED action. (Emphasis added not to insinuate you'd need it, but rather to show, well, my emphasis. :))
For the foreseeable future, there's simply too many variables outside autopilot manufacturers' control; I cannot see how car-borne sensors alone will be able to provide the level of confidence needed to do L5 safely.
Oh, and a mix of self-driving and bipedal, carbon-based-driven ones on the roads does not do anything to make it simpler, as those bipedal, carbon-based drivers tend to do unpredictable things every now and then. It'll probably be easier when (if) all cars are L5.
I see this stated often, that humans are unpredictable drivers. What's the proof that automated systems will be predictable? They too will be dealing with a huge number of variables, and trying to interpret things like intent etc.
Yes, automated systems will also do unpredictable things - the point I was (poorly, as it were) trying to make was that the mix of autopilots and humans are likely to create new problems; without being able to dig it out now, I remember a study which found that humans had problems interacting with autonomous vehicles as the latter never fudged their way through traffic like a human would - say, approaching a traffic light, noting it turned yellow - then coming to a hard stop, whereas a human driver would likely just scoot through the intersection on yellow. Result - autonomous vehicles got rear-ended much more frequently than normal ones.
So - humans need to adapt to new behaviour from other vehicles on the road.
When ALL vehicles are L5, though, they (hopefully) will all obey the same rules and be able to communicate intent and negotiate who goes where when /prior/ to occupying the same space at the same time...
I think that unless a single form of AI is dictated for all vehicles, we can't safely make the assumption that autonomous vehicles will obey the same rules. Hell, we can't even get computer to obey the same rules now, either programmatically or at a physic level.
And, of course, they should all obey the same rules (well, traffic regulations being one, but also how they handle the unexpected - it would be a tough sell for a manufacturer who rather damaged the vehicle than other objects in the vicinity in the event of pending collision if other manufacturers didn't follow suit...
Autonomous Mad Max-style vehicles probably isn't a good thing. :/
It's only a problem if you believe in driverless cars, then it becomes a Hard Problem: "it works in situations where it's irrelevant", but so does plain old not holding the wheel: look, it's self-driving!* (in ideal conditions)
Which is why most car companies long ago said they wanted to skip level 3 and go direct to level 4. With level 4 when the car can't drive it will stop and give the human plenty of time to take over.
Were not at the self-driving level of kicking back the seat and watching netflix on your phone yet.
I doubt we will ever get there; there will always be edge cases which are difficult for a computer to grasp. Faded lane marking, some non-self-driving car doing something totally unexpected, extreme weather conditions limiting visibility for the camera's etc.