If a self driving car hits a school bus at high speed (despite braking), I'm pretty sure a lawyer will make a case that it could have avoided the school bus by driving up on the side walk which may or may not have had people on it.
I get your point, but I think it is probably more correct for people who are making a split second decision rather than for a car where the decision has already been made in code and someone (or some company) has to take responsibility for why it was made that way.
Today’s AIs are happy to correctly identify which lane they should be on - counting how many people are on a given vehicle is so way out of scope that there is no point continuing.
If the option is available to avoid collision with another car and it brakes instead, colliding and killing someone else, that will be a lawsuit and that lawsuit will easily win.
I get your point, but I think it is probably more correct for people who are making a split second decision rather than for a car where the decision has already been made in code and someone (or some company) has to take responsibility for why it was made that way.