Would you say the same about code that, for example, is designed to fail when used in a missile? E.g. navigation code that deliberately pilots into the ocean if onboard something going faster than Mach 3?
I think the poster meant navigation code that was not intended to be used in a missile, being used in a missile. Consumer grade GPS units do exactly that; they intentionally stop working above a certain speed or altitude, so that they can't be used to build missiles.
There’s a big difference doing that in a consumer grade device and an open source project you publish for the world to use; i.e. when someone decides to use it in a hypersonic passenger jet
It's perfectly ethical to write open source code that will crash a passenger jet if used in such a manner. It's unethical (and probably criminally negligent) to include it in a passenger jet.
It's perfectly ethical to write open source code that will crash a passenger jet
Really?
We’re not talking about bugs here, we’re talking about the deliberate inclusion of dangerous logic traps based on assumptions that could be wrong.
In engineering terms, this is trying to solve the problem at the wrong level of abstraction. Maybe you don’t like the thought of your code being used in weapons; that’s perfectly reasonable. Maybe you want to rid the world of such weapons; also reasonable. My advice would be that open source is not the right model in those circumstances, and that to really solve the problem you’ll have to be politically engaged.
Littering code with booby traps and publishing it is an absolutely awful idea, one that should not be acceptable to us as a professional community.
Just stating you don’t have a warranty doesn’t absolve you of all ethical considerations. You can’t just publish something littered with poorly thought out booby traps and then absolve yourself with a license file! If you’re not interested in people using your code for specific purposes, maybe don’t publish it for all to use?