So you're proposing replacing a hypothetical risk (Geoffrey Hinton puts the probability at 1%) with guaranteed pain and destruction, which might escalate to nuclear war? Thank you, but no thank you.
No, I'm proposing replacing a high risk scenario (I disagree with Hinton here) with a low risk scenario, which has low chance to escalate to any serious fighting.
It seems people have missed what the proposed alternative is, so let me spell it out: it's about getting governments to SIGN AN INTERNATIONAL TREATY. It is NOT about having US (or anyone else) policing the rest of the world.
It's not fundamentally more dangerous than the few existing treaties of similar kind, such as those about nuclear non-proliferation. And obviously, all nuclear powers need to be on-board with it, because being implicitly backed by nukes is the only way any agreement can truly stick on the international level.
This level of coordination may seem near-impossible to achieve now, but then it's much more possible than surviving an accidental creation of an AGI.
> No, I'm proposing replacing a high risk scenario (I disagree with Hinton here)
I disagree with you here, I think the risk is low, not high.
> it's about getting governments to SIGN AN INTERNATIONAL TREATY. It is NOT about having US (or anyone else) policing the rest of the world
We haven't been able to do that for climate change. When we do, then I'll be convinced enough that it would be feasible for AI. Until then, show me this coordination for the damage that's already happening (climate change).
> This level of coordination may seem near-impossible to achieve now, but then it's much more possible than surviving an accidental creation of an AGI.
I think the coordination required is much less possible than a scenario where we need to "survive" some sort of danger from the creation of an AGI. But we can find out for sure with climate change as an example. Let's see the global coordination. Have we solved that actual problem, yet?
Remember, everyone has to first agree to this "bomb AI" to avoid war. Otherwise, bombing/war starts. The equivalent for climate change would be bombing carbon producers. I don't see either agreement happening globally.
To continue contributing to climate change takes very little: you need some guy willing to spin a stick for long enough to start a fire, or to feed and protect a flock of cows. To continue contributing to AI, you need to maintain a global multi-billion supply chain with cutting edge technology that might have a four-digit bus factor.
The mechanisms that advance climate change are also grandfathered in to the point that we are struggling to conceive of a society that does not use them, which makes "stop doing all of that" a hard sell. On the other hand, every society at least has cultural memory of living without several necessary ingredients of AI.
to continue contributing to the harm caused by AI only requires that 1 person use an existing model running off their laptop to foment real-world violence or spread disinformation en masse on social media, or use it on a cheap swarm of weaponized/suicide drones, for a few examples
There's a qualitative barrier there. The AI risk people in the know are afraid of is not a flood of AI-generated articles, but something that probably can't be achieved yet with current levels of AI (and more slop generated from current-day AI won't hasten the advent of). On the other hand, modern-day greenhouse gases are exactly the greenhouse gases that climate activists are afraid of in the limit.
The AI risks people in the know are afraid of, are indeed what I listed: pretty much any person, anywhere, using an existing model running off their laptop to foment real-world violence or spread disinformation en masse on social media, or use it on a cheap swarm of weaponized/suicide drones, for a few examples
That's what we're already seeing today, so we know the risk is there and has a probability of 100%. The "skynet" AI risk is far more fringe and farfetched.
So, like you said about climate change, the harm can come from 1 person. In the case of climate change, though, the risks people in the know are afraid of, aren't "some guy willing to spin a stick for long enough to start a fire, or to feed and protect a flock of cows"