On a core level, why are you trying to create an AGI?
Anyone who has thought seriously about the emergence of AGI equates the chance that AGI causes a human extinction level event ~20%, if not greater.
Various discussion groups I am a part of now see anyone who is developing AGI to be equivalent to developing a stockpile of nuclear warheads in your basement that you're not sure won't immediately shoot off on completion.
As an open question. If one believes that
1. We do not know how to control an AGI
2. AGI has a very credible chance to cause a human level extinction event
3. We do not know what this chance or percentage is
4. We can identify who is actively working to create an AGI
Why should we not immediately arrest people who are working on an "AGI-future" and try them for crimes against humanity? Certainly, In my nuclear warhead example, I would immediately be arrested by the government of the country I am currently living in the moment they discovered this.
The problem is that if the United States doesn't do it, China or other countries will. It's exactly the reason why we can't get behind on such a technology from a political / national perspective.
For what it's worth though, I think you're right that there are a lot of parallels with nuclear warheads and other dangerous technologies.
There needs to be a level of serious discourse that doesn't appear to currently be in the air, around what to do, international treaties, and repercussions.
I have no idea why people aren't treating this with grave importance. The level of development of AI technologies is clearly much ahead of where anyone thought it would be.
With exponential growth rates, acting too early is always seen as an 'overreaction', but waiting too long is sure to be a bad outcome (see, world re: coronavirus).
There seems to be some hope, in that as a world we seemed to have banned human cloning, and that has been around since dolly in the late 90s.
On the other hand, the USA can't seem to come to a consensus that a deadly virus is a problem, as it is killing its own citizens.
Anyone who has thought seriously about the emergence of AGI equates the chance that AGI causes a human extinction level event ~20%, if not greater.
Various discussion groups I am a part of now see anyone who is developing AGI to be equivalent to developing a stockpile of nuclear warheads in your basement that you're not sure won't immediately shoot off on completion.
As an open question. If one believes that 1. We do not know how to control an AGI 2. AGI has a very credible chance to cause a human level extinction event 3. We do not know what this chance or percentage is 4. We can identify who is actively working to create an AGI
Why should we not immediately arrest people who are working on an "AGI-future" and try them for crimes against humanity? Certainly, In my nuclear warhead example, I would immediately be arrested by the government of the country I am currently living in the moment they discovered this.