Thank you, Jack. That "equally germane" is what strikes us about the letter. We are with you that the deep innovation-diffusion discourse including Jeff's new work applies to national leadership/outcomes. On that note, I find the same diffusion point in the military innovation discourse too (e.g. Stephen Rosen, Bill McNeil, and and Williamson Murray each in their own ways). Since the NSCAI, we've been consistently advancing that AI is a general purpose technology (agrees with Jeff) and that is also different from fission (unless one wants to make the energy connection and that energy itself is a GPT). This is why we note, "A nuclear chain reaction and the arrival of super-intelligence are not one and the same." What has transfer value is how a nation gets organized for a very large technology transformation. This transfer value is in the spirit of "history doesn't (exactly) repeat itself, but it often rhymes." Thanks again!
A creative exercise with a lot of important and helpful analogies. There are, however, some fundamental differences between a nuclear weapon and AGI.
By the start of WWII, the physics, mathematics, and chemistry governing a nuclear reaction (fission/fusion) were unassailable. No such parallel with AGI, other than to extrapolate from current data and compute to future data and compute (an extrapolation that many AI experts don’t agree with — calling instead for an entirely new research path to AGI).
A nuclear explosion is a singular and extremely well-defined event. The whole point of AGI is that it is supposed to be everything everywhere all at once.
Put the 50 top physicists in the world in the same room in 1944, and they would all have agreed on the meaning of nuclear fission/fusion. By 1946, they would have all agreed on the global ramifications of a nuclear war.
The top 50 AI leaders in the world are still all over the place when it comes to defining AGI and its potential ramifications. From “meh” to “the end is nigh!” Because it’s not nearly the same clean physics/mathematics/chemistry problem. In fact, it’s as much a socio-economic issue as it is anything else.
I struggle with treating AGI as the outcome. Or viewing this as a race. I’m much more in Jeff Ding’s camp: that long-term societal diffusion matters most.
Many of the nuclear-weapon related recommendations you raise are equally germane to AI diffusion. It’s just a much messier problem, with far more complicated and uncertain outcomes, than the race for nuclear weapons.
Thank you, Jack. That "equally germane" is what strikes us about the letter. We are with you that the deep innovation-diffusion discourse including Jeff's new work applies to national leadership/outcomes. On that note, I find the same diffusion point in the military innovation discourse too (e.g. Stephen Rosen, Bill McNeil, and and Williamson Murray each in their own ways). Since the NSCAI, we've been consistently advancing that AI is a general purpose technology (agrees with Jeff) and that is also different from fission (unless one wants to make the energy connection and that energy itself is a GPT). This is why we note, "A nuclear chain reaction and the arrival of super-intelligence are not one and the same." What has transfer value is how a nation gets organized for a very large technology transformation. This transfer value is in the spirit of "history doesn't (exactly) repeat itself, but it often rhymes." Thanks again!
A creative exercise with a lot of important and helpful analogies. There are, however, some fundamental differences between a nuclear weapon and AGI.
By the start of WWII, the physics, mathematics, and chemistry governing a nuclear reaction (fission/fusion) were unassailable. No such parallel with AGI, other than to extrapolate from current data and compute to future data and compute (an extrapolation that many AI experts don’t agree with — calling instead for an entirely new research path to AGI).
A nuclear explosion is a singular and extremely well-defined event. The whole point of AGI is that it is supposed to be everything everywhere all at once.
Put the 50 top physicists in the world in the same room in 1944, and they would all have agreed on the meaning of nuclear fission/fusion. By 1946, they would have all agreed on the global ramifications of a nuclear war.
The top 50 AI leaders in the world are still all over the place when it comes to defining AGI and its potential ramifications. From “meh” to “the end is nigh!” Because it’s not nearly the same clean physics/mathematics/chemistry problem. In fact, it’s as much a socio-economic issue as it is anything else.
I struggle with treating AGI as the outcome. Or viewing this as a race. I’m much more in Jeff Ding’s camp: that long-term societal diffusion matters most.
Many of the nuclear-weapon related recommendations you raise are equally germane to AI diffusion. It’s just a much messier problem, with far more complicated and uncertain outcomes, than the race for nuclear weapons.