There’s a 2x2 here for transformative AI like: generalizes versus specializes, harmonizes versus destabilizes.
The former tracks centralization, tends toward fewer bigger models in the more generalized state. The latter more about AIs frequency locking to human timescales, coordination patterns, cycles that are important to us.
So you could have generalize / harmonize world, with gentle giants that lawfully work together, or specialize / destabilize, an asymmetric war of all against all, etc
Wait did i just reinvent alignment charts
People making AI models do not realize how confusing the numbering system is. Why is this one at 3 and that one at 4.1 and that one at 4.7? And they're all "state of the art" so are they equally good or no?
— 🎭 (@deepfates) December 22, 2025
I suspect we will end up with some kind of yearly cycle, where…
