· 2 min read

Degrees of freedom

“Degrees of freedom” a powerful concept that is underused outside of this kind of cybernetic/systems thinking lineage. It’s the inverse of constraints, so focusing on the most constrained part of a system is the highest leverage, is another way to look at this.

You also see this in Ashby’s law of requisite variety: to control a given system with n degrees of freedom, you need to have ≥n degrees of freedom in your controller. The controller system has to be complex enough to model and contain the controlled.

and in fact that’s exactly what we do with deep learning models. That’s what the parameters are! A 7B model has 7 billion tunable weights in it (insert caveats here if you are getting nerd sniped by the specifics) and it can theoretically model any system with 7 billion or if you were moving parts.

We don’t know yet really how many “parameters” exist in complex systems like language, vision, climate, economics. work but we’re starting to be able to predict them, which means we’re using something like the right number of parameters. Who knows maybe we have already found the bare minimum degrees of freedom that needs to model a given thing and our models can get no smaller than this. But they probably don’t have to get much larger..?

View original