AI Does Not Actually Learn “Probability Distributions”
In AI, when we say a model “learns a probability distribution” this is not a literal statement.
In reality, we have no idea what that thing is (what the model learned). But for convenience, we call it a “probability distribution” because in simple systems that’s what we actually see and use.
Probability under complexity is an analogy used as a conceptual framework, to describe what a model is doing, not a literal, observable distribution.
There’s no direct way to extract and visualize the “distribution” from an AI model without losing the nuances captured by the model.
Imagine how much more complex nature is than an AI model. Reductionist extraction is at the heart of most of today’s science, and it is utter nonsense.
If you want to understand nature, you have to observe it where it stands. In Situ.
AI does not run off equations and distributions, that is just a useful way to speak about it.
Know the difference.