No, AI Does Not Work “Because of Math”
“AI works because of math!”
No. At least not in the way you probably think.
If you take enough naive pieces and mash them together you will get something very different than the pieces. Computing provides the scale and speed to do this with engineering. But what those pieces are, matters nowhere near as much as you might think.
Let’s augment reinforcement learning: take a real world system, define the actual dynamics with some functional abstraction (e.g. basis functions), throw those basis functions into a linear operator, then compute this operator millions of times via iteration, and get something that works (e.g. a reinforcement learning model that drives cars).
Saying that this occurs thanks to mathematics is like saying The Constitution exists because someone built a Hall and handed people pens.
Systems need interactions, and mathematics is how we achieve interactions in the machine. It is the mashing together and iteration that solves the problem, not mathematical elegance or precise models. Mathematics is not “solving” anything, it is taking data, changing its representation, and amalgamating, blending, merging, fusing, conglomerating and jumbling its way towards what works.
As we move towards building as nature builds, mathematics is a naive starting point, not some recipe for producing things that work. It is not the sacred properties of mathematical constructs that matter, since by definition, these properties will not map to the outputs we need.
So, is math needed? Well yes, since that is how we tell machines to mush stuff together. But that’s all it is. Carriers of data that enable mushing. If we had something else we would use it.
Stop giving credit to human-made constructs. We bring the pieces that get discarded; nature makes it work.