All Things Techie With Huge, Unstructured, Intuitive Leaps

A New Branch of Artificial Intelligence Mathematics Is Needed For Artificial Neural Networks


Isaac Newton described himself as a "natural philosopher".  He did experiments in motion and gravity by rolling balls down inclines and measuring and extrapolating the results. He was acquainted with Archimedes of Syracuse's Method of Exhaustion to calculate the area inside of a circle. He studied the works of French mathematician Pierre de Fermat and his work with tangents. He was willing to work with infinite series and alternate forms of expressing them.  He gravitated towards the idea (sorry for the pun, it it is fitting, so to speak), of the infinitesimal.  He determined the area under a curve by extrapolating it from the rate of change of it, and invented calculus.  Leibniz also invented calculus independently of Newton, and he too, introduced notation that made the concepts understandable and calculable.

It was Thomas Bayes who defined Bayesian inference and rules of probability. Although he developed some of the ideas in relation to proving the existence of a divine being, his formulas have been co-opted as the basis of machine learning.

The process repeated itself again with George Boole, who created a representational language and branch of mathematical logic called Boolean Algebra that is the basis of the information age.

Mathematics is a universal language with its own abstract symbology that describes concepts, observations, events, predictions and values based on numbers.  The whole shebang is independent of spoken cultural communications and human experience. It is the purest form of description in most concise manner possible.

Those who are bound by the strictures of conventional thinking have the idea that almost everything that we know about mathematics has been described or discovered.  I beg to differ.

The very-recently invented mathematical field of topological modularity had to used to solve Fermat's last theorem https://en.wikipedia.org/wiki/Fermat%27s_Last_Theorem There are new discoveries almost every day in mathematics, and the velocity of discovery will increase.

It is my feeling that the field of mathematics, as used to describe how thing work on earth and the universe is evolving to be a more precise way of stating imprecise events -- which happens to be the macro human experience on earth, and may also be the static of quantum dynamics.

The reason that I bring this up, is that I just read a piece about DeepMind - Google's new AI acquisition.  It just beat the European Go champion five times in a row. Go is an ancient Chinese board game.  If you are thinking "So What?", I'd like to take this opportunity to point out that the number of possible permutations and combinations in the game Go, outnumber the number of atoms  in the universe.  Obviously this was not a brute force solution.

The way that DeepMind created this machine, was to combine search trees with deep neural networks and reinforcement learning.  The program would play games across 50 computers and would learn with each game.  This is how a journalist John Naughton from the Guardian described what was happening:

The really significant thing about AlphaGo is that it (and its creators) cannot explain its moves. And yet it plays a very difficult game expertly. So it’s displaying a capability eerily similar to what we call intuition – “knowledge obtained without conscious reasoning”. Up to now, we have regarded that as an exclusively human prerogative. It’s what Newton was on about when he wrote “Hypotheses non fingo” in the second edition of his Principia: “I don’t make hypotheses,” he’s saying, “I just know.”

But if AlphaGo really is a demonstration that machines could be intuitive, then we have definitely crossed a Rubicon of some kind. For intuition is a slippery idea and until now we have thought about it exclusively in human terms. Because Newton was a genius, we’re prepared to take him at his word, just as we are inclined to trust the intuition of a mother who thinks there’s something wrong with her child or the suspicion one has that a particular individual is not telling the truth.  Link: http://www.theguardian.com/commentisfree/2016/jan/31/google-alphago-deepmind-artificial-intelligence-intuititive

There is something deep in me that rebels at the thought of intuition at this stage of progress in Artificial Intelligence.  I think that intuition, as well as artificial consciousness may eventually be built with massively parallel massive neural networks, but we are not there yet. We haven't crossed the Rubicon as far as that is concerned.

What we really need is new branch of mathematics to describe what goes on in an artificial neural network.  We need a new Fermat, Leibnitz, Boole or Newton to start playing with the concepts and logic states of neural networks and synthesize the universal language, theories and concepts that will describe the inner workings of a massively parallel artificial neural network.

This new field could start anywhere.  The math of an individual neuron is completely understood. We know the basic arithmetic that multiplies weights with inputs and sums them, We understand how to use a sigmoid activation function or a rectified linear function to fire activation of the artificial network. We understand how to use back propagation, gradient descent and a variety of other tools to adjust the weights and thresholds to a more correct overall response.  We even understand how to put the neurons into a Cartesian coordinate matrix and do transformation matrices. But that is as far it goes.

We cannot define an overall function of the entire field in mathematical terms. We cannot define the field effects coupled to the inner workings and relationships of the neural networks to specific values and outcomes.  We cannot representationally take the macro function that neural networks solve, and couple them to other macro functions and have useful computational ability from a theoretical or a high level point of view.

The time has come. We need some serious work on the mathematics of Artificial Intelligence.  We need to be able to describe what the nets have learned and be able to replicate that with new nets. We need a new language, a new understanding and a new framework using this branch of mathematics.  It is this field that will prevent a technology singularity.





No comments:

Post a Comment