All Things Techie With Huge, Unstructured, Intuitive Leaps

Asynchronous Neural Nets Are Primitive Cave Man Neural Nets



The first electronic circuit that I ever designed was a logic fall-through. (The circuit was for a team quiz-show type of game which determined which button was pressed first by what member of what team.)  It was asynchronous. That means that when a signal arrived at the inputs to the silicon chip, it was processed right away. It wasn't held for a state change of the chip like modern digital systems are today.

Modern digital systems have a bus architecture. That means that every chip on the circuit board is connected to a central set of traces or wires called a bus. The chips share the traffic or signals on the bus. The way this happens, is that there is a regular clock signal generator which controls timing. So if an input receives a signal, it in turn sends a signal to the bus controller that it needs to put its data on the bus, and that needs to go to a certain scratchpad register or memory unit to hold that data.

The bus controller signals all of the other chips that it is going to commandeer the bus. All of the other chips finish up what they are doing, and clear the decks. The bus controller then signals the necessary registers to receive the data, and signals the originating input to load its data on the bus.  Once the data is loaded, it signals the register to process the data, and clears the decks for the next operation.  All operations are controlled by a nice orderly clock signal, that is a square wave that rises up and down. And that square wave represents computer binary language of zeros and ones.  So a clock signal in computer talk, always looks like this: 01010101010101010101 etc,
but the important thing is that there is a frequency or a timing between the state changes of zero and one, or the up and down of the waveform.

This frequency is important. If you remember back to the days of dial-up internet (if you are old enough), you would remember the distinctive sound of the modem connecting to the internet through the phone line. It would be an oscillating sound. The binary signals were converted to a certain frequency that the modem that the other end understood to be a zero or a one. This was called frequency shift keying, and it was a way to turn the binary computer language into a sound that could traverse a telephone line. And it could preserve the coherent computer data by having a set frequency, and a set timing of that frequency. It was all rather ingenious, and it was baby steps to where we are today with  high-speed internet.

Well a good idea can always be re-used. FSK or frequency shift keying took us from asynchronous to synchronous systems, and it could be used to make Artificial Neural Networks a lot smarter.

I was just reading some of the latest in brain science research with real neural networks in our brain. They are fall-through asynchronous in general, meaning that when a nerve sends a signal, it is immediately fed into a massively parallel network of neurons. However, it has been discovered that frequency also plays a factor in the neural network.

The stimulation is almost like a palimpsest that Isaac Asimov talked about in his book "Contact". In its truest sense a palimpsest is an ancient manuscript of a book, back in the day before paper, that had another book written on it. The old words were scraped off, and new words were written.  However you could still read the old words, and the book carried double the information. In the book and movie "Contact", the information to build the time machine was a palimpsest where additional information was encoded in the polarity or the rotation of the wave form.

Apparently the brain in humans uses frequency as an additional information encoder. It has been measured in studying emotion response in the brain, where frequency plays a huge part.  This component is entirely missing in computer Artificial Neural Networks. All computer neurons are asynchronous fall-through.

I am by no means suggesting that they become synchronous in the sense of a clock system in a computer (although that could be a possible paradigm), but that somehow frequency be incorporated as an additional tool, paradigm, algorithm or species for the neurons.  A good start would be to incorporate Frequency Shift Keying into Artificial Neural Networks. I don't have an exact methodology on how to do this yet, but you can be sure that I will devote some of my internal brain cycles to try and figure this thing out.

As a matter of fact, it is a fascinating thought experiment to contemplate on how a Von Neumann machine might behave if it were frequency-aware. New, ingenious compute dynamics such as frequency awareness are fascinating to think about.

Obviously a lot more research needs to happen, but here is a venue worth exploring for Machine Learning, Deep Learning, Artificial Neural Networks, and Artificial Consciousness.  More ruminations on this topic to come.

No comments:

Post a Comment