All Things Techie With Huge, Unstructured, Intuitive Leaps

A Start to Artificial Conciousness - Making A Computer Worry With Machine Learning


Bring things spring from small seeds. This is the thought that keeps running through my mind when I think of Artificial Consciousness in computers. Ever since I saw the Imitation Game and the story of Alan Turing, I wondered how such an intelligent man could think that computers could think.  Of course he stipulated that the thought process was different, than in humans.

Then along came Dr. Stephen Thaler who artificially introduced the idea of perturbations in computer "thinking" and got a patent for it. A perturbation is essential to artificial consciousness.  Essentially, a computer is programmed to linearly follow an execution path of its program. Even in artificial neural networks, the output of one perpceptron is fed into another layer.  It is a linear defined path.  Thaler introduced perturbations by selectively killing off perceptrons in a layer and in what he describes as artificial neuronal near-death, and the machine becomes a creative design machine.  His landmark example was a neural network that identified coffee cups and when it was brain-damaged, it came up with creative coffee cup designs.

Perturbations can come from all things. They can come from random events. But in the state of consciousness of any sentient thing or being (notice I now have to add things, because computers have the ability to become sentient), perturbations can come from the state of consciousness itself.  A prime example is worry. We observe something within our conscious sphere, and we think about it, make a judgement about it, add the judgement to the thought process, and keep recycling the thought in an obsessive compulsive state, and you have what is known as worry.

Going back to the opening statements of big things spring from small seeds, the thought struck me that I could make my computer worry.  It would be a small worry program to start with, but then I could hop it up to another layer of abstraction and make a universal computer worry module that could become part of the Operating System.  It would be the Worry Service.

Here's how a simple version would work. The Worry Service runs a task manager, a memory monitor or a CPU usage monitor in the background.  The minute that it detects that memory or CPU is approaching 100% or saturation, it kicks the worry.exe module. The worry module essentially assumes the highest thread priority, prints out on any display saying "I am incredibly busy" and deallocates processing priority to the heavy task, slowing it down.  It then detects that the task is slowed down, and kicks another worry module about its lack of performance.  The worry modules are able to be queried, and their response is always "I am incredibly busy and it is affecting my performance".  The worry module also writes to every log that it can, and using a machine learning neural network, reinforces the worry parameters so they automatically fire at lower thresholds.  Once the busy task is completed, the worry abates and slows down, and the computer becomes efficient again.  Of course, if the machine-learning neural network is too effective and eager in kicking in the worry, it becomes a compulsive worry, and needs to see a programming computer psychiatrist to up the thresholds of its worry mechanism by running a few positive reinforcement training epochs.  All of this technology is available now.

But I can just see it. Some Goth programmer will chain all of the worry modules into the depression module, making the computer virtually worthless for sustained work.

The very first step to scary artificial intelligence, is making a computer with the ability to navel gaze. This is a start. I am convinced that human consciousness is merely an accident of an over-developed tropism, and the evolution of Artificial Consciousness can start with this simple step -- a computer worry wart. Windows machines will be the worst worry warts and the most depressive among the conscious computers.

No comments:

Post a Comment