Neural Turing Machine Is The Artificial Human Brain The Robot Apocalypse Always Wanted

Artificial neural networks are great for building computers that can recognize patterns with startling accuracy (e.g. Facebook uses one for their face recognition system), but they’re horrible when it comes to basic computational tasks even a calculator from the 80s can perform.  Deepmind, the highly-secretive startup that Google acquired early this year for $400 million, is looking to change that with their Neural Turing Machines.

A computer neural network that’s also capable of deducing basic computer functions, it’s, basically, the early beginnings of an artificial human brain.   Yep, we’re now totally going to make robots with the level of sophistication of the organ sitting in our noggins.  Absolutely nothing can go wrong with that.

Based on pioneering computer scientist Alfred Turing’s Turing Machine architecture, the Neural Turing Machine is a prototype computer that can perform high-level pattern-recognition as the best neural networks, all while being able to do normal computer things, such as copying information, sorting objects, and making associative recall.  It achieves this by coupling a neural network to external memory resources, which it interacts with using attentional processes.  The memory acts as a tickertape of sorts, passing information back and forth to the neural network, similar to a human brain’s short-term memory.

This new component will allow the computer to learn new behaviors without requiring any programming – a machine that can, basically, deduce new skills on its own.  In the study, for instance, they trained the system to copy a data set.  Once it knew that, they simply instructed it to copy the same data set in a specific order, which it was able to accomplish with help from its newfound short-term memory.   Basically, it can learn new algorithms, provided it has sufficient examples of similar processes.  They also achieved similar results when teaching the computer to sort and remember data.

You can read the full paper from the link below.

Check It Out