Technology & Innovation

AI comes of age

AI finally comes of age (just don’t call it artificial intelligence)

Two new concepts in IT - cognitive and neuromorphic computing – may finally bring the AI fantasies of the past 50 years to life. 

The history of attempts to reproduce human intelligence in machines is riddled with pitfalls. What used to be called artificial intelligence (AI) fell into disrepute in the 1970s, but it’s making a comeback. It’s been rebranded as “cognitive computing”, and it’s advancing very quickly.

Arguably the most powerful cognitive computer in the world today is IBM’s Watson. A showcase of how far artificial intelligence—sorry, cognitive computing—came to American TV screens in February 2011, when Watson trounced two former champions of the gameshow Jeopardy!. 

If Deep Blue thrashing Gary Kasparov in the late 1990s was impressive, then Watson’s defeat of Ken Jennings and Brad Rutter was much more so. Unlike chess, which a computer can play without knowing anything about the real world, Jeopardy! requires an encyclopaedic knowledge and an ability to understand all the nuances of spoken language, including puns, metaphors and slang.

Cognitive computers draw most of their wizardry from two branches of computer science: natural language processing and machine learning.

Natural language processing helps computers understand and generate spoken language. Apple’s Siri uses it, as does speech recognition software, such as Dragon NaturallySpeaking.  Machine learning allows computers to extract rules and patterns from data. The more data you feed it, the “smarter” it becomes.

Together, natural language processing and machine learning make cognitive computers seem almost human. There were even recent claims that a cognitive computer had duped 10 out of 30 judges at the Royal Society into believing it was a 13-year-old Ukranian boy.

But there’s another group of cognitive computer scientists who are trying to build systems that not only seem to possess a human brain, but actually mimic the human brain’s functionality, with its neurons and synapses and action potentials. These machines are called “neuromorphic computers”.

Steve Furber, a leader in the field of neuromorphic computing, says that a lot of machine learning has digressed quite a long way from biologically inspired models. “Watson is based on very deep statistical modelling—its link with neural systems is, at best, tenuous,” he says.

The reason companies like Qualcomm, IBM and HRL are trying to build neuromorphic computers is because Moore’s law – the principle that the computational capacity of processors will double every 18 months - will soon run out of steam. Heat dissipation is rapidly becoming the limiting factor with silicon chips. The human brain, on the other hand, is the most power-efficient computation device in the universe. It runs at 10Hz and has a power density of 10 miliwatts per square centimetre.

Dharmendra Modha, head of IBM’s neurmorphic computing programme, says the brain does what is necessary, when it is necessary, only that which is necessary. “It doesn’t just run a clock,” he says. “That’s a very inefficient way to do things.”

Neuromorphic computer scientists may well sniff at power-hungry supercomputers, but while they tinker in their labs, cognitive computers are already delivering commercial applications. In January 2014, IBM announced the launch of the Watson Business Group, effectively putting the power of Watson in the hands of any organisation that wishes to use it, for a fee.

Organisations that are using Watson tend to be highly data-driven, such as healthcare (WellPoint) and financial services (Citi). Watson can help clinicians accurately diagnose a patient, they can help insurance firms detect fraud by spotting anomalous patterns in petabytes of data, and they can help banks identify trading patterns.

In the meantime, Qualcomm and IBM are racing each other to be the first to deliver a commercially viable neuromorphic chipset and programming language. Exactly what the first use case will be remains to be seen, but most pundits are betting they will be the eyes and ears of robots and drones. For Mr Modha, they will provide a low cost way of finding patterns in the mountains of data that will be generated by the Internet of Things.  

Will neuromorphic and cognitive computing deliver the long-hoped for promise of artificial intelligence? Share your view over on the Future Realities LinkedIn group, sponsored by Dassault Systèmes.

Enjoy in-depth insights and expert analysis - subscribe to our Perspectives newsletter, delivered every week