Follow Datanami:
January 23, 2018

AI Definitions: Machine Learning vs. Deep Learning vs. Cognitive Computing vs. Robotics vs. Strong AI….

(kentoh/Shutterstock)

AI is the compelling topic of tech conversations du jour, yet within these conversations confusion often reigns – confusion caused by loose use of AI terminology.

The problem is that AI comes in a variety of forms, each one with its own distinct range of capabilities and techniques, and at its own stage of development. Some forms of AI that we frequently hear about, such as Artificial General Intelligence, the kind of AI that might someday automate all work and that we might lose control of – may never come to pass. Others are doing useful work and are driving growth in the high performance sector of the technology industry.

These definitions aren’t meant to be the final word on AI terminology, the industry is growing and changing so fast that terms will change and new ones will be added. Instead, this is an attempt to frame the language we use now. We invite your feedback in the hope of encouraging discussion and greater clarity, and we plan to update this list over time.

Our source for all but the last of these definitions is a company well-versed in AI: Pegasystems, for more than 30 years a developer of operations and customer engagement software and a company that studies the implications and impacts of AI in the workplace.

Artificial Intelligence, in Pegasystem’s definition, “is a broad term that covers many sub-fields of computer science that aim to build machines that can do things that require intelligence when done by humans. These sub-fields include:

Machine learning – rooted in statistics and mathematical optimization, machine learning is the ability of computer systems to improve their performance by exposure to data without the need to follow explicitly programmed instructions. Machine learning is the process of automatically spotting patterns in large amounts of data that can then be used to make predictions.

Deep learning – this is a relatively new and hugely powerful technique that involves a family of algorithms that processes information in deep “neural” networks where the output from one layer becomes the input for the next one. Deep learning algorithms have proved hugely successful in, for example, detecting cancerous cells or forecasting disease but with one huge caveat: there’s no way to identify which factors the deep learning program uses to reach its conclusion.

Computer vision – the ability of computers to identify objects, scenes and activities in images using techniques to decompose the task of analyzing images into manageable pieces, detecting the edges and textures of objects in an image and comparing images to known objects for classification.

Natural language/speech processing – the ability of computers to work with text and language the way humans do, for instance, extracting meaning from text/speech or even generating text that is readable, stylistically natural, and grammatically correct.

Cognitive computing – a relatively new term, favored by IBM, cognitive computing applies knowledge from cognitive science to build an architecture of multiple AI subsystems – including machine learning, natural language processing, vision, and human-computer interaction – to simulate human thought processes with the aim of making high level decisions in complex situations. According to IBM, the aim is to help humans make better decisions, rather than making the decisions for them.

Robotic Process Automation (RPA) – computer software that is configured to automatically capture and interpret existing applications for processing a transaction, manipulating data, triggering responses and communicating with other digital systems. The key difference…from enterprise automation tools like business process management (BPM) is that RPA uses software or cognitive robots to perform and optimize process operations rather than human operators.”

Artificial general intelligence (AGI) – this is a futuristic term applied to the potential for machines to “successfully perform any intellectual task that a human being can.” Also known as “strong AI,” “super-intelligent AI” and “full AI,” the definition typically encompasses powers of intuition, emotion and aesthetic discernment – or, in a word, consciousness. Related to AGI is “the singularity,” another futuristic concept around the idea that AGI will trigger “runaway technological growth…, a ‘runaway reaction’ of self-improvement cycles…resulting in a powerful superintelligence that would, qualitatively, far surpass all human intelligence.” AGI contrasts with “applied AI,” “narrow AI” and “weak AI,” which is AI limited in scope to handling a specific task or problem.

Whether AI, broadly defined, remains applied/narrow/weak, as it is today, or becomes general/strong/super/full is the great technology debate of our time.

Related Items:

Machine Learning, Deep Learning, and AI: What’s the Difference?

5 Things AI Is Better At Than You

AI to Surpass Human Perception in 5 to 10 Years, Zuckerberg Says

Datanami