Posted on 13/Apr/2015
FacebookTwitterGoogle+WhatsAppEvernotePocketKindle ItBufferLinkedIn

Cognitive Computing is going to transform and improve our lives. But it also presents challenges that we need to be conscious of in order to make the best use of this technology.

Co-authored by: Mateo Valero and Jordi Torres

Big Data technology allows companies to gain the edge over their business competitors and, in many ways, to increase customer benefits. For customers, the influences of big data are far reaching, but the technology is often so subtle that consumers have no idea that big data is actually helping make their lives easier.

For instance, in the online shopping arena, Amazon’s recommendation engine uses big data and its database of around 250 million customers to suggest products by looking at previous purchases, what other people looking at similar things have purchased, and other variables.

They are also developing a new technology which predicts what items you might want based on the factors mentioned above and sends it to your nearest delivery hub, meaning faster deliveries for us.

To do so, they are using predictive models, a collection of mathematical and programming techniques used to determine the probability of future events, analyzing historic and current data to create a model to predict future outcomes.

Today, predictive models form the basis of many of the things that we do online: search engines, computer translation, voice recognition systems, etc. Thanks to the advent of Big Data these models can be improved, or “trained”, by exposing them to large data sets that were previously unavailable.

And it is for this reason that we are now at a turning point in the history of computing. Throughout its short history, computing has undergone a number of profound changes with different computing waves.

In its first wave, computing made numbers computable.

The second wave has made text and rich media computable and digitally accessible. Nowadays, we are experiencing the next wave that will also make context computable with systems that embed predictive capabilities, providing the right functionality and content at the right time, for the right application, by continuously learning about them and predicting what they will need.

For example identify and extract context features such as hour, location, task, history or profile to present an information set that is appropriate for a person at a specific time and place.

The general idea is that instead of instructing a computer what to do, we are going to simply throw data at the problem and tell the computer to figure it out itself.

We changed the nature of the problem from one in which we tried to explain to the computer how to drive, to one in which we say, “Here’s a lot of data, figure out how to drive yourself”. For this purpose the computer software takes functions from the brain like: inference, prediction, correlation, abstraction, … giving to the systems to possibility to do this by themselves. And here it comes the use of cognitive word to describe this new computing.

These reasoning capabilities, data complexity, and time to value expectations are driving the need for a new class of supercomputer systems such as those investigated in our research group in Barcelona.

It is required a continuous development of supercomputing systems enabling the convergence of advanced analytic algorithms and big data technologies driving new insights based on the massive amounts of available data. We will use the term “Cognitive Computing” (others use Smart Computing, Intelligent Computing, etc.) to label this new type of computing research.

We can find different examples of the strides made by cognitive computing in industry. The accuracy of Google’s voice recognition technology, for instance, improved from 84 percent in 2012 to 98 percent less than two years later. DeepFace technology from Facebook can now recognize faces with 97 percent accuracy.

IBM was able to double the precision of Watson’s answers in the few years leading up to its famous victory in the quiz show Jeopardy. This is a very active scenario.

From 2011 through to May 2014, over 100 companies in the area merged or were acquired. During this same period, over $2 billion dollars in venture capital funds have been given to companies building cognitive computing products and services.

Cognitive Computing will improve our lives. Healthcare organizations are using predictive modeling to assist diagnosing patients and identifying risks associated with care. Or farmers are using predictive modeling to manage and protect crops from planting through harvest.

But there are problems that we need to be conscious of. The first, is the idea that we may be controlled by algorithms that are likely to predict what we are about to do.

Privacy was the central challenge in the second wave era. In the next wave of Cognitive Computing, the challenge will be safeguarding free will. After Snowden revelations we realize it’s easy to abuse access to data.

There is another problem. Cognitive Computing is going to challenge white collar, professional knowledge work in the 21st century in the same way that factory automation and the assembly line challenged blue collar labor in the 20th century.

For instance, one of the Narrative Science’s co-founder estimates that 90 percent of news could be algorithmically generated by the mid-2020s, much of it without human intervention. Or researchers at Oxford published a study estimating that 47 percent of total US employment is “at risk” due to the automation of cognitive tasks.

Cognitive Computing is going to transform how we live, how we work and how we think, and that’s why Cognitive Computing will be a big deal. Cognitive computing is a powerful tool, but a tool nevertheless – and the humans wielding the tool must decide how to best use it.


photo credits: Robert Course-Baker
(Visited 4,439 times, 1 visits today)