How Cognitive Computing Will Rethink Analytics

Cognitive Business IntelligenceThe future of business intelligence is going to look a lot different than it does today. Last week, IBM announced a breakthrough with its new “SyNAPSE Chip” and in doing so opened the door for discussions about the future of computing that would have been considered ‘far-fetched’ even just one year ago.

The SyNAPSE Chip is the product of nearly 10 years of research under the auspices of the Defense Advanced Research Projects Agency (DARPA). It represents a very important innovation in replicating our brain’s neural networks through heavily interconnected transistors (similar to the neural structure of our brains) and emphasizing important functions such as pattern recognition.  I’ve written before about how these networks are the key to our brain’s speed and efficiency. This is what allows us to have super computer type processing power in a form factor small enough to fit in our heads. 1

These types of advances have made possible the beginning of a transition away from traditional programmatic computing to a newer arena of cognitive computing. In essence, it is the transition to enabling computers to ‘think’ or ‘reason’ in a way similar to the human mind. IBM has been the most visible innovator in this arena, as they have championed the advent of cognitive computing and committed to growing their Watson unit to being a $10 billion business.

Of course, IBM is not alone. Project Adam represents Microsoft’s own impressive endeavors in machine learning and artificial intelligence. Project Adam made some headlines recently as Microsoft demonstrated its ability to successfully identify the bread of dog from an image database. Google’s ‘deep learning” initiatives are similarly impressive, as their work dubbed by many as ‘Google Brain’ has already demonstrated its ability to reason through complex tasks that were once squarely out of the realm of machine possibility. HP Labs has showcased interesting advances in their own cognitive computing efforts. That investment should see significant synergies from advances in its next generation super computer, “The Machine”.

So what does all this mean for analytics? For starters, it means a fundamental change in how we interact with data.

Up front, it is important to note that cognitive computing and analytics are not synonymous. Rather, cognitive computing and all its semantic iterations (artificial intelligence, deep learning, etc.) represent a broader set of capabilities that happen to have a very profound impact on analytics. Analytics solutions are the tools that provide the answers to our questions while cognitive computing is something that analytics tools can leverage to arrive at these answers.

See Related Research

Consider the state of analytics as we know it today. We access data through queries, searches, menus, or other such commands. A major battleground for analytics software vendors has been the user interface. There is a constant tug of war between simplifying the user experience and preserving robustness of capabilities. Millions of dollars are staked on delivering the ever important, yet often elusive, “intuitive” design.

But what if the user interface of the future is our own voice?

I’ll be tackling the opportunities associated specifically with natural language processing in my next blog, but I will say that this is a monumental opportunity to transform how we interact with data. Computers with the ability to reason present the chance for data models to be built without the human guidance that is usually required. Delivering data within the right context is a difficult task, but one that cognitive computing just may be able to accomplish if it can mimic human intuition for data manipulation.

Consider the real life example of an ‘expert personal shopper’ that The North Face is utilizing Watson to create. This intelligent assistant is able to answer online shoppers’ questions, ask its own qualifying questions, and suggest products that match to their personal situations. Now, what if we take this example and apply it to analytics?

In today’s world, an executive may be limited in asking data-based questions by either their own ability to navigate existing business intelligence software or the availability of their team to perform analysis for them. While not every executive is tech-savvy enough to perform their own queries, they certainly are able to ask questions such as “what location was our best performing store last year?”

If solutions could address this type of question, that alone would represent a massive scale up in terms of the accessibility of analytics. There is an opportunity for contextualized and well-reasoned answers to be delivered to users in a way that could never be done before.

IBM is uniquely positioned to deliver on these ideas, as they not only have made significant developments in cognitive computing, but also boast a deep portfolio of their own analytics and business intelligence offerings. Microsoft is also in a relatively unique spot to take the lead on this trend, as its deep learning investments could, in the future, be paired with its existing business intelligence suite, Power BI. Still, more specialized players such as DataRPM are beginning to step into the fold that hang their hats more centrally on being a provider of cognitive analytics. Expect the space to heat up as the value proposition for businesses to massively distribute insight evolves from cutting-edge “nice to have” into a fundamental expectation.

To be sure, much of the potential promised by cognitive analytics is still in its infancy, but the future is beginning to look a whole lot closer. As the convergence of cognitive computing and analytics marches on, the line separating the realms of reality and fantasy will continue to be redrawn.

 

1 This is an accomplishment made even more impressive by the fact that the brain runs on little more electricity than a low wattage light bulb. Contrast this to the room sized super computers that use as much electricity as thousands of homes, and you can’t help but be impressed by the brain’s efficiency.

About James Haight

James Haight is a principal analyst at Blue Hill Research focusing on analytics and emerging enterprise technologies. His primary research includes exploring the business case development and solution assessment for data warehousing, data integration, advanced analytics and business intelligence applications. He also hosts Blue Hill's Emerging Tech Roundup Podcast, which features interviews with industry leaders and CEOs on the forefront of a variety of emerging technologies. Prior to Blue Hill Research, James worked in Radford Consulting's Executive and Board of Director Compensation practice, specializing in the high tech and life sciences industries. Currently he serves on the strategic advisory board of the Bentley Microfinance Group, a 501(c)(3) non-profit organization dedicated to community development through funding and consulting entrepreneurs in the Greater Boston area.
Posted on August 19, 2014 by James Haight

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>