Sergey Nivens - Fotolia
Much attention has been given to the exploitation of big data within the enterprise, but what has received little focus is how these mega-data applications can be used effectively once they have been built. In most cases, enterprises are left to architect an analytics environment that allows data scientists to query these data stores. In other words, rather than democratizing data and using it to inform and enable decision making at every level, we are creating a new IT ivory tower: one literally staffed with people in white lab coats whom we depend on to formulate appropriate queries on our behalf.
Of course, analytic software applications exist that enable general queries in a big data application environment. Many of these are quite intuitive; however, many of the most important enterprise business and networking-related decisions do not lend themselves to the sort of simple questioning that most of these analytics engines support. Instead, what's needed is an analytics and big data tool designed to apply all of the data available to the particular problem at hand -- a tool that can also exploit data that is ambiguous or fuzzy. This is the realm of cognitive computing: using an intelligent engine to draw conclusions from massive data sets.
The answer is Watson. What's the question?
IBM, of course, has been considering the application of cognitive computing to big data analysis for some time. In 2011, it demonstrated its Watson cognitive computing technology on the game show Jeopardy. Watson, playing at an expert level, defeated its two human opponents. How? By considering the question and answer pairs in the context of a massive database that had been created to support the exercise.
Since then, IBM has further developed Watson into a cloud-based capability that can be called by applications that need to quickly find answers in complex data sets. It supports a natural language front end and can be used to analyze relationships in large data collections. In particular, its Watson Explorer and Watson Analytics applications allow decision makers to literally consider all of the data available in a particular domain to inform their decisions. Rather than paralysis through analysis, Watson enables analysis excellence by leveraging all of the big data application assets of the enterprise.
At a recent World of Watson event, IBM demonstrated the value of using all available data to solve a problem. In this case, IBM showed how Watson is improving cancer treatments by matching a patient's genetic information to potential treatment options. Since the volume of medical research is growing exponentially, this is a task that is extremely complex and virtually impossible for individual physicians to manage. Watson, though, can quickly conduct the necessary matching, make recommendations to the treating physician, and can even learn from the physician when a treatment option is selected. It gets better over time.
One avenue that illustrates the value of cognitive computing
IBM has made Watson Analytics available to developers and enterprise IT through an online portal that provides free access to explore its capabilities. For actual applications, IBM sells enterprises and IT organizations access to its Watson cloud environment through subscriptions. Additionally, IBM just announced a new hybrid cloud product for enterprises that wish to maintain their computing environment on premises.
Watson, of course, is not the only approach to analytics and big data cognitive computing. There are many artificial intelligence start-ups exploring this space. SmartAction, a developer of artificially intelligent interactive voice response tools, for example, is tackling the customer calling center space with its IVA platform. Additionally, the Cognitive Systems Institute is a great resource to track developments in this space.
Watson, however, stands apart because it is the first attempt at a market-ready cognitive engine designed for general purpose use. And with open APIs, it is clearly aimed at creating a cognitive ecosystem that will ultimately drive a new wave of rational computing.
Enterprise IT professionals need to become familiar with cognitive computing, advanced analytics and artificial intelligence technology. The rate that data is growing is quickly becoming exponential, with the potential of more than 20 zetabytes (20 trillion gigabytes) of enterprise data by the year 2020, according to one Frost & Sullivan Stratecast forecast. Simply storing this data is going to be challenging, but analyzing it will be virtually impossible using tools like spreadsheets. More advanced tools will be essential.
Nevertheless, implementing cognitive computing is not trivial. Developing the infrastructure to support cognitive computing is not something the average IT shop should attempt without help. Even Watson comes with a heavy dose of upfront professional services in order to define the use case that is being supported and to fine-tune the approach. The business case is tied to business metrics as a part of this process so that once the application is implemented, ROI can be determined. As with any technology implementation, business metrics are essential: It is difficult to gauge the impact of a new technology unless one understands what it is being compared to.
A legitimate question might be whether all of the upfront effort is worth the return. Stratecast believes that in each industry vertical, the company that masters cognitive computing to enable its business will acquire an almost insurmountable competitive advantage. Cognitive computing is transformational: It will redefine the competitive landscape. It is worth the effort.
How big data analytics impacts SDN
Using big data for network operations