Stratecast, where I work, is very interested in these questions. Traditional, siloed enterprise IT is quickly giving way to a virtualized IT. Commodity programs deal with most routine computing tasks, while IT strategists increasingly focus on implementing new -- heavy computing -- architectures based on huge collections of structured and unstructured data. Increasingly, the computing and data storage tasks are so great they can only live comfortably, and economically, in the cloud. But why do it in the first place?
Indeed, the question about why a company would want to deploy these new technologies stumps most of our survey respondents. This seems odd, since IT professionals are asking management to fund them, without solid answers on such simple questions as how the new architecture will be secured. Stratecast believes the answer is pretty straightforward: The reason companies do big data, cloud and virtualized IT is to accelerate and optimize business decisions. The reason is decision support.
The quicker the decision, the better
With the hypercompetitive markets of today, making quick, accurate business decisions is the most valuable capability a company can have. Being able to identify threats and opportunities at the speed of the market -- while exploiting all the data assets the corporation possesses -- is the new competitive advantage. Practically everything else can be outsourced.
With the advent of automated data processing and big computers, it became a common refrain of decision-makers that there could never be too much data applied to a decision. In many cases, however, that resulted in analysis paralysis. Decisions could theoretically be postponed forever, because one could always find more data to throw at a particular problem. By making every shred of data available for decision support, big data was supposed to solve the logjam, yet that isn't what occurred.
Instead, big data gave rise to advanced analytics, which were required to make sense of the huge data lakes that enterprises were amassing. To make sense out of the advanced analytics, a new kind of IT professional was required: a data scientist. Data scientists were to return us to the old days of IT professionals in white lab coats and large machine rooms. What could be more perfect?
Making big data available to all
Waiting for an IT organization to build big data queries and then interpret the outputs using statistical routines is nonstarter. Businesses that invest in the new IT technologies want a quick return on their investment; they want a measurable increase in decision accuracy and speed. In organizations that don't measure metrics like these and aren't working to adopt a decision support model, the new IT can be a hard sell.
For today's IT professionals, the new objective must be to make the new technologies encompassing virtualized IT available to company decision-makers, but without the delay that white lab coats usually introduce into a process. IT, to be valuable, needs to be quick, effective and accurate. IT organizations contemplating big data architectures and advanced analytics must keep in mind that the only reason a company would fund such technology is if it can prove the business will be better off for doing so.
How big data analytics can help your enterprise
Data management shifts as a result of real-time analysis
Avoiding downtime during cloud migrations