The death of well known application problems

Jasmine Noel
There is a lot of buzz about utility computing and how it makes IT hardware more dynamic and adaptable to business needs. However, the thing that most utility computing vendors fail to recognize is that hardware capacity is not the only part of the infrastructure stack that is becoming dynamic. Applications are changing more frequently with the advent of modular application architectures and off-the-shelf application servers. Integration is becoming dynamic as service oriented architectures gain popularity. Business processes are becoming more dynamic as process modeling tools mature. In other words, the entire business computing stack is becoming more flexible and changeable.

With all of this flexibility and change happening, how on earth can service-level and application management vendors assume that IT will continue to have well-known applications with well-known problems that can have automated resolutions cast in concrete? Any solution that makes this assumption will ultimately fail. It

    Requires Free Membership to View

will not be able to deal with a dynamic infrastructure stack.

Instead what is needed is an application solution that embeds learning in every part of the problem resolution process. For monitoring we need to learn the performance baselines, not set static thresholds. For analysis we need to learn the current infrastructure dependency map, not rely on a manual map created a month ago. For planning we need to learn the actual configuration and capacity allocation in real time, not rely on the original deployment specifications. For execution we need to learn whether the applied fix provided a systemic resolution or only temporary relief.

When solutions can provide that level of learning out of the box then we will have truly automated management for dynamic application environments.

Large management vendors are busy working on their next generation solutions that provide learning. Last week, Tivoli upgraded its Provisioning and Orchestration solutions to include storage devices. HP recently demonstrated beta software that links performance analysis with real-time capacity planning. BMC announced its strategic roadmap to deliver similar capabilities from acquired technologies. Smaller startups such as Vieo, Quantiva and n-Layers are betting on statistical analysis to provide the missing link between performance analysis and real-time capacity provisioning. Ptak, Noel & Associates believes all of these efforts are strong steps in the right direction. This is a good thing, because flexibility without control is worse than useless -- it is downright dangerous.


Jasmine Noel is Founder and Partner of Ptak, Noel & Associates. A recognized expert in infrastructure management, Jasmine served previously as director of systems and applications management at Hurwitz Group. She was also a senior analyst at D.H. Brown Associates, where her responsibilities included technology trend analysis in the network and systems management space.

This was first published in July 2004

Join the conversationComment

Share
Comments

    Results

    Contribute to the conversation

    All fields are required. Comments will appear at the bottom of the article.

    Disclaimer: Our Tips Exchange is a forum for you to share technical advice and expertise with your peers and to learn from other enterprise IT professionals. TechTarget provides the infrastructure to facilitate this sharing of information. However, we cannot guarantee the accuracy or validity of the material submitted. You agree that your use of the Ask The Expert services and your reliance on any questions, answers, information or other materials received through this Web site is at your own risk.