Recent research commissioned by BT found that the top European businesses are wasting in excess of 3,000,000 working hours every year, equating to approximately a quarter of a billion euros, trying to get to the root of poor application performance. These are worrying statistics, given that business success relies heavily on mission-critical applications running smoothly.
As convergence becomes the norm and organizations start to run increasing numbers of applications over a single corporate network, the task of assuring application performance looks set to get even more complex. Each application has different operating metrics and requirements, and each has the potential to negatively impact the performance of another. With the rise of real-time applications, the effect of this becomes even more apparent -- for example, voice and video can only work with miniscule delay before their quality deteriorates audibly and visibly, impacting the end-user's experience.
However, it is not just legitimate business practice that IT departments need to take into account. Rogue activity, such as excessive internet surfing and the downloading of music and film, can also have a detrimental effect, stealing away bandwidth from mission-critical applications.
Until now the standard tactic has been to throw extra bandwidth at the problem of poor application performance, but these days, as network prices begin to stabilize and with continuous rigorous control of IT budgets, this cannot remain a viable long-term option.
Problem ownership is also an area of concern, particularly as organizations look to converge their voice and data. At present when performance problems occur, it is all too easy for the network managers to blame the data managers and vice versa, resulting in a Mexican standoff situation -- which helps nobody.
In addition, companies' operations and customers are becoming increasingly dispersed so business success relies more heavily on the underlying information and communications technology (ICT) being failsafe. As a result, it is more important now than ever before for IT departments to get to grips with the activities taking place on their corporate networks.
One way this can be achieved is by centrally analyzing activity on the corporate network so that applications can be prioritized and managed more effectively to support the performance of the business as a whole -- and businesses are not short of solutions to choose from. There are hundreds of different network and application monitoring tools on the market, all of which promise to help IT departments gain better visibility and greater understanding of the traffic running across their corporate network.
However, our research suggests that over-stretched IT departments are finding it difficult to get to grips with this sort of specialized work. Over 60% are struggling to resolve issues around application performance due to a lack of resources (money, people or time) and 11% are short of the necessary skills and expertise.
The tricky part is not putting probes into a network to get a view of performance. The real challenge lies in interpreting the data collected, and in bringing network, application and consultancy skills together to boost the efficiency of the infrastructure and overall performance of the business.
It is important that companies evaluate whether they have the required expertise to carry out this task in-house. If not, one solution could be to hand over application and network monitoring to a trusted third party who has the necessary resources, skills and expertise to turn the information captured into something meaningful.
What is more, with an economic upturn hopefully on the horizon, IT managers will be looking to spend more time exploiting new opportunities rather than having to acquire new skills and spend more time fire-fighting.
To conclude, as organizations start to plan their convergence strategies, there has never been a better time for IT departments to de-clutter their corporate networks. However, the sheer choice of monitoring tools on the market and the complexity of interpreting the data collected means many organizations will find it beneficial to work with a third party provider who can help guide them through the maze. Ultimately, those businesses that get their infrastructure in order now will fare best in the future.
About the author:
Ivor Kendall, General manager of IP infrastructure, BT
Ivor Kendall is responsible for developing and marketing the portfolio of BT's IP products and services for the UK corporate market. With more than 20 years' experience in ICT sales and marketing and a rich legacy in IP, Ivor became a valuable addition to the BT team when he joined just over a year ago from Cable & Wireless. Ivor had been vice-president of Cable & Wireless's IP Convergence Centre of Excellence, where he was responsible for the development of a focused group of sales, marketing and engineering resources that supported the company's global IP proposition development and sales.