Problem solve Get help with specific problems with your technologies, process and projects.

Application delivery systems for the ROI-driven data center

As innovation and demand converge in the new ROI-driven data center, yesterday's load balancer is being supplanted by a rapidly evolving category, the application delivery system.

Prabakar Sundarrajan

The conventional enterprise data center typically employs five core categories of equipment: the router, firewall (including intrusion-detection and protection systems), Layer 2-3 switch, traffic management systems (load balancer or Layer 4-7 content switch), and the Web or application server. Of these established categories, the traffic management category is arguably evolving most rapidly. Enterprises' bottom-line demand for ROI through selective investments in IT infrastructure has fueled market growth, and vendors have responded with innovations in application-centric functionality and performance and application-layer security capabilities. In fact, industry analysts signal that the long-awaited growth in the IT industry is largely being driven by the ROI associated with such innovation.

As vendor innovation and enterprise demand converge in the new ROI-driven data center, yesterday's load balancer or traffic management device is being supplanted by a rapidly evolving category, the application delivery system. These systems go beyond the legacy load balancer or content switch by offering additional capabilities in the areas of performance and security that are designed to further optimize and protect the delivery of enterprise ERP/MRP, CRM, e-business, databases, messaging, file access and other business-critical applications.

Optimization, switching, attack protection, secure access and more
Ensuring the efficiency, reliability, and security of applications delivered over IP-based networks introduces new requirements for optimization, switching, secure remote access, and attack protection. These new capabilities must enable enterprise applications to adapt to the uncertainties that lie beyond corporate network boundaries while achieving the right balance of security and performance.

However, optimization, switching, access and protection capabilities often are not incorporated within conventional traffic management, application switching, or host-based infrastructure. As a result, enterprise applications, such as Oracle, PeopleSoft, and Outlook/Exchange oftentimes fail to perform nearly as well across the Internet or enterprise WAN networks as they do on corporate LANs. Meanwhile, achieving the optimum combination of security, performance, and scalability continues to challenge organizations delivering business-critical applications over their extended networks.

Today's unified application delivery systems now also incorporate SSL VPN-based secure remote access capabilities, which provide fast access to all applications for remote users of all types. SSL VPNs can be configured to provide granular access to a specific set of applications without the expense and ongoing complexity of managing client software.

Let's consider the impact of an application delivery system on an enterprise messaging environment. Microsoft Exchange, for example, is a firewall-challenging product. To understand why, let's take a look at the process used to establish a connection between the Outlook client and Exchange server. The first step is a TCP connection to the Windows RPC port (TCP port 135). This connection is used to establish two additional TCP ports used for communication. The exact port numbers of the two connections are not known until this first connection from Outlook occurs. Security-conscious administrators should already be noting that the Windows RPC port is often used by hackers either seeking system access or trying to make the server unavailable through a denial of service (DoS) attack.

Once the two additional TCP ports are negotiated, Outlook initiates connections to them. All three TCP connections combined give Outlook full access to the Exchange server. The two TCP connections use port numbers anywhere between 1024 and 65535.

The final method of conventional Outlook/Exchange communication is for new mail notification. This is done through a UDP packet from the Exchange server to the Outlook client. From an administrator's point of view, a firewall that must support Exchange looks like Swiss cheese. Microsoft has tried to compensate for this by offering two registry entries on the Exchange server that allow the administrator to hardcode the two TCP ports that are used in addition to port 135. Unfortunately, the only way to configure these values is through using RegEdit, and the only place it is documented is at Microsoft TechNet. The note includes a stern warning to administrators about the perils of making a mistake when using the registry editor. Unfortunately, these two registry entries still do not address the concern over needing to expose TCP port 135 to the Internet.

One option in remote connectivity for remote users is to use an IPsec-based VPN. The good news with this solution is that it offers a secure, encrypted and authenticated tunnel. The bad news is that without an additional firewall, the administrator cannot control access to what other resources remote users can access. Assuming an administrator were to install a firewall to control access, the administrator must still "tweak" Exchange's registry settings so that it uses fixed ports.

A superior option is to use an application delivery system's SSL VPN capabilities that support dynamic ports. With an SSL VPN, the remote access is secured, authenticated, accelerated, and encrypted. The administrator does not need to make any changes to the Exchange server or to the Outlook client. Finally, no additional firewalls need to be purchased. The SSL VPN is capable of limiting access based on client credentials (e.g. user groups) so that end users accessing the network remotely have limited access to the internal network. The application delivery system's built-in SSL encryption, all data compression, dynamic application caching and load balancing capabilities assure rapid, continuous operation of the business-critical enterprise messaging system.

New equipment for the ROI-driven data center
At the same time that application delivery systems improve performance and scale capacity, they also protect applications with advanced security capabilities, including intrusion filtering to block worms and viruses such as Code Red and Nimda. The systems defend against various types of DoS attacks, such as SYN floods and the recent MyDoom set of attacks. The system serves as an additional ring of application security, providing integrated packet filtering and attack protection as a second line of defense to an organization's perimeter firewalls.

Application delivery system users also report significant improvements in response time for remotely accessed applications, ranging from Web-based as well as native client e-mail to CRM, human resources, file access and extranet applications. This translates into less time waiting for application data to be delivered and increased employee productivity and customer/partner satisfaction. Best of all, users benefit whether they access the applications over wireless LANs, the WAN, from home, or on the road.

While thwarting illegitimate traffic, the application delivery system also ensures that all legitimate requests get through, ensuring undisrupted availability for users. Rather than dropping connections during peak traffic, the system queues the requests so users do not see the dreaded "server not available" message. For example, the earlier mentioned online shopping site is able to take unpredictable traffic spikes in stride with existing server capacity, rather than adding horsepower that goes unused except at peak times. Similar improvements are possible for enterprise applications that typically must be over-provisioned to handle "flash crowds" such as open enrollment periods.

Optimized application performance, advanced switching capabilities, remote access security via SSL VPN, and protection against attacks are some of the benefits that today's application delivery systems can bring to the new enterprise data center. These next-generation systems provide a unified solution that can do much more for the performance and security of enterprise applications than conventional load balancers or traffic management systems. As enterprises seek immediate payback in server capacity and network bandwidth and forward-going gains in user productivity, security and application availability and performance, the application delivery system is becoming a core component of the new, ROI-driven data center.



About the author:
Prabakar Sundarrajan is CTO and Executive Vice President of Strategic Planning and Corporate Development for NetScaler, Inc. Prabakar has more than 20 years of experience in networking, enterprise applications and e-business. He is responsible for driving NetScaler's technology vision, product and corporate strategy and NetScaler's role in powering the groundbreaking eNet data center at this year's Networld+Interop.
This was last published in May 2004

Dig Deeper on Network application performance

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchUnifiedCommunications

SearchMobileComputing

SearchDataCenter

SearchITChannel

Close