Yesterday, we ran an event at the Duck and Waffle Restaurant (40th Floor Heron Tower) on Optimising the Digital Experience with Riverbed's Performance Management platform and in particular SteelApp.
We made a few un-due "assumptions" that all of our guests know what Load Balancing and "Application Delivery Controllers" are and how they work. We thought we would give a brief overview into each.
Acronyms and Terminology
GLBs, ADCs, Traffic Managers, Reverse Proxy's, Application Proxy's. These words/phrases were used a lot today in different contexts. We thought it was worth clarifying exactly what these “ADCs” and “Traffic Managers” actually do, and how they differ from plain and simple “Load Balancers”or DNS round-robin processes.
So – what is a “Traffic Manager?”
A Traffic Manager performs similar functions for a website (or web service), that a call flow management system does in a call centre. Think about a customer service representative in a small business for example. Their direct-dial number is published in the phone book or online and they handle in bound and out-bound customer queries, ranging from account queries, technical support questions, through to escalations and complaints.
This is where the trouble starts.
As this "small" company begins to become more successful, the volume of calls increases and customer service levels begin to decrease as calls are left unanswered or dealt with in-correctly. For example, the phone line may be engaged, or the customer service rep may be away from his desk, so calls get missed. In this scenario, the customer service rep is also busy dealing with all kinds of “unwanted calls” - : sales call, wrong numbers, spam calling from other company’s for example as well as personal calls.
What the company needs is a way to control how phone calls are routed to employees to ensure that their customers are services by the right people in the fastest time possible.
So what’s the solution?
In the case of the example above, many businesses implement a call centre technology. A Call Centre is a great way to solve the problem as a company can sit a number of call centre “operators” on a call management system, to balance the phone calls across the members of staff, to route particular calls to particular departments and most importantly, to give them more control - such as stopping calls from certain locations, screening out nuisance calls, and in some cases, to even respond directly to customer inquiries. Above all it’s all about improving the customer experience…
How does this work for my applications then?
In a very similar way actually.
In just the same way as the example above, businesses may have an online application (let’s call it an e-commerce website) that may be hosted with a single web server with a public IP address. As the business grows they quickly progress to building a farm of web servers which may be hosted on-premise in the cloud on in a hybrid combination of the two.
To ensure that the applications are delivered in a timely, secure and most efficient way, businesses choose to deploy “Traffic Management” in front of these web applications. These traffic management systems (often referred to as “Load Balancers” or “Proxy’s”) are application delivery controllers. Their job is to manage the delivery of the critical applications and services that the business publishes. The effectiveness of these ADCs can be measured by one simple metric – the degree of control that the application delivery controller gives you in delivery the application to the use in the fastest and most personal way.
Ok, I got it! So where and how do I deploy them?
Virtualization and Cloud have had a significant impact on the architecture of applications and servers running in a business – this deployment model applies equally to ADCs.
- The old architecture was monolithic - with all elements of an application deployed within a physical data center and physical ADCs in close proximity to manage the traffic. It was a static environment, with long lead times required to change or upgrade the overall application deployment.
- The new architecture is more flexible, which follows the advantages of virtualisation. With elements of an application spread across a range of IT environments (and often in different locations) we can now place an ADC around each part of the application meaning we can ensure application availability whatever the demand without buying expensive static and “future proofed” hardware meaning we can not only better control costs, but we can support an environment that is more dynamic and distributed in nature.
And an example of how Riverbed SteelApp does this is?
Yesterday, Steve Mavin (from Riverbed) spoke about ‘See Tickets’. See Tickets is one of the largest online ticket resellers in Europe, with over 34 million online unit sales per year. The IT team at See Tickets was becoming increasingly aware that its website struggled to cope with huge and sudden spikes in web traffic – meaning they were missing sales! With 85 percent of ticket sales made online, and over 1 million page views per day, See Tickets is heavily reliant on the performance of its website infrastructure.
When See Tickets launches a new event, thousands of people will hit their website at the same time, and there is a real danger that the web site will collapse. SteelApp steps in to act as a shock absorber, and manages the incoming requests to protect the application, and speed up the response time.
Riverbed SteelApp was chosen for its flexibility and reporting metrics to monitor and review website traffic patterns, identify trends, and make informed business decisions. They use SteelApp to apply a series of application business policies to control the type, flow, and priority of traffic to their website. This enables See Tickets to manage traffic to the back-end servers to cope with the huge peaks of visitors coming to the site.
In addition, SteelApp Web App Firewall is a key part of the solution, providing an additional layer of security, giving See Tickets optimum protection for their online presence. See Tickets’ customers have the reassurance that their personal and financial information is protected, as they comply with the Payment Card Industry Data Security Standard (PCI DSS).
Online ticket sales and similar high-volume transactions are a great example of the way in which SteelApp can make a real difference to a web application: put simply, shorter response time means they make more money, and downtime means losing money.
Anything else?
Riverbed SteelApp, is the #1 virtual application delivery controller (ADC) for scalable, secure, and elastic delivery of your enterprise, cloud, and e-commerce applications.
We'd love to hear from you to talk through your application and business performance needs. Please get in touch @Cisilion or contact me directly @rquickenden