A load balancer is a device that distributes network or application traffic across a cluster of servers to optimize utilization, improve responsiveness and increase availability. A load balancer sits between the client and the server farm accepting incoming network and application traffic and distributing the traffic across multiple backend servers using various methods. By balancing application requests across multiple servers, a load balancer prevents any one application server from becoming a single point of failure, thus improving overall application availability and responsiveness.
Load balancing is the most straightforward method of scaling out an application server infrastructure. As application demand increases, new servers can be easily added to the resource pool, and the load balancer will immediately begin sending traffic to the new server.
When one application server becomes unavailable, the load balancer directs all new application requests to other available servers in the pool.
To handle more advanced application delivery requirements, an application delivery controller (ADC) is used to improve the performance, security and resiliency of applications delivered to the web. An ADC is not only a load balancer, but a platform for delivering networks, applications and mobile services in the fastest, safest and most consistent manner, regardless of where, when and how they are accessed.
Load balancing uses various algorithms, called load balancing methods, to define the criteria that the ADC appliance uses to select the service to which to redirect each client request. Different load balancing algorithms use different criteria.
Load balancers provide the bedrock for building flexible networks by improving performance and security for many types of traffic and services, including applications.