API
The Gcore Customer Portal is being updated. Screenshots may not show the current version.
Edge Cloud
Edge Cloud
OverviewBillingTerraformAnsible
API
Chosen image
Home/Edge Cloud/Networking/Load Balancers

Load Balancers

A Load Balancer is a tool used to sort incoming requests across Virtual Machines and Bare Metal servers to improve your infrastructure's fault tolerance.

Gcore Load Balancers come with various configuration options to fit different network requirements. We’ve also conducted multiple performance tests on available flavors to help you make an informed decision and select the most effective solution for your infrastructure.

Our Load Balancers also support long (keepalive) connections through Server-Sent Events (SSE), Long polling, and WebSockets. To keep the connections consistently stable, you need to adjust the data timeout to the appropriate values based on your application's requirements.

Performance analysis

We’ve tested our Load Balancers to determine the performance of different flavors.

For each flavor, we’ve deployed the client in multithreading mode with 36 concurrent threads and 400 connections over the test duration of 30 seconds.

The results show:

  • Throughput: The number of requests per second (RPS) a Load Balancer can handle under a number of simultaneous users’ requests.

  • Latency: Response times for both HTTP and HTTPS traffic across different Load Balancer flavors.

Flavor HTTP HTTPS
Throughput Latency (ms) Throughput Latency (ms)
1 vCPU - 2 GiB 21k 4 20k 20
2 vCPU - 4 GiB 45k 3 34k 12
4 vCPU - 8 GiB 91k 5 51k 8
8 vCPU - 16 GiB 142k 3 117k 4

Was this article helpful?

Not a Gcore user yet?

Discover our offerings, including virtual instances starting from 3.7 euro/mo, bare metal servers, AI Infrastructure, load balancers, Managed Kubernetes, Function as a Service, and Centralized Logging solutions.

Go to the product page