Load Balancer Pressure Test Solution
A load balancer can distribute high-volume service traffic. But its performance and stability directly impact service reliability. Before using a load balancer to distribute real service traffic, you can perform pressure tests on ELB and simulate extreme service scenarios to ensure that the load balancer is highly available under heavy load conditions.
Pressure Test Solution Architecture
Figure 1 shows the reference architecture of the pressure test solution.
Pressure Test Solution
- Metrics for pressure testing on TCP or TLS listeners
You can refer to the three key metrics in Table 1 when performing pressure tests on TCP or TLS listeners. For details about the monitoring metrics, see ELB Monitoring Metrics.
Table 1 Metrics for pressure testing on TCP or TLS listeners Metric
Pressure Test Suggestions
New Connections
Use short connections to test how well the load balancer and backend servers can process new connections.
Concurrent Connections
Use persistent connections to test how well the load balancer and backend servers can process concurrent connections.
Bandwidth
Use large packets to test the load balancer's ability to handle high bandwidth. This ensures it can manage heavy traffic efficiently and reliably in actual scenarios.
- Metrics for pressure testing on HTTP or HTTPS listeners
You can refer to the four key metrics in Table 2 when performing pressure tests on HTTP or HTTPS listeners. For details about the monitoring metrics, see ELB Monitoring Metrics.
Table 2 Metrics for pressure testing on HTTP or HTTPS listeners Metric
Pressure Test Suggestions
New Connections
Use short connections to test how well the load balancer and backend servers can process new connections.
Concurrent Connections
Use persistent connections to test how well the load balancer and backend servers can process concurrent connections.
Bandwidth
Use large packets to test the load balancer's ability to handle high bandwidth. This ensures it can manage heavy traffic efficiently and reliably in actual scenarios.
Queries per second (QPS)
Configure a high request rate to test how well the load balancer and backend servers can process requests.
- Backend server group configuration
- Cloud servers are recommended to be used as the backend servers for the pressure test. There are some restrictions on using IP as backend servers. For details, see .
- You are not advised to use source IP hash as the load balancing algorithm or enable sticky session. If you do, the requests from a client will be distributed to a specific backend server, causing uneven loads across backend servers.
Pressure Test Tool Suggestions
You are advised to use CodeArts PerfTest to perform pressure tests. CodeArts PerfTest simulates high user traffic during peak times. It allows you to define the contents and time sequences of packets and supports complex combinations of multiple transactions. After tests are complete, CodeArts PerfTest provides professional test reports to evaluate your service quality.
Pressure Test Example
This practice uses HTTP listeners as an example. Before pressure testing an HTTP listener, you need to:
Create an HTTP backend server group, add two ECSs to the backend server group, and set the backend port to 80 for each ECS.
Create a dedicated load balancer that supports both network and application load balancing, add an HTTP listener, and configure the backend server group to which traffic will be directed.
Configure the two ECSs as described in Table 3.
|
vCPUs |
4 vCPUs |
|---|---|
|
Memory |
8 GiB |
|
OS |
Huawei Cloud EulerOS 2.0 Standard 64-bit (10 GiB) |
Procedure
- Remotely log in to each ECS and install the HTTP service.
- Install Nginx:
yum install -y nginx
- Initialize the default page:
echo "performance test" > /usr/share/nginx/html/index.html
- Modify /etc/nginx/nginx.conf to listen on more ports based on the pressure test requirements as Nginx listens on TCP port 80 by default.
- Start the HTTP service:
systemctl start nginx
- Run either of the following commands to check whether the HTTP service can be accessed (port 80 is used by default):
curl -X GET http://localhost or curl -X GET http://127.0.0.1:80
- Install Nginx:
- Use CodeArts PerfTest to perform pressure tests by referring to .
Causes of Poor Pressure Test Results
- Poor CPU performance of the backend server
Poor CPU performance of the backend server causes bad pressure test results.
Solution: Check the CPU usage of all backend servers and upgrade the specifications of the backend servers.
- Poor performance of antecedent services
The service running on the backend server may rely on other services (such as the database and DNS) to function properly. The poor performance of antecedent services can cause poor pressure test results.
Solution: Check all the antecedent services and improve their performance.
- Insufficient ports
Insufficient ports on the client or server may reduce the number of concurrent connections.
Check the following possible causes and take the corresponding measures:
- The TCP connection that is actively closed will enter the TIME_WAIT state. If there are a large number of connections in the TIME_WAIT state, ports will be exhausted.
- Use persistent connections instead of short connections to reduce the need for repeatedly opening and closing connections.
- Set sysctl -w net.ipv4.tcp_tw_reuse to 1 to allow the connections in the TIME_WAIT state to be reused.
- Set sysctl -w net.ipv4.ip_local_port_range to "1024 65535" on the client to increase available ports and prevent port exhaustion.
- The numbers of clients and backend servers used for the pressure test are limited, causing insufficient ports.
Suggestion: Increase the number of clients and backend servers.
- The TCP connection that is actively closed will enter the TIME_WAIT state. If there are a large number of connections in the TIME_WAIT state, ports will be exhausted.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot
