Help Center/ Application Performance Management/ Best Practices(2.0)/ Locating Performance Problems Using APM Profiler
Updated on 2025-03-27 GMT+08:00

Locating Performance Problems Using APM Profiler

Use APM Profiler (a performance profiling tool) to locate the code that consumes excessive resources.

Prerequisites

  1. An APM Agent has been connected.
  2. The Profiler function has been enabled.
  3. You have logged in to the APM console.

Handling High CPU Usage

  1. In the navigation pane, choose Application Monitoring > Metrics.
  2. In the tree on the left, click next to the target environment.
  3. Click the JVM tab. On the displayed page, select JVMMonitor from the monitoring item drop-down list.

    Figure 1 Viewing JVM monitoring data

  4. Check the cpu(%) graph. The CPU usage remains higher than 80%.

    Figure 2 CPU (%)

  5. Click the Profiler Performance Analysis tab.
  6. Click Performance Analysis. On the displayed page, select CPU Time for Type.

    Figure 3 Profiler flame graph

  7. Analyze the flame graph data. The java.util.LinedList.node(int) method occupies 66% of the CPU, and the corresponding service code method is countPages(List).

    Figure 4 Profiler flame graph analysis

  8. Analyze the service code. countPages(List) traverses the position indexes of the input parameter set list. However, when the input data is linked lists, position index-based traversal will be inefficient.

    Figure 5 Code analysis

  9. Fix the code. Specifically, change the list traversal algorithm to "enhanced for loop".

    Figure 6 Fixing code

  10. After the optimization, repeat steps 4 and 5. The CPU usage is lower than 1%.

    Figure 7 CPU usage after optimization

Handling High Memory Usage

Prerequisite: The test program is started, and the heap size is set to 2g(-Xms2g -Xmx2g).

  1. In the navigation pane, choose Application Monitoring > Metrics.
  2. In the tree on the left, click next to the target environment.
  3. Click the JVM tab, select the GC monitoring item. GC events occur frequently.
  4. Select the JVMMonitor monitoring item to check JVM monitoring data.
  5. Click the Profiler Performance Analysis tab.
  6. Click Performance Analysis. On the displayed page, select the Allocated Memory instance. Locate the method with the most allocated memory based on the Self column on the right.

    Figure 8 Memory flame graph

  7. Check the code. LargeEnum is an enumeration class and has defined a large number of constants. The enumeration class method values() implements functions through array clone. That is, each time the values() method is called, an enumeration array will be copied at the bottom layer. As a result, heap memory is frequently allocated and GC often occurs.

    Figure 9 Checking the code

  8. Define values as a constant to avoid frequent calling of enum.values().

    Figure 10 Resolving the problem

  9. Repeat steps 3 to 6. The number of GC times decreases greatly and there is no memory allocated to enum.values() in the flame graph.

    Figure 11 Flame graph after optimization

Handling Slow API Response

  1. In the navigation pane, choose Application Monitoring > Metrics.
  2. In the tree on the left, click next to the target environment.
  3. Click the URL tab. The API responses are slow. The average response time is about 80s.
  4. Click the Profiler Performance Analysis tab.
  5. Click Performance Analysis. On the displayed page, select the Latency instance and enter the method of the API.

    Figure 12 Performance analysis

  6. Check the call stack and find the time-consuming method. As shown in the following figure, the executeUpdate() method in NegativeWorkService#handle consumes the most time.

    Figure 13 Checking the call stack

  7. Check the NegativeWorkService#handle method. The cause is that database insertion is performed in the loop.

    Figure 14 Checking NegativeWorkService#handle

  8. Change the configuration to batch data insertion to resolve the problem.

    Figure 15 Resolving the problem

  9. Check the average response time. It is reduced from 80s to 0.2s.