Help Center/ MapReduce Service/ Component Operation Guide (ME-Abu Dhabi Region)/ Using Spark2x/ Obtaining Container Logs of a Running Spark Application
Updated on 2022-02-22 GMT+08:00

Obtaining Container Logs of a Running Spark Application

Container logs of running Spark applications are distributed on multiple nodes. This section describes how to quickly obtain container logs.

Scenario Description

You can run the yarn logs command to obtain the logs of applications running on Yarn. In different scenarios, you can run the following commands to obtain required logs:

  1. Obtain complete logs of the application: yarn logs --applicationId <appId> -out <outputDir>.

    Example: yarn logs --applicationId application_1574856994802_0016 -out /opt/test

    The following figure shows the command output.

    1. If the application is running, container logs in the dead state cannot be obtained.
    2. If the application is stopped, all archived container logs can be obtained.
  2. Obtain logs of a specified container: yarn logs -applicationId <appId> -containerId <containerId>.

    Example: yarn logs -applicationId application_1574856994802_0018 -containerId container_e01_1574856994802_0018_01_000003

    The following figure shows the command output.

    1. If the application is running, container logs in the dead state cannot be obtained.
    2. If the application is stopped, you can obtain logs of any container.
  3. Obtain container logs in any state: yarn logs -applicationId <appId> -containerId <containerId> -nodeAddress <nodeAddress>

    Example: yarn logs -applicationId application_1574856994802_0019 -containerId container_e01_1574856994802_0019_01_000003 -nodeAddress 192-168-1-1:8041

    Execution result: Logs of any container can be obtained.

    You need to set nodeAddress in the command. You can run the following command to obtain the value:

    yarn node -list -all