Updated on 2024-10-23 GMT+08:00

Viewing the Spark Program Commissioning Result in the Linux Environment

Scenario

After a Spark application is run, you can check the running result through one of the following methods:

  • Viewing the command output.
  • Logging in to the Spark web UI.
  • Viewing Spark logs.

Procedure

  • Check the operating result data of the Spark application.

    The data storage directory and format are specified by users in the Spark application. You can obtain the data in the specified file.

  • Check the status of the Spark application.

    The Spark contains the following two web UIs:

    • The Spark UI displays the status of applications being executed.

      The Spark UI contains the Spark Jobs, Spark Stages, Storage, Environment, and Executors parts. Besides these parts, Streaming is displayed for the Streaming application.

      On the YARN web UI, find the Spark application. Click ApplicationMaster in the last column of the application information. The Spark UI page is displayed.

    • The History Server UI displays the status of all Spark applications.

      The History Server UI displays information such as the application ID, application name, start time, end time, execution time, and user to whom the application belongs. After the application ID is clicked, the Spark UI of the application is displayed.

  • View Spark logs to learn application running conditions.

    The logs of Spark offer immediate visibility into application running conditions. You can adjust application programs based on the logs. Log related information can be referenced to Spark2x Logs.