Viewing the Spark Program Commissioning Result in the Linux Environment
Scenario
After a Spark application is run, you can check the running result through one of the following methods:
- Viewing the command output.
- Logging in to the Spark web UI.
- Viewing Spark logs.
Procedure
- Check the operating result data of the Spark application.
The data storage directory and format are specified by users in the Spark application. You can obtain the data in the specified file.
- Check the status of the Spark application.
The Spark contains the following two web UIs:
- The Spark UI displays the status of applications being executed.
The Spark UI contains the Spark Jobs, Spark Stages, Storage, Environment, and Executors parts. Besides these parts, Streaming is displayed for the Streaming application.
Access to the interface: On the web UI of the YARN, find the corresponding Spark application, and click ApplicationMaster in the last column of the application information. The Spark UI page is displayed.
- The History Server UI displays the status of all Spark applications.
The History Server UI displays information such as the application ID, application name, start time, end time, execution time, and user to whom the application belongs. After the application ID is clicked, the Spark UI of the application is displayed.
- The Spark UI displays the status of applications being executed.
- View Spark logs to learn application running conditions.
The logs of Spark offer immediate visibility into application running conditions. You can adjust application programs based on the logs. Log related information can be referenced to Spark2x Logs.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot