Help Center > > User Guide> Managing Active Clusters> Managing Jobs> Viewing Job Configurations and Logs

Viewing Job Configurations and Logs

Updated at: Nov 06, 2019 GMT+08:00

This section describes how to view job configurations and logs.

Background

  • You can view configurations of all jobs.
  • You can only view logs of running jobs.

    Because logs of Spark SQL and DistCp jobs are not in the background, you cannot view logs of running Spark SQL and DistCp jobs.

Procedure

  1. Log in to the MRS management console.
  2. Choose Clusters > Active Clusters, select a running cluster, and click its name to switch to the cluster details page.
  3. Click Jobs.
  4. In the Operation column corresponding to the selected job, click View Details.

    In the View Details window that is displayed, configuration of the selected job is displayed.

  5. Select a MapReduce job, and click View Log in the Operation column corresponding to the selected job.

    In the page that is displayed, log information of the selected job is displayed.

    The MapReduce job is only an example. You can view log information about MapReduce, Spark, Spark Script, and Hive Script jobs regardless of their status.

    Each tenant can submit 10 jobs and query 10 logs concurrently.

Did you find this page helpful?

Submit successfully!

Thank you for your feedback. Your feedback helps make our documentation better.

Failed to submit the feedback. Please try again later.

Which of the following issues have you encountered?







Please complete at least one feedback item.

Content most length 200 character

Content is empty.

OK Cancel