Help Center> Data Lake Insight> FAQs> Problems Related to Spark Jobs> Job O&M Errors> Why Does the Job Fail to Be Executed Due to Insufficient Database and Table Permissions?
Updated on 2023-05-19 GMT+08:00

Why Does the Job Fail to Be Executed Due to Insufficient Database and Table Permissions?

Symptom

When a Spark job is running, an error message is displayed, indicating that the user does not have the database permission. The error information is as follows:
org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Permission denied for resource: databases.xxx,action:SPARK_APP_ACCESS_META)

Solution

You need to assign the database permission to the user who executes the job. The procedure is as follows:

  1. In the navigation pane on the left of the management console, choose Data Management > Databases and Tables.
  2. Locate the row where the target database resides and click Permissions in the Operation column.
  3. On the displayed page, click Grant Permission in the upper right corner.
  4. In the displayed dialog box, select User or Project, enter the username or select the project that needs the permission, and select the desired permissions.
  5. Click OK.

Job O&M Errors FAQs

more