Help Center/ MapReduce Service/ Troubleshooting/ Using Spark/ Failed to Create or Delete a Table in Spark Beeline
Updated on 2023-11-30 GMT+08:00

Failed to Create or Delete a Table in Spark Beeline

Issue

When the customer frequently creates or deletes a large number of users in Spark Beeline, some users occasionally fail to create or delete tables.

Symptom

The procedure for creating a table is as follows:

CREATE TABLE wlg_test001 (start_time STRING,value INT);

The following error message is displayed:

Error: org.apache.spark.sql.AnalysisException: 
org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Failed to grant permission on HDFSjava.lang.reflect.UndeclaredThrowableException); (state=,code=0)

Cause Analysis

  1. View MetaStore logs.

  2. View HDFS logs.

  3. Compare permissions (test001 is a table created by a user in the abnormal state, and test002 is a table created by a user in the normal state).

  4. An error similar to the following is reported when a table is dropped:
    dataplan_modela_csbch2;
    Error: Error while compiling statement: FAILED:
    SemanticException Unable to fetch table dataplan_modela_csbch2.
    java.security.AccessControlException: Permission denied: user=CSB_csb_3f8_x48ssrbt,
    access=READ,
    inode="/user/hive/warehouse/hive_csb_csb_3f8_x48ssrbt_5lbi2edu.db/dataplan_modela_csbch2":spark:hive:drwx------
  5. Analyze the cause.

    The default user created during cluster creation uses the same UID, causing user disorder. This problem is triggered when a large number of users are created. As a result, the Hive user does not have the permission to create tables occasionally.

Procedure

Restart the sssd process of the cluster.

Run the service sssd restart command as user root to restart the sssd process and run the ps -ef | grep sssd command to check whether the sssd process is running properly.

In normal cases, the /usr/sbin/sssd process and three sub-processes /usr/libexec/sssd/sssd_be, /usr/libexec/sssd/sssd_nss and /usr/libexec/sssd/sssd_pam exist.