Updated on 2025-08-22 GMT+08:00

Permission Parameters of the Spark Client and Server

This section describes how to configure SparkSQL permission management functions (client configuration is similar to server configuration). To enable table permission, add following configurations on the client and server:
  • spark-defaults.conf configuration file
    Table 1 Parameter description (1)

    Parameter

    Description

    Example Value

    spark.sql.authorization.enabled

    Specifies whether to enable permission authentication of the datasource statement. It is recommended that the parameter value be set to true to enable permission authentication.

    true

  • hive-site.xml configuration file
    Table 2 Parameter description (2)

    Parameter

    Description

    Example Value

    hive.metastore.uris

    Address of the Hive Metastore service.

    thrift://10.10.169.84:21088,thrift://10.10.81.37:21088

    hive.metastore.sasl.enabled

    Whether the Metastore service uses SASL to improve security. The table permission function must be enabled.

    true

    hive.metastore.kerberos.principal

    Principal of the Hive Metastore service, for example, hive/hadoop.<system domain name>@<system domain name>.

    hive-metastore/_HOST@EXAMPLE.COM

    hive.metastore.thrift.sasl.qop

    After the Spark SQL permission management function is enabled, set this parameter to auth-conf.

    auth-conf

    hive.metastore.token.signature

    Token identifier of the Metastore service.

    HiveServer2ImpersonationToken

    hive.security.authenticator.manager

    Authentication manager of the Hive client.

    org.apache.hadoop.hive.ql.security.SessionStateUserMSGroupAuthenticator

    hive.security.authorization.enabled

    Whether to enable client authorization.

    true

    hive.security.authorization.createtable.owner.grants

    Permissions granted to the owner who creates the table.

    ALL

  • core-site.xml configuration file of the MetaStore service
    Table 3 Parameter description (3)

    Parameter

    Description

    hadoop.proxyuser.spark.hosts

    Specifies the hosts from which Spark users can be masqueraded, which is set to *, indicating all hosts.

    hadoop.proxyuser.spark.groups

    Specifies the user groups from which Spark users can be masqueraded, which is set to *, indicating all user groups.