Help Center/ MapReduce Service/ User Guide (Kuala Lumpur Region)/ Troubleshooting/ Accessing OBS/ When the Hadoop Client Is Used to Delete Data from OBS, It Does Not Have the Permission for the .Trash Directory
Updated on 2022-12-14 GMT+08:00

When the Hadoop Client Is Used to Delete Data from OBS, It Does Not Have the Permission for the .Trash Directory

Issue

When a user uses the Hadoop client to delete data from OBS, an error message is displayed indicating that the user does not have the permission on the .Trash directory.

Symptom

After the hadoop fs -rm obs://<obs_path> command is executed, the following error information is displayed:

exception [java.nio.file.AccessDeniedException: user/root/.Trash/Current/: getFileStatus on user/root/.Trash/Current/: status [403]

Cause Analysis

When deleting a file, Hadoop moves the file to the .Trash directory. If the user does not have the permission on the directory, error 403 is reported.

Procedure

Solution 1:

Run the hadoop fs -rm -skipTrash command to delete the file.

Solution 2:

Add the permission to access the .Trash directory to the agency corresponding to the cluster.

  1. On the Dashboard tab page of the cluster, query and record the name of the agency bound to the cluster.
  2. Log in to the IAM console.
  3. Choose Permissions. On the displayed page, click Create Custom Policy.

    • Policy Name: Enter a policy name.
    • Scope: Select Global services.
    • Policy View: Select Visual editor.
    • Policy Content:
      1. Allow: Select Allow.
      2. Select service: Select Object Storage Service (OBS).
      3. Select all operation permissions.
      4. Specific resources:
        1. Set object to Specify resource path, click Add resource path, and enter the .Trash directory, for example, obs_bucket_name/user/root/.Trash/* in Path.
        2. Set bucket to Specify resource path, click Add resource path, and enter obs_bucket_name in Path.

        Replace obs_bucket-name with the actual OBS bucket name.

      5. (Optional) Request condition, which does not need to be added currently.
    Figure 1 Custom policy

  4. Click OK.
  5. Select Agency and click Assign Permissions in the Operation column of the agency queried in 1.
  6. Query and select the created policy in 3.
  7. Click OK.
  8. Run the hadoop fs -rm obs://<obs_path> command again.