Help Center/ MapReduce Service/ User Guide (Paris Region)/ Troubleshooting/ Using HDFS/ File Read Fails, and "No common protection layer" Is Displayed
Updated on 2024-10-11 GMT+08:00

File Read Fails, and "No common protection layer" Is Displayed

Symptom

HDFS fails to be operated on the Shell client or other clients, and the error message "No common protection layer between client and server" is displayed.

Running any hadoop command, such as hadoop fs -ls /, on a node outside the cluster fails. The bottom-layer error message is displayed stating "No common protection layer between client and server."

2017-05-13 19:14:19,060 | ERROR | [pool-1-thread-1] |  Server startup failure  | org.apache.sqoop.core.SqoopServer.initializeServer(SqoopServer.java:69)
org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0028:Failed to operate HDFS - Failed to get the file /user/loader/etl_dirty_data_dir status
        at org.apache.sqoop.job.mr.HDFSClient.fileExist(HDFSClient.java:85)
...
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Failed on local exception: java.io.IOException: Couldn't setup connection for loader/hadoop@HADOOP.COM to loader37/10.162.0.37:25000; Host Details : local host is: "loader37/10.162.0.37"; destination host is: "loader37":25000;
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:776)
...
        ... 10 more
Caused by: java.io.IOException: Couldn't setup connection for loader/hadoop@HADOOP.COM to loader37/10.162.0.37:25000
        at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:674
        ... 28 more
Caused by: javax.security.sasl.SaslException: No common protection layer between client and server
        at com.sun.security.sasl.gsskerb.GssKrb5Client.doFinalHandshake(GssKrb5Client.java:251)
...
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:720)

Cause Analysis

  1. The RPC protocol is used for data transmission between the client and server of HDFS. The protocol has multiple encryption modes and the hadoop.rpc.protection parameter specifies the mode to use.
  2. If the value of the hadoop.rpc.protection parameter on the client is different from that on the server, the "No common protection layer between client and server" error is reported.

    hadoop.rpc.protection indicates that data can be transmitted between nodes in any of the following modes:

    • privacy: Data is transmitted after authentication and encryption. This mode reduces the performance.
    • authentication: Data is transmitted after authentication without encryption. This mode ensures performance but has security risks.
    • integrity: Data is transmitted without encryption or authentication. To ensure data security, exercise caution when using this mode.

Solution

  1. Download the client again. If the client is an application, update the configuration file in the application.