Help Center/ MapReduce Service/ Component Operation Guide (Normal)/ Using HDFS/ Common Issues About HDFS/ Why Is Data in the Cache Lost When Small Files Are Stored?
Updated on 2024-12-11 GMT+08:00

Why Is Data in the Cache Lost When Small Files Are Stored?

Question

The system is powered off when it is saving small files. As a result, the data in the cache is lost.

Answer

Blocks in the cache were not written to the disk immediately due to the power failure. To synchronously write the cached blocks to the disk, set dfs.datanode.synconclose to true in Client installation path/HDFS/hadoop/etc/hadoop/hdfs-site.xml.

By default, dfs.datanode.synconclose is set to false. Although the performance is high, data stored in the cache will be lost after a power failure. You can set dfs.datanode.synconclose to true to solve this problem. However, the performance will be greatly affected. Set this parameter based on the application scenario.