Help Center/
MapReduce Service/
Developer Guide (Normal_3.x)/
HDFS Development Guide (Normal Mode)/
Developing an HDFS Application/
Setting Storage Policies
Updated on 2024-10-23 GMT+08:00
Setting Storage Policies
Function
Specify storage policies for a file or folder in the HDFS.
Example Code
- Log in to the FusionInsight Manager portal, choose Cluster > Name of the desired cluster > Services > HDFS > Configurations > All Configurations.
- Check whether the value of dfs.storage.policy.enabled is the default value true. If not, modify the value to true, click Save, and restart HDFS.
- Check the code.
The following code segment is only an example. For details, see the HdfsMain class in com.huawei.bigdata.hdfs.examples.
/** * set storage policy to path * @param policyName * Policy Name can be accepted: * <li>HOT * <li>WARM * <li>COLD * <li>LAZY_PERSIST * <li>ALL_SSD * <li>ONE_SSD * @throws IOException */ private void setStoragePolicy(String policyName) throws IOException { if (fSystem instanceof DistributedFileSystem) { DistributedFileSystem dfs = (DistributedFileSystem) fSystem; Path destPath = new Path(DEST_PATH); Boolean flag = false; mkdir(); BlockStoragePolicySpi[] storage = dfs.getStoragePolicies(); for (BlockStoragePolicySpi bs : storage) { if (bs.getName().equals(policyName)) { flag = true; } LOG.info("StoragePolicy:" + bs.getName()); } if (!flag) { policyName = storage[0].getName(); } dfs.setStoragePolicy(destPath, policyName); LOG.info("success to set Storage Policy path " + DEST_PATH); rmdir(); } else { LOG.info("SmallFile not support to set Storage Policy !!!"); } }
Parent topic: Developing an HDFS Application
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
The system is busy. Please try again later.
For any further questions, feel free to contact us through the chatbot.
Chatbot