Data Lake Factory
Data Lake Factory (DLF) is a one-stop big data development platform to help you quickly build big data processing centers.
Progressive Knowledge
DIS knowledge for users from beginner level to expert level
01
Understand
Data Lake Factory (Data Development) is a big data platform designed specifically for the HUAWEI CLOUD. It manages diverse big data services and provides a one-stop big data development environment and fully-managed big data scheduling capabilities.
Service Overview
03
Start
After creating an ECS, you can log in to it and initialize data disks in multiple ways to start your ECS.
Logging in to an ECS
Initializing Data Disks
05
Deploy
ECSs support the deployment of websites and applications to meet your service requirements.
Web Deployment
Web Deployment
03
APIs
Several DLF APIs and calling examples help you manage DLF.
API Reference
02
Purchase
ECS billing modes and configurations are available for you to select.
Billing Mode
Quick Config
Custom Config
02
Be a Power User
Data Management
Data Development
Job Development
O&M and Monitoring
FAQs
Learn more about common issues and solutions.
Typical Cases
-
What Is DLF Used For?
-
What Is a Job?
-
How Many Jobs Can Be Created in DLF?
-
What Is a Node?
-
How Can I Quickly Rectify a Deleted CDM Cluster Associated with a Job?
-
What Can I Do When a Job Fails?
-
Does DLF Provide Job Templates?
-
What Should I Do If I Cannot Log In to My ECS Using the Reset Password?
-
Why Cannot an EIP Be Pinged?
-
Troubleshooting Multi-User Logins
Remote Logins
-
What Are the Username and Password for Remote Logins?
-
Resetting the ECS Password on the Management Console
-
Troubleshooting Remote Login Errors Occurred on a Windows ECS
-
File Uploading FAQs
-
How Can I Change the Resolution of a Windows ECS?
-
Why Cannot I Use a Non-Default SSH Port to Log In to My Linux ECS?
Slow ECS Responses
Internet Access
Operating Systems
Technical Topics
Technologies, expert opinions, and courses