Updated on 2022-11-18 GMT+08:00

Introduction to HDFS

Introduction to HDFS

Hadoop distribute file system (HDFS) is a distributed file system with high fault tolerance. HDFS supports data access with high throughput and applies to processing of large data sets.

HDFS applies to the following application scenarios:

  • Massive data processing (higher than the TB or PB level).
  • Scenarios that require high throughput.
  • Scenarios that require high reliability.
  • Scenarios that require good scalability.

Introduction to HDFS Interface

HDFS can be developed by using Java language. For details of API interface, see Java API.