Introduction to HDFS
Introduction to HDFS
Hadoop distribute file system (HDFS) is a distributed file system with high fault tolerance. HDFS supports data access with high throughput and applies to processing of large data sets.
HDFS applies to the following application scenarios:
- Massive data processing (higher than the TB or PB level).
- Scenarios that require high throughput.
- Scenarios that require high reliability.
- Scenarios that require good scalability.
Introduction to HDFS Interface
HDFS can be developed by using Java language. For details of API interface, see Java API.
Last Article: Overview
Next Article: Basic Concepts
Did this article solve your problem?
Thank you for your score!Your feedback would help us improve the website.