El contenido no se encuentra disponible en el idioma seleccionado. Estamos trabajando continuamente para agregar más idiomas. Gracias por su apoyo.

Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive
On this page
Help Center/ MapReduce Service/ Component Operation Guide (LTS) (Ankara Region)/ Using Loader/ Exporting Data/ Typical Scenario: Importing Data from HBase to HDFS or OBS

Typical Scenario: Importing Data from HBase to HDFS or OBS

Updated on 2024-11-29 GMT+08:00

Scenario

This section describes how to use Loader to export data from HBase to HDFS or OBS.

Prerequisites

  • You have obtained the service username and password for creating a Loader job.
  • You have had the permission to access the HDFS or OBS directories and data involved in job execution.
  • You have had the permission to access the HBase tables or phoenix tables that are used during job execution.
  • No disk space alarm is reported, and the available disk space is sufficient for importing and exporting data.
  • If a configured task requires the Yarn queue function, the user must be authorized with related Yarn queue permission.
  • The user who configures a task must obtain execution permission on the task and obtain usage permission on the related connection of the task.

Procedure

Configure basic job information.

  1. Access the Loader web UI.

    1. Log in to FusionInsight Manager.
    2. Choose Cluster > Services > Loader.
    3. Click LoaderServer(Node name, Active). The Loader web UI is displayed.
      Figure 1 Loader web UI

  2. Click New Job. Configure basic job parameters on the Basic Information page displayed.

    Figure 2 Basic Information page
    1. Enter a job name in Name.
    2. Set Type to Export.
    3. Set Group to the group to which the job belongs. There is no group created by default. Click Add, enter the group name, and click OK.
    4. Set Queue to the Yarn queue that executes the job. The default value is root.default.
    5. Set Priority to the priority of the Yarn queue that executes the job. The default value is NORMAL. The options are VERY_LOW, LOW, NORMAL, HIGH, and VERY_HIGH.

  3. In the Connection area, click Add to create a connection, set Connector to hdfs-connector, set connection parameters, and click Test to verify whether the connection is available. When "Test Success" is displayed, click OK.

Configure data source information.

  1. Click Next. On the displayed From page, set Source type to HBASE.

    Table 1 Parameter description

    Parameter

    Description

    Example Value

    HBase instance

    Specifies the HBase service instance that Loader selects from all available HBase service instances in the cluster. If the selected HBase service instance is not added to the cluster, the HBase job cannot be run properly.

    HBase

    Quantity

    Number of Maps that are started at the same time in a MapReduce task of a data configuration operation. The value must be less than or equal to 3000.

    20

Configure data transformation.

  1. Click Next. On the displayed Transform page, set the transformation operations in the data transformation process. For details about how to select operators and set parameters, see Operator Help and Table 2.

    Table 2 Input and output parameters of the operator

    Input Type

    Output Type

    HBase Input

    File Output

    In input, drag HBase Input to the grid. In output, drag File Output to the grid. Use an arrow to connect HBase Input to File Output.

Set data storage information and execute the job.

  1. Click Next. On the displayed To page, set the data storage mode.

    Table 3 Parameter description

    Parameter

    Description

    Example Value

    Output path

    Specifies the output directory or file name of the export file in the HDFS or OBS.

    NOTE:

    You can use macros to define path parameters. For details, see Using Macro Definitions in Configuration Items.

    /user/test

    File Format

    Specifies the file export type.

    • TEXT_FILE: imports a text file and saves it as a text file.
    • SEQUENCE_FILE: imports a text file and saves it as a sequence file.
    • BINARY_FILE: imports files of any format using binary streams.

    TEXT_FILE

    Compression codec

    Specifies the compression format of files exported to HDFS or OBS. Select a format from the drop-down list. If you select NONE or do not set this parameter, data is not compressed.

    NONE

  2. Click Save and Run to save and run the job.

View the job execution result.

  1. Go to the Loader web UI. When Status is Succeeded, the job is complete.

    Figure 3 Viewing job details

Utilizamos cookies para mejorar nuestro sitio y tu experiencia. Al continuar navegando en nuestro sitio, tú aceptas nuestra política de cookies. Descubre más

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback