หน้านี้ยังไม่พร้อมใช้งานในภาษาท้องถิ่นของคุณ เรากำลังพยายามอย่างหนักเพื่อเพิ่มเวอร์ชันภาษาอื่น ๆ เพิ่มเติม ขอบคุณสำหรับการสนับสนุนเสมอมา

Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive
Help Center/ GaussDB(DWS)/ Best Practices/ Import and Export/ Importing Table Data from DLI to a GaussDB(DWS) Cluster

Importing Table Data from DLI to a GaussDB(DWS) Cluster

Updated on 2024-11-08 GMT+08:00

This exercise demonstrates how to use the GaussDB(DWS) foreign table function to import data from DLI to GaussDB(DWS).

For details about DLI, see What Is Data Lake Insight?

This exercise lasts for approximately 60 minutes and involves utilizing various cloud services such as Virtual Private Cloud (VPC) and Subnet, Data Lake Insight (DLI), Object Storage Service (OBS), and GaussDB(DWS). The following is an outline of the exercise.

  1. Preparations
  2. Step 1: Preparing DLI Source Data
  3. Step 2: Creating a GaussDB(DWS) Cluster
  4. Step 3: Obtaining Authentication Information Required by the GaussDB(DWS) External Server.
  5. Step 4: Importing DLI Table Data Using a Foreign Table

Preparations

  • You have registered a Huawei ID and enabled Huawei Cloud services.. The account cannot be in arrears or frozen.
  • You have created a VPC and subnet. For details, see Creating a VPC.
  • You have obtained the AK and SK of your Huawei account. For details, see Access Keys.

Step 1: Preparing DLI Source Data

  1. Create a DLI elastic resource pool and queue.

    1. Log in to the Huawei Cloud console and choose Analytics > Data Lake Insight from the service list. The DLI console is displayed.
    2. In the navigation pane on the left, choose Resources > Resource Pool.
    3. Click Buy Resource Pool in the upper right corner, set the following parameters, and retain the default values for other parameters that are not described in the table.
      Table 1 DLI elastic resource pool parameters

      Parameter

      Value

      Billing Mode

      Pay-per-use

      Region

      CN-Hong Kong

      Name

      dli_dws

      Specifications

      Standard

      CIDR Block

      172.16.0.0/18.

    4. Click Buy and click Submit.

      After the resource pool is created, go to the next step.

    5. On the elastic resource pool page, locate the row that contains the created resource pool, click Add Queue in the Operation column, and set the following parameters. Retain the default values for other parameters that are not described in the table.
      Table 2 Adding a queue

      Parameter

      Value

      Name

      dli_dws

      Type

      For SQL

    6. Click Next and click OK. The queue is created.

  2. Upload the source data to the OBS bucket.

    1. An OBS bucket has been created with a user-defined name, for example, dli-obs01 (if the bucket name is already in use, use dli-obs02 instead). The region is CN-Hong Kong.
    2. Download the data sample file.
    3. Create a folder dli_order in the OBS bucket and upload the downloaded data file to that folder.

  3. Go back to the DLI management console. In the navigation pane, click SQL Editor. Select dli_dws for Queue and Default for Database. Run the following command to create a database named dli_data:

    1
    CREATE DATABASE dli_data;
    

  4. Create a table.

    NOTE:

    LOCATION specifies the OBS directory where the data file is stored, formatted as obs://OBS bucket name/folder name. In this example, the directory is obs://dli-obs01/dli_order. If the bucket name or folder name changes, substitute it accordingly.

    1
    2
    3
    4
    5
    6
    7
    8
    9
    CREATE EXTERNAL TABLE dli_data.dli_order
         ( order_id      VARCHAR(12),
           order_channel VARCHAR(32),
           order_time    TIMESTAMP,
           cust_code     VARCHAR(6),
           pay_amount    DOUBLE,
           real_pay      DOUBLE ) 
    STORED AS parquet
    LOCATION 'obs://dli-obs01/dli_order';
    

  5. Run the following statement to query data.

    1
    SELECT * FROM dli_data.dli_order;
    

Step 2: Creating a GaussDB(DWS) Cluster

  1. Create a cluster . To ensure network connectivity, set the region of the GaussDB(DWS) cluster to CN-Hong Kong.

Step 3: Obtaining Authentication Information Required by the GaussDB(DWS) External Server

  1. Obtain the endpoint of the OBS bucket.

    1. Log in to the OBS management console.
    2. Click the bucket name, choose Overview on the left, and record the endpoint.

  2. Visit Endpoints to obtain the endpoint of DLI.

    In this example, the endpoint is dli.ap-southeast-1.myhuaweicloud.com.

    In this example (EU-Dublin), the endpoint is dli.eu-west-101.myhuaweicloud.com.

  3. Obtain the project ID for the specific region of the account used to create DLI.

    1. Move the cursor to the account name in the upper right corner and click My Credentials.
    2. Choose API Credentials on the left.
    3. In the list, find the region where the DLI instance is deployed, for example, CN-Hong Kong, and record the project ID corresponding to the region name.

  4. Obtain the AK and SK of your account. For details, see Prerequisites.

Step 4: Importing DLI Table Data Using a Foreign Table

  1. Log in to the GaussDB(DWS) database as the system administrator dbadmin. By default, you can log in to the GaussDB database.
  2. Run the following SQL statements to create a foreign server: The OBS endpoint is obtained from 1, the AK and SK are obtained from Preparations, and the DLI endpoint is obtained from 2.

    NOTE:

    If the GaussDB(DWS) and DLI instances are created by the same account, enter the AK and SK twice.

    1
    2
    3
    4
    5
    6
    7
    8
    9
    CREATE SERVER dli_server FOREIGN DATA WRAPPER DFS_FDW OPTIONS 
    (ADDRESS'OBS endpoint',
    ACCESS_KEY'AK value'
    SECRET_ACCESS_KEY'SK value'
           TYPE 'DLI',
    DLI_ADDRESS'DLI endpoint',
    DLI_ACCESS_KEY'AK value',
    DLI_SECRET_ACCESS_KEY 'SK value'
         );
    

  3. Run the following SQL statement to create a target schema:

    1
    CREATE SCHEMA dws_data;
    

  4. Run the following SQL statements to create a foreign table: Replace Project ID with the actual value obtained in 3.

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    CREATE FOREIGN TABLE dws_data.dli_pq_order (
      order_id VARCHAR(14) PRIMARY KEY NOT ENFORCED,
      order_channel VARCHAR(32),
      order_time TIMESTAMP,
      cust_code VARCHAR(6),
      pay_amount DOUBLE PRECISION,
      real_pay DOUBLE PRECISION
    )
    SERVER dli_server
    OPTIONS (
      FORMAT 'parquet',
      ENCODING 'utf8',
    DLI_PROJECT_ID'Project ID'
      DLI_DATABASE_NAME 'dli_data',
      DLI_TABLE_NAME 'dli_order')
    DISTRIBUTE BY roundrobin;
    

  5. Run the following SQL statement to query the DLI table data through the foreign table.

    The DLI table data is successfully accessed.
    1
    SELECT * FROM dws_data.dli_pq_order;
    

  6. Run the following SQL statements to create a local table for importing DLI table data:

    1
    2
    3
    4
    5
    6
    CREATE TABLE dws_data.dws_monthly_order
         ( order_month       CHAR(8),
           cust_code         VARCHAR(6),
           order_count       INT,
           total_pay_amount  DOUBLE PRECISION,
           total_real_pay    DOUBLE PRECISION );
    

  7. Run the following SQL statements to query the monthly order details of 2023 and import the result to the GaussDB(DWS) table:

    1
    2
    3
    4
    5
    6
    7
    8
    INSERT INTO dws_data.dws_monthly_order
         ( order_month, cust_code, order_count     
         , total_pay_amount, total_real_pay )
    SELECT TO_CHAR(order_time, 'MON-YYYY'), cust_code, COUNT(*)
         , SUM(pay_amount), SUM(real_pay)
      FROM dws_data.dli_pq_order
     WHERE DATE_PART('Year', order_time) = 2023
    GROUP BY TO_CHAR(order_time, 'MON-YYYY'), cust_code;
    

  8. Run the following SQL statement to query table data.

    The DLI table data is successfully imported to the DWS database.

    1
    SELECT * FROM dws_data.dws_monthly_order;
    

เราใช้คุกกี้เพื่อปรับปรุงไซต์และประสบการณ์การใช้ของคุณ การเรียกดูเว็บไซต์ของเราต่อแสดงว่าคุณยอมรับนโยบายคุกกี้ของเรา เรียนรู้เพิ่มเติม

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback