هذه الصفحة غير متوفرة حاليًا بلغتك المحلية. نحن نعمل جاهدين على إضافة المزيد من اللغات. شاكرين تفهمك ودعمك المستمر لنا.

Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Adding DWS Dump Tasks

Function

This API is used to add DWS dump tasks.

Calling Method

For details, see Calling APIs.

URI

POST /v2/{project_id}/streams/{stream_name}/transfer-tasks

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID

stream_name

Yes

String

Name of the stream

Maximum: 60

Request Parameters

Table 2 Request header parameters

Parameter

Mandatory

Type

Description

X-Auth-Token

Yes

String

User token.

It can be obtained by calling the IAM API used to obtain a user token. The value of X-Subject-Token in the response header is the user token.

Table 3 Request body parameters

Parameter

Mandatory

Type

Description

destination_type

Yes

String

Type of the dump task.

  • OBS: Data is dumped to OBS.

  • MRS: Data is dumped to MRS.

  • DLI: Data is dumped to DLI.

  • CLOUDTABLE: Data is dumped to CloudTable.

  • DWS: Data is dumped to DWS.

Default: NOWHERE

Enumeration values:

  • DWS

dws_destination_descriptor

No

DWSDestinationDescriptorRequest object

Parameter list of the DWS to which data in the DIS stream will be dumped

Table 4 DWSDestinationDescriptorRequest

Parameter

Mandatory

Type

Description

task_name

Yes

String

Name of the dump task. The task name consists of letters, digits, hyphens (-), and underscores (_). It contains 1 to 64 characters.

agency_name

Yes

String

Name of the agency created on IAM. DIS uses an agency to access your specified resources. Agency parameter settings:- Agency Type: Cloud service- Cloud Service: DIS- Validity Period: Unlimited- Set Policy to Tenant Administrator on the OBS project in the Global service region.If agencies are available, you can use an IAM API to obtain the available agencies.This parameter cannot be left unspecified and the parameter value cannot exceed 64 characters.If there are dump tasks on the console, the system displays a message indicating that an agency will be automatically created. The name of the automatically created agency is dis_admin_agency.

Maximum: 64

deliver_time_interval

Yes

Integer

User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated.

Unit: second

Minimum: 30

Maximum: 900

Default: 300

consumer_strategy

No

String

Offset.

  • LATEST: maximum offset, indicating that the latest data will be extracted.

  • TRIM_HORIZON: minimum offset, indicating that the earliest data will be extracted.

Default: LATEST

Enumeration values:

  • LATEST

  • TRIM_HORIZON

dws_cluster_name

Yes

String

Name of the DWS cluster that stores the data in the stream

dws_cluster_id

Yes

String

ID of the DWS cluster to which will be dumped

dws_database_name

Yes

String

Name of the DWS database that stores the data in the stream

dws_schema

Yes

String

Schema of the DWS database to which data will be dumped

dws_table_name

Yes

String

Name of the DWS table that stores the data in the stream

dws_delimiter

Yes

String

Delimiter used to separate the columns in the DWS tables into which user data is inserted.

The delimiter can be a comma (,), semicolon (;), or vertical bar (|).

user_name

Yes

String

Username of the DWS database to which data will be dumped

user_password

Yes

String

Password of the DWS database to which data will be dumped

kms_user_key_name

Yes

String

Name of the key created in KMS and used to encrypt the password of the DWS database

kms_user_key_id

Yes

String

ID of the key created in KMS and used to encrypt the password of the DWS database

obs_bucket_path

Yes

String

Name of the OBS bucket used to temporarily store data in the DIS stream

file_prefix

No

String

Self-defined directory created in the OBS bucket and used to temporarily store data in the DIS stream. Directory levels are separated by slashes (/) and cannot start with slashes.

The value can contain a maximum of 50 characters, including letters, digits, underscores (_), and slashes (/).

This parameter is left blank by default.

retry_duration

No

String

Duration when you can constantly retry dumping data to DWS after the dump fails. If the duration expires but the dump still fails, the data will be backed up to the OBS bucket name/file_prefix/dws_error directory.Value range: 0–7200Unit: secondDefault value: 1800

dws_table_columns

No

String

Column to be dumped to the DWS table. If the value is null or empty, all columns are dumped by default. For example, c1,c2 indicates that columns c1 and c2 in the schema are dumped to DWS.

This parameter is left blank by default.

options

No

Options object

DWS fault tolerance option (used to specify various parameters of foreign table data).

Table 5 Options

Parameter

Mandatory

Type

Description

fill_missing_fields

No

String

Whether to set the field to Null or enable an error message to be displayed in the error table when the last field in a row of the data source file is missing during database import

Options:

  • true/on

  • false/off

Default value: false or off

Enumeration values:

  • true/on

  • false/off

ignore_extra_data

No

String

Whether to ignore the extra columns when the number of fields in the data source file is greater than the number of columns defined in the external table. This parameter is used only during data import.

Options:

  • true/on

  • false/off

Default value: false or off

Enumeration values:

  • true/on

  • false/off

compatible_illegal_chars

No

String

Specifies whether to tolerate invalid characters during data import. Specifies whether to convert invalid characters based on the conversion rule and import them to the database, or to report an error and stop the import.

Options:

  • true/on

  • false/off

Default value: false or off

Enumeration values:

  • true/on

  • false/off

reject_limit

No

String

Maximum number of data format errors allowed during the data import. If the number of data format errors does not reach the maximum, the data import is successful.

Options:

  • Integer value

  • unlimited

Default value: 0, indicating that an error message is returned immediately a data format error occurs.

error_table_name

No

String

Name of the error table that records data format errors. After the parallel import is complete, you can query the error information table to obtain the detailed error information.

Response Parameters

None

Example Requests

Adding DWS Dump Tasks

POST https://{Endpoint}/v2/{project_id}/streams/{stream_name}/transfer-tasks

{
  "destination_type" : "DWS",
  "dws_destination_descriptor" : {
    "task_name" : "dwstask",
    "consumer_strategy" : "LATEST",
    "agency_name" : "dis_admin_agency",
    "dws_cluster_name" : "dwscluster",
    "dws_cluster_id" : "f82dc227-3691-47eb-bca7-e7851f509b2a",
    "dws_database_name" : "postgres",
    "dws_schema" : "dbadmin",
    "dws_table_name" : "dwstablename",
    "dws_delimiter" : "",
    "user_name" : "dbadmin",
    "user_password" : "userpassword",
    "kms_user_key_name" : "kmskey",
    "kms_user_key_id" : "1e759f06-9188-4d21-afab-a75e57c04d2b",
    "obs_bucket_path" : "obsbucket",
    "file_prefix" : "",
    "deliver_time_interval" : 60,
    "retry_duration" : 1800,
    "options" : {
      "fill_missing_fields" : "false",
      "ignore_extra_data" : "false",
      "compatible_illegal_chars" : "false"
    }
  }
}

Example Responses

None

Status Codes

Status Code

Description

201

Normal response

Error Codes

See Error Codes.

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback