Halaman ini belum tersedia dalam bahasa lokal Anda. Kami berusaha keras untuk menambahkan lebih banyak versi bahasa. Terima kasih atas dukungan Anda.

Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Adding CloudTable Dump Tasks

Function

This API is used to add CloudTable dump tasks.

Calling Method

For details, see Calling APIs.

URI

POST /v2/{project_id}/streams/{stream_name}/transfer-tasks

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID

stream_name

Yes

String

Name of the stream

Maximum: 60

Request Parameters

Table 2 Request header parameters

Parameter

Mandatory

Type

Description

X-Auth-Token

Yes

String

User token.

It can be obtained by calling the IAM API used to obtain a user token. The value of X-Subject-Token in the response header is the user token.

Table 3 Request body parameters

Parameter

Mandatory

Type

Description

destination_type

Yes

String

Type of the dump task.

  • OBS: Data is dumped to OBS.

  • MRS: Data is dumped to MRS.

  • DLI: Data is dumped to DLI.

  • CLOUDTABLE: Data is dumped to CloudTable.

  • DWS: Data is dumped to DWS.

Default: NOWHERE

Enumeration values:

  • CLOUDTABLE

cloudtable_destination_descriptor

No

CloudtableDestinationDescriptorRequest object

Parameter list of the CloudTable to which data in the DIS stream will be dumped

Table 4 CloudtableDestinationDescriptorRequest

Parameter

Mandatory

Type

Description

task_name

Yes

String

Name of the dump task. The task name consists of letters, digits, hyphens (-), and underscores (_). It contains 1 to 64 characters.

agency_name

Yes

String

Name of the agency created on IAM. DIS uses an agency to access your specified resources. Agency parameter settings:- Agency Type: Cloud service- Cloud Service: DIS- Validity Period: Unlimited- Set Policy to Tenant Administrator on the OBS project in the Global service region.If agencies are available, you can use an IAM API to obtain the available agencies.This parameter cannot be left unspecified and the parameter value cannot exceed 64 characters.If there are dump tasks on the console, the system displays a message indicating that an agency will be automatically created. The name of the automatically created agency is dis_admin_agency.

Maximum: 64

deliver_time_interval

Yes

Integer

User-defined interval at which data is imported from the current DIS stream into OBS. If no data is pushed to the DIS stream during the current interval, no dump file package will be generated.

Unit: second

Minimum: 30

Maximum: 900

Default: 300

consumer_strategy

No

String

Offset.

  • LATEST: maximum offset, indicating that the latest data will be extracted.

  • TRIM_HORIZON: minimum offset, indicating that the earliest data will be extracted.

Default: LATEST

Enumeration values:

  • LATEST

  • TRIM_HORIZON

cloudtable_cluster_name

Yes

String

Name of the CloudTable cluster to which data will be dumped.

If you choose to dump data to OpenTSDB, OpenTSDB must be enabled for the cluster.

cloudtable_cluster_id

Yes

String

ID of the CloudTable cluster to which data will be dumped.

If you choose to dump data to OpenTSDB, OpenTSDB must be enabled for the cluster.

cloudtable_table_name

No

String

HBase table name of the CloudTable cluster to which data will be dumped. The parameter is mandatory when data is dumped to the CloudTable HBase.

cloudtable_schema

No

CloudtableSchema object

Schema configuration of the CloudTable HBase data. You can set either this parameter or opentsdb_schema, but this parameter is mandatory when data will be dumped to HBase. After this parameter is set, the JSON data in the stream can be converted to another format and then be imported to the CloudTable HBase.

opentsdb_schema

No

Array of OpenTSDBSchema objects

Schema configuration of the CloudTable OpenTSDB data. You can set either this parameter or cloudtable_schema, but this parameter is mandatory when data will be dumped to OpenTSDB. After this parameter is set, the JSON data in the stream can be converted to another format and then be imported to the CloudTable OpenTSDB.

cloudtable_row_key_delimiter

No

String

Delimiter used to separate the user data that generates HBase row keys. Value range: ,. |;-_~Default value: .

obs_backup_bucket_path

No

String

Name of the OBS bucket used to back up data that failed to be dumped to CloudTable

backup_file_prefix

No

String

Self-defined directory created in the OBS bucket and used to back up data that failed to be dumped to CloudTable. Directory levels are separated by slashes (/) and cannot start with slashes.Value range: a string of letters, digits, and underscores (_)Maximum length: 50 charactersThis parameter is left blank by default.

retry_duration

No

String

Time duration for DIS to retry if data fails to be dumped to CloudTable. If the duration is exceeded but the dump still fails, the data will be backed up to OBS bucket name/backup_file_prefix/cloudtable_error or OBS bucket name/backup_file_prefix/opentsdb_error.Value range: 0–7200Unit: secondDefault value: 1800

Table 5 CloudtableSchema

Parameter

Mandatory

Type

Description

row_key

Yes

Array of RowKey objects

HBase rowkey schema used by the CloudTable cluster to convert JSON data into HBase rowkeys.

Value range:1–64

columns

Yes

Array of Column objects

HBase column schema used by the CloudTable cluster to convert JSON data into HBase columns.

Value range: 1–4096

Table 6 RowKey

Parameter

Mandatory

Type

Description

value

Yes

String

JSON attribute name, which is used to generate HBase rowkeys for JSON data in the DIS stream.

type

Yes

String

JSON attribute type of JSON data in the DIS stream

Enumeration values:

  • Bigint

  • Double

  • Boolean

  • Timestamp

  • String

  • Decimal

Table 7 Column

Parameter

Mandatory

Type

Description

column_family_name

Yes

String

Name of the HBase column family to which data will be dumped

column_name

Yes

String

Name of the HBase column to which data will be dumped.

Value range: 1–32. The value can contain only letters, digits, and underscores (_).

value

Yes

String

JSON attribute name, which is used to generate HBase column values for JSON data in the DIS stream.

type

Yes

String

JSON attribute type of JSON data in the DIS stream

Enumeration values:

  • Bigint

  • Double

  • Boolean

  • Timestamp

  • String

  • Decimal

Table 8 OpenTSDBSchema

Parameter

Mandatory

Type

Description

metric

Yes

Array of OpenTSDBMetric objects

Schema configuration of the OpenTSDB data metric in the CloudTable cluster. After this parameter is set, the JSON data in the stream can be converted to the metric of the OpenTSDB data.

timestamp

Yes

OpenTSDBTimestamp object

Schema configuration of the OpenTSDB data timestamp in the CloudTable cluster. After this parameter is set, the JSON data in the stream can be converted to the timestamp of the OpenTSDB data.

value

Yes

OpenTSDBValue object

Schema configuration of the OpenTSDB data value in the CloudTable cluster. After this parameter is set, the JSON data in the stream can be converted to the value of the OpenTSDB data.

tags

Yes

Array of OpenTSDBTags objects

Schema configuration of the OpenTSDB data tags in the CloudTable cluster. After this parameter is set, the JSON data in the stream can be converted to the tags of the OpenTSDB data.

Table 9 OpenTSDBMetric

Parameter

Mandatory

Type

Description

type

Yes

String

  • Constant: The value of metric is the value of Value.

  • String: The value of metric is the value of the JSON attribute of the user data in the stream.

Enumeration values:

  • Constant

  • String

value

Yes

String

Constant or JSON attribute name of the user data in the stream

The value contains 1 to 32 characters. Only letters, digits, and periods (.) are allowed.

Table 10 OpenTSDBTimestamp

Parameter

Mandatory

Type

Description

type

Yes

String

  • Timestamp: The value type of the JSON attribute of the user data in the stream is Timestamp, and the timestamp of OpenTSDB can be generated without converting the data format.- String: The value type of the JSON attribute of the user data in the stream is Date, and the timestamp of OpenTSDB can be generated only after the data format is converted.

value

Yes

String

JSON attribute name of the user data in the stream

Value range: 1–32. The value can contain only letters, digits, and underscores (_).

format

Yes

String

This parameter is mandatory when type is set to String. When the value type of the JSON attribute of the user data in the stream is Date, format is required to convert the data format to generate the timestamp of OpenTSDB.Options:- yyyy/MM/dd HH:mm:ss- MM/dd/yyyy HH:mm:ss- dd/MM/yyyy HH:mm:ss- yyyy-MM-dd HH:mm:ss- MM-dd-yyyy HH:mm:ss- dd-MM-yyyy HH:mm:ss

Enumeration values:

  • yyyy/MM/dd HH:mm:ss

  • MM/dd/yyyy HH:mm:ss

  • dd/MM/yyyy HH:mm:ss

  • yyyy-MM-dd HH:mm:ss

  • MM-dd-yyyy HH:mm:ss

  • dd-MM-yyyy HH:mm:ss

Table 11 OpenTSDBValue

Parameter

Mandatory

Type

Description

type

Yes

String

Type name of the JSON attribute of the user data in the stream

Options:

  • Bigint

  • Double

  • Boolean

  • Timestamp

  • String

  • Decimal

value

Yes

String

Constant or JSON attribute name of the user data in the stream

Value range: 1–32. The value can contain only letters, digits, and underscores (_).

Table 12 OpenTSDBTags

Parameter

Mandatory

Type

Description

name

Yes

String

Tag name of the OpenTSDB data that stores the data in the stream

Value range: 1–32. The value can contain only letters, digits, and underscores (_).

type

Yes

String

Type name of the JSON attribute of the user data in the stream

Options:

  • Bigint

  • Double

  • Boolean

  • Timestamp

  • String

  • Decimal

value

Yes

String

Constant or JSON attribute name of the user data in the stream

Value range: 1–32. The value can contain only letters, digits, and underscores (_).

Response Parameters

None

Example Requests

  • Adding CloudTable HBase Dump Tasks

    POST https://{Endpoint}/v2/{project_id}/streams/{stream_name}/transfer-tasks
    
    {
      "destination_type" : "CLOUDTABLE",
      "cloudtable_destination_descriptor" : {
        "task_name" : "hbasetask",
        "consumer_strategy" : "TRIM_HORIZON",
        "agency_name" : "dis_admin_agency",
        "cloudtable_cluster_name" : "cloudtablecluster",
        "cloudtable_cluster_id" : "b8c095e2-db5f-4732-8a1d-eacd662e35dc",
        "cloudtable_table_name" : "cloudtabletable",
        "cloudtable_row_key_delimiter" : "|",
        "retry_duration" : 1800,
        "obs_backup_bucket_path" : "obsbackupbucket",
        "backup_file_prefix" : "",
        "cloudtable_schema" : {
          "row_key" : [ {
            "value" : "datavalue",
            "type" : "String"
          } ],
          "columns" : [ {
            "column_family_name" : "cfname1",
            "column_name" : "ID",
            "value" : "datavalue1",
            "type" : "String"
          }, {
            "column_family_name" : "cfname2",
            "column_name" : "VALUE",
            "value" : "datavalue2",
            "type" : "String"
          } ]
        }
      }
    }
  • Adding CloudTable OpenTSDB Dump Tasks

    POST https://{Endpoint}/v2/{project_id}/streams/{stream_name}/transfer-tasks
    
    {
      "destination_type" : "CLOUDTABLE",
      "cloudtable_destination_descriptor" : {
        "task_name" : "opentsdbtask",
        "consumer_strategy" : "LATEST",
        "agency_name" : "dis_admin_agency",
        "cloudtable_cluster_name" : "cloudtablecluster",
        "cloudtable_cluster_id" : "b8c095e2-db5f-4732-8a1d-eacd662e35dc",
        "retry_duration" : 1800,
        "obs_backup_bucket_path" : "obsbackupbucket",
        "backup_file_prefix" : "",
        "opentsdb_schema" : [ {
          "metric" : [ {
            "type" : "Constant",
            "value" : "age"
          } ],
          "timestamp" : {
            "value" : "date",
            "type" : "String",
            "format" : "yyyy/MM/dd HH:mm:ss"
          },
          "value" : {
            "value" : "value",
            "type" : "Bigint"
          },
          "tags" : [ {
            "name" : "name",
            "value" : "name",
            "type" : "Bigint"
          } ]
        } ]
      }
    }

Example Responses

None

Status Codes

Status Code

Description

201

Normal response

Error Codes

See Error Codes.

Kami menggunakan cookie untuk meningkatkan kualitas situs kami dan pengalaman Anda. Dengan melanjutkan penelusuran di situs kami berarti Anda menerima kebijakan cookie kami. Cari tahu selengkapnya

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback