Querying a Dumping Task
Function
This API is used to query a dumping task.
Debugging
You can use API Explorer to debug this API.
URI
GET /v2/{project_id}/connectors/{connector_id}/sink-tasks/{task_id}
| Parameter | Mandatory | Type | Description |
|---|---|---|---|
| project_id | Yes | String | Tenant's project ID. |
| connector_id | Yes | String | Instance dump ID. The value can be obtained from response of the API for querying an instance. |
| task_id | Yes | String | Dumping task ID. |
| Parameter | Mandatory | Type | Description |
|---|---|---|---|
| topic-info | No | String | Whether topic information is contained. The default value is false. |
Request Parameters
None
Response Parameters
Status code: 200
| Parameter | Type | Description |
|---|---|---|
| task_name | String | Name of the dumping task. |
| destination_type | String | Type of the dumping task. |
| create_time | Long | Time when the dumping task is created. |
| status | String | Dumping task status. |
| topics | String | Topic list or topic regular expression of the dumped task. |
| obs_destination_descriptor | obs_destination_descriptor object | Description of the dump. |
| topics_info | Array of topics_info objects | Topic information. |
| Parameter | Type | Description |
|---|---|---|
| consumer_strategy | String | Message consumption policy:
The default value is latest. |
| destination_file_type | String | Dump file format. Currently, only TXT is supported. |
| obs_bucket_name | String | Name of the OBS bucket used to store the data. |
| obs_path | String | OBS path. |
| partition_format | String | Directory structure of the object file written into OBS. The directory structure is in the format of yyyy/MM/dd/HH/mm (time at which the dump task was created).
Default value: empty. After the data is dumped successfully, the storage directory structure is obs_bucket_path/file_prefix/partition_format. The default time zone is GMT+08:00. |
| record_delimiter | String | Delimiter for the dump file, which is used to separate the user data that is written into the dump file. Value range:
Default value: newline (\n). |
| deliver_time_interval | Integer | User-defined interval at which data is imported into OBS. If no data is pushed during the current interval, no dump file package will be generated. Value range: 30 to 900 Default value: 300. Unit: second NOTE: This parameter is mandatory if streaming data is dumped to OBS. |
| obs_part_size | Long | Size (in bytes) of each file to be uploaded. Default value: 5242880. |
| Parameter | Type | Description |
|---|---|---|
| topic | String | Topic name. |
| partitions | Array of partitions objects | Partition list. |
Example Requests
GET https://{endpoint}/v2/{project_id}/connectors/{connector_id}/sink-tasks/{task_id}?topic-info=true Example Responses
Status code: 200
The dumping task is queried successfully.
{
"task_name" : "obsTransfer-56997523",
"destination_type" : "OBS",
"create_time" : 1628126621283,
"status" : "RUNNING",
"topics" : "topic-sdk-no-delete",
"obs_destination_descriptor" : {
"consumer_strategy" : "earliest",
"destination_file_type" : "TEXT",
"obs_bucket_name" : "testobs",
"obs_path" : "obsTransfer-56997523",
"partition_format" : "yyyy/MM/dd/HH/mm",
"record_delimiter" : "\n",
"deliver_time_interval" : 300,
"obs_part_size" : 5242880,
"flush_size" : 1000000,
"connector_class" : "com.huawei.dms.connector.obs.OBSSinkConnector",
"storage_class" : "com.huawei.dms.connector.obs.storage.OBSStorage",
"format_class" : "com.huawei.dms.connector.obs.format.bytearray.ByteArrayFormat",
"schema_generator_class" : "io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator",
"partitioner_class" : "io.confluent.connect.storage.partitioner.TimeBasedPartitioner",
"value_converter" : "org.apache.kafka.connect.converters.ByteArrayConverter",
"key_converter" : "org.apache.kafka.connect.converters.ByteArrayConverter"
},
"topics_info" : [ {
"topic" : "topic-sdk-no-delete",
"partitions" : [ {
"partition_id" : "2",
"status" : "RUNNING",
"last_transfer_offset" : "3",
"log_end_offset" : "3",
"lag" : "0"
}, {
"partition_id" : "1",
"status" : "RUNNING",
"last_transfer_offset" : "3",
"log_end_offset" : "3",
"lag" : "0"
}, {
"partition_id" : "0",
"status" : "RUNNING",
"last_transfer_offset" : "3",
"log_end_offset" : "3",
"lag" : "0"
} ]
} ]
} Status Codes
| Status Code | Description |
|---|---|
| 200 | The dumping task is queried successfully. |
Error Codes
See Error Codes.
Last Article: Modifying Dumping Task Quotas
Next Article: Deleting a Dumping Task
Did this article solve your problem?
Thank you for your score!Your feedback would help us improve the website.