هذه الصفحة غير متوفرة حاليًا بلغتك المحلية. نحن نعمل جاهدين على إضافة المزيد من اللغات. شاكرين تفهمك ودعمك المستمر لنا.

Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Querying the Log Details of a Single Task

Updated on 2023-06-29 GMT+08:00

Function

This API is used to query all logs of a task.

URI

GET /v2/{project_id}/fdi/instances/{instance_id}/tasks/{task_id}/monitor-logs

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID. For details about how to obtain the project ID, see Appendix > Obtaining a Project ID in the ROMA Connect API Reference.

Minimum: 1

Maximum: 30

instance_id

Yes

String

Instance ID.

Minimum: 1

Maximum: 30

task_id

Yes

String

Task ID.

Minimum: 1

Maximum: 30

Table 2 Query Parameters

Parameter

Mandatory

Type

Description

offset

No

Integer

Offset. The query starts from this offset. The value of offset is greater than or equal to 1.

Minimum: 1

Maximum: 999999

Default: 1

limit

No

Integer

Number of records displayed on each page. The maximum value is 999. Excessive records are not returned.

Minimum: 0

Maximum: 999999

Default: 10

begin_time

No

Integer

Start time for the logs to be queried. The value is in the format of timestamp(ms) and uses the UTC time zone.

Minimum: 1

Maximum: 9999999999999

end_time

No

Integer

End time for the logs to be queried. The value is in the format of timestamp(ms) and uses the UTC time zone.

Minimum: 1

Maximum: 9999999999999

Request Parameters

Table 3 Request header parameters

Parameter

Mandatory

Type

Description

X-Auth-Token

Yes

String

User token, which can be obtained by calling the IAM API (value of X-Subject-Token in the response header).

Response Parameters

Status code: 200

Table 4 Response body parameters

Parameter

Type

Description

total

Integer

Total number.

Minimum: 1

Maximum: 99999

size

Integer

Number of logs on the current page.

Minimum: 1

Maximum: 99999

entities

Array of TaskMonitorLog objects

Elements on the current page of task monitoring logs.

Table 5 TaskMonitorLog

Parameter

Type

Description

id

String

Trace ID of a single task.

Minimum: 10

Maximum: 40

start_time

Integer

Start time of the current execution. The value is in the format of timestamp(ms) and uses the UTC time zone.

Minimum: 1

Maximum: 9999999999999

dispatch_time

Integer

Scheduled execution time. The value is in the format of timestamp(ms) and uses the UTC time zone.

Minimum: 1

Maximum: 9999999999999

end_time

Integer

End time of the current execution. The value is in the format of timestamp(ms) and uses the UTC time zone.

Minimum: 1

Maximum: 9999999999999

execute_status

String

Task execution status.

  • UNSTARTED

  • WAITING

  • RUNNING

  • SUCCESS

  • CANCELLED

  • ERROR

Minimum: 10

Maximum: 3

position

String

Phase to which the task is executed.

  • ADAPTER (initialization phase)

  • READER (reader phase)

  • WRITER (writer phase)

Minimum: 10

Maximum: 3

position_status

String

Status of the current task execution phase.

  • NORMAL (currently running)

  • NODE_END (current node ends normally)

  • RUNTIME_CANCEL (task canceled)

  • TASK_END (current task ends normally)

  • RUNTIME_ERR (runtime error)

  • INTERNAL_ERR (internal program error)

Minimum: 20

Maximum: 3

status

Integer

Detailed task execution status. The status codes are classified as follows: 100 to 499 for the reader side, 500 to 899 for the writer side, and 900 and beyond for others.

  • 16 (task forcibly canceled)

  • 99 (task starts)

  • 100 (reader task starts)

  • 101 (reader task ends)

  • 102 (reading data)

  • 103 (reading data source exception)

  • 104 (reading data ends)

  • 105 (0 reading data)

  • 106 (reading task forcibly canceled)

  • 107 (task interrupted in the reader plugin)

  • 108 (reader task resumes)

  • 500 (writer task starts)

  • 501 (writer task ends)

  • 502 (data being written)

  • 503 (destination exception)

  • 504 (data writing ends)

  • 505 (writer task forcibly canceled)

  • 506 (task interrupted in the writer plugin)

  • 507 (writer task resumed)

  • 900 (scheduling request received)

  • 901 (task execution ends)

  • 902 (task execution ends and data integrity check is being performed)

  • 903 (output data integrity check results)

  • 904 (Data loss detected after data integrity check. Data compensation is being performed.)

  • 905 (output data compensation results)

  • 906 (reader tasks being queued (platform resources))

  • 907 (reader tasks rejected because last scheduling not completed yet)

  • 908 (writer tasks being queued (platform resources))

  • 909 (writer task rejected because last scheduling not completed yet)

  • 911 (Reader task not started normally. Check whether the network is normal and whether the parameters are correct.)

  • 912 (Writer task not started normally. Check whether the network is normal and whether the parameters are correct.)

  • 913 (task scheduling request failure)

  • 914 (task rejected because last scheduling not completed yet)

  • 915 (task not running properly)

  • 916 (task log report abnormal)

Minimum: 1

Maximum: 1000

dirty_data_count

Integer

Number of abnormal data records.

Minimum: 0

Maximum: 9999999999999

data_count

Integer

Number of successful data records.

Minimum: 0

Maximum: 9999999999999

data_size

Number

Size of success data. The value is a floating-point number.

Minimum: 0

Maximum: 9999999999999

data_size_unit

String

Unit for measuring the size of success data.

Minimum: 2

Maximum: 1

spend_time

Integer

Execution duration, in milliseconds.

Minimum: 0

Maximum: 9999999999999

read_spend_time

Integer

Read execution duration, in milliseconds. This parameter is available only for scheduled tasks.

Minimum: 0

Maximum: 9999999999999

write_spend_time

Integer

Write execution duration, in milliseconds.

Minimum: 0

Maximum: 9999999999999

remarks

String

Brief information about the execution result.

Minimum: 0

Maximum: 1000

detail_logs

Array of TaskMonitorDetailLog objects

Detailed track information about the current execution.

Table 6 TaskMonitorDetailLog

Parameter

Type

Description

id

String

Unique ID generated each time when a task is executed.

Minimum: 10

Maximum: 40

status

Integer

Detailed task execution status. The status codes are classified as follows: 100 to 499 for the reader side, 500 to 899 for the writer side, and 900 and beyond for others.

  • 16 (task forcibly canceled)

  • 99 (task starts)

  • 100 (reader task starts)

  • 101 (reader task ends)

  • 102 (reading data)

  • 103 (reading data source exception)

  • 104 (reading data ends)

  • 105 (0 reading data)

  • 106 (reading task forcibly canceled)

  • 107 (task interrupted in the reader plugin)

  • 108 (reader task resumes)

  • 500 (writer task starts)

  • 501 (writer task ends)

  • 502 (data being written)

  • 503 (destination exception)

  • 504 (data writing ends)

  • 505 (writer task forcibly canceled)

  • 506 (task interrupted in the writer plugin)

  • 507 (writer task resumed)

  • 900 (scheduling request received)

  • 901 (task execution ends)

  • 902 (task execution ends and data integrity check is being performed)

  • 903 (output data integrity check results)

  • 904 (Data loss detected after data integrity check. Data compensation is being performed.)

  • 905 (output data compensation results)

  • 906 (reader tasks being queued (platform resources))

  • 907 (reader tasks rejected because last scheduling not completed yet)

  • 908 (writer tasks being queued (platform resources))

  • 909 (writer task rejected because last scheduling not completed yet)

  • 911 (Reader task not started normally. Check whether the network is normal and whether the parameters are correct.)

  • 912 (Writer task not started normally. Check whether the network is normal and whether the parameters are correct.)

  • 913 (task scheduling request failure)

  • 914 (task rejected because last scheduling not completed yet)

  • 915 (task not running properly)

  • 916 (task log report abnormal)

Minimum: 1

Maximum: 1000

position

String

Phase at which the current step is.

  • ADAPTER (initialization phase)

  • READER (reader phase)

  • WRITER (writer phase)

Minimum: 10

Maximum: 3

position_status

String

Status of the current step of a task.

  • NORMAL (currently running)

  • NODE_END (current node ends normally)

  • RUNTIME_CANCEL (task canceled)

  • TASK_END (current task ends normally)

  • RUNTIME_ERR (runtime error)

  • INTERNAL_ERR (internal program error)

Minimum: 20

Maximum: 3

stage

String

FDI plug-in to which the current step belongs, for example, adapter, apireader, and rdbwriter.

Minimum: 20

Maximum: 3

dirty_data_count

Integer

Number of abnormal data records.

Minimum: 0

Maximum: 9999999999999

data_count

Integer

Number of successful data records.

Minimum: 0

Maximum: 9999999999999

data_size

Number

Size of success data. The value is a floating-point number.

Minimum: 0

Maximum: 9999999999999

data_size_unit

String

Unit for measuring the size of success data.

Minimum: 2

Maximum: 1

spend_time

Integer

Execution duration, in milliseconds.

Minimum: 0

Maximum: 9999999999999

remarks

String

Execution details.

Minimum: 256

Maximum: 1

step_begin_time

Integer

Start time of this step. The value is in the format of timestamp(ms) and uses the UTC time zone.

Minimum: 1

Maximum: 9999999999999

step_end_time

Integer

End time of this step. The value is in the format of timestamp(ms) and uses the UTC time zone.

Minimum: 1

Maximum: 9999999999999

Status code: 400

Table 7 Response body parameters

Parameter

Type

Description

error_code

String

Error code.

Minimum: 8

Maximum: 36

error_msg

String

Error message.

Minimum: 2

Maximum: 512

Status code: 404

Table 8 Response body parameters

Parameter

Type

Description

error_code

String

Error code.

Minimum: 8

Maximum: 36

error_msg

String

Error message.

Minimum: 2

Maximum: 512

Status code: 500

Table 9 Response body parameters

Parameter

Type

Description

error_code

String

Error code.

Minimum: 8

Maximum: 36

error_msg

String

Error message.

Minimum: 2

Maximum: 512

Example Requests

None

Example Responses

None

Status Codes

Status Code

Description

200

OK

400

Bad Request

404

Not Found

500

Internal Server Error

Error Codes

See Error Codes.

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback