Updated on 2025-11-24 GMT+08:00

Performing a Global Search

Function

This API is used to perform a global search.

Calling Method

For details, see Calling APIs.

URI

GET /v2/{project_id}/factory/search

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID. For details about how to obtain a project ID, see Project ID and Account ID.

Table 2 Query Parameters

Parameter

Mandatory

Type

Description

workspace_id

No

String

Search space range:

By default, no parameter is transferred, and the search is performed in all spaces.

  • Current workspace ID: ID of the current workspace

search_text

Yes

String

Global search keyword. Enter at least two characters.

job_type

No

String

Job type: (Multiple choices) Example: job_type=BATCH

which default to all

  • BATCH: batch job

  • REAL_TIME: stream job

script_type

No

String

Script type: (Multiple choices) Example: script_type=HIVE,DLI.

By default, all types of scripts are not filtered.

  • HIVE: Hive SQL

  • DLI: DLI SQL

  • DWS: DWS SQL

  • SparkSQL: Spark SQL

  • SparkPython: Spark Python

  • FlinkSQL: Flink SQL

  • RDS: RDS SQL

  • PRESTO: Presto SQL

  • HETUENGINE: HeruEngine

  • ClickHouse: ClickHouse

  • IMPALA: Impala SQL

  • SHELL: Shell

  • PYTHON: Python

node_type

No

String

Node Type: (Multiple choices) List of node types. Example: node_type=com.cloud.datacraft.processactivity.ExecuteHiveJob

The default value is all.

  • com.cloud.datacraft.processactivity.ExecuteHiveJob: MRS Hive SQL

  • com.cloud.datacraft.activity.ExecuteSparkSQL: MRS Spark SQL

  • com.cloud.datacraft.activity.MRSSparkPython: MRS Spark Python

  • com.cloud.datacraft.processactivity.ExecuteImpalaJob: MRS Impala SQL

  • com.cloud.datacraft.activity.DLISQL: DLI SQL

  • com.cloud.datacraft.activity.DliFlinkJob: DLI Flink Job

  • com.cloud.datacraft.processactivity.ExecuteDWSJob: DWS SQL

  • com.cloud.datacraft.activity.ExecuteQuery: RDS SQL

  • com.cloud.datacraft.activity.MRSPrestoSQL: MRS Presto SQL

  • com.cloud.datacraft.processactivity.ExecuteScript: Shell

  • com.cloud.datacraft.processactivity.ExecutePythonScript: Python

  • com.cloud.datacraft.processactivity.ExecuteClickHouseJob: ClickHouse

  • com.cloud.datacraft.processactivity.ExecuteHetuEngineJob: HetuEngine

  • com.cloud.datacraft.activity.DataMigration: DataMigration

new_save_or_commit

No

String

Latest modification: Example: new_save_or_commit=save

The default value is save.

  • save: latest saved

  • commit: latest commit

owners

No

String

Owner name: (Multiple choices) personnel list or my node. Example: owners=dayu_wm

By default, owners are not filtered.

doc_types

No

String

Search scope: (Multiple choices) Example: doc_types=script

which default to all

  • node: development job

  • script: script

begin_time

No

Long

Start time, which is used together with the end time. By default, there is no time range. Example: begin_time=1746633600000

end_time

No

Long

End time, which is used together with the start time. By default, there is no time range. Example: endTime=1746806399999

limit

No

Integer

The maximum number of records on each page. The value ranges from 1 to 100. Example: limit=10

The default value is 10.

offset

No

Integer

Start page. The value must be greater than or equal to 0. Example: offset=0

Default value: 0.

if_query_parameters

No

String

Whether to search for configuration parameters, for example, if_query_parameters=false.

The default value is false, indicating that configuration parameters are not searched.

  • true: Yes.

  • false: No.

match_type

No

Integer

Matching mode, for example, match_type=0.

The default value is 0.

  • 0: general

  • 1: fuzzy

schedule_state

No

String

Scheduling status: Only the job search scenario is supported. Set new_save_or_commit to commit.

The default value is all.

  • running: scheduled

  • stop: not scheduled

is_exact

No

String

Whether to perform exact search. This parameter is used together with the exact_field parameter.

The default value is false, indicating non-exact search.

  • true: exact search

  • false: non-exact search

exact_field

No

String

Field for exact search, which takes effect when exact search is enabled.

  • jobName: job name

  • scriptName: script name

  • jobId: job ID

  • scriptId: script ID

Request Parameters

Table 3 Request header parameters

Parameter

Mandatory

Type

Description

workspace

Yes

String

Workspace ID. For details about how to obtain the workspace ID, see Instance ID and Workspace ID.

X-Auth-Token

No

String

IAM token, which is obtained by calling the IAM API for obtaining a user token (value of X-Subject-Token in the response header). This parameter is mandatory when token authentication is used. The value contains 0 to 4096 characters.

Response Parameters

Status code: 200

Table 4 Response body parameters

Parameter

Type

Description

limit

Integer

Page size limit.

The value range is [1, 100].

offset

Integer

Current page.

search_details

Array of SearchDetailV2 objects

The query is successful and the search result is returned.

total_hits

Long

Number of successful hits.

Table 5 SearchDetailV2

Parameter

Type

Description

id

String

Unique identifier

tenant_id

String

Tenant ID.

project_id

String

The project ID.

dgc_instance_id

String

DGC instance ID

workspace

String

Workspace ID

doc_type

String

Search scope: (multiple choices)

which default to all

  • node: development job

  • script: script

owner

String

Owner

new_save_or_commit

String

Latest modification:

The default value is save.

  • save: latest saved

  • commit: latest commit

version

Integer

Digital version number or draft ID

last_modified_time

Long

Last Modified

job_name

String

Job Name

job_type

String

Job type: (Multiple choices) Example: job_type=BATCH

which default to all

  • BATCH: batch job

  • REAL_TIME: stream job

job_params

String

Job parameter

node_name

String

Node name

node_type

String

Node Type: (Multiple choices) List of node types.

which default to all

  • com.cloud.datacraft.processactivity.ExecuteHiveJob: MRS Hive SQL

  • com.cloud.datacraft.activity.ExecuteSparkSQL: MRS Spark SQL

  • com.cloud.datacraft.activity.MRSSparkPython: MRS Spark Python

  • com.cloud.datacraft.processactivity.ExecuteImpalaJob: MRS Impala SQL

  • com.cloud.datacraft.activity.DLISQL: DLI SQL

  • com.cloud.datacraft.activity.DliFlinkJob: DLI Flink Job

  • com.cloud.datacraft.processactivity.ExecuteDWSJob: DWS SQL

  • com.cloud.datacraft.activity.ExecuteQuery: RDS SQL

  • com.cloud.datacraft.activity.MRSPrestoSQL: MRS Presto SQL

  • com.cloud.datacraft.processactivity.ExecuteScript: Shell

  • com.cloud.datacraft.processactivity.ExecutePythonScript: Python

  • com.cloud.datacraft.processactivity.ExecuteClickHouseJob: ClickHouse

  • com.cloud.datacraft.processactivity.ExecuteHetuEngineJob: HetuEngine

  • com.cloud.datacraft.activity.DataMigration: DataMigration

script_name

String

Script name

script_type

String

Script type: (Multiple choices) Example: script_type=HIVE,DLI.

By default, all types of scripts are not filtered.

  • HIVE: Hive SQL

  • DLI: DLI SQL

  • DWS: DWS SQL

  • SparkSQL: Spark SQL

  • Spark Python: Spark Python

  • FlinkSQL: Flink SQL

  • RDS: RDS SQL

  • PRESTO: Presto SQL

  • HETUENGINE: HeruEngine

  • ClickHouse: ClickHouse

  • IMPALA: Impala SQL

  • SHELL: Shell

  • PYTHON: Python

script_params

String

Script Params

cdm_cluster_name

String

CDM Cluster Name

cdm_job_name

String

CDM Job Name

cdm_params

String

CDM parameters

workspace_name

String

Workspace name

job_id

String

Task ID

script_id

String

Script ID.

single_node_job_type

String

Single-node job type

schedule_state

String

Scheduling status:

which default to all

  • running: scheduled

  • stop: not scheduled

Status code: 400

Table 6 Response body parameters

Parameter

Type

Description

error_code

String

Error code.

error_msg

String

Error description.

Example Requests

Search for keyword "select". The current page is the first page. A maximum of 10 records can be displayed on a page.

GET /v2/62099355b894428e8916573ae635f1f9/search?search_text=select&offset=0&limit=10

Example Responses

Status code: 200

The query is successful and the search result is returned.

{
  "limit" : 10,
  "offset" : 0,
  "search_details" : [ {
    "dgc_instance_id" : "9e763f354c6e46e1a87e2f63fc471d85",
    "doc_type" : "node",
    "id" : "ClvD7ZQBjnPSvBMriixA",
    "job_id" : "549",
    "job_name" : "job_9944",
    "job_type" : "BATCH",
    "last_modified_time" : 1739155370115,
    "new_save_or_commit" : "save",
    "node_name" : "DLI_SQL_4403",
    "node_type" : "com.cloud.datacraft.activity.DLISQL",
    "owner" : "ei_dlf_l00341563",
    "project_id" : "62099355b894428e8916573ae635f1f9",
    "tenant_id" : "61c2cde19d9c47cfb2123d4c8ec0d1e8",
    "version" : 0,
    "workspace" : "1390a9d8c7154126ad662be0f2ebe2e0",
    "workspace_name" : "testRDS"
  } ],
  "total_hits" : 1
}

Status code: 400

If the query fails, an error message is returned.

{
  "error_code" : "DLF.3051",
  "error_msg" : "The request parameter is invalid."
}

Status Codes

Status Code

Description

200

The query is successful and the search result is returned.

400

If the query fails, an error message is returned.