Updated on 2025-11-28 GMT+08:00

Performing a Global Search

Function

This API is used to perform a global search.

Calling Method

For details, see Calling APIs.

Authorization Information

Each account has all the permissions required to call all APIs, but IAM users must be assigned the required permissions.

  • If you are using role/policy-based authorization, see Permissions Policies and Supported Actions for details on the required permissions.
  • If you are using identity policy-based authorization, no identity policy-based permission required for calling this API.

URI

GET /v2/{project_id}/factory/search

Table 1 Path Parameters

Parameter

Mandatory

Type

Description

project_id

Yes

String

Project ID. For details about how to obtain a project ID, see Project ID and Account ID.

Table 2 Query Parameters

Parameter

Mandatory

Type

Description

workspace_id

No

String

Search space range:

By default, no parameter is transferred, and the search is performed in all spaces.

  • Current workspace ID: ID of the current workspace

search_text

Yes

String

Global search keyword. Enter at least two characters.

job_type

No

String

Job type: (Multiple choices) Example: job_type=BATCH

which default to all

  • BATCH: batch job

  • REAL_TIME: stream job

script_type

No

String

Script type: (Multiple choices) Example: script_type=HIVE,DLI.

By default, all types of scripts are not filtered.

  • HIVE: Hive SQL

  • DLI: DLI SQL

  • DWS: DWS SQL

  • SparkSQL: Spark SQL

  • SparkPython: Spark Python

  • FlinkSQL: Flink SQL

  • RDS: RDS SQL

  • PRESTO: Presto SQL

  • HETUENGINE: HeruEngine

  • ClickHouse: ClickHouse

  • IMPALA: Impala SQL

  • SHELL: Shell

  • PYTHON: Python

node_type

No

String

Node Type: (Multiple choices) List of node types. Example: node_type=com.cloud.datacraft.processactivity.ExecuteHiveJob

The default value is all.

  • com.cloud.datacraft.processactivity.ExecuteHiveJob: MRS Hive SQL

  • com.cloud.datacraft.activity.ExecuteSparkSQL: MRS Spark SQL

  • com.cloud.datacraft.activity.MRSSparkPython: MRS Spark Python

  • com.cloud.datacraft.processactivity.ExecuteImpalaJob: MRS Impala SQL

  • com.cloud.datacraft.activity.DLISQL: DLI SQL

  • com.cloud.datacraft.activity.DliFlinkJob: DLI Flink Job

  • com.cloud.datacraft.processactivity.ExecuteDWSJob: DWS SQL

  • com.cloud.datacraft.activity.ExecuteQuery: RDS SQL

  • com.cloud.datacraft.activity.MRSPrestoSQL: MRS Presto SQL

  • com.cloud.datacraft.processactivity.ExecuteScript: Shell

  • com.cloud.datacraft.processactivity.ExecutePythonScript: Python

  • com.cloud.datacraft.processactivity.ExecuteClickHouseJob: ClickHouse

  • com.cloud.datacraft.processactivity.ExecuteHetuEngineJob: HetuEngine

  • com.cloud.datacraft.activity.DataMigration: DataMigration

new_save_or_commit

No

String

Latest modification: Example: new_save_or_commit=save

The default value is save.

  • save: latest saved

  • commit: latest commit

owners

No

String

Owner name: (Multiple choices) personnel list or my node. Example: owners=dayu_wm

By default, owners are not filtered.

doc_types

No

String

Search scope: (Multiple choices) Example: doc_types=script

which default to all

  • node: development job

  • script: script

begin_time

No

Long

Start time, which is used together with the end time. By default, there is no time range. Example: begin_time=1746633600000

end_time

No

Long

End time, which is used together with the start time. By default, there is no time range. Example: endTime=1746806399999

limit

No

Integer

The maximum number of records on each page. The value ranges from 1 to 100. Example: limit=10

The default value is 10.

offset

No

Integer

Start page. The value must be greater than or equal to 0. Example: offset=0

Default value: 0.

if_query_parameters

No

String

Whether to search for configuration parameters, for example, if_query_parameters=false.

The default value is false, indicating that configuration parameters are not searched.

  • true: Yes.

  • false: No.

match_type

No

Integer

Matching mode, for example, match_type=0.

The default value is 0.

  • 0: general

  • 1: fuzzy

schedule_state

No

String

Scheduling status: Only the job search scenario is supported. Set new_save_or_commit to commit.

The default value is all.

  • running: scheduled

  • stop: not scheduled

is_exact

No

String

Whether to perform exact search. This parameter is used together with the exact_field parameter.

The default value is false, indicating non-exact search.

  • true: exact search

  • false: non-exact search

exact_field

No

String

Field for exact search, which takes effect when exact search is enabled.

  • jobName: job name

  • scriptName: script name

  • jobId: job ID

  • scriptId: script ID

Request Parameters

Table 3 Request header parameters

Parameter

Mandatory

Type

Description

workspace

Yes

String

Workspace ID. For details about how to obtain the workspace ID, see Instance ID and Workspace ID.

X-Auth-Token

No

String

IAM token, which is obtained by calling the IAM API for obtaining a user token (value of X-Subject-Token in the response header). This parameter is mandatory when token authentication is used. The value contains 0 to 4096 characters.

Response Parameters

Status code: 200

Table 4 Response body parameters

Parameter

Type

Description

limit

Integer

Page size limit.

The value range is [1, 100].

offset

Integer

Current page.

search_details

Array of SearchDetailV2 objects

The query is successful and the search result is returned.

total_hits

Long

Number of successful hits.

Table 5 SearchDetailV2

Parameter

Type

Description

id

String

Unique identifier

tenant_id

String

Tenant ID.

project_id

String

The project ID.

dgc_instance_id

String

DGC instance ID

workspace

String

Workspace ID

doc_type

String

Search scope: (multiple choices)

which default to all

  • node: development job

  • script: script

owner

String

Owner

new_save_or_commit

String

Latest modification:

The default value is save.

  • save: latest saved

  • commit: latest commit

version

Integer

Digital version number or draft ID

last_modified_time

Long

Last Modified

job_name

String

Job Name

job_type

String

Job type: (Multiple choices) Example: job_type=BATCH

which default to all

  • BATCH: batch job

  • REAL_TIME: stream job

job_params

String

Job parameter

node_name

String

Node name

node_type

String

Node Type: (Multiple choices) List of node types.

which default to all

  • com.cloud.datacraft.processactivity.ExecuteHiveJob: MRS Hive SQL

  • com.cloud.datacraft.activity.ExecuteSparkSQL: MRS Spark SQL

  • com.cloud.datacraft.activity.MRSSparkPython: MRS Spark Python

  • com.cloud.datacraft.processactivity.ExecuteImpalaJob: MRS Impala SQL

  • com.cloud.datacraft.activity.DLISQL: DLI SQL

  • com.cloud.datacraft.activity.DliFlinkJob: DLI Flink Job

  • com.cloud.datacraft.processactivity.ExecuteDWSJob: DWS SQL

  • com.cloud.datacraft.activity.ExecuteQuery: RDS SQL

  • com.cloud.datacraft.activity.MRSPrestoSQL: MRS Presto SQL

  • com.cloud.datacraft.processactivity.ExecuteScript: Shell

  • com.cloud.datacraft.processactivity.ExecutePythonScript: Python

  • com.cloud.datacraft.processactivity.ExecuteClickHouseJob: ClickHouse

  • com.cloud.datacraft.processactivity.ExecuteHetuEngineJob: HetuEngine

  • com.cloud.datacraft.activity.DataMigration: DataMigration

script_name

String

Script name

script_type

String

Script type: (Multiple choices) Example: script_type=HIVE,DLI.

By default, all types of scripts are not filtered.

  • HIVE: Hive SQL

  • DLI: DLI SQL

  • DWS: DWS SQL

  • SparkSQL: Spark SQL

  • Spark Python: Spark Python

  • FlinkSQL: Flink SQL

  • RDS: RDS SQL

  • PRESTO: Presto SQL

  • HETUENGINE: HeruEngine

  • ClickHouse: ClickHouse

  • IMPALA: Impala SQL

  • SHELL: Shell

  • PYTHON: Python

script_params

String

Script Params

cdm_cluster_name

String

CDM Cluster Name

cdm_job_name

String

CDM Job Name

cdm_params

String

CDM parameters

workspace_name

String

Workspace name

job_id

String

Task ID

script_id

String

Script ID.

single_node_job_type

String

Single-node job type

schedule_state

String

Scheduling status:

which default to all

  • running: scheduled

  • stop: not scheduled

Status code: 400

Table 6 Response body parameters

Parameter

Type

Description

error_code

String

Error code.

error_msg

String

Error description.

Example Requests

Search for keyword "select". The current page is the first page. A maximum of 10 records can be displayed on a page.

GET /v2/62099355b894428e8916573ae635f1f9/search?search_text=select&offset=0&limit=10

Example Responses

Status code: 200

The query is successful and the search result is returned.

{
  "limit" : 10,
  "offset" : 0,
  "search_details" : [ {
    "dgc_instance_id" : "9e763f354c6e46e1a87e2f63fc471d85",
    "doc_type" : "node",
    "id" : "ClvD7ZQBjnPSvBMriixA",
    "job_id" : "549",
    "job_name" : "job_9944",
    "job_type" : "BATCH",
    "last_modified_time" : 1739155370115,
    "new_save_or_commit" : "save",
    "node_name" : "DLI_SQL_4403",
    "node_type" : "com.cloud.datacraft.activity.DLISQL",
    "owner" : "ei_dlf_l00341563",
    "project_id" : "62099355b894428e8916573ae635f1f9",
    "tenant_id" : "61c2cde19d9c47cfb2123d4c8ec0d1e8",
    "version" : 0,
    "workspace" : "1390a9d8c7154126ad662be0f2ebe2e0",
    "workspace_name" : "testRDS"
  } ],
  "total_hits" : 1
}

Status code: 400

If the query fails, an error message is returned.

{
  "error_code" : "DLF.3051",
  "error_msg" : "The request parameter is invalid."
}

SDK Sample Code

The SDK sample code is as follows.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
package com.huaweicloud.sdk.test;

import com.huaweicloud.sdk.core.auth.ICredential;
import com.huaweicloud.sdk.core.auth.BasicCredentials;
import com.huaweicloud.sdk.core.exception.ConnectionException;
import com.huaweicloud.sdk.core.exception.RequestTimeoutException;
import com.huaweicloud.sdk.core.exception.ServiceResponseException;
import com.huaweicloud.sdk.dataartsstudio.v1.region.DataArtsStudioRegion;
import com.huaweicloud.sdk.dataartsstudio.v1.*;
import com.huaweicloud.sdk.dataartsstudio.v1.model.*;


public class ShowFactoryFullTextSolution {

    public static void main(String[] args) {
        // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
        // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
        String ak = System.getenv("CLOUD_SDK_AK");
        String sk = System.getenv("CLOUD_SDK_SK");
        String projectId = "{project_id}";

        ICredential auth = new BasicCredentials()
                .withProjectId(projectId)
                .withAk(ak)
                .withSk(sk);

        DataArtsStudioClient client = DataArtsStudioClient.newBuilder()
                .withCredential(auth)
                .withRegion(DataArtsStudioRegion.valueOf("<YOUR REGION>"))
                .build();
        ShowFactoryFullTextRequest request = new ShowFactoryFullTextRequest();
        try {
            ShowFactoryFullTextResponse response = client.showFactoryFullText(request);
            System.out.println(response.toString());
        } catch (ConnectionException e) {
            e.printStackTrace();
        } catch (RequestTimeoutException e) {
            e.printStackTrace();
        } catch (ServiceResponseException e) {
            e.printStackTrace();
            System.out.println(e.getHttpStatusCode());
            System.out.println(e.getRequestId());
            System.out.println(e.getErrorCode());
            System.out.println(e.getErrorMsg());
        }
    }
}
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
# coding: utf-8

import os
from huaweicloudsdkcore.auth.credentials import BasicCredentials
from huaweicloudsdkdataartsstudio.v1.region.dataartsstudio_region import DataArtsStudioRegion
from huaweicloudsdkcore.exceptions import exceptions
from huaweicloudsdkdataartsstudio.v1 import *

if __name__ == "__main__":
    # The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    # In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak = os.environ["CLOUD_SDK_AK"]
    sk = os.environ["CLOUD_SDK_SK"]
    projectId = "{project_id}"

    credentials = BasicCredentials(ak, sk, projectId)

    client = DataArtsStudioClient.new_builder() \
        .with_credentials(credentials) \
        .with_region(DataArtsStudioRegion.value_of("<YOUR REGION>")) \
        .build()

    try:
        request = ShowFactoryFullTextRequest()
        response = client.show_factory_full_text(request)
        print(response)
    except exceptions.ClientRequestException as e:
        print(e.status_code)
        print(e.request_id)
        print(e.error_code)
        print(e.error_msg)
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
package main

import (
	"fmt"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/core/auth/basic"
    dataartsstudio "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/dataartsstudio/v1"
	"github.com/huaweicloud/huaweicloud-sdk-go-v3/services/dataartsstudio/v1/model"
    region "github.com/huaweicloud/huaweicloud-sdk-go-v3/services/dataartsstudio/v1/region"
)

func main() {
    // The AK and SK used for authentication are hard-coded or stored in plaintext, which has great security risks. It is recommended that the AK and SK be stored in ciphertext in configuration files or environment variables and decrypted during use to ensure security.
    // In this example, AK and SK are stored in environment variables for authentication. Before running this example, set environment variables CLOUD_SDK_AK and CLOUD_SDK_SK in the local environment
    ak := os.Getenv("CLOUD_SDK_AK")
    sk := os.Getenv("CLOUD_SDK_SK")
    projectId := "{project_id}"

    auth := basic.NewCredentialsBuilder().
        WithAk(ak).
        WithSk(sk).
        WithProjectId(projectId).
        Build()

    client := dataartsstudio.NewDataArtsStudioClient(
        dataartsstudio.DataArtsStudioClientBuilder().
            WithRegion(region.ValueOf("<YOUR REGION>")).
            WithCredential(auth).
            Build())

    request := &model.ShowFactoryFullTextRequest{}
	response, err := client.ShowFactoryFullText(request)
	if err == nil {
        fmt.Printf("%+v\n", response)
    } else {
        fmt.Println(err)
    }
}

For SDK sample code of more programming languages, see the Sample Code tab in API Explorer. SDK sample code can be automatically generated.

Status Codes

Status Code

Description

200

The query is successful and the search result is returned.

400

If the query fails, an error message is returned.