หน้านี้ยังไม่พร้อมใช้งานในภาษาท้องถิ่นของคุณ เรากำลังพยายามอย่างหนักเพื่อเพิ่มเวอร์ชันภาษาอื่น ๆ เพิ่มเติม ขอบคุณสำหรับการสนับสนุนเสมอมา

Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Job-related SDKs

Updated on 2025-02-11 GMT+08:00

Importing Data

DLI provides an API for importing data. You can use it to import data stored in OBS to a created DLI or OBS table. The example code is as follows:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
//Instantiate the importJob object. The input parameters of the constructor include the queue, database name, table name (obtained by instantiating the Table object), and data path.
private static void importData(Queue queue, Table DLITable) throws DLIException {
    String dataPath = "OBS Path";
    queue = client.getQueue("queueName");
    CsvFormatInfo formatInfo = new CsvFormatInfo();
    formatInfo.setWithColumnHeader(true);
    formatInfo.setDelimiter(",");
    formatInfo.setQuoteChar("\"");
    formatInfo.setEscapeChar("\\");
    formatInfo.setDateFormat("yyyy/MM/dd");
    formatInfo.setTimestampFormat("yyyy-MM-dd HH:mm:ss");
    String dbName = DLITable.getDb().getDatabaseName();
    String tableName = DLITable.getTableName();
    ImportJob importJob = new ImportJob(queue, dbName, tableName, dataPath);
    importJob.setStorageType(StorageType.CSV);
    importJob.setCsvFormatInfo(formatInfo);
    System.out.println("start submit import table: " + DLITable.getTableName());
    //Call the submit interface of the ImportJob object to submit the data importing job.
    importJob.submit(); //Call the getStatus interface of the ImportJob object to query the status of the data importing job.
    JobStatus status = importJob.getStatus();
    System.out.println("Job id: " + importJob.getJobId() + ", Status : " + status.getName());
}
NOTE:
  • Before submitting the data importing job, you can set the format of the data to be imported. In the sample code, the setStorageType interface of the ImportJob object is called to set the data storage type to csv. The data format is set by calling the setCsvFormatInfo interface of the ImportJob object.
  • Before submitting the data import job, you can set the partition of the data to be imported and whether to overwrite the data. You can call the setPartitionSpec API of the ImportJob object to set the partition information, for example, importJob.setPartitionSpec(new PartitionSpec("part1=value1,part2=value2")). You can also create the partition using parameters when creating the ImportJob object. By default, data is appended to an import job. To overwrite the existing data, call the setOverWrite API of the ImportJob object, for example, importJob.setOverWrite(Boolean.TRUE).
  • If a folder and a file under an OBS bucket directory have the same name, data is preferentially loaded to the file, instead of the folder. It is recommended that the files and folders of the same level have different names when you create an OBS object.

Importing the Partition Data

DLI provides an API for importing data. You can use it to import data stored in OBS to a specified partition of the created DLI or OBS table. The example code is as follows:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
//Instantiate the importJob object. The input parameters of the constructor include the queue, database name, table name (obtained by instantiating the Table object), and data path.
private static void importData(Queue queue, Table DLITable) throws DLIException {
    String dataPath = "OBS Path";
    queue = client.getQueue("queueName");
    CsvFormatInfo formatInfo = new CsvFormatInfo();
    formatInfo.setWithColumnHeader(true);
    formatInfo.setDelimiter(",");
    formatInfo.setQuoteChar("\"");
    formatInfo.setEscapeChar("\\");
    formatInfo.setDateFormat("yyyy/MM/dd");
    formatInfo.setTimestampFormat("yyyy-MM-dd HH:mm:ss");
    String dbName = DLITable.getDb().getDatabaseName();
    String tableName = DLITable.getTableName();
    PartitionSpec partitionSpec = new PartitionSpec("part1=value1,part2=value2");
    Boolean isOverWrite = true;
    ImportJob importJob = new ImportJob(queue, dbName, tableName, dataPath, partitionSpec, isOverWrite);
    importJob.setStorageType(StorageType.CSV);
    importJob.setCsvFormatInfo(formatInfo);
    System.out.println("start submit import table: " + DLITable.getTableName());
    //Call the submit interface of the ImportJob object to submit the data importing job.
    importJob.submit(); //Call the getStatus interface of the ImportJob object to query the status of the data importing job.
    JobStatus status = importJob.getStatus();
    System.out.println("Job id: " + importJob.getJobId() + ", Status : " + status.getName());
}
NOTE:
  • When the ImportJob object is created, the partition information PartitionSpec can also be directly transferred as the partition character string.
  • If some columns are specified as partition columns during partitionSpec import but the imported data contains only the specified partition information, the unspecified partition columns after data import contain abnormal values such as null.
  • In the example, isOverWrite indicates whether to overwrite data. The value true indicates that data is overwritten, and the value false indicates that data is appended. Currently, overwrite is not supported to overwrite the entire table. Only the specified partition can be overwritten. To append data to a specified partition, set isOverWrite to false when creating the import job.

Exporting Data

DLI provides an API for exporting data. You can use it to export data from a DLI table to OBS. The example code is as follows:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
//Instantiate the ExportJob object and transfer the queue, database name, table name (obtained by instantiating the Table object), and storage path of the exported data. The table type must be MANAGED.
private static void exportData(Queue queue, Table DLITable) throws DLIException {
    String dataPath = "OBS Path";
    queue = client.getQueue("queueName");
    String dbName = DLITable.getDb().getDatabaseName();
    String tableName = DLITable.getTableName();
    ExportJob exportJob = new ExportJob(queue, dbName, tableName, dataPath);
    exportJob.setStorageType(StorageType.CSV);
    exportJob.setCompressType(CompressType.GZIP);
    exportJob.setExportMode(ExportMode.ERRORIFEXISTS);
    System.out.println("start export DLI Table data...");
    // Call the submit interface of the ExportJob object to submit the data exporting job.
    exportJob.submit();
    // Call the getStatus interface of the ExportJob object to query the status of the data exporting job.
    JobStatus status = exportJob.getStatus();
    System.out.println("Job id: " + exportJob.getJobId() + ", Status : " + status.getName());
}
NOTE:
  • Before submitting the data exporting job, you can optionally set the data format, compression type, and export mode. In the preceding sample code, the setStorageType, setCompressType, and setExportMode interfaces of the ExportJob object are called to set the data format, compression type, and export mode, respectively. The setStorageType interface supports only the CSV format.
  • If a folder and a file under an OBS bucket directory have the same name, data is preferentially loaded to the file, instead of the folder. It is recommended that the files and folders of the same level have different names when you create an OBS object.

Submitting a Job

DLI provides APIs for submitting and querying jobs. You can submit a job by calling the API. You can also call the API to query the job result. The example code is as follows:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
//Instantiate the SQLJob object and construct input parameters for executing SQL, including the queue, database name, and SQL statements.
private static void runSqlJob(Queue queue, Table obsTable) throws DLIException {
    String sql = "select * from " + obsTable.getTableName();
    String queryResultPath = "OBS Path";
    SQLJob sqlJob = new SQLJob(queue, obsTable.getDb().getDatabaseName(), sql);
    System.out.println("start submit SQL job...");
    // Call the submit interface of the SQLJob object to submit the querying job.
    sqlJob.submit();
    // Call the getStatus interface of the SQLJob object to query the status of the querying job.
    JobStatus status = sqlJob.getStatus();
    System.out.println(status);
    System.out.println("start export Result...");
    //Call the exportResult interface of the SQLJob object to export the query result. queryResultPath refers to the path of the data to be exported.
    sqlJob.exportResult(queryResultPath, StorageType.CSV,
            CompressType.GZIP, ExportMode.ERRORIFEXISTS, null);
    System.out.println("Job id: " + sqlJob.getJobId() + ", Status : " + status.getName());
}

Canceling a Job

DLI provides an API for canceling jobs. You can use it to cancel all jobs in the Launching or Running state. The following sample code is used for canceling jobs in the Launching state:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
private static void cancelSqlJob(DLIClient client) throws DLIException {

  List<JobResultInfo> jobResultInfos = client.listAllJobs(JobType.QUERY);
  for (JobResultInfo jobResultInfo : jobResultInfos) {
      //Cancel jobs in the LAUNCHING state.
      if (JobStatus.LAUNCHING.equals(jobResultInfo.getJobStatus())) {
          //Cancel the job of a specific job ID.
          client.cancelJob(jobResultInfo.getJobId());
      }
  }
}

Querying All Jobs

DLI provides an API for querying jobs. You can use it to query all jobs of the current project. The example code is as follows:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
private static void listAllSqlJobs(DLIClient client) throws DLIException {
   //Return the collection of JobResultInfo lists.
    List < JobResultInfo > jobResultInfos = client.listAllJobs();
   //Traverse the JobResultInfo lists to view job information.
    for (JobResultInfo jobResultInfo: jobResultInfos) {
        //job id
        System.out.println(jobResultInfo.getJobId());
        //Job description
        System.out.println(jobResultInfo.getDetail());
        //job status
        System.out.println(jobResultInfo.getJobStatus());
        //job type
        System.out.println(jobResultInfo.getJobType());
    }
   //Filter the query result by job type.
    List < JobResultInfo > jobResultInfos1 = client.listAllJobs(JobType.DDL);
   //Filter the query result by job type and start time that is in the Unix timestamp format.
    List < JobResultInfo > jobResultInfos2 = client.listAllJobs(1502349803729L, 1502349821460L, JobType.DDL);
   //Filter the query result by page.
    List < JobResultInfo > jobResultInfos3 = client.listAllJobs(100, 1, JobType.DDL);
   //Filter the query result by page, start time, and job type.
    List < JobResultInfo > jobResultInfos4 = client.listAllJobs(100, 1, 1502349803729L, 1502349821460L, JobType.DDL);

    // Use Tags to query jobs that meet the conditions.
    JobFilter jobFilter = new JobFilter();
    jobFilter.setTags("workspace=space002,jobName=name002");
    List < JobResultInfo > jobResultInfos1 = client.listAllJobs(jobFilter);
    // Use Tags to query target jobs of a specified page.
    JobFilter jobFilter = new JobFilter();
    jobFilter.setTags("workspace=space002,jobName=name002");
    jobFilter.setPageSize(100);
    jobFilter.setCurrentPage(0);
    List < JobResultInfo > jobResultInfos1 = client.listJobsByPage(jobFilter);
}
NOTE:
  • Parameters in the OVERRIDE method can be set to null, indicating that no filter conditions are specified. Ensure that all parameters are set to valid values. If the page parameter is set to -1, the query will fail.
  • APIs in this SDK do not support SQL patterns. You cannot match SQL patterns for job query.

    To query DLI jobs, use the Querying All Jobs API.

Querying Job Results

DLI provides an API for querying job results. You can use it to query information about a job of the specific job ID. The example code is as follows:
1
2
3
4
5
6
7
8
9
private static void getJobResultInfo(DLIClient client) throws DLIException {
    String jobId = "4c4f7168-5bc4-45bd-8c8a-43dfc85055d0";
    JobResultInfo jobResultInfo = client.queryJobResultInfo(jobId);
   //View information about a job.
    System.out.println(jobResultInfo.getJobId());
    System.out.println(jobResultInfo.getDetail());
    System.out.println(jobResultInfo.getJobStatus());
    System.out.println(jobResultInfo.getJobType());
}

Querying Jobs of the SQL Type

DLI provides an API for querying SQL jobs. You can use it to query information about recently executed jobs submitted using SQL statements in the current project. The example code is as follows:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
private static void getJobResultInfos(DLIClient client) throws DLIException {

  //Return the collection of JobResultInfo lists.
  List<JobResultInfo> jobResultInfos = client.listSQLJobs();
  //Traverse the list to view job information.
  for (JobResultInfo jobResultInfo : jobResultInfos) {
        //job id
        System.out.println(jobResultInfo.getJobId());
        //Job description
        System.out.println(jobResultInfo.getDetail());
        //job status
        System.out.println(jobResultInfo.getJobStatus());
        //job type
        System.out.println(jobResultInfo.getJobType());
   }

    // Use Tags to query SQL jobs that meet the conditions.
    JobFilter jobFilter = new JobFilter();
    jobFilter.setTags("workspace=space002,jobName=name002");
    List < JobResultInfo > jobResultInfos1 = client.listAllSQLJobs(jobFilter);
    // Use Tags to query target SQL jobs of a specified page.
    JobFilter jobFilter = new JobFilter();
    jobFilter.setTags("workspace=space002,jobName=name002");
    jobFilter.setPageSize(100);
    jobFilter.setCurrentPage(0);
    List < JobResultInfo > jobResultInfos1 = client.listSQLJobsByPage(jobFilter);
}

Exporting Query Results

DLI provides an API for exporting query results. You can use the API to export the query job result submitted in the editing box of the current project. The example code is as follows:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
  //Instantiate the SQLJob object and construct input parameters for executing SQL, including the queue, database name, and SQL statements.
  private static void exportSqlResult(Queue queue, Table obsTable) throws DLIException {
    String sql = "select * from " + obsTable.getTableName();
    String queryResultPath = "OBS Path";
    SQLJob sqlJob = new SQLJob(queue, obsTable.getDb().getDatabaseName(), sql);
    System.out.println("start submit SQL job...");
    //Call the submit interface of the SQLJob object to submit the querying job.
     sqlJob.submit();
    //Call the getStatus interface of the SQLJob object to query the status of the querying job.
     JobStatus status = sqlJob.getStatus();
     System.out.println(status);
     System.out.println("start export Result...");
    //Call the exportResult interface of the SQLJob object to export the query result. exportPath indicates the path for exporting data. JSON indicates the export format. queueName indicates the queue for executing the export job. limitNum indicates the number of results of the export job. 0 indicates that all data is exported.
    sqlJob.exportResult(exportPath + "result", StorageType.JSON, CompressType.NONE,
        ExportMode.ERRORIFEXISTS, queueName, true, 5);
  }

Previewing Job Results

DLI provides an API for previewing job results. You can call this API to obtain the first 1000 records in the result set.
// Initialize a SQLJob object and pass the queue, database name, and SQL statement to execute the SQL.
private static void getPreviewJobResult(Queue queue, Table obsTable) throws DLIException {
    String sql = "select * from " + obsTable.getTableName();
    SQLJob sqlJob = new SQLJob(queue, obsTable.getDb().getDatabaseName(), sql);
    System.out.println("start submit SQL job...");
    // Call the submit method on the SQLJob object.
    sqlJob.submit();
    // Call the previewJobResult method on the SQLJob object to query the first 1000 records in the result set.
    List<Row> rows = sqlJob.previewJobResult();
    if (rows.size() > 0) {
        Integer value = rows.get(0).getInt(0);
        System.out.println("Obtain the data value in the first column at the first row." + value);
    }
    System.out.println("Job id: " + sqlJob.getJobId() + ", previewJobResultSize : " + rows.size());
}

Deprecated API

The getJobResult method has been discarded. You can call DownloadJob instead to obtain the job result.

For details about the DownloadJob method, obtain the dli-sdk-java-x.x.x.zip package by referring to Obtaining and Installing the SDK and decompress the package.

เราใช้คุกกี้เพื่อปรับปรุงไซต์และประสบการณ์การใช้ของคุณ การเรียกดูเว็บไซต์ของเราต่อแสดงว่าคุณยอมรับนโยบายคุกกี้ของเรา เรียนรู้เพิ่มเติม

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback