このページは、お客様の言語ではご利用いただけません。Huawei Cloudは、より多くの言語バージョンを追加するために懸命に取り組んでいます。ご協力ありがとうございました。

Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive
Help Center/ Data Lake Insight/ User Guide/ Creating an Elastic Resource Pool and Queues Within It/ Example Use Case: Creating an Elastic Resource Pool and Running Jobs

Example Use Case: Creating an Elastic Resource Pool and Running Jobs

Updated on 2024-12-28 GMT+08:00
This section walks you through the procedure of adding a queue to an elastic resource pool and binding an enhanced datasource connection to the elastic resource pool.
Figure 1 Process of creating an elastic resource pool
Table 1 Procedure

Step

Description

Reference

Create an elastic resource pool

Create an elastic resource pool and configure basic information, such as the billing mode, CU range, and CIDR block.

Creating an Elastic Resource Pool and Creating Queues Within It

Add a queue to the elastic resource pool

Add the queue where your jobs will run on to the elastic resource pool. The operations are as follows:

  1. Set basic information about the queue, such as the name and type.
  2. Configure the scaling policy of the queue, including the priority, period, and the maximum and minimum CUs allowed for scaling.

Creating an Elastic Resource Pool and Creating Queues Within It

Adjusting Scaling Policies for Queues in an Elastic Resource Pool

(Optional) Create an enhanced datasource connection.

If a job needs to access data from other data sources, for example, GaussDB(DWS) and RDS, you need to create a datasource connection.

The created datasource connection must be bound to the elastic resource pool.

Creating an Enhanced Datasource Connection

Run a job.

Create and submit the job as you need.

Managing SQL Jobs

Flink Job Overview

Creating a Spark Job

Step 1: Create an Elastic Resource Pool

  1. Log in to the DLI management console. In the navigation pane on the left, choose Resources > Resource Pool.
  2. On the displayed page, click Buy Resource Pool in the upper right corner.
  3. On the displayed page, set the following parameters:
    • Name: Enter the name of the elastic resource pool. For example, pool_test.
    • CU range: Minimum and maximum CUs of the elastic resource pool.
    • CIDR Block: Network segment of the elastic resource pool. For example, 172.16.0.0/18.
    • Set other parameters as required.
    Figure 2 Creating an elastic resource pool

    For details about how to create an elastic resource pool, see Creating an Elastic Resource Pool and Creating Queues Within It.

  4. Click Buy. Confirm the configuration and click Pay.
  5. Go to the Resource Pool page to view the creation status. If the status is Available, the elastic resource pool is ready for use.

Step 2: Add a Queue to the Elastic Resource Pool

  1. In the Operation column of the created elastic resource pool, click Add Queue.
  2. Specify the basic information about the queue. The configuration parameters are as follows:
    • Name: Queue name
    • Type: Queue type In this example, select For general purpose.

      For SQL: The queue is used to run Spark SQL and HetuEngine jobs.

      For general purpose: The queue is used to run Flink and Spark Jar jobs.

    • Set other parameters as required.
    Figure 3 Creating a queue
  3. Click Next. On the displayed page, set Min CU to 64 and Max CU to 64.
    Figure 4 Set scaling policy for the queue
  4. Click OK. The queue is added.

(Optional) Step 3: Create an Enhanced Datasource Connection

In this example, a datasource connection is required to connect to RDS. You need to create a datasource connection. If your job does not need to connect to an external data source, skip this step.

  1. Log in to the RDS console and create an RDS DB instance.

    For details, see Buying an RDS for MySQL DB Instance.

  2. Click Create Database. In the dialog box that appears, enter database name test2. Then, click OK.
  3. Locate the row that contains the test2 database, click Query SQL Statements in the Operation column. On the displayed page, enter the following statement to create table tabletest2. Click Execute SQL. The table creation statement is as follows:
    CREATE TABLE `tabletest2` (
    	`id` int(11) unsigned,
    	`name` VARCHAR(32)
    )	ENGINE = InnoDB	DEFAULT CHARACTER SET = utf8mb4;
  4. On the RDS console, choose Instances form the navigation pane. Click the name of a created RDS DB instance to view its basic information.
  5. In the Connection Information pane, obtain the floating IP address, database port, VPC, and subnet.
  6. Click the security group name. In the Inbound Rules tab, add a rule to allow access from the CIDR block of the elastic resource pool. For example, if the CIDR block of the elastic resource pool is 172.16.0.0/18 and the database port is 3306, set the rule Priority to 1, Action to Allow, Protocol to TCP and Port to 3306, Type to IPv4, and Source to 172.16.0.0/18.

    Click OK. The security group rule is added.

  7. Log in to the DLI management console. In the navigation pane on the left, choose Datasource Connections. On the displayed page, click Create in the Enhanced tab.
  8. In the displayed dialog box, set the following parameters:
    • Connection Name: Name of the enhanced datasource connection
    • Resource Pool: Select the elastic resource pool created in Step 1: Create an Elastic Resource Pool.
      NOTE:

      If you cannot decide the elastic resource pool in this step, you can skip this parameter, go to the Enhanced tab, and click More > Bind Resource Pool in the Operation column of the row that contains this datasource connection after it is created.

    • VPC: Select the VPC of the RDS DB instance obtained in 5.
    • Subnet: Select the subnet of the RDS DB instance obtained in 5.
    • Set other parameters as you need.

    Click OK. Click the name of the created datasource connection to view its status. You can perform subsequent steps only after the connection status changes to Active.

  9. Click Resources > Queue Management, select the target queue, for example, general_test. In the Operation column, click More and select Test Address Connectivity.
  10. In the displayed dialog box, enter Floating IP address:Database port of the RDS database in the Address box and click Test to check whether the database is reachable.

Step 4: Run a Job

Run a Flink SQL jab on a queue in an elastic resource pool.

  1. On the DLI management console, choose Job Management > Flink Jobs. On the Flink Jobs page, click Create Job.
  2. In the Create Job dialog box, set Type to Flink SQL and Name to testFlinkSqlJob. Click OK.
  3. On the job editing page, set the following parameters:
    Figure 5 Creating a Flink SQL job
    • Queue: Select the general_test queue added to the elastic resource pool in Step 2: Add a Queue to the Elastic Resource Pool.
    • Save Job Log: Enable this function.
    • OBS Bucket: Select an OBS bucket for storing job logs and grant access permissions of the OBS bucket as prompted.
    • Enable Checkpointing: Enable this function.
    • Enter the SQL statement in the editing pane. The following is an example. Modify the parameters in bold as you need.
      CREATE SINK STREAM car_info (id INT, name STRING) WITH (
        type = "rds",
        region = "", /* Change the value to the current region ID. */
         'pwd_auth_name'="xxxxx", // Name of the datasource authentication of the password type created on DLI. If datasource authentication is used, you do not need to set the username and password for the job.
      db_url = "mysql://192.168.x.x:3306/test2", /* The format is mysql://floating IP address:port number of the RDS database/database name. */
      table_name = "tabletest2" /* Table name in RDS database */
      );
      INSERT INTO
        car_info
      SELECT
        13,
        'abc';
  4. Click Check Semantic and ensure that the SQL statement passes the check. Click Save. Click Start, confirm the job parameters, and click Start Now to execute the job.
  5. Wait until the job is complete. The job status changes to Completed.
  6. Log in to the RDS console, click the name of the RDS DB instance. On the displayed page, click the name of the created database, for example, test2, and click Query SQL Statements in the Operation column of the row that containing the tabletest2 table.
  7. On the displayed page, click Execute SQL. Check whether data has been written into the RDS table.
    Figure 6 Query result

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback