Help Center> CloudTable Service> Best Practices> Real-time Data Processing Practices

Real-time Data Processing Practices

This document is written based on the practices of CloudTable Service (CloudTable for short) on HUAWEI CLOUD and provides guidance for you to use CloudTable to implement real-time IoT data storage and analysis.

The document contains the following parts:

Scenario

Description

This project uses the boiler anomaly detection of a water supply company as an example to get you familiar with real-time data ingestion, data processing, and data display. Through this project, you can understand how to use Data Ingestion Service (DIS), Cloud Stream Service (CS), and CloudTable Service (CloudTable) on HUAWEI CLOUD.

In this project, real-time data generated by water devices is simulated and uploaded to HUAWEI CLOUD. HUAWEI CLOUD services can detect the anomaly of boilers and output reports that show the running status of boilers in different dimensions, so that alarms are generated when boilers are abnormal and the running status of boilers can be observed continuously. The following figure shows a typical scenario.

Figure 1 Typical scenario

A customer expects that the running status of boilers can be detected and reported so that when the temperature, pressure, and water line of boilers exceed their thresholds, the regional inspectors can be notified for emergency handling. In addition, real-time data must be observed continuously to help the customer check the health status of boilers and decide whether the boilers need to be replaced or repaired by an engineering maintenance team.

Data Description

The following is a piece of data detected by a monitoring device.

{"equId":"15","equType":"2","zoneId":"1","uploadTime":1527000545560,"runningTime":52.0,"temperature":66.9,"pressure":1.31,"waterLine":42.6,"targetTemperature":68.0,"targetPressure":2.07,"targetWaterLine":38.0,"feedWater":0.116,"noxEmissions":50.3,"unitLoad":83.9}

The data definition and details are as follows:

Table 1 Data definition

Parameter

Type

Description

equId

string

Device ID

equType

string

Device type

zoneId

string

Zone ID

uploadTime

long

Upload time

runningTime

double

Running time (hour)

temperature

double

Temperature (°C)

pressure

double

Pressure (MPa)

waterLine

double

Water line (cm)

targetTemperature

double

Target temperature

targetPressure

double

Target pressure

targetWaterLine

double

Target water line

feedWater

double

Feedwater (m3/h)

noxEmissions

double

Nox emission (mg/Nm3)

unitLoad

double

Unit load (%)

Solution

Solution architecture: Device running data is acquired by the boiler monitoring device and uploaded to HUAWEI CLOUD using DIS. Data is processed based on the rules configured by CS. If an exception occurs, an alarm is triggered and sent to you in SMS using the Simple Message Notification (SMN) service. At the same time, the data is inputted into CloudTable (OpenTSDB), and Grafana continuously monitors the running data of the devices to check the health status.

Figure 2 Solution architecture

The following cloud services are involved:

  • DIS: DIS uploads data from the detection device to the cloud and imports the data to CS in real time.
  • CS: CS configures data processing rules and triggers SMN to send alarm messages when exceptions occur.
  • CloudTable: CloudTable provides OpenTSDB and is interconnected with Grafana to observe the running status of devices in real time, thus checking their health status.
  • SMN: When the SMS notification is subscribed on SMN, your mobile phone can receive SMS messages if an alarm is triggered. For example, you can receive an SMS message for notifying you of high pressure of boilers.
  • ECS: Indicates Elastic Cloud Server. The ECS is preinstalled with the DemoUI that can demonstrate EI real-time data. You can use images to quickly interconnect with your production environment to enable the data display function. For example, in this scenario, the customer can query the boiler monitoring information on the web UI.
  • DLF: Indicates Data Lake Factory. On the Dashboard of DLF, purchase the IoT Real-time Data Processing and Storage package so that you can enable and configure DIS, CS, CloudTable, and DemoUI in one click to deploy an application that can quickly process IoT real-time streams.

Tasks

1. Your mobile phone can receive alarm messages.

2. You can view the running status of devices on Grafana.

Step 1: Purchase the IoT Real-time Data Processing and Storage Package

On the Dashboard of DLF, purchase the IoT Real-time Data Processing and Storage package so that you can enable and configure DIS, CS, CloudTable, and DemoUI in one click to deploy an application that can quickly process IoT real-time streams.

  1. Log in to the public cloud management console, click in the upper left corner, and choose CN North-Beijing1.

    Figure 3 Selecting a region

  2. Create a VPC.

    Virtual Private Cloud (VPC) creates an isolated virtual network that can be configured and managed by users as required for cloud services, improving resource security and simplifying network deployment.

    Before purchasing the package, perform the following steps to create a VPC:

    1. In the top navigation bar on the management console, click Service List and choose Network > Virtual Private Cloud. The VPC service page is displayed.
    2. On the service page of VPC, click Create VPC in the upper right corner.
      Figure 4 VPC page
    3. On the Create VPC page, set the following parameters and click Create Now.
      • Region: CN North-Bejing1
      • Name: vpc-demo (You can specify a name according to the naming rule.)
      • CIDR Block: Retain the default value.
      • Tag: This field is left blank by default.
      • AZ: AZ2
      • Name: subnet-demo (You can specify a name according to the naming rule.)
      • CIDR Block: Retain the default value.
      • Advanced Settings: Select Default.
      Figure 5 Creating a VPC
    4. Back to the VPC list and view the newly created VPC.
      Figure 6 Viewing the VPC

  3. If you use CS for the first time, you need to enable CS first. For details, see Enabling CS.

    If you have enabled CS before, skip this step.

  4. In the top navigation bar on the management console, click Service List, and choose EI Enterprise Intelligence > Data Lake Factory. The DLF management console is displayed.
  5. In the IoT Real-Time Data Processing and Storage area, click Purchase Package.

    Figure 7 Purchasing the package

  6. On the purchase page, set the following parameters:

    • Name: Enter a name according to the actual requirements, for example, IoT-0109. This package includes DIS, CS, CloudTable, and ECSs preinstalled with DemoUI, all of which are named by the name you enter.
    • AZ: Retain the default value.
    • Partitions: Retain the default value.
    • SPU Quota: Retain the default value.
    • RS Units: Retain the default value.
    • TSD Units: Retain the default value.
    • Advanced Feature: Select this option of CloudTable.
    • VPC and Subnet: Select the VPC and subnet created in Step 1: Purchase the IoT Real-time Data Processing and Storage Package, accordingly.
    • Security Group: Select the default security group Sys-default.
    • Application Name: Select DemoUI. After you purchase DemoUI, the ECS and the EIP are automatically included. The username and password for logging in to the ECS are root and test@1234, respectively. Ensure that ports 8080 and 3000 of the security group are open.
    • Sample Name: Select [IoT]Boiler exception detection.
    • Select SMN Topic: Select the topic created in Step 2: Enable SMN and Subscribe to SMS Notification.
    Figure 8 Purchasing an IoT solution
    Figure 9 Template installation

  7. Click Buy Now. Then the order details page is displayed. Select "I have read and agree to the Disclaimer", and click Submit.

    Figure 10 Confirming the order

  8. Click Homepage, and view the package purchase information. It takes some time to purchase the package. Please wait patiently. After you purchase the package, Created successfully is displayed in the package area.

    Figure 11 Successful creation

Step 2: Enable SMN and Subscribe to SMS Notification

Enable SMN and subscribe to SMS notifications so that your mobile phones can receive SMS messages if a device alarm is triggered. Enable SMN by performing the following steps:

  1. Log in to the public cloud management console. In the top navigation bar, click Service List, and then select Simple Message Notification under Application.
  2. Click in the upper left corner, and select CN North-Beijing1.
  3. In the left navigation pane, choose Topic Management > Topics, and then click Create Topic in the upper right corner.

    Figure 12 Topic management

    In the displayed Create Topic dialog box, enter a topic name, for example, boiler-alarm, and click OK. After the topic is created, back to the topic list.

    Figure 13 Creating a topic

  4. View the newly created topic in the topic list, and choose More > Add Subscription in the row where the topic resides to add topic subscriptions.

    Figure 14 Topic subscription

    In the displayed dialog box, select SMS for Protocol, and enter your mobile phone number in the Endpoint text box, for example, 15999*****6. Then click OK.

    NOTE:

    Ensure that the registered mobile number can receive SMS messages, and you need to confirm the subscription as SMS prompted.

    Figure 15 Adding subscriptions

  5. Request your mobile phone to accept subscriptions.

    In the left navigation pane, choose Topic Management > Subscriptions. In the Endpoint area, locate the newly added subscription in the subscription list, and click Request Confirmation in the row where the subscription resides.

    Figure 16 Request confirmation

    Click OK in the displayed dialog box.

    Figure 17 Request confirmation

  6. Confirm the subscription.

    After your mobile phone receives the subscription confirmation message, click the link in the message to confirm the subscription. After confirmation, click to refresh the subscription status on the SMN subscription page. You can view that the status is changed to Confirmed.

    Figure 18 Viewing the subscription status

  7. On the Topics page, locate the topic named boiler-alarm and record its topic URN.

    Example: urn:smn:cn-north-1:fa9ec10d1fcc476f9b33a2c98002fd8c:boiler-alarm

    Figure 19 Obtaining the topic URN

Step 3: Configure CS Job Parameters in a Package

A streaming job is automatically created in CS based on the predefined template. The job is named in the boiler-<package name> format, for example, boiler-IoT-0109. This job is used to detect the anomaly of boilers in real time. You need to edit the job and set the parameters of the job according to actual conditions.

  1. In the top navigation bar on the management console, click Service List, and choose EI Enterprise Intelligence > Data Lake Factory. The DLF management console is displayed.
  2. On the Dashboard page, click View Details in the area of IoT Real-Time Data Processing and Storage. The job development page is displayed.

    Figure 20 IoT package

  3. On the job development page, click the CloudStream operator. On the Properties page on the right, choose Select an existing CS job, and select boiler-IoT-0109 from the Streaming Job Name drop-down list. Click Save on the upper part of the page.

    Figure 21 Job development page

  4. On the Properties page, click next to the Streaming Job Name drop-down list. The job management page is displayed.

    Figure 22 Viewing the job

  5. On the Job Management page, select boiler-IoT-0109 and click Stop. In the confirmation dialog box, click OK. After the job is stopped, click Edit in the row where boiler-IoT-0109 is located.

    Figure 23 Stopping the job

  6. On the job editing page, edit the job based on actual conditions.

    Figure 24 Editing the job
    The following job script is displayed in the job editing window. The italic parameters need to be configured according to actual conditions. For more parameter information, see the parameter description below the job script.
    CREATE SOURCE STREAM boiler (
      equId STRING,
      equType STRING,
      zoneId STRING,
      uploadTime long,
      runningTime double,
      temperature double,
      pressure double,
      waterLine double,
      targetTemperature double,
      targetPressure double,
      targetWaterLine double,
      feedWater double,
      noxEmissions double,
      unitLoad double
    )
    WITH (
      type = "dis",
      region = "cn-north-1",/*Indicates region information.*/
      channel = "IoT-0109",/*Indicates the name of the created DIS stream.*/
      partition_count = "1",
      encode = "json",
      json_config = "equId=equId;equType=equType;zoneId=zoneId;uploadTime=uploadTime;runningTime=runningTime;temperature=temperature;pressure=pressure;waterLine=waterLine;targetTemperature=targetTemperature;targetPressure=targetPressure;targetWaterLine=targetWaterLine;feedWater=feedWater;noxEmissions=noxEmissions;unitLoad=unitLoad;"
    )TIMESTAMP BY proctime.proctime;
     
     
    CREATE SINK STREAM over_pressure_msg (
      equId STRING, 
      equType STRING,
      zoneId STRING,
      pressure double,
      targetPressure double,
      subject STRING
    )
    WITH (
      type = "smn",
      region = "cn-north-1",/*Indicates region information.*/
      topic_urn = "urn:smn:cn-north-1:fa9ec10d1fcc476f9b33a2c98002fd8c:boiler-alarm",/*Indicates the topic URN created using SMN.*/
      message_subject = "Boiler-Alarm",
      message_column = "subject"
    );
     
    /** Creates the output stream and outputs results to CloudTable.
      *
      * Modify the following options based on actual conditions (For configuration items that support dynamic column names, the dynamic column names are represented by ${Column name}. If multiple data points need to be inserted, separate them with semicolons (;).
      * cluster_id: ID of the cluster to which the data table to be inserted belongs
      * tsdb_metrics: Metric of a data point. Metrics can be optionally dynamic.
      * tsdb_timestamps: Timestamp of a data point. Only dynamic columns are supported.
      * tsdb_values: Value of a data point. The value can be a dynamic column or a constant value.
      * tsdb_tags: Tag of a data point. Each tag contains at least one and at most eight values. Tags can be optionally dynamic.
      * batch_insert_data_num: Amount of data to be written in batches at a time. The value must be a positive integer. The upper limit is 100. The default value is 8.
      **/
    CREATE SINK STREAM boiler_tsdb (
      equId STRING,
      equType STRING,
      zoneId STRING,
      uploadTime long,
      runningTime double,
      temperature double,
      pressure double,
      waterLine double,
      targetTemperature double,
      targetPressure double,
      targetWaterLine double,
      feedWater double,
      noxEmissions double,
      unitLoad double
    )
    WITH (
      type = "opentsdb",
      region = "cn-north-1",/*Indicates region information.*/
      cluster_id = "2ada63e3-b237-45e0-8d89-4d108a6a2e9f",/*Indicates the ID of the cluster created using CloudTable.*/
      tsdb_metrics = "runningTime;temperature;pressure;waterLine;targetTemperature;targetPressure;targetWaterLine;feedWater;noxEmissions;unitLoad",
      tsdb_timestamps = "${uploadTime}",
      tsdb_values = "${runningTime};${temperature};${pressure};${waterLine};${targetTemperature};${targetPressure};${targetWaterLine};${feedWater};${noxEmissions};${unitLoad}",
      tsdb_tags = "equType:${equType},zoneId:${zoneId},equId:${equId}",
      batch_insert_data_num = "100"
    );
     
    /**Outputs some fields.**/
    INSERT INTO over_pressure_msg
    SELECT equId,equType,zoneId,pressure,targetPressure,"EquId "||equId||" targetPressure is "||CAST(targetPressure as VARCHAR(8))||", but now pressure is "||CAST(pressure as VARCHAR(8))||", please check it quickly."
    FROM 
    (SELECT equId, equType, zoneId, targetPressure, MAX(pressure) as pressure, count(pressure) as cnt1 FROM boiler WHERE pressure > targetPressure GROUP BY TUMBLE(proctime, INTERVAL '30' SECOND), equId, equType, zoneId, targetPressure) WHERE cnt1 >= 4;
     
     
    /**Outputs the information.**/
    INSERT INTO boiler_tsdb
    SELECT 
     equId,
     equType,
     zoneId,
     uploadTime,
     runningTime,
     temperature,
     pressure,
     waterLine,
     targetTemperature,
     targetPressure,
     targetWaterLine,
     feedWater,
     noxEmissions,
     unitLoad
    FROM boiler;
    
    The parameters are described as follows:
    • region: Set this parameter to the region you select when purchasing the package. For example, set it to cn-north-1, which represents the CN North-Beijing1 region. For more information about regions, see Regions and Endpoints.
    • Channel: name of the DIS stream. After a package is purchased, a DIS stream named after the package name is automatically created. In this example, the DIS stream name is IoT-0109.
    • topic_urn: Set this parameter to the URN created in Step 2: Enable SMN and Subscribe to SMS Notification.
    • cluster-id: cluster ID of CloudTable
      Log in to the CloudTable management console. In the left navigation pane, choose Cluster Mode. In the cluster list, locate the desired cluster and click the cluster name. On the cluster details page that is displayed, obtain the CloudTable cluster ID. See the following figure.
      Figure 25 Cluster information

  7. Click Check Semantics. If the check is successful, click Save.

    Figure 26 Editing the job
    NOTE:

    The following statement in the job is used to control the threshold for the anomaly detection of boilers. Alarms will be generated when the pressure exceeds the targetPressure value for four times within 30 seconds.

    SELECT equId,equType,zoneId,pressure,targetPressure,"EquId "||equId||" targetPressure is "||CAST(targetPressure as VARCHAR(8))||", but now pressure is "||CAST(pressure as VARCHAR(8))||", please check it quickly."
    FROM
    (SELECT equId, equType, zoneId, targetPressure, MAX(pressure) as pressure, count(pressure) as cnt1 FROM boiler WHERE pressure > targetPressure GROUP BY TUMBLE(proctime, INTERVAL '30' SECOND), equId, equType, zoneId, targetPressure) WHERE cnt1 >= 4;

  8. Click Submit. In the confirmation dialog box that is displayed, click OK.

Step 4: Log In to the Web UI and Add a Data Source on the Grafana Page

  1. Obtain the EIP of the DemoUI.

    In the top navigation bar on the management console, click Service List, and choose Computing > Elastic Cloud Server. The ECS management console is displayed. Locate the ECS named IoT-0109 and record the EIP of this ECS.

    Figure 27 Viewing the EIP

  2. Launch the browser and enter a URL, for example, http://ecs.ip:3000 (ecs.ip indicates the EIP of ECSs obtained in 1.) Enter the username and password to log in to the IoT service. The default username and password are admin and admin, respectively. Then the following page is displayed:

    Figure 28 WebUI

  3. Click Add data source to add a data source.

    Figure 29 Adding a data source

  4. On the configuration page of the data source, set the following parameters:

    • Name: Enter a user-defined name.
    • Type: Choose OpenTSDB.
    • URL: Set this parameter to http://<OpenTSDB link of the CloudTable cluster>, for example, http://opentsdb-rh3lqn383ikxjv3.cloudtable.com:4242.
      Log in to the CloudTable management console. In the left navigation pane, choose Cluster Mode. In the cluster list, locate the desired cluster and click the cluster name. On the cluster details page that is displayed, obtain the OpenTSDB link of the CloudTable cluster. See the following figure.
      Figure 30 Cluster information
    • Version: Select ==2.3.
    Figure 31 Configuring a data source

  5. Click Save&Test.

Step 5: Use the DIS Client to Send Data

A DIS client has been set up on CloutTable. This client simulates boilers to generate abnormal data and uploads the data to HUAWEI CLOUD. You can download the DIS client from OBS on HUAWEI CLOUD.

  1. Set up a computer running the Windows operating system, log in to HUAWEI CLOUD, and access the following address to obtain the DIS client:

    https://dis-publish.obs.myhwclouds.com/datasets/demo/20180529_huarui/dis-boiler-1.0.0.zip

  2. Configure the DIS client.

    Decompress the downloaded dis-boiler-1.0.0.zip package to a local directory and edit the dis.properties file.

    Figure 32 DIS client

    The following provides an example of the dis.properties configuration file. Set the region, ak, sk, projectId, endpoint, and stream_name parameters in bold based on the parameter description below.

    region=cn-north-1
    ak=UKP1SHOLFXYBWTYF7LRH
    sk=pYuYeWtoJsioQHLG8DxzJsb68hNf2N2ueCGXgHpy
    projectId=cbb9c2d6c9d1474f9a50f39baff62337
    endpoint=https://dis.cn-north-1.myhuaweicloud.com:20004
    stream_name=IoT-0109
    data.timestamp=now
    # Number of sent messages. (The value -1 indicates that there is no limit.)
    producer_record_num=-1
    # Length of each message (B)
    producer_record_length=1024
    # Number of sent threads
    producer_thread_num=1
    # Number of messages contained in each request
    producer_request_record_num=200
    # Sleep interval of each request (ms)
    producer_request_sleep_time=1000
     
    # Consumption start position on Consumer (-2 indicates that consumption starts at the earliest location; -1 indicates that consumption starts at the latest location; N (N ≥ 0) indicates that consumption starts at the specified location.)
    consumer_offset=-1
    # Number of consumed partitions (auto: automatically obtains and consumes all partitions; positive integer N (N > 0): consumes data of only the first N partitions.)
    consumer_partition_num=1
    # Maximum number of messages that is expected to be received each time a consumption request is sent.
    consumer_limit=10000
    # CloudTable data input
    producer_record_data=aaaaaa
     
     
     
    # Number of partitions for creating the stream
    create_partition_num=100
    # Type of the created stream (COMMON indicates a common stream; ADVANCED indicates an advanced stream.)
    create_stream_type=ADVANCED
    # Life cycle of the created stream (1 to 7)
    create_data_duration=7
    # Name of the OBS bucket for dumping the stream (If this parameter is left blank, the stream is not dumped.)
    create_obs_bucket_name=
    # Agency name for dumping the created stream to OBS 
    create_agency_name=
     
    # Number of cycles for automatic tests (creation/uploading/details/deletion) (-1 indicates the unlimited running times; N (N>0) indicates specified running times only.)
    auto_run_num=1
    # Character string carried in the stream name during the automatic test, used to mark the stream source.
    auto_run_user_name=dis

    The parameters are described as follows:

    • region: Set this parameter to the region selected in Step 1: Purchase the IoT Real-time Data Processing and Storage Package. For example, set it to cn-north-1, which represents the CN North-Beijing1 region. For more information about regions, see Regions and Endpoints.
    • ak and sk: AK/SK (Access Key ID/Secret Access Key)

      Log in to the HUAWEI CLOUD management console, move your cursor over your account in the upper right corner, and click My Credential from the drop-down list.

      Figure 33 My credential

      On the My Credential page, click Access Keys, and then click Add Access Key to generate a key file. Download and save the file. (Note that keep the key file properly.) If an access key has been created, view the key file and obtain AK/SK information.

      Figure 34 Adding an access key
    • ProjectId: project ID. Log in to the HUAWEI CLOUD management console, move your cursor over your account in the upper right corner, click My Credential from the drop-down list, and then view project IDs of different regions on the Projects tab page.
      Figure 35 Obtaining a project ID
    • endpoint: Set this parameter to https://<DIS endpoint>. The following provides endpoint information of DIS. For details about regions and endpoints, see Regions and Endpoints.
      Table 2 Information about regions and endpoints

      Region Name

      Region

      Endpoint

      Protocol Type

      CN North-Beijing1

      cn-north-1

      dis.cn-north-1.myhuaweicloud.com:20004

      HTTPS

      CN South-Guangzhou

      cn-south-1

      dis.cn-south-1.myhuaweicloud.com

      HTTPS

    • stream_name: name of the DIS stream. After Step 1: Purchase the IoT Real-time Data Processing and Storage Package has been completed successfully, a DIS stream named after the package name is automatically created. In this example, the DIS stream name is IoT-0109.

  3. Start the DIS client to send data.

    On the DIS client host, go to the DIS client directory dis-boiler-1.0.0\bin\. Double-click start_producer.bat to start the DIS client to send data.

    Figure 36 Sending data

Step 6: View Data on the Web UI

Viewing the Pressure of Boilers on Grafana

  1. Log in to the web UI. For details on the login method, see Step 4: Log In to the Web UI and Add a Data Source on the Grafana Page.
  2. On Grafana, click New dashboard.

    Figure 37 Creating a dashboard

  3. On the New dashboard page, click Graph.

    Figure 38 Selecting Graph

  4. Click Panel Title and then select Edit from the drop-down list.

    Figure 39 Editing

  5. Click the Metric tab, and then select Boiler-Data from the Data Source drop-down list.

    Figure 40 Selecting a data source

  6. On the Metric tab page, set Metric to pressure (boiler pressure). Select avg from the Aggregator drop-down list, and then you can view the average pressure of boilers.

    Figure 41 Viewing results

Viewing Alarms on Your Mobile Phone

Your mobile phone receives the alarms about boilers. For details about alarm rules, refer to job script settings in Step 3: Configure CS Job Parameters in a Package.

Step 7: Clear Resources

If you do not want to use the purchased boiler anomaly detection package, you can delete it. The deleted package cannot be recovered. Exercise caution when performing this operation.

  1. Log in to the public cloud management console. In the top navigation bar on the management console, click Service List, and choose EI Enterprise Intelligence > Data Lake Factory. The DLF management console is displayed.
  2. Click in the upper left corner, and select CN North-Beijing1.
  3. In the IoT Real-Time Data Processing and Storage area, click Delete Package. In the displayed dialog box, click OK.

    Figure 42 Deleting the package

  4. After the package is deleted, jobs in CS need to be manually deleted by performing the following steps:

    1. In the top navigation bar on the management console, click Service List, and choose EI Enterprise Intelligence > Cloud Stream Service. The CS management console is displayed.
    2. Click Job Management, select boiler-IoT-0109 from the job list, and click Delete.
    3. In the displayed dialog box, click OK.

  5. Delete mobile phone subscription.

    1. In the top navigation bar of the management console, click Service List, and then select Simple Message Notification under Application.
    2. In the left navigation pane, choose Topic Management > Subscriptions. In the subscription list, locate the desired subscription URN, and click Delete.
    3. In the left navigation pane, choose Topic Management > Topics. In the topic list, locate the desired topic and click Delete.

  6. Delete the VPC and subnet.

    1. In the top navigation bar on the management console, click Service List and choose Network > Virtual Private Cloud. The VPC service page is displayed.
    2. Delete the subnet. Before deleting the subnet, ensure that the subnet is not bound to other resources.

      In the left navigation pane, choose Virtual Private Cloud. In the VPC list, click vpc-demo. On the subnet page that is displayed, locate subnet-demo in the subnet list and click Delete.

    3. Delete the VPC. Before deleting the VPC, ensure that the VPC is not bound to other resources.

      In the left navigation pane, choose Virtual Private Cloud. In the VPC list, locate vpc-demo in the VPC list and click Delete.