このページは、お客様の言語ではご利用いただけません。Huawei Cloudは、より多くの言語バージョンを追加するために懸命に取り組んでいます。ご協力ありがとうございました。

Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Situation Awareness
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
Software Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Creating a Data Delivery

Updated on 2024-07-18 GMT+08:00

Scenario

SecMaster can deliver data to other pipelines or other cloud products in real time so that you can store data or consume data with other systems. After data delivery is configured, SecMaster periodically delivers the collected data to the specified pipelines or cloud products.

Currently, data can be delivered to the following cloud products: Object Storage Service (OBS) and Log Tank Service (LTS).

This section describes how to create a data delivery task.

Prerequisites

  • To deliver data to OBS, ensure there is an available bucket whose bucket policy is Public Read and Write.
  • To deliver data to LTS, ensure there is an available log group and log streams.

Limitations and Constraints

When performing cross-account delivery, the data can only be delivered to the pipelines instead of cloud services of other accounts.

Creating a Data Delivery

  1. Log in to the management console.
  2. Click in the upper part of the page and choose Security > SecMaster.
  3. In the navigation pane on the left, choose Workspaces > Management. In the workspace list, click the name of the target workspace.

    Figure 1 Workspace management page

  4. In the navigation pane on the left, choose Threat Operations > Security Analysis. The security analysis page is displayed.

    Figure 2 Accessing the Security Analysis tab page

  5. In the data space navigation tree on the left, click the data space name to expand all pipelines. Next to the name of the target pipeline, click More > Deliver.

    Figure 3 Accessing data delivery settings page

  6. (Optional) Authorization of the destination type is required for the first delivery. If the authorization has been performed, skip this step.

    Confirm the authorization information, select Agree to authorize and click OK.

  7. On the Create Delivery page, set data delivery parameters.

    1. Configure basic information.
      Table 1 Basic information

      Parameter

      Description

      Delivery Name

      Customized delivery rule name

      Resource Consumption

      The value is generated by default and does not need to be configured.

    2. Configure the data source.
      In the Data Source Settings area, the detailed information about the current pipeline is displayed. You do not need to set this parameter.
      Table 2 Data source parameters

      Parameter

      Description

      Delivery Type

      Delivery destination type. The default value is PIPE.

      Region

      Area where the current pipeline is located

      Workspace

      Workspace to which the current pipeline belongs

      Data Spaces

      Data space to which the current pipeline belongs

      Pipeline

      Pipeline name

      Data Read Policy

      Data read policy of the current pipeline

      Read By

      Identity of the data source reader

    3. Configure the delivery destination.
      • PIPE: Deliver the current pipeline data to other pipelines of the current account or pipelines of other accounts. Set this parameter as required.
        • Current: Deliver the current pipeline data to another pipeline of the current account. For details about the parameters, see Table 3.
          Table 3 Destination parameters - Current account pipeline

          Parameter

          Description

          Account Type

          Account type of the data delivery destination. Select Current.

          Delivery Type

          Delivery type. Select PIPE.

          Workspace

          Workspace where the destination PIPE is located

          Data Spaces

          Data space where the destination PIPE is located

          Pipeline

          Pipeline where the destination PIPE is located

          Written To

          The value is generated by default and does not need to be configured.

        • Cross-account delivery: Deliver the current pipeline data to the pipeline of another account. For details about the parameters, see Table 4.
          Table 4 Destination parameters - PIPE of Other account

          Parameter

          Description

          Account Type

          Account type of the data delivery destination. Select Other.

          Delivery Type

          Delivery type. Select PIPE.

          Account ID

          ID of the account to which the destination pipeline belongs

          Workspace ID

          ID of the workspace where the destination PIPE is located. For details about how to query the workspace ID, see 6.

          Data Space ID

          ID of the data space where the destination PIPE is located. For details about how to query the data space ID, see 6.

          Pipeline ID

          ID of the pipeline where the destination PIPE is located. For details about how to query the pipeline ID, see 6.

          Written To

          The value is generated by default and does not need to be configured.

      • LTS: Deliver the pipeline data to LTS. For details about the parameter settings, see Table 5.

        To deliver data to LTS, ensure there is an available log group and log streams.

        Table 5 Destination parameters - LTS

        Parameter

        Description

        Account Type

        Account type of the data delivery destination. When delivering data to LTS, only the Current account type can be selected.

        Delivery Type

        Delivery type. Select LTS.

        Log Group

        Destination LTS log group

        Log Stream

        Destination LTS log stream

        Written To

        The value is generated by default and does not need to be configured.

      • OBS: Deliver the pipeline data to OBS. For details about the parameter settings, see Table 6.
        To deliver data to OBS, ensure there is an available bucket whose bucket policy is Public Read and Write.
        Table 6 Destination parameters - OBS

        Parameter

        Description

        Account Type

        Account type of the data delivery destination. When delivering data to OBS, only the Current account type can be selected.

        Delivery Type

        Delivery type. Select OBS.

        Bucket Name

        Name of the destination OBS bucket

        Written To

        The value is generated by default and does not need to be configured.

    4. Under Access Authorization, view the permissions granted in 6.

      A delivery request requires the read and write permissions to access your cloud resources. After the authorization, the delivery task can access your cloud resources.

  8. Click OK.

Follow-up Operation

After a data delivery task is added, you need to grant the delivery permission. The delivery takes effect only after you accept the authorization. For details, see Data Delivery Authorization.

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback