このページは、お客様の言語ではご利用いただけません。Huawei Cloudは、より多くの言語バージョンを追加するために懸命に取り組んでいます。ご協力ありがとうございました。
- What's New
- Function Overview
- Service Overview
-
Billing
- Billing Overview
- Billing Modes
- Billing Items
- Billing Examples
- Changing the Billing Mode
- Renewing Your Subscription
- Bills
- Arrears
- Billing Termination
- Cost Management
-
Billing FAQs
- How Is SecMaster Billed?
- Can I Use SecMaster for Free?
- How Do I Change or Disable Auto Renewal for SecMaster?
- Will SecMaster Be Billed After It Expires?
- How Do I Renew SecMaster When It Is About to Expire?
- Where Can I Unsubscribe from SecMaster?
- Where Can I View the Remaining Quotas of Security Data Collection and Security Data Packages?
- Getting Started
-
User Guide
- Buying SecMaster
- Authorizing SecMaster
- Checking Security Overview
- Workspaces
- Viewing Purchased Resources
-
Security Governance
- Security Governance Overview
- Security Compliance Pack Description
- Authorizing SecMaster to Access Cloud Service Resources
- Subscribing to or Unsubscribing from a Compliance Pack
- Starting a Self-Assessment
- Viewing Security Compliance Overview
- Viewing Evaluation Results
- Viewing Policy Scanning Results
- Downloading a Compliance Report
- Security Situation
- Resource Manager
- Risk Prevention
- Threat Operations
- Security Orchestration
-
Playbook Overview
- Ransomware Incident Response Solution
- Attack Link Analysis Alert Notification
- HSS Isolation and Killing of Malware
- Automatic Renaming of Alert Names
- Auto High-Risk Vulnerability Notification
- Automatic Notification of High-Risk Alerts
- Auto Blocking for High-risk Alerts
- Real-time Notification of Critical Organization and Management Operations
-
Settings
- Data Integration
-
Log Data Collection
- Data Collection Overview
- Adding a Node
- Configuring a Component
- Adding a Connection
- Creating and Editing a Parser
- Adding and Editing a Collection Channel
- Managing Connections
- Managing Parsers
- Managing Collection Channels
- Managing Collection Nodes
- Viewing Collection Nodes
- Partitioning a Disk
- Logstash Configuration Description
- Connector Rules
- Parser Rules
- Upgrading the Component Controller
- Customizing Directories
- Permissions Management
- Key Operations Recorded by CTS
-
Best Practices
-
Log Access and Transfer Operation Guide
- Solution Overview
- Resource Planning
- Process Flow
-
Procedure
- (Optional) Step 1: Buy an ECS
- (Optional) Step 2: Buy a Data Disk
- (Optional) Step 3: Attach a Data Disk
- Step 4: Create a Non-administrator IAM User
- Step 5: Configure Network Connection
- Step 6: Install the Component Controller (isap-agent)
- Step 7: Install the Log Collection Component (Logstash)
- (Optional) Step 8: Creating a Log Storage Pipeline
- Step 9: Configure a Connector
- (Optional) Step 10: Configure a Log Parser
- Step 11: Configure a Log Collection Channel
- Step 12: Verify Log Access and Transfer
- Credential Leakage Response Solution
-
Log Access and Transfer Operation Guide
-
API Reference
- Before You Start
- API Overview
- Calling APIs
-
API
- Alert Management
- Incident Management
- Indicator Management
- Playbook Management
- Alert Rule Management
- Playbook Version Management
- Playbook Rule Management
- Playbook Instance Management
- Playbook Approval Management
- Playbook Action Management
- Incident Relationship Management
- Data Class Management
- Workflow Management
- Data Space Management
- Pipelines
- Workspace Management
- Metering and Billing
- Metric Query
- Baseline Inspection
- Appendix
- FAQs
-
More Documents
-
User Guide (ME-Abu Dhabi Region)
- Service Overview
- Permissions Management
- Buying SecMaster
- Authorizing SecMaster
- Security Overview
- Workspaces
- Viewing Purchased Resources
- Security Situation
- Resource Manager
- Risk Prevention
-
Threat Operations
- Incident Management
- Alert Management
- Indicator Management
- Intelligent Modeling
-
Security Analysis
- Security Analysis Overview
- How to Use Security Analysis
- Configuring Indexes
- Querying and Analyzing Data
- Downloading Logs
-
Query and Analysis Statements - SQL Syntax
- Basic Syntax
- Limitations and Constraints
- Query Statements
- Syntax of Analysis Statements
- Analysis Statements - SELECT
- Analysis Statements - GROUP BY
- Analysis Statements - HAVING
- Analysis Statements - ORDER BY
- Analysis Statements - LIMIT
- Analysis Statements - Functions
- Analysis Statements - Aggregate Functions
- Quick Query
- Quickly Adding a Log Alarm Model
- Charts
- Managing Data Spaces
- Managing Pipelines
- Data Consumption
- Data Delivery
- Data Monitoring
-
Security Orchestration
- Security Orchestration Overview
- Built-in Playbooks and Workflows
- Security Orchestration Process
- (Optional) Configuring and Enabling a Workflow
- Configuring and Enabling a Playbook
- Operation Object Management
- Playbook Orchestration Management
- Layout Management
- Plug-in Management
- Settings
-
FAQs
-
Product Consulting
- Why Is There No Attack Data or Only A Small Amount of Attack Data?
- Where Does SecMaster Obtain Its Data From?
- What Are the Dependencies and Differences Between SecMaster and Other Security Services?
- What Are the Differences Between SecMaster and HSS?
- How Do I Update My Security Score?
- How Do I Handle a Brute-force Attack?
- Data Synchronization and Consistency
- How Do I Grant Permissions to an IAM User?
- Purchase Consulting
- Troubleshooting
-
Product Consulting
- Change History
-
User Guide (Kuala Lumpur Region)
- Service Overview
- Authorizing SecMaster
- Security Overview
- Workspaces
- Viewing Purchased Resources
- Security Situation
- Resource Manager
-
Risk Prevention
-
Baseline Inspection
- Baseline Inspection Overview
- Creating a Custom Check Plan
- Starting an Immediate Baseline Check
- Viewing Check Results
- Handling Check Results
- Viewing Compliance Packs
- Creating a Custom Compliance Pack
- Importing and Exporting a Compliance Pack
- Viewing Check Items
- Creating a Custom Check Item
- Importing and Exporting Check Items
- Vulnerability Management
- Policy Management
-
Baseline Inspection
-
Threat Operations
- Incident Management
- Alert Management
- Indicator Management
- Intelligent Modeling
- Security Analysis
- Data Delivery
-
Security Orchestration
- Security Orchestration Overview
- Built-in Playbooks
- Security Orchestration Process
- (Optional) Configuring and Enabling a Workflow
- Configuring and Enabling a Playbook
- Operation Object Management
- Playbook Orchestration Management
- Layout Management
- Plug-in Management
- Settings
-
FAQs
-
Product Consulting
- Why Is There No Attack Data or Only A Small Amount of Attack Data?
- Where Does SecMaster Obtain Its Data From?
- What Are the Dependencies and Differences Between SecMaster and Other Security Services?
- What Are the Differences Between SecMaster and HSS?
- How Do I Update My Security Score?
- How Do I Handle a Brute-force Attack?
- Issues About Data Synchronization and Data Consistency
- About Data Collection Faults
-
Product Consulting
- Change History
-
User Guide (ME-Abu Dhabi Region)
- General Reference
Copied.
Creating a Data Delivery
Scenario
SecMaster can deliver data to other pipelines or other cloud products in real time so that you can store data or consume data with other systems. After data delivery is configured, SecMaster periodically delivers the collected data to the specified pipelines or cloud products.
Currently, data can be delivered to the following cloud products: Object Storage Service (OBS) and Log Tank Service (LTS).
This section describes how to create a data delivery task.
Prerequisites
- To deliver data to OBS, ensure there is an available bucket whose bucket policy is Public Read and Write.
- To deliver data to LTS, ensure there is an available log group and log streams.
Limitations and Constraints
When performing cross-account delivery, the data can only be delivered to the pipelines instead of cloud services of other accounts.
Creating a Data Delivery
- Log in to the management console.
- Click in the upper part of the page and choose Security > SecMaster.
- In the navigation pane on the left, choose Workspaces > Management. In the workspace list, click the name of the target workspace.
Figure 1 Workspace management page
- In the navigation pane on the left, choose Threat Operations > Security Analysis. The security analysis page is displayed.
Figure 2 Accessing the Security Analysis tab page
- In the data space navigation tree on the left, click the data space name to expand all pipelines. Next to the name of the target pipeline, click More > Deliver.
Figure 3 Accessing data delivery settings page
- (Optional) Authorization of the destination type is required for the first delivery. If the authorization has been performed, skip this step.
Confirm the authorization information, select Agree to authorize and click OK.
- On the Create Delivery page, set data delivery parameters.
- Configure basic information.
Table 1 Basic information Parameter
Description
Delivery Name
Customized delivery rule name
Resource Consumption
The value is generated by default and does not need to be configured.
- Configure the data source.
In the Data Source Settings area, the detailed information about the current pipeline is displayed. You do not need to set this parameter.
Table 2 Data source parameters Parameter
Description
Delivery Type
Delivery destination type. The default value is PIPE.
Region
Area where the current pipeline is located
Workspace
Workspace to which the current pipeline belongs
Data Spaces
Data space to which the current pipeline belongs
Pipeline
Pipeline name
Data Read Policy
Data read policy of the current pipeline
Read By
Identity of the data source reader
- Configure the delivery destination.
- PIPE: Deliver the current pipeline data to other pipelines of the current account or pipelines of other accounts. Set this parameter as required.
- Current: Deliver the current pipeline data to another pipeline of the current account. For details about the parameters, see Table 3.
Table 3 Destination parameters - Current account pipeline Parameter
Description
Account Type
Account type of the data delivery destination. Select Current.
Delivery Type
Delivery type. Select PIPE.
Workspace
Workspace where the destination PIPE is located
Data Spaces
Data space where the destination PIPE is located
Pipeline
Pipeline where the destination PIPE is located
Written To
The value is generated by default and does not need to be configured.
- Cross-account delivery: Deliver the current pipeline data to the pipeline of another account. For details about the parameters, see Table 4.
Table 4 Destination parameters - PIPE of Other account Parameter
Description
Account Type
Account type of the data delivery destination. Select Other.
Delivery Type
Delivery type. Select PIPE.
Account ID
ID of the account to which the destination pipeline belongs
Workspace ID
ID of the workspace where the destination PIPE is located. For details about how to query the workspace ID, see 6.
Data Space ID
ID of the data space where the destination PIPE is located. For details about how to query the data space ID, see 6.
Pipeline ID
ID of the pipeline where the destination PIPE is located. For details about how to query the pipeline ID, see 6.
Written To
The value is generated by default and does not need to be configured.
- Current: Deliver the current pipeline data to another pipeline of the current account. For details about the parameters, see Table 3.
- LTS: Deliver the pipeline data to LTS. For details about the parameter settings, see Table 5.
To deliver data to LTS, ensure there is an available log group and log streams.
Table 5 Destination parameters - LTS Parameter
Description
Account Type
Account type of the data delivery destination. When delivering data to LTS, only the Current account type can be selected.
Delivery Type
Delivery type. Select LTS.
Log Group
Destination LTS log group
Log Stream
Destination LTS log stream
Written To
The value is generated by default and does not need to be configured.
- OBS: Deliver the pipeline data to OBS. For details about the parameter settings, see Table 6.
To deliver data to OBS, ensure there is an available bucket whose bucket policy is Public Read and Write.
Table 6 Destination parameters - OBS Parameter
Description
Account Type
Account type of the data delivery destination. When delivering data to OBS, only the Current account type can be selected.
Delivery Type
Delivery type. Select OBS.
Bucket Name
Name of the destination OBS bucket
Written To
The value is generated by default and does not need to be configured.
- PIPE: Deliver the current pipeline data to other pipelines of the current account or pipelines of other accounts. Set this parameter as required.
- Under Access Authorization, view the permissions granted in 6.
A delivery request requires the read and write permissions to access your cloud resources. After the authorization, the delivery task can access your cloud resources.
- Configure basic information.
- Click OK.
Follow-up Operation
After a data delivery task is added, you need to grant the delivery permission. The delivery takes effect only after you accept the authorization. For details, see Data Delivery Authorization.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot