SecMaster
SecMaster
- What's New
- Function Overview
- Service Overview
- Billing
- Getting Started
-
User Guide
- Buying SecMaster
- Authorizing SecMaster
- Viewing Security Overview
- Workspaces
- Viewing Purchased Resources
- Security Situation
- Resource Manager
- Risk Prevention
- Threat Operations
- Security Orchestration
-
Playbook Overview
- Ransomware Incident Response Solution
- Attack Link Analysis Alert Notification
- HSS Isolation and Killing of Malware
- Automatic Renaming of Alert Names
- Auto High-Risk Vulnerability Notification
- Automatic Notification of High-Risk Alerts
- Auto Blocking for High-risk Alerts
- Real-time Notification of Critical Organization and Management Operations
-
Settings
- Data Integration
-
Log Data Collection
- Data Collection Overview
- Adding a Node
- Configuring a Component
- Adding a Connection
- Creating and Editing a Parser
- Adding and Editing a Collection Channel
- Managing Connections
- Managing Parsers
- Managing Collection Channels
- Viewing Collection Nodes
- Managing Nodes and Components
- Partitioning a Disk
- Logstash Configuration Description
- Connector Rules
- Parser Rules
- Upgrading the Component Controller
- Customizing Directories
- Permissions Management
- Key Operations Recorded by CTS
-
Best Practices
-
Log Access and Transfer Operation Guide
- Solution Overview
- Resource Planning
- Process Flow
-
Procedure
- (Optional) Step 1: Buy an ECS
- (Optional) Step 2: Buy a Data Disk
- (Optional) Step 3: Attach a Data Disk
- Step 4: Create a Non-administrator IAM User
- Step 5: Configure Network Connection
- Step 6: Install the Component Controller (isap-agent)
- Step 7: Install the Log Collection Component (Logstash)
- (Optional) Step 8: Creating a Log Storage Pipeline
- Step 9: Configure a Connector
- (Optional) Step 10: Configure a Log Parser
- Step 11: Configure a Log Collection Channel
- Step 12: Verify Log Access and Transfer
- Credential Leakage Response Solution
-
Log Access and Transfer Operation Guide
-
API Reference
- Before You Start
- API Overview
- Calling APIs
-
API
- Alert Management
- Incident Management
- Indicator Management
- Playbook Management
- Alert Rule Management
- Playbook Version Management
- Playbook Rule Management
- Playbook Instance Management
- Playbook Approval Management
- Playbook Action Management
- Incident Relationship Management
- Data Class Management
- Workflow Management
- Data Space Management
- Pipelines
- Workspace Management
- Metering and Billing
- Metric Query
- Baseline Inspection
- Appendix
- FAQs
On this page
Data Delivery Overview
Updated on 2025-01-20 GMT+08:00
Scenario
SecMaster can deliver data to other pipelines or other cloud products in real time so that you can store data or consume data with other systems. After data delivery is configured, SecMaster periodically delivers the collected data to the specified pipelines or cloud products.
Currently, SecMaster supports the following data delivery destinations:
- Other pipelines: You can deliver log data to other pipelines.
- OBS buckets: You can deliver log data to Object Storage Service (OBS) buckets.
- LTS: You can deliver log data to Log Tank Service (LTS).
You can manage data delivery tasks, including viewing, suspending, starting, and deleting a data delivery task.
Advantages
- Simple operation: You only need to complete simple configurations on the console to deliver SecMaster data to other cloud products such as OBS.
- Data centralization: SecMaster has completed data centralization of different services. You only need to deliver the collected data to other cloud products such as OBS for centralized data management.
- Category management: When collecting data, the SecMaster manages the data by category. You can use this function to deliver data of different projects and types to different cloud products.
Prerequisites
- If you want to deliver data to an OBS bucket, the bucket must have private, public read, or public read/write policy enabled. Currently, parallel file buckets are not supported. For details, see Creating an OBS Bucket.
- To deliver data to LTS, ensure there are available log groups and log streams. For details, see Managing Log Groups and Managing Log Streams.
Limitations and Constraints
- When performing cross-account delivery, the data can only be delivered to the pipelines instead of cloud services of other accounts.
- If the new data delivery is cross-account, you need to log in to SecMaster using the destination account and authorize the delivery.
Parent topic: Data Delivery
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.
The system is busy. Please try again later.