Connector Rules
Source Connectors
SecMaster provides a wide range of source connectors for you to collect security data from your security products.
Connector Type |
Logstash In Use |
Description |
---|---|---|
TCP |
tcp |
This collector is used to receive TCP logs. For details about the configuration rules, see Table 2. |
UDP |
udp |
This collector is used to receive UDP logs. For details about the configuration rules, see Table 3. |
OBS |
obs |
This collector is used to obtain log data from an OBS bucket. For details about the configuration rules, see Table 4. |
Kafka |
kafka |
This collector is used to obtain Kafka network log data. For details about the configuration rules, see Table 5. |
SecMaster |
pipe |
This collector is used to transfer SecMaster data to you. For details about the configuration rules, see Table 6. |
Elasticsearch |
elasticsearch |
This collector is used to read data from the Elasticsearch cluster. For details about the configuration rules, see Table 7. |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the connector with custom sources. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Port |
port |
number |
1025 |
Yes |
Port for the collection node. Port range: 1025 to 65535 The port for each connector must be unique. |
Codec |
codec |
string |
plain |
Yes |
Encoding format.
|
SSL_enable |
ssl_enable |
boolean |
false |
No |
Whether to enable SSL authentication. |
SSL certificate |
ssl_cert |
file |
null |
No |
Certificate. |
SSL key |
ssl_key |
file |
-- |
No |
SSL key. |
SSL key. |
ssl_key_passphrase |
string |
-- |
No |
SSL certificate key. |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Port |
port |
number |
1025 |
Yes |
Port for the collection node. The port ranges from 1025 to 65535. The port for each connector must be unique. |
Codec |
codec |
string |
plain |
Yes |
Decoding type
|
Type |
type |
string |
udp |
No |
Packet label, which is used for subsequent processing. One data collection channel maps to multiple data connections. You are advised to add different packet labels to each data connection to distinguish packets of different data connections. |
Queue_size |
queue_size |
number |
1000000 |
No |
Queue size. |
Receive_buffer_bytes |
receive_buffer_bytes |
number |
1000000 |
No |
Number of bytes in the receiving buffer. Retain the default value. |
Buffer_size |
buffer_size |
number |
1000000 |
No |
Buffer size. Retain the default value. |
Workers |
workers |
number |
1 |
No |
Number of worker threads. Retain the default value. |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Region |
Region |
string |
-- |
Yes |
Region of the workspace where the data source is located. |
Bucket |
bucket |
string |
demo-obs-sec-mrd-datas |
Yes |
Bucket name, which is the name of the bucket where the data source is located. |
Endpoint |
Endpoint |
string |
-- |
Yes |
Endpoint address. Note that "https" must be added. You can obtain the endpoint address from the bucket details. |
AK |
ak |
string |
-- |
No |
AK |
SK |
sk |
string |
-- |
No |
SK |
Prefix |
prefix |
string |
/test |
No |
Prefix of the folder where logs are read. Enter the folder prefix corresponding to the storage path of the data source in the bucket. |
Cache folder |
temporary_directory |
string |
/temp |
No |
Cache folder for reading logs. When SecMaster log data is transferred out, the data needs to be cached in the cache folder first. The data is transferred out after the data volume reaches a certain threshold. You need to create the /opt/cloud/logstash/myobs directory on the ECS and use the directory as the cache folder path. |
Packet label |
type |
string |
-- |
No |
Packet label, which is used for subsequent processing. One data collection channel maps to multiple data connections. You are advised to add different packet labels to each data connection to distinguish packets of different data connections. |
Memory path |
sincedb_path |
string |
/opt/cloud/logstash/pipeline/file_name |
No |
Log read position. This parameter is used to prevent full-text traversal caused by restart. You are advised to retain the default value. |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
||||
---|---|---|---|---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
||||
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
||||
Bootstrap_servers |
bootstrap_servers |
string |
-- |
Yes |
Kafka service address. |
||||
Topics |
topics |
array |
logstash |
Yes |
Topics. Multiple topics can be consumed at the same time. |
||||
Consumer threads |
consumer_threads |
number |
1 |
Yes |
Consumer threads |
||||
Auto offset reset |
auto_offset_reset |
string |
latest |
No |
Offset reset:
|
||||
SSL certificate |
ssl_truststore_location |
file |
-- |
No |
SSL certificate This parameter is mandatory when SSL is selected. |
||||
SSL key |
ssl_truststore_password |
string |
-- |
No |
SSL key This parameter is mandatory when SSL is selected. |
||||
Security_protocol |
security_protocol |
string |
SASL_SSL |
No |
Security protocol.
|
||||
Sasl_jaas_config |
sasl_jaas_config |
string |
-- |
No |
SASL connection configuration. |
||||
Is_pw_encrypted |
is_pw_encrypted |
string |
false |
No |
Whether to encrypt the value. |
||||
Sasl_mechanism |
sasl_mechanism |
string |
PLAIN |
No |
SASL mechanism. |
||||
Group_id |
group_id |
string |
-- |
No |
Consumer group ID. |
||||
Set sasl_jaas_config based on the Kafka specifications. Example:
|
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Type |
type |
string |
Tenant |
Yes |
Type |
Pipeline |
pipeId |
string |
-- |
Yes |
Pipeline ID |
domain_name |
domain_name |
string |
domain_name |
Yes |
Domain name of the user |
User_name |
user_name |
string |
user_name |
Yes |
Username of the user |
Password |
user_password |
string |
-- |
Yes |
Username of the user |
Subscription type |
subscription_type |
string |
true |
No |
Subscription type
|
Subscription Start |
subscription_initial_position |
string |
true |
No |
Subscription Start |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Hosts |
hosts |
array |
-- |
Yes |
Address of the host where the data source is located. |
Index |
index |
string |
-- |
Yes |
Index name. |
Query |
query |
string |
-- |
Yes |
Retrieval statement for reading data. |
User |
user |
string |
-- |
Yes |
Username for logging in to Elasticsearch. |
User_password |
user_password |
string |
-- |
Yes |
Password for logging in to Elasticsearch. |
Size |
size |
number |
20 |
Yes |
Queries |
Scroll |
scroll |
string |
5m |
Yes |
Volume |
Docinfo |
docinfo |
boolean |
Yes |
Yes |
Document |
Is_pw_encrypted |
is_pw_encrypted |
boolean |
Yes |
Yes |
Whether to enable encryption |
Ssl |
ssl |
boolean |
Yes |
No |
Whether to enable SSL. |
Ca_file |
ca_file |
file |
-- |
No |
Certificate file. |
SsL_certificate_verification |
ssl_certificate_verification |
boolean |
Yes |
No |
Whether to enable SSL certificate verification. |
Destination Connectors
SecMaster provides a wide range of destination connectors for you to collect security data from your security products.
Connector Type |
In-use Logstash |
Description |
---|---|---|
TCP |
tcp |
This collector is used to send TCP logs. For details about the configuration rules, see Table 9. |
UDP |
udp |
This collector is used to send UD logs. For details about the configuration rules, see Table 10. |
Kafka |
kafka |
This collector is used to write logs to Kafka message queues. For details about the configuration rules, see Table 11. |
OBS |
obs |
This collector is used to write logs to OBS buckets. For details about the configuration rules, see Table 12. |
SecMaster pipeline |
pipe |
This collector is used to write logs to the SecMaster pipeline. For details about the configuration rules, see Table 13. |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Port |
port |
number |
1025 |
Yes |
Port |
Codec |
codec |
string |
plain |
Yes |
Decoding type, which can be Json_lines or Plain.
|
Hosts |
host |
string |
192.168.0.66 |
Yes |
Host address Note: The network between the host and the node is normal. |
Ssl_enable |
ssl_enable |
boolean |
false |
No |
Whether to enable SSL authentication. |
SSL certificate |
ssl_cert |
file |
-- |
No |
SSL certificates |
SSL key |
ssl_key |
file |
-- |
No |
SSL certificate file |
SSL key |
ssl_key_passphrase |
string |
-- |
No |
SSL certificate key |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Hosts |
host |
string |
-- |
Yes |
Host IP address. Note: The network between the host and the node is normal. |
Port |
port |
number |
1025 |
Yes |
Port |
Decoding type |
codec |
string |
json_lines |
Yes |
Decoding type, which can be Json_lines or Plain.
|
Retry count |
retry_count |
number |
3 |
No |
Time of retry attempts |
Retry backoff (ms) |
retry_backoff_ms |
number |
200 |
No |
Retry backoff (ms) |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Service address |
bootstrap_servers |
string |
-- |
Yes |
Service address, for example, 192.168.21.21:9092,192.168.21.24:9999. |
Topics |
topic_id |
string |
logstash |
Yes |
Topics |
Decoding type |
codec |
string |
plain |
Yes |
Decoding type, which can be Json or Plain. |
Maximum length of the request |
max_request_size |
number |
10485760 |
Yes |
Maximum length of the request |
Security_protocol |
security_protocol |
string |
PLAINTEXT |
No |
Security protocol |
SASL connection configuration |
sasl_jaas_config |
string |
-- |
No |
SASL connection configuration |
Encrypted |
is_pw_encrypted |
string |
true |
No |
Encrypted |
SASL mechanism |
sasl_mechanism |
string |
PLAIN |
No |
sasl_mechanism |
Set sasl_jaas_config based on the Kafka specifications. The following is an example:
|
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Region |
Region |
string |
-- |
Yes |
Region |
Bucket |
bucket |
string |
demo-obs-sec-mrd-datas |
Yes |
Bucket name |
Endpoint |
Endpoint |
string |
-- |
Yes |
Endpoint |
Cache folder |
temporary_directory |
string |
/temp/logstash/ |
Yes |
Cache path |
Encoding type |
codec |
string |
plain |
No |
Encoding format: plain or JSON |
AK |
ak |
string |
-- |
No |
AK |
SK |
sk |
string |
-- |
No |
SK |
Prefix |
prefix |
string |
test |
No |
Path prefix. |
Encoding format |
encoding |
string |
gzip |
No |
Encoding format: gzip or pure file |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Type |
type |
string |
Tenant |
Yes |
Type |
Pipeline |
pipeId |
string |
-- |
Yes |
Pipeline |
AK |
ak |
string |
-- |
Yes |
AK This parameter is mandatory when the platform type is selected. |
SK |
sk |
string |
-- |
Yes |
SK This parameter is mandatory when the platform type is selected. |
domain_name |
domain_name |
string |
domain_name |
Yes |
Domain name of the user This parameter is mandatory when the tenant type is selected. |
User_name |
user_name |
string |
user_name |
Yes |
Username of the user This parameter is mandatory when the tenant type is selected. |
Password |
user_password |
string |
-- |
Yes |
Password of the user This parameter is mandatory when the tenant type is selected. |
Compression type |
compression_type |
string |
NONE |
No |
Packet compression type |
Block if the queue is full |
block_if_queue_full |
boolean |
true |
No |
Whether to block the access if the queue is full. |
Enable batch processing |
enable_batching |
boolean |
true |
No |
Whether to enable batch processing. |
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot