Connector Rules
Source Connectors
SecMaster provides a wide range of source connectors for you to collect security data from your security products.
Connector Type |
Logstash In Use |
Description |
---|---|---|
TCP |
tcp |
This collector is used to receive TCP logs. For details about the configuration rules, see Table 2. |
UDP |
udp |
This collector is used to receive UDP logs. For details about the configuration rules, see Table 3. |
OBS |
obs |
This collector is used to obtain log data from an OBS bucket. For details about the configuration rules, see Table 4. |
Kafka |
kafka |
This collector is used to obtain Kafka network log data. For details about the configuration rules, see Table 5. |
SecMaster |
pipe |
This collector is used to transfer SecMaster data externally. For details about configuration rules, see Table 6. |
Elasticsearch |
elasticsearch |
This collector is used to read data from the Elasticsearch cluster. For details about the configuration rules, see Table 7. |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the connector with custom sources. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Port |
port |
number |
1025 |
Yes |
Port for the collection node. Port range: 1025 to 65535 The port for each connector must be unique. |
Codec |
codec |
string |
plain |
Yes |
Encoding format.
|
SSL_enable |
ssl_enable |
boolean |
false |
No |
Whether to enable SSL authentication. |
SSL certificate |
ssl_cert |
file |
null |
No |
Certificate. |
SSL key |
ssl_key |
file |
-- |
No |
SSL key. |
SSL key. |
ssl_key_passphrase |
string |
-- |
No |
SSL certificate key. |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Port |
port |
number |
1025 |
Yes |
Port for the collection node. The port ranges from 1025 to 65535. The port for each connector must be unique. |
Codec |
codec |
string |
plain |
Yes |
Decoding type
|
Type |
type |
string |
udp |
No |
Packet label, which is used for subsequent processing. One data collection channel maps to multiple data connections. You are advised to add different packet labels to each data connection to distinguish packets of different data connections. |
Queue_size |
queue_size |
number |
1000000 |
No |
Queue size. |
Receive_buffer_bytes |
receive_buffer_bytes |
number |
1000000 |
No |
Number of bytes in the receiving buffer. Retain the default value. |
Buffer_size |
buffer_size |
number |
1000000 |
No |
Buffer size. Retain the default value. |
Workers |
workers |
number |
1 |
No |
Number of worker threads. Retain the default value. |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Region |
Region |
string |
-- |
Yes |
Region of the workspace where the data source is located. |
Bucket |
bucket |
string |
demo-obs-sec-mrd-datas |
Yes |
Bucket name, which is the name of the bucket where the data source is located. |
Endpoint |
Endpoint |
string |
https://obs.huawei.com |
Yes |
Endpoint address. Note that "https" must be added. You can obtain the endpoint address from the bucket details. |
AK |
ak |
string |
-- |
No |
AK |
SK |
sk |
string |
-- |
No |
SK |
Prefix |
prefix |
string |
/test |
No |
Prefix of the folder where logs are read. Enter the folder prefix corresponding to the storage path of the data source in the bucket. |
Cache folder |
temporary_directory |
string |
/temp |
No |
Cache folder for reading logs. When SecMaster log data is transferred out, the data needs to be cached in the cache folder first. The data is transferred out after the data volume reaches a certain threshold. You need to create the /opt/cloud/logstash/myobs directory on the ECS and use the directory as the cache folder path. |
Packet label |
type |
string |
-- |
No |
Packet label, which is used for subsequent processing. One data collection channel maps to multiple data connections. You are advised to add different packet labels to each data connection to distinguish packets of different data connections. |
Memory path |
sincedb_path |
string |
/opt/cloud/logstash/pipeline/file_name |
No |
Log read position. This parameter is used to prevent full-text traversal caused by restart. You are advised to retain the default value. |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
||||
---|---|---|---|---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
||||
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
||||
Bootstrap_servers |
bootstrap_servers |
string |
-- |
Yes |
Kafka service address. |
||||
Topics |
topics |
array |
logstash |
Yes |
Topics. Multiple topics can be consumed at the same time. |
||||
Consumer threads |
consumer_threads |
number |
1 |
Yes |
Consumer threads |
||||
Auto offset reset |
auto_offset_reset |
string |
latest |
No |
Offset reset:
|
||||
SSL certificate |
ssl_truststore_location |
file |
-- |
No |
SSL certificate This parameter is mandatory when SSL is selected. |
||||
SSL key |
ssl_truststore_password |
string |
-- |
No |
SSL key This parameter is mandatory when SSL is selected. |
||||
Security_protocol |
security_protocol |
string |
SASL_SSL |
No |
Security protocol.
|
||||
Sasl_jaas_config |
sasl_jaas_config |
string |
-- |
No |
SASL connection configuration. |
||||
Is_pw_encrypted |
is_pw_encrypted |
string |
false |
No |
Whether to encrypt the value. |
||||
Sasl_mechanism |
sasl_mechanism |
string |
PLAIN |
No |
SASL mechanism. |
||||
Group_id |
group_id |
string |
-- |
No |
Consumer group ID. |
||||
Set sasl_jaas_config based on the Kafka specifications. Example:
|
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Type |
type |
string |
Tenant |
Yes |
The options are Tenant, Platform, and ECS Agency. You are advised to set this parameter to ECS Agency.
|
StreamTable |
pipe_id |
string |
-- |
Yes |
Select a stream table from the drop-down list. This table stores the source data you want to transfer out from SecMaster. |
AK |
ak |
string |
-- |
Yes |
You need to specify AK only when you set Type to Platform in the AK/SK authentication scenario. |
SK |
sk |
string |
-- |
Yes |
You need to specify SK only when you set Type to Platform in the AK/SK authentication scenario. |
Domain_name |
domain_name |
string |
domain_name |
Yes |
Domain name of the IAM user. You need to specify Domain_name only when you set Type to Tenant in IAM authentication scenario. |
User_name |
user_name |
string |
user_name |
Yes |
Username of the IAM user. You need to specify User_name only when you set Type to Tenant in the IAM authentication scenario. |
User_password |
user_password |
string |
-- |
Yes |
Password of the IAM user. You need to specify User_password only when you set Type to Tenant in the IAM authentication scenario. |
Subscription_type |
subscription_type |
string |
Shared |
No |
Subscription type. Retain the default value.
|
Subscription_initial_position |
subscription_initial_position |
string |
Latest |
No |
Initial subscription position. Retain the default value.
|
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Hosts |
hosts |
array |
-- |
Yes |
Address of the host where the data source is located. |
Index |
index |
string |
-- |
Yes |
Index name. |
Query |
query |
string |
-- |
Yes |
Retrieval statement for reading data. |
User |
user |
string |
-- |
Yes |
Username for logging in to Elasticsearch. |
User_password |
user_password |
string |
-- |
Yes |
Password for logging in to Elasticsearch. |
Size |
size |
number |
20 |
Yes |
Queries |
Scroll |
scroll |
string |
5m |
Yes |
Volume |
Docinfo |
docinfo |
boolean |
Yes |
Yes |
Document |
Is_pw_encrypted |
is_pw_encrypted |
boolean |
Yes |
Yes |
Whether to enable encryption |
Ssl |
ssl |
boolean |
Yes |
No |
Whether to enable SSL. |
Ca_file |
ca_file |
file |
-- |
No |
Certificate file. |
SsL_certificate_verification |
ssl_certificate_verification |
boolean |
Yes |
No |
Whether to enable SSL certificate verification. |
Destination Connectors
SecMaster provides a wide range of destination connectors for you to collect security data from your security products.
Connector Type |
In-use Logstash |
Description |
---|---|---|
TCP |
tcp |
This collector is used to send TCP logs. For details about the configuration rules, see Table 9. |
UDP |
udp |
This collector is used to send UD logs. For details about the configuration rules, see Table 10. |
Kafka |
kafka |
This collector is used to write logs to Kafka message queues. For details about the configuration rules, see Table 11. |
OBS |
obs |
This collector is used to write logs to OBS buckets. For details about the configuration rules, see Table 12. |
SecMaster pipeline |
pipe |
This collector is used to write logs to the SecMaster pipeline. For details about the configuration rules, see Table 13. |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Port |
port |
number |
1025 |
Yes |
Port |
Codec |
codec |
string |
plain |
Yes |
Decoding type, which can be Json_lines or Plain.
|
Hosts |
host |
string |
192.168.0.66 |
Yes |
Host address Note: The network between the host and the node is normal. |
Ssl_enable |
ssl_enable |
boolean |
false |
No |
Whether to enable SSL authentication. |
SSL certificate |
ssl_cert |
file |
-- |
No |
SSL certificates |
SSL key |
ssl_key |
file |
-- |
No |
SSL certificate file |
SSL key |
ssl_key_passphrase |
string |
-- |
No |
SSL certificate key |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Hosts |
host |
string |
-- |
Yes |
Host IP address. Note: The network between the host and the node is normal. |
Port |
port |
number |
1025 |
Yes |
Port |
Decoding type |
codec |
string |
json_lines |
Yes |
Decoding type, which can be Json_lines or Plain.
|
Retry count |
retry_count |
number |
3 |
No |
Time of retry attempts |
Retry backoff (ms) |
retry_backoff_ms |
number |
200 |
No |
Retry backoff (ms) |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Service address |
bootstrap_servers |
string |
-- |
Yes |
Service address, for example, 192.168.21.21:9092,192.168.21.24:9999. |
Topics |
topic_id |
string |
logstash |
Yes |
Topics |
Decoding type |
codec |
string |
plain |
Yes |
Decoding type, which can be Json or Plain. |
Maximum length of the request |
max_request_size |
number |
10485760 |
Yes |
Maximum length of the request |
Security_protocol |
security_protocol |
string |
PLAINTEXT |
No |
Security protocol |
SASL connection configuration |
sasl_jaas_config |
string |
-- |
No |
SASL connection configuration |
Encrypted |
is_pw_encrypted |
string |
true |
No |
Encrypted |
SASL mechanism |
sasl_mechanism |
string |
PLAIN |
No |
sasl_mechanism |
Set sasl_jaas_config based on the Kafka specifications. The following is an example:
|
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Region |
Region |
string |
-- |
Yes |
Region |
Bucket |
bucket |
string |
demo-obs-sec-mrd-datas |
Yes |
Bucket name |
Endpoint |
Endpoint |
string |
https://obs.huawei.com |
Yes |
Endpoint |
Cache folder |
temporary_directory |
string |
/temp/logstash/ |
Yes |
Cache path |
Encoding type |
codec |
string |
plain |
No |
Encoding format: plain or JSON |
AK |
ak |
string |
-- |
No |
AK |
SK |
sk |
string |
-- |
No |
SK |
Prefix |
prefix |
string |
test |
No |
Path prefix. |
Encoding format |
encoding |
string |
gzip |
No |
Encoding format: gzip or pure file |
Rule |
Logstash Settings |
Type |
Default Value |
Mandatory |
Description |
---|---|---|---|---|---|
Title |
title |
string |
-- |
Yes |
Name of the custom connector. It must meet the following requirements:
|
Description |
description |
string |
-- |
Yes |
Description of the custom connector. It must meet the following requirements:
|
Type |
type |
string |
Tenant |
Yes |
The options are Tenant, Platform, and ECS Agency.
|
StreamTable |
pipe_id |
string |
-- |
Yes |
Select a stream table from the drop-down list. This table stores the source data you want to transfer into SecMaster. |
AK |
ak |
string |
-- |
Yes |
You need to specify AK only when you set Type to Platform in the AK/SK authentication scenario. |
SK |
sk |
string |
-- |
Yes |
You need to specify SK only when you set Type to Platform in the AK/SK authentication scenario. |
Domain_name |
domain_name |
string |
domain_name |
Yes |
Domain name of the IAM user. You need to specify Domain_name only when you set Type to Tenant in the IAM authentication scenario. |
User_name |
user_name |
string |
user_name |
Yes |
Username of the IAM user. You need to specify User_name only when you set Type to Tenant in the IAM authentication scenario. |
User_password |
user_password |
string |
-- |
Yes |
Password of the IAM user. You need to specify User_password only when you set Type to Tenant in the IAM authentication scenario. |
Compression_type |
compression_type |
string |
NONE |
No |
Packet compression type. You are advised to retain the default value. |
Block_if_queue_full |
block_if_queue_full |
boolean |
true |
Yes |
If the queue is full, new requests are rejected. You are advised to retain the default value. |
Enable_batching |
enable_batching |
boolean |
true |
Yes |
Whether to enable batch processing. You are advised to retain the default value. |
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot