Updated on 2024-07-18 GMT+08:00

Rules for Configuring Connectors

Source Connectors

SecMaster provides a wide range of source connectors for you to collect security data from your security products.

Table 1 Source connector types

Connector Type

In-use Logstash

Description

TCP

tcp

This collector is used to receive TCP logs. For details about the configuration rules, see Table 2.

User file

file

This collector is used to receive logs in local files. For details about the configuration rules, see Table 3.

UDP

udp

This collector is used to receive UDP logs. For details about the configuration rules, see Table 4.

OBS

obs

This collector is used to obtain log data from an OBS bucket. For details about the configuration rules, see Table 5.

Kafka

kafka

This collector is used to obtain Kafka network log data. For details about the configuration rules, see Table 6.

SecMaster

pipe

This collector is used to transfer SecMaster data to you. For details about the configuration rules, see Table 7.

Elasticsearch

elasticsearch

This collector is used to obtain data from the Elasticsearch cluster. For details about the configuration rules, see Table 8.

Table 2 TCP connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

Port

port

number

1025

Yes

Port number of the collection node.

Codec

codec

string

plain

Yes

Encoding format

  • Plain: Read the original content.
  • Json: processes the content in JSON format.

Packet label

type

string

tcp

Yes

Used to label logs.

SSL_enable

ssl_enable

boolean

false

No

Whether to enable SSL verification.

SSL certificate

ssl_cert

file

null

No

Certificate.

SSL key

ssl_key

file

--

No

SSL key file.

SSL key passphrase

ssl_key_passphrase

string

--

No

SSL certificate key.

Table 3 File connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

File path

path

array

/opt/cloud/logstash/config/in.txt

Yes

Path to obtain files.

Start position

start_position

string

beginning

Yes

Read start position.

Decoding type

codec

string

json

Yes

Decoding type

  • Plain: Read the original content.
  • Json: Processes the content in JSON format.

Packet label

type

string

file

No

Packet label, which is used for subsequent processing.

Enable metric

enable_metric

boolean

true

No

Whether to enable metrics.

Table 4 UDP connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

Port

port

number

1025

Yes

Port for the collection node.

Codec

codec

string

plain

Yes

Decoding type

  • Plain: Read the original content.
  • Json: Processes the content in JSON format.

Packet label

type

string

udp

No

Packet label, which is used for subsequent processing.

Queue size

queue_size

number

20000

No

Queue size.

Number of bytes in the receiving buffer

receive_buffer_bytes

number

20000

No

Number of bytes in the receiving buffer

Buffer size

buffer_size

number

10000

No

Buffer size

Worker thread

workers

number

1

No

Number of worker threads

Table 5 OBS connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

region

region

string

--

Yes

region

Bucket

bucket

string

demo-obs-sec-mrd-datas

Yes

OBS bucket name

endpoint

endpoint

string

https://obs.huawei.com

Yes

Endpoint address. Note that https must be added.

AK

ak

string

--

No

AK

SK

sk

string

--

No

SK

Prefix

prefix

string

/test

No

Prefix of the folder for log reads

Cache folder

temporary_directory

string

/temp

No

Cache folder for log reads

Packet label

type

string

--

No

Packet label

Memory path

sincedb_path

string

/opt/cloud/logstash/pipeline/file_name

No

Log read position. This parameter is used to prevent full-text traversal caused by restart.

Table 6 Kafka connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

Service address

bootstrap_servers

string

--

Yes

Service address

Topics

topics

array

logstash

Yes

Topics. Multiple topics can be consumed at the same time.

Consumer threads

consumer_threads

number

1

Yes

Consumer threads

Auto offset reset

auto_offset_reset

string

latest

No

Offset reset

  • Earliest: Read the earliest message.
  • Latest: Read the latest messages.

SSL certificate

ssl_truststore_location

file

--

No

SSL certificate

This parameter is mandatory when SSL is selected.

SSL private key

ssl_truststore_password

string

--

No

SSL private key

This parameter is mandatory when SSL is selected.

Security protocol

security_protocol

string

SASL_SSL

No

Security protocol

SASL connection configuration

sasl_jaas_config

string

--

No

SASL connection configuration

Encrypted

is_pw_encrypted

string

false

No

Encrypted

SASL mechanism

sasl_mechanism

string

PLAIN

No

sasl_mechanism

Group ID

group_id

string

--

No

group_id

Set sasl_jaas_config based on the Kafka specifications. Example:

  • Plaintext connection configuration
    org.apache.kafka.common.security.plain.PlainLoginModule required username='kafka user'password='kafka password';
  • Ciphertext connection configuration
    org.apache.kafka.common.security.scram.ScramLoginModule required username='kafka user name'password='kafka password';
Table 7 Pipe connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

Type

type

string

Tenant

Yes

Type

Pipeline

pipeId

string

--

Yes

Pipeline ID

domain_name

domain_name

string

domain_name

Yes

Domain name of the IAM user

User_name

user_name

string

user_name

Yes

Username of the IAM user

Password

user_password

string

--

Yes

Username of the IAM user

Subscription type

subscription_type

string

true

No

Subscription type

  • Shared: shared mode
  • Exclusive: exclusive mode
  • Failover: disaster recovery mode

Subscription Start

subscription_initial_position

string

true

No

Subscription Start

Table 8 Elasticsearch connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

Hosts

hosts

array

--

Yes

Host IP address

Index

index

string

--

Yes

Index

Retrieval statement

query

string

--

Yes

Retrieval statement

User_name

user

string

--

Yes

User_name

Password

user_password

string

--

Yes

Password

Queries

size

number

20

Yes

Queries

Scroll

scroll

string

5m

Yes

Volume

Docinfo

docinfo

boolean

true

Yes

Document

Is pw encrypted

is_pw_encrypted

boolean

true

Yes

Whether to enable encryption

Whether to enable SSL

ssl

boolean

true

No

Whether to enable SSL

Ssl

ca_file

file

--

No

Certificate file

SsL_certificate_verification

ssl_certificate_verification

boolean

true

No

SSL certificate verification

Destination Connectors

SecMaster provides a wide range of destination connectors for you to collect security data from your security products.

Table 9 Destination connectors

Connector Type

In-use Logstash

Description

File

file

This collector is used to write data to local files on nodes. For details about the configuration rules, see Table 10.

TCP

tcp

This collector is used to send TCP logs. For details about the configuration rules, see Table 11.

UDP

udp

This collector is used to send UD logs. For details about the configuration rules, see Table 12.

Kafka

kafka

This collector is used to write logs to Kafka message queues. For details about the configuration rules, see Table 13.

OBS

obs

This collector is used to write logs to OBS buckets. For details about the configuration rules, see Table 14.

SecMaster pipeline

pipe

This collector is used to write logs to the SecMaster pipeline. For details about the configuration rules, see Table 15.

Table 10 File connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

Path

path

string

/opt/cloud/logstash/config/out.txt

Yes

File path on the output node

Create if deleted

create_if_deleted

boolean

true

Yes

If the file does not exist, create one.

Decoding type

codec

string

json_lines

Yes

Codec

  • plain: Read the original content.
  • Json_lines: Processes the content in JSON format.
Table 11 TCP connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

Port

port

number

1025

Yes

Port

Decoding type

codec

string

plain

Yes

Decoding type, which can be json_lines or Plain.

  • plain: Read the original content.
  • Json_lines: Processes the content in JSON format.

Hosts

host

string

192.168.0.66

Yes

Host address

Note: The network between the host and the node is normal.

SSL certificate

ssl_cert

file

--

No

SSL certificates

Whether to enable SSL

ssl_enable

boolean

false

No

Whether to enable SSL authentication

SSL key

ssl_key

file

--

No

SSL certificate file

SSL key passphrase

ssl_key_passphrase

string

--

No

SSL certificate key

Table 12 UDP connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

Hosts

host

string

--

Yes

Host IP address.

Note: The network between the host and the node is normal.

Port

port

number

1025

Yes

Port

Decoding type

codec

string

json_lines

Yes

Decoding type, which can be Json_lines or Plain.

  • plain: Read the original content.
  • Json_lines: Processes the content in JSON format.

Retry count

retry_count

number

3

No

Time of retry attempts

Retry backoff (ms)

retry_backoff_ms

number

200

No

Retry backoff (ms)

Table 13 Kafka connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

Service address

bootstrap_servers

string

--

Yes

Service address, for example, 192.168.21.21:9092,192.168.21.24:9999.

Topics

topic_id

string

logstash

Yes

Topics

Decoding type

codec

string

plain

Yes

Decoding type, which can be Json or Plain.

Maximum length of the request

max_request_size

number

10485760

Yes

Maximum length of the request

SSL certificate

ssl_truststore_location

file

--

No

SSL certificates

This parameter is mandatory when SSL is selected.

SSL private key

ssl_truststore_password

string

--

No

SSL private key

This parameter is mandatory when SSL is selected.

Security protocol

security_protocol

string

PLAINTEXT

No

Security protocol

SASL connection configuration

sasl_jaas_config

string

--

No

SASL connection configuration

is_pw_encrypted

is_pw_encrypted

string

true

No

Whether to encrypt the value.

SASL mechanism

sasl_mechanism

string

PLAIN

No

sasl_mechanism

Set Sasl_jaas_config based on the Kafka specifications. The following is an example:

  • Plaintext connection configuration
    org.apache.kafka.common.security.plain.PlainLoginModule required username='kafka user'password='kafka password';
  • Ciphertext connection configuration
    org.apache.kafka.common.security.scram.ScramLoginModule required username='kafka user name'password='kafka password';
Table 14 OBS connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

region

region

string

--

Yes

region

Bucket

bucket

string

demo-obs-sec-mrd-datas

Yes

Bucket name

endpoint

endpoint

string

https://obs.huawei.com

Yes

endpoint

Cache folder

temporary_directory

string

/temp/logstash/

Yes

Cache path

Encoding type

codec

string

plain

No

Encoding format: plain or JSON

AK

ak

string

--

No

AK

SK

sk

string

--

No

SK

Prefix

prefix

string

test

No

Path prefix.

Encoding format

encoding

string

gzip

No

Encoding format: gzip or pure file

Memory path

sincedb_path

string

/opt/cloud/logstash/pipeline/file_name

No

Log read position. This parameter is used to prevent full-text traversal caused by restart.

Table 15 Pipe connector configuration rules

Rule

Logstash Settings

Type

Default Value

Mandatory

Description

Type

type

string

Tenant

Yes

Type

Pipeline

pipeId

string

--

Yes

Pipeline

AK

ak

string

--

Yes

AK

This parameter is mandatory when the platform type is selected.

SK

sk

string

--

Yes

SK

This parameter is mandatory when the platform type is selected.

domain_name

domain_name

string

domain_name

Yes

Domain_name of the IAM user

This parameter is mandatory when the tenant type is selected.

User_name

user_name

string

user_name

Yes

Username of the IAM user

This parameter is mandatory when the tenant type is selected.

Password

user_password

string

--

Yes

Username of the IAM user

This parameter is mandatory when the tenant type is selected.

Compression type

compression_type

string

NONE

No

Packet compression type

Block if the queue is full

block_if_queue_full

boolean

true

No

Whether to block the access if the queue is full.

Enable batch processing

enable_batching

boolean

true

No

Whether to enable batch processing.