Updated on 2022-02-22 GMT+08:00

Configuring the DIS Logstash Plugin

DIS Logstash Plugins consist of the Input and Output plugins. This section describes the configuration items of the plugins.

Configuring DIS Logstash Input

The configuration template (used to download data from a DIS stream to a local file) is as follows:

input
{
   dis {
        streams => ["YOUR_DIS_STREAM_NAME"]
        endpoint => "https://dis.${region}.myhuaweicloud.com"
        ak => "YOUR_ACCESS_KEY_ID"
        sk => "YOUR_SECRET_KEY_ID"
        region => "YOUR_Region"
        project_id => "YOUR_PROJECT_ID"
        group_id => "YOUR_APP_ID"
        client_id => "YOUR_CLIENT_ID"
        auto_offset_reset => "earliest"
    }
}
output
{
    file {
       path => ["/tmp/test.log"]
   }
}
Table 1 DIS Logstash Input configuration parameters

Parameter

Mandatory

Description

Default Value

stream

Yes

DIS stream name.

The entered DIS stream name must be the same as the stream name specified when you are creating a DIS stream on the DIS console.

ak

Yes

User's AK.

For details about how to obtain an AK, see Checking Authentication Information.

-

sk

Yes

User's SK.

For details about how to obtain an SK, see Checking Authentication Information.

-

region

Yes

Region in which DIS is located.

-

project_id

Yes

Project ID specific to your region.

For details about how to obtain a project ID, see Checking Authentication Information.

-

client_id

No

Client ID, which identifies a consumer in a consumer group.

If multiple pipelines or Logstash instances are started for consumption, set this parameter to different values. For example, the value of instance 1 is client1, and the value of instance 2 is client2.

logstash

endpoint

Yes

Data API address of the region where DIS resides.

-

group_id

Yes

DIS App name, used to identify a consumer group. The value can be any character string.

-

auto_offset_reset

No

Position where data starts to be consumed from the stream. The options are as follows:

earliest: Data is consumed from the earliest one.

latest: Data is consumed from the latest one.

latest

Configuring DIS Logstash Output

The configuration template (used to read data from a local file and upload it to a DIS stream) is as follows:

input
{
   file {
       path => ["/tmp/test.log"]
       type => "log4j"
       start_position => "beginning"
   }
}
output
{
    dis {
        stream => ["YOUR_DIS_STREAM_NAME"]
        endpoint => "https://dis.${region}.myhuaweicloud.com"
        ak => "YOUR_ACCESS_KEY_ID"
        sk => "YOUR_SECRET_KEY_ID"
        region => "YOUR_Region"
        project_id => "YOUR_PROJECT_ID"
    }
}
Table 2 DIS Logstash Output configuration parameters

Parameter

Mandatory

Description

Default Value

stream

Yes

DIS stream name.

The entered DIS stream name must be the same as the stream name specified when you are creating a DIS stream on the DIS console.

ak

Yes

User's AK.

For details about how to obtain an AK, see Checking Authentication Information.

-

sk

Yes

User's SK.

For details about how to obtain an SK, see Checking Authentication Information.

-

region

Yes

Region in which DIS is located.

-

project_id

Yes

Project ID specific to your region.

For details about how to obtain a project ID, see Checking Authentication Information.

-

body_compress_enabled

No

Specifies whether to enable data compression.

No

body_compress_type

No

Data compression type. The following compression algorithms are supported:

lz4: a compression algorithm with a fast compression speed and high compression efficiency

snappy: a compression algorithm with a high compression speed and a reasonable compression rate, not for achieving maximum compression or being compatible with other compression formats

zstd: a new lossless compression algorithm with a fast compression speed and high compression ratio

lz4