Configuring ICAgent Collection
You can use ICAgent to collect logs to LTS. When creating a log ingestion task, you can customize collection policies, such as parsing, whitelist, and blacklist rules, and settings of uploading raw logs. ICAgent collection configurations define how to collect logs of the same type on a server, parse the logs, and send them to a specified log stream.
![](https://support.huaweicloud.com/intl/en-us/usermanual-lts/public_sys-resources/note_3.0-en-us.png)
ICAgent structuring parsing is in the closed beta test and is available only to whitelisted users. To use this function, For details, see Submitting a Service Ticket.
Advantages
- Collecting logs in non-intrusive mode based on log files: You do not need to modify the application code, and log collection does not affect the running of your application.
- Handling various exceptions during log collection: Security measures such as proactive retry and local cache are taken when a network or server exception occurs.
- Centralized management: After installing ICAgent, you only need to configure information such as host groups and ICAgent collection configurations on LTS.
- Comprehensive self-protection mechanisms: To ensure that ICAgent does not significantly affect the performance of other services on the same server, ICAgent has strict restrictions and protection mechanisms in terms of CPU, memory, and network usage.
Before configuring log ingestion, get to know the structuring parsing rules of ICAgent collection to facilitate your operations. ICAgent 5.12.147 or later is required. Advantages of this function include lower costs and combined parsing. Different structuring parsing rules can be configured for each collection configuration of a log stream.
ICAgent collection supports the following log structuring parsing rules:
- Single-Line - Full-Text Log: Each log line is displayed as a single log event.
- Multi-Line - Full-Text Log: Multiple lines of exception log events can be displayed as a single log event. This is helpful when you check logs to locate problems.
- JSON: applicable to JSON logs and splits them into key-value pairs.
- Delimiter: Fields are extracted using delimiters (such as commas or spaces).
- Single-Line - Completely Regular: applicable to single-line logs in any format and uses a regular expression to extract fields. After entering a regular expression, click Verify to verify it.
- Multi-Line - Completely Regular: applicable to multi-line logs in any format and uses a regular expression to extract fields. The regular expression of the first line can be automatically generated or manually entered. After entering a regular expression, click Verify to verify it.
- Combined Parsing: applicable to logs in multiple nested formats, for example, delimiter+JSON.
![](https://support.huaweicloud.com/intl/en-us/usermanual-lts/public_sys-resources/note_3.0-en-us.png)
- From now: queries log data generated in a time range that ends with the current time, such as the previous 1, 5, or 15 minutes. For example, if the current time is 19:20:31 and 1 hour is selected as the relative time from now, the charts on the dashboard display the log data that is generated from 18:20:31 to 19:20:31.
- From last: queries log data generated in a time range that ends with the current time, such as the previous 1 or 15 minutes. For example, if the current time is 19:20:31 and 1 hour is selected as the relative time from last, the charts on the dashboard display the log data that is generated from 18:00:00 to 19:00:00.
- Specified: queries log data that is generated in a specified time range.
Single-Line - Full-Text Log
If you want to display each line of log data as a single log on the LTS page, select Single-Line.
- Select Single-Line - Full-Text Log.
- Enable Log Filtering (disabled by default) as required and add a maximum of 20 whitelist or blacklist rules.
- Add a whitelist rule (available only when Log Filtering is enabled).
You can add a filtering rule to filter out valuable log data. The filtering rule is a regular expression and is applied to the value of a specified key. The created filtering rule is a matching rule. That is, only logs that match the regular expression will be collected and reported. In the single-line or multi-line full text mode, content is used as the key name of the full text by default. The relationship between multiple filtering rules is OR. For example, to collect logs whose source files contain hello, set the collection rule to .*hello.*.
- Add a blacklist rule (available only when Log Filtering is enabled).
You can add a filtering rule to filter out valuable log data. The filtering rule is a regular expression and is applied to the value of a specified key. The created filtering rule is a discarding rule. That is, only logs that match the regular expression will be discarded. In the single-line or multi-line full text mode, content is used as the key name of the full text by default. The relationship between multiple filtering rules is OR. For example, in order to not collect logs whose source files contain hello, set the collection rule to .*hello.*.
Multi-Line - Full-Text Log
Multiple lines of exception log events can be displayed as a single log event. This is helpful when you check logs to locate problems.
- Select Multi-Line - Full-Text Log.
- Select a log example from existing logs or paste it from the clipboard, and enter or let the system generate a regular expression for the first line.
- Click Select from Existing Logs, select a log event, and click OK. You can select different time ranges to filter logs.
- Click Paste from Clipboard to copy the cut log content to the Log Example box.
- Enable Log Filtering (disabled by default) as required and add a maximum of 20 whitelist or blacklist rules.
- Add a whitelist rule (available only when Log Filtering is enabled).
You can add a filtering rule to filter out valuable log data. The filtering rule is a regular expression and is applied to the value of a specified key. The created filtering rule is a matching rule. That is, only logs that match the regular expression will be collected and reported. In the single-line or multi-line full text mode, content is used as the key name of the full text by default. The relationship between multiple filtering rules is OR. For example, to collect logs whose source files contain hello, set the collection rule to .*hello.*.
- Add a blacklist rule (available only when Log Filtering is enabled).
You can add a filtering rule to filter out valuable log data. The filtering rule is a regular expression and is applied to the value of a specified key. The created filtering rule is a discarding rule. That is, only logs that match the regular expression will be discarded. In the single-line or multi-line full text mode, content is used as the key name of the full text by default. The relationship between multiple filtering rules is OR. For example, in order to not collect logs whose source files contain hello, set the collection rule to .*hello.*.
JSON
This option is applicable to JSON logs and splits them into key-value pairs.
- Choose JSON.
- Enable Log Filtering (disabled by default) as required and add a maximum of 20 whitelist or blacklist rules.
- Add a whitelist rule (available only when Log Filtering is enabled).
You can add a filtering rule to filter out valuable log data. The filtering rule is a regular expression and is applied to the value of a specified key. The created filtering rule is a matching rule. That is, only logs that match the regular expression will be collected and reported. In the single-line or multi-line full text mode, content is used as the key name of the full text by default. The relationship between multiple filtering rules is OR. For example, to collect logs whose source files contain hello, set the collection rule to .*hello.*.
- Add a blacklist rule (available only when Log Filtering is enabled).
You can add a filtering rule to filter out valuable log data. The filtering rule is a regular expression and is applied to the value of a specified key. The created filtering rule is a discarding rule. That is, only logs that match the regular expression will be discarded. In the single-line or multi-line full text mode, content is used as the key name of the full text by default. The relationship between multiple filtering rules is OR. For example, in order to not collect logs whose source files contain hello, set the collection rule to .*hello.*.
- Raw Log Upload:
After this function is enabled, raw logs are uploaded to LTS as the value of the content field.
- Upload Parsing Failure Log:
After this function is enabled, raw logs are uploaded to LTS as the value of the _content_parse_fail_ field.
- Custom Time:
Enabling this lets you specify a field as the log time. Otherwise, the log time is the time set during ingestion configuration.
- JSON Parsing Layers: Add 1 to 4 JSON parsing layers. The value must be an integer and is 1 by default.
This function expands the fields of a JSON log. For example, for raw log {"key1":{"key2":"value"}}, if you choose to parse it into 1 layer, the log become {"key1":{"key2":"value"}}; if you choose to parse it into 2 layers, the log become {"key1.key2":"value"}.
Delimiter
Logs can be parsed by delimiters, such as commas (,), spaces, or other special characters.
- Select a delimiter.
- Select or customize a delimiter.
- Select a log example from existing logs or paste it from the clipboard, click Verify, and view the results under Extraction Results.
- Click Select from Existing Logs, select a log event, and click OK. You can select different time ranges to filter logs.
- Click Paste from Clipboard to copy the cut log content to the Log Example box.
- Enable Log Filtering (disabled by default) as required and add a maximum of 20 whitelist or blacklist rules.
- Add a whitelist rule (available only when Log Filtering is enabled).
You can add a filtering rule to filter out valuable log data. The filtering rule is a regular expression and is applied to the value of a specified key. The created filtering rule is a matching rule. That is, only logs that match the regular expression will be collected and reported. In the single-line or multi-line full text mode, content is used as the key name of the full text by default. The relationship between multiple filtering rules is OR. For example, to collect logs whose source files contain hello, set the collection rule to .*hello.*.
- Add a blacklist rule (available only when Log Filtering is enabled).
You can add a filtering rule to filter out valuable log data. The filtering rule is a regular expression and is applied to the value of a specified key. The created filtering rule is a discarding rule. That is, only logs that match the regular expression will be discarded. In the single-line or multi-line full text mode, content is used as the key name of the full text by default. The relationship between multiple filtering rules is OR. For example, in order to not collect logs whose source files contain hello, set the collection rule to .*hello.*.
- Raw Log Upload:
After this function is enabled, raw logs are uploaded to LTS as the value of the content field.
- Upload Parsing Failure Log:
After this function is enabled, raw logs are uploaded to LTS as the value of the _content_parse_fail_ field.
- Custom Time:
Enabling this lets you specify a field as the log time. Otherwise, the log time is the time set during ingestion configuration.
Single-Line - Completely Regular
This option is applicable to single-line logs in any format and uses a regular expression to extract fields.
- Select Single-Line - Completely Regular.
- Select a log example from existing logs or paste it from the clipboard, enter a regular expression in the Extraction Regular Expression box, click Verify, and view the results under Extraction Results.
- Click Select from Existing Logs, select a log event, and click OK. You can select different time ranges to filter logs.
- Click Paste from Clipboard to copy the cut log content to the Log Example box.
- Enable Log Filtering (disabled by default) as required and add a maximum of 20 whitelist or blacklist rules.
- Add a whitelist rule (available only when Log Filtering is enabled).
You can add a filtering rule to filter out valuable log data. The filtering rule is a regular expression and is applied to the value of a specified key. The created filtering rule is a matching rule. That is, only logs that match the regular expression will be collected and reported. In the single-line or multi-line full text mode, content is used as the key name of the full text by default. The relationship between multiple filtering rules is OR. For example, to collect logs whose source files contain hello, set the collection rule to .*hello.*.
- Add a blacklist rule (available only when Log Filtering is enabled).
You can add a filtering rule to filter out valuable log data. The filtering rule is a regular expression and is applied to the value of a specified key. The created filtering rule is a discarding rule. That is, only logs that match the regular expression will be discarded. In the single-line or multi-line full text mode, content is used as the key name of the full text by default. The relationship between multiple filtering rules is OR. For example, in order to not collect logs whose source files contain hello, set the collection rule to .*hello.*.
- Raw Log Upload:
After this function is enabled, raw logs are uploaded to LTS as the value of the content field.
- Upload Parsing Failure Log:
After this function is enabled, raw logs are uploaded to LTS as the value of the _content_parse_fail_ field.
- Custom Time:
Enabling this lets you specify a field as the log time. Otherwise, the log time is the time set during ingestion configuration.
Multi-Line - Completely Regular
This option is applicable to multi-line logs in any format and uses a regular expression to extract fields.
- Select Multi-Line - Completely Regular.
- Select a log example from existing logs or paste it from the clipboard, enter or let the system generate a regular expression for the first line, enter a regular expression in the Extraction Regular Expression box, click Verify, and view the results under Extraction Results.
- Click Select from Existing Logs, select a log event, and click OK. You can select different time ranges to filter logs.
- Click Paste from Clipboard to copy the cut log content to the Log Example box.
- Enable Log Filtering (disabled by default) as required and add a maximum of 20 whitelist or blacklist rules.
- Add a whitelist rule (available only when Log Filtering is enabled).
You can add a filtering rule to filter out valuable log data. The filtering rule is a regular expression and is applied to the value of a specified key. The created filtering rule is a matching rule. That is, only logs that match the regular expression will be collected and reported. In the single-line or multi-line full text mode, content is used as the key name of the full text by default. The relationship between multiple filtering rules is OR. For example, to collect logs whose source files contain hello, set the collection rule to .*hello.*.
- Add a blacklist rule (available only when Log Filtering is enabled).
You can add a filtering rule to filter out valuable log data. The filtering rule is a regular expression and is applied to the value of a specified key. The created filtering rule is a discarding rule. That is, only logs that match the regular expression will be discarded. In the single-line or multi-line full text mode, content is used as the key name of the full text by default. The relationship between multiple filtering rules is OR. For example, in order to not collect logs whose source files contain hello, set the collection rule to .*hello.*.
- Raw Log Upload:
After this function is enabled, raw logs are uploaded to LTS as the value of the content field.
- Upload Parsing Failure Log:
After this function is enabled, raw logs are uploaded to LTS as the value of the _content_parse_fail_ field.
- Custom Time:
Enabling this lets you specify a field as the log time. Otherwise, the log time is the time set during ingestion configuration.
Combined Parsing
This option is applicable to logs in multiple nested formats, for example, delimiter+JSON. You can customize parsing rules based on the syntax.
- Select Combined Parsing.
- Select a log example from existing logs or paste it from the clipboard and enter the configuration content under Plug-in Settings.
- Customize the settings based on the log content by referring to the following plug-in syntaxes.
- processor_regex
Table 1 Regular expression extraction Parameter
Type
Description
source_key
string
Original field name.
regex
string
() in a regular expression indicates the field to be extracted.
keys
string
Field name for the extracted content.
keep_source
boolean
Whether to retain the original field.
keep_source_if_parse
boolean
Whether to retain the original field when a parsing error occurs.
- processor_split_string
Table 2 Delimiter parsing Parameter
Type
Description
source_key
string
Original field name.
split_sep
string
Delimiter string.
keys
string
Field name for the extracted content.
keep_source
boolean
Whether to retain the original field in the parsed log.
split_type
char/special_char/string
Delimiter type. The options are char (single character), special_char (invisible character), and string.
keep_source_if_parse_error
boolean
Whether to retain the original field when a parsing error occurs.
- processor_split_key_value
Table 3 Key-value pair segmentation Parameter
Type
Description
source_key
string
Original field name.
delimiter
string
Delimiter between key-value pairs. The default value is the tab character (\t).
separator
string
Delimiter between the key and value in a key-value pair. The default value is a colon (:).
keep_source
boolean
Whether to retain the original field in the parsed log.
- processor_add_fields
Table 4 Adding a field Parameter
Type
Description
fields
json/object
Name and value of the field to be added. The value is in key-value pair format. Multiple key-value pairs can be added.
- processor_drop
Table 5 Discarded field Parameter
Type
Description
drop_keys
string
List of discarded fields.
- processor_rename
Table 6 Renaming a field Parameter
Type
Description
source_keys
string
Field to be renamed.
destkeys
string
Field with the new name.
- processor_json
Table 7 JSON expansion and extraction Parameter
Type
Description
source_key
string
Original field name.
keep_source
string
Whether to retain the original field in the parsed log.
expand_depth
int
JSON expansion depth. The default value 0 indicates that the depth is not limited. Other numbers, such as 1, indicate the current level.
expand_connector
string
Connector for expanding JSON. The default value is an underscore (_).
prefix
string
Prefix added to a field name when JSON is expanded.
keep_source_if_parse_error
boolean
Whether to retain the original field when a parsing error occurs.
- processor_filter_regex
Table 8 Filters Parameter
Type
Description
include
json/object
The key indicates the log field, and the value indicates the regular expression to be matched.
exclude
json/object
The key indicates the log field, and the value indicates the regular expression to be matched.
- processor_gotime
Table 9 Extraction time Parameter
Type
Description
source_key
string
Original field name.
source_format
string
Original time format.
source_location
int
Original time zone. If the value is empty, it indicates the time zone of the host or container where the logtail is located.
dest_key
string
Target field after parsing.
dest_format
string
Time format after parsing.
dest_location
int
Time zone after parsing. If this parameter is left blank, the time zone of the local host is used.
set_time
boolean
Whether to set the parsed time as the log time.
keep_source
boolean
Whether to retain the original field in the parsed log.
- processor_regex
- Example:
[ { "type": "processor_regex", "detail": { "source_key": "content", "regex": "*", "keys": [ "key1", "key2" ], "multi_line_regex": "*", "keep_source": true, "keep_source_if_parse_error": true } }, { "type": "processor_split_string", "detail": { "split_sep": ".", "split_type": ".", "split_keys": [ "key1", "key2" ], "source_key": "context", "keep_source": true, "keep_source_if_parse_error": true } }, { "type": "processor_add_fields", "detail": { "fields": [ { "key1": "value1" }, { "key2": "value2" } ] } }, { "type": "processor_drop", "detail": { "drop_keys": [ "key1", "key2" ] } }, { "type": "processor_rename", "detail": { "source_key": [ "skey1", "skey2" ], "dest_keys": [ "dkey1", "dkey2" ] } }, { "type": "processor_json", "detail": { "source_key": "context", "expand_depth": 4, "expand_connector": "_", "prefix": "prefix", "keep_source": true, "keep_source_if_parse_error": true } }, { "type": "processor_gotime", "detail": { "source_key": "skey", "source_format": "ydm", "source_location": 8, "dest_key": "dkey", "dest_format": "ydm", "dest_ocation": 8, "set_time": true, "keep_source": true, "keep_source_if_parse_error": true } }, { "type": "processor_filter_regex", "detail": { "include": { "ikey1": "*", "ikey2": "*" }, "exclude": { "ekey1": "*", "ekey1": "*" } } } ]
Custom Time
Enable Custom Time and set parameters by referring to Table 10.
![](https://support.huaweicloud.com/intl/en-us/usermanual-lts/public_sys-resources/note_3.0-en-us.png)
- If the time format is incorrect or the specified field does not exist, the log time is the time set during ingestion configuration.
- The time field needs to be verified again when operations such as field name modification, field deletion, and field type modification are performed on structuring parsing.
![Click to enlarge](https://support.huaweicloud.com/intl/en-us/usermanual-lts/en-us_image_0000001698158716.png)
Parameter |
Description |
Example |
---|---|---|
Key Name of the Time Field |
Name of an extracted field. You can select an extracted field from the drop-down list. The field is of the string or long type. |
test |
Field Value |
Value of an extracted field. After a Key is selected, the Value is automatically filled in.
NOTE:
The value of the field must be within 24 hours earlier or later than the current time. |
2023-07-19 12:12:00 |
Time Format |
For details, see Common Log Time Formats. |
yyyy-MM-dd HH:mm:ss |
Operation |
Click the verification icon ( |
- |
Common Log Time Formats
The following table lists the available common log time formats.
![](https://support.huaweicloud.com/intl/en-us/usermanual-lts/public_sys-resources/note_3.0-en-us.png)
By default, log timestamps in LTS are accurate to seconds. You do not need to configure information such as milliseconds and microseconds.
Format |
Description |
Example |
---|---|---|
EEE |
Abbreviation for Week. |
Fri |
EEEE |
Full name for Week. |
Friday |
MMM |
Abbreviation for Month. |
Jan |
MMMM |
Full name for Month. |
January |
dd |
Number of the day in a month, ranging from 01 to 31 (decimal). |
07, 31 |
HH |
Hour, in 24-hour format. |
22 |
hh |
Hour, in 12-hour format. |
11 |
MM |
Number of the month, ranging from 01 to 12 (decimal). |
08 |
mm |
Number of the minute, ranging from 00 to 59 (decimal). |
59 |
a |
AM or PM |
AM, PM |
hh:mm:ss a |
Time in the 12-hour format. |
11:59:59 AM |
HH:mm |
Hour and minute format. |
23:59 |
ss |
Number of the second, ranging from 00 to 59 (decimal). |
59 |
yy |
Year without century, ranging from 00 to 99 (decimal). |
04, 98 |
yyyy |
Year (decimal). |
2004, 1998 |
d |
Number of the day in a month, decimal, ranging from 1 to 31. If the value is a single-digit number, add a space before the value. |
7, 31 |
DDD |
Number of the day in a year, ranging from 001 to 366 (decimal). |
365 |
u |
Number of the day in a week, ranging from 1 to 7 (decimal). The value 1 indicates Monday. |
2 |
w |
Number of the week in a year. Sunday is the start of a week. The value ranges from 00 to 53. |
23 |
w |
Number of the week in a year. Monday is the start of a week. The value ranges from 01 to 53. A week is the first week of a month if it has at least four days in that month; otherwise, the following week is the first week. |
24 |
U |
Number of the day in a week, ranging from 0 to 6 (decimal). The value 0 indicates Sunday. |
5 |
EEE MMM dd HH:mm:ss yyyy |
Standard date and time. |
Tue Nov 20 14:12:58 2020 |
EEE MMM dd yyyy |
Standard date without time. |
Tue Nov 20 2020 |
HH:mm:ss |
Standard time without date. |
11:59:59 |
%s |
UNIX Timestamp. |
147618725 |
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot