Ingesting BMS Text Logs to LTS
A Bare Metal Server (BMS) features both the scalability of VMs and high performance of physical servers. It provides dedicated servers on the cloud, delivering the performance and security required by core databases, critical applications, high-performance computing (HPC), and big data.
After you configure BMS log ingestion, ICAgent collects logs from BMS based on your specified rules, and sends the logs to LTS by log stream. You can view and analyze these logs on the LTS console for improving host running stability and information security.
Perform the following steps to complete the ingestion configuration:
- Step 1: Select a Log Stream
- Step 2: (Optional) Select a Host Group
- Step 3: Configure the Collection
- Step 4: Configure Indexing
- Step 5: Complete the Ingestion Configuration
To collect logs from multiple scenarios, set multiple ingestion configurations in a batch.
Prerequisites
ICAgent has been installed and added to the host group. You have enabled ICAgent Diagnosis to view exceptions, overall status, and collection status of ICAgent. For details, see Setting ICAgent Collection.
Step 1: Select a Log Stream
- Log in to the LTS console.
- Choose Log Ingestion > Ingestion Center in the navigation pane. Then, click BMS (Bare Metal Server).
Alternatively, choose Log Ingestion > Ingestion Management in the navigation pane, and click Ingest Log > BMS (Bare Metal Server) on the displayed page.
Alternatively, choose Log Management in the navigation pane and click the target log stream to access its details page. Click in the upper right corner. On the displayed page, click the Log Ingestion tab and click Ingest Log. In the displayed dialog box, click BMS (Bare Metal Server).
- Select a log group from the Log Group drop-down list. If there is no desired log group, click Create Log Group. For details, see Managing Log Groups.
- Select a log stream from the Log Stream drop-down list. If there is no desired log stream, click Create Log Stream. For details, see Managing Log Streams.
- Click Next: (Optional) Select Host Group.
Step 2: (Optional) Select a Host Group
- Select one or more host groups from which you want to collect logs. If there are no desired host groups, click Create above the host group list to create one. For details, see Managing Host Groups.
You can also skip this step, but the collection configuration will not take effect. You are advised to select a host group during the first ingestion configuration. If you skip this step, follow either of the following ways to configure host groups after the ingestion configuration is complete:
- Choose Host Management > Host Groups in the navigation pane and associate host groups with ingestion configurations.
- On the Ingestion Rule tab page, click Edit in the Operation column. On the displayed page, select required host groups.
- Click Next: Configurations.
Step 3: Configure the Collection
After selecting host groups, configure the collection as follows:
- Ensure that sensitive information is not collected.
- If a collection path of a host has been configured in AOM, do not configure the path in LTS.
- If log files were last modified more than 12 hours earlier than the time when the path is added, the files are not collected.
- LTS cannot collect logs of PostgreSQL (database) instances.
- Collection Configuration Name: Enter 1 to 64 characters. Only letters, digits, hyphens (-), underscores (_), and periods (.) are allowed. Do not start with a period or underscore, or end with a period.
If you want to reuse existing collection configurations, click Import Configuration next to the text box. On the Import Configuration page, select a configuration and click OK.
Import Old-Edition Configuration: Import the host ingestion configuration of the old version to the log ingestion of the new version.
- If LTS is newly installed and Import Old-Edition Configuration is not displayed, you can directly create a configuration without importing the old one.
- If LTS is upgraded, Import Old-Edition Configuration is displayed. Import the old configuration or create one as required.
- Collection Paths: Add one or more host paths. LTS will collect logs from these paths. The rules for setting collection paths are as follows:
- Logs can be collected recursively. A double asterisk (**) can represent up to 5 directory levels in a path.
For example, /var/logs/**/a.log will match the following logs:
/var/logs/a.log /var/logs/1/a.log /var/logs/1/2/a.log /var/logs/1/2/3/a.log /var/logs/1/2/3/4/a.log /var/logs/1/2/3/4/5/a.log
- /1/2/3/4/5/ indicates the 5 levels of directories under the /var/logs directory. All the a.log files found in all these levels of directories will be collected.
- Only one double asterisk (**) can be contained in a collection path. For example, /var/logs/**/a.log is acceptable but /opt/test/**/log/** is not.
- A collection path cannot begin with a double asterisk (**), such as /**/test, to avoid collecting system files.
- You can use an asterisk (*) as a wildcard for fuzzy match. The wildcard (*) can represent one or more characters of a directory or file name.
If a log collection path is similar to C:\windows\system32 but logs cannot be collected, enable Web Application Firewall (WAF) and configure the path again.
- Example 1: /var/logs/*/a.log will match all a.log files found in all directories under the /var/logs/ directory:
/var/logs/1/a.log
/var/logs/2/a.log
- Example 2: /var/logs/service-*/a.log will match files as follows:
/var/logs/service-1/a.log
/var/logs/service-2/a.log
- Example 3: /var/logs/service/a*.log will match files as follows:
/var/logs/service/a1.log
/var/logs/service/a2.log
- Example 1: /var/logs/*/a.log will match all a.log files found in all directories under the /var/logs/ directory:
- If the collection path is set to a directory (such as /var/logs/), only .log, .trace, and .out files in the directory are collected.
If the collection path is set to a file name, the corresponding file is collected. Only text files can be collected.
- Add Custom Wrapping Rule: ICAgent determines whether a file is wrapped based on the file name rule. If your wrapping rule does not comply with the built-in rules, you can add a custom wrap rule to prevent log loss during repeated collection and wrapping.
The built-in rules are {basename}{connector}{wrapping identifier}.{suffix} and {basename}.{suffix}{connector}{wrapping identifier}. The connector is -._, the wrapping identifier is a non-letter symbol, and the suffix consists of letters.
A custom wrapping rule consists of {basename} and the feature regular expression of the wrapped file. Example: If your log file name is /opt/test.out.log, and the wrapped file names are test.2024-01-01.0.out.log and test.2024-01-01.1.out.log, the collection path is /opt/*.log and the wrapping rule is {basename}\.[-0-9\.].out.log.
- Logs can be collected recursively. A double asterisk (**) can represent up to 5 directory levels in a path.
- Files can be collected for multiple times (not available to Windows).
After you enable Allow Repeated File Collection, one host log file can be collected to multiple log streams. This function is available only to certain ICAgent versions. For details, see Checking the ICAgent Version Description.
After you disable this function, each collection path must be unique. That is, the same log file in the same host cannot be collected to different log streams.
- Set Collection Filters: Blacklisted directories or files will not be collected.
Blacklist filters can be exact matches or wildcard pattern matches. For details, see Collection Paths.
- If you blacklist a file or directory that has been set as a collection path in the previous step, the blacklist settings will be used and the file or files in the directory will be filtered out.
- If a log has been added to the blacklist, it cannot be collected even if you create a log ingestion task. You can collect it again only after you delete the collection path from the blacklist.
- If you specify a directory, all files in the directory are filtered out, but log files in the folders in the directory cannot be filtered out.
- Collect Windows Event Logs: To collect logs from Windows hosts, enable this option and set the following parameters.
Table 1 Parameters for collecting windows event logs Parameter
Description
Log Type
Log types include System, Application, Security, and Startup.
First Collection Time Offset
If you set this parameter to 7, logs generated within the 7 days before the collection start time are collected. This offset takes effect only for the first collection to ensure that the logs are not repeatedly collected. Max: 7 days.
Event Level
You can filter and collect Windows events based on their severity (information, warning, error, critical, and verbose). This function is available only to Windows Vista or later.
- Enable structuring parsing. For details, see Setting ICAgent Structuring Parsing Rules.
LTS enables combined parsing, allowing you to create different structuring parsing rules for each collection configuration of a log stream.
If you have configured cloud structuring parsing, delete its configurations before configuring ICAgent structuring parsing.
Figure 1 ICAgent structuring parsing configuration
- Set other configurations.
- Configure the log format and time by referring to Table 3.
Table 3 Log collection settings Parameter
Description
Log Format
- Single-line: Each log line is displayed as a single log event.
- Multi-line: Multiple lines of exception log events can be displayed as a single log event. This is helpful when you check logs to locate problems.
Log Time
System time: log collection time by default. It is displayed at the beginning of each log event.
NOTE:- Log collection time is the time when logs are collected and sent by ICAgent to LTS.
- Log printing time is the time when logs are printed. ICAgent collects and sends logs to LTS with an interval of 1 second.
- Restriction on log collection time: Logs are collected within 24 hours before and after the system time.
Time wildcard: You can set a time wildcard so that ICAgent will look for the log printing time as the beginning of a log event.
- If the time format in a log event is 2019-01-01 23:59:59.011, the time wildcard should be set to YYYY-MM-DD hh:mm:ss.SSS.
- If the time format in a log event is 19-1-1 23:59:59.011, the time wildcard should be set to YY-M-D hh:mm:ss.SSS.
NOTE:If a log event does not contain year information, ICAgent regards it as printed in the current year.
Example:
YY - year (19) YYYY - year (2019) M - month (1) MM - month (01) D - day (1) DD - day (01) hh - hours (23) mm - minutes (59) ss - seconds (59) SSS - millisecond (999) hpm - hours (03PM) h:mmpm - hours:minutes (03:04PM) h:mm:sspm - hours:minutes:seconds (03:04:05PM) hh:mm:ss ZZZZ (16:05:06 +0100) hh:mm:ss ZZZ (16:05:06 CET) hh:mm:ss ZZ (16:05:06 +01:00)
Log Segmentation
This parameter needs to be specified if the Log Format is set to Multi-line. By generation time indicates that a time wildcard is used to detect log boundaries, whereas By regular expression indicates that a regular expression is used.
Regular Expression
You can set a regular expression to look for a specific pattern to indicate the beginning of a log event. This parameter needs to be specified when you select Multi-line for Log Format and By regular expression for Log Segmentation.
The time wildcard and regular expression will look for the specified pattern right from the beginning of each log line. If no match is found, the system time, which may be different from the time in the log event, is used. In general cases, you are advised to select Single-line for Log Format and System time for Log Time.
Step 4: Configure Indexing
- (Optional) Configure indexing. For details, see Setting Indexes.
- Click Submit
Step 5: Complete the Ingestion Configuration
- Click its name to view its details.
- Click Edit in the Operation column to modify the ingestion configuration.
- Click Configure Tag in the Operation column to add a tag.
- Click More > Copy in the Operation column to copy the ingestion configuration.
- Click More > Delete in the Operation column to delete the ingestion configuration.
- Click More > ICAgent Collect Diagnosis in the Operation column of the ingestion configuration to monitor the exceptions, overall status, and collection status of ICAgent.
Setting Multiple Ingestion Configurations in a Batch
You can set multiple ingestion configurations for multiple scenarios in a batch, avoiding repetitive setups.
- On the Ingestion Management page, click Batch Ingestion to go to the details page. For details, see Table 4.
Table 4 Adding configurations in batches Type
Parameter
Description
Basic Settings
Ingestion Type
Choose BMS (Bare Metal Server).
Configurations to Add
Enter the number of ingestion configurations in the text box and click Add.
A maximum of 100 ingestion configurations can be added, including the one already exists under Ingestion Settings by default. Therefore, you can add up to 99 more.
Ingestion Settings
Configuration List
- The ingestion configurations are displayed on the left. You can add up to 99 more configurations.
- The ingestion configuration details are displayed on the right. Set them by referring to Step 3: Configure the Collection.
- After an ingestion configuration is complete, you can click Apply to Other Configurations to copy its settings to other configurations.
- Click Check Parameters. After the check is successful, click Submit.
The added ingestion configurations will be displayed on the Ingestion Rule tab page after the batch creation is successful.
- (Optional) Perform the following operations on ingestion configurations:
- Select multiple existing ingestion configurations and click Edit. On the displayed page, select an ingestion type to modify the corresponding ingestion configurations.
- Select multiple existing ingestion configurations and click Enable or Disable. If you toggle off the switch in the Status column of an ingestion configuration, logs will not be collected for this configuration.
- Select multiple existing ingestion configurations and click Delete.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot