- What's New
- Function Overview
- Service Overview
- Billing
- Getting Started
-
User Guide
- Granting LTS Permissions to IAM Users
- Purchasing LTS Resource Packages
- Log Management
-
Log Ingestion
- Overview
-
Using ICAgent to Collect Logs
- Overview
- Installing ICAgent (Intra-Region Hosts)
- Installing ICAgent (Extra-Region Hosts)
- Managing ICAgent
- Managing Host Groups
- Ingesting BMS Text Logs to LTS
- Ingesting CCE Application Logs to LTS
- Ingesting ECS Text Logs to LTS
- Ingesting ServiceStage Containerized Application Logs to LTS
- Ingesting ServiceStage Cloud Host Logs to LTS
- Ingesting Self-Built Kubernetes Application Logs to LTS
- Setting ICAgent Structuring Parsing Rules
-
Ingesting Cloud Service Logs to LTS
- Overview
- Ingesting AOM Logs to LTS
- Ingesting APIG Logs to LTS
- Ingesting CBH Logs to LTS
- Ingesting CFW Logs to LTS
- Ingesting CTS Logs to LTS
- Ingesting DDS Logs to LTS
- Ingesting DMS for Kafka Logs to LTS
- Ingesting DRS Logs to LTS
- Ingesting GaussDB(DWS) Logs to LTS
- Ingesting ELB Logs to LTS
- Ingesting Enterprise Router Logs to LTS
- Ingesting FunctionGraph Logs to LTS
- Ingesting GaussDB Logs to LTS
- Ingesting GES Logs to LTS
- Ingesting TaurusDB Logs to LTS
- Ingesting GeminiDB Logs to LTS
- Ingesting GeminiDB Mongo Logs to LTS
- Ingesting GeminiDB Cassandra Logs to LTS
- Ingesting IoTDA Logs to LTS
- Ingesting ModelArts Logs to LTS
- Ingesting MRS Logs to LTS
- Ingesting RDS for MySQL Logs to LTS
- Ingesting RDS for PostgreSQL Logs to LTS
- Ingesting RDS for SQL Server Logs to LTS
- Ingesting ROMA Connect Logs to LTS
- Ingesting SMN Logs to LTS
- Ingesting SecMaster Logs to LTS
- Ingesting OBS Files to LTS (Beta)
- Ingesting VPC Logs to LTS
- Ingesting WAF Logs to LTS
- Using APIs to Ingest Logs to LTS
- Ingesting Logs to LTS Across IAM Accounts
- Using Kafka to Report Logs to LTS
- Using Flume to Report Logs to LTS
- Log Search and Analysis
-
Log Visualization
- Overview
- Visualizing Logs in Statistical Charts
-
Visualizing Logs in Dashboards
- Creating a Dashboard
- Adding a Dashboard Filter
-
Dashboard Templates
- APIG Dashboard Templates
- CCE Dashboard Templates
- CDN Dashboard Templates
- CFW Dashboard Templates
- CSE Dashboard Templates
- DCS Dashboard Template
- DDS Dashboard Template
- DMS Dashboard Template
- DSL Dashboard Template
- ER Dashboard Template
- METRIC Dashboard Template
- Nginx Dashboard Templates
- VPC Dashboard Template
- WAF Dashboard Templates
- Log Alarms
- Log Transfer
- Log Processing
- Configuration Center
- Querying Real-Time LTS Traces
-
Best Practices
- Overview
-
Log Ingestion
- Collecting Logs from Third-Party Cloud Vendors, Internet Data Centers, and Other Huawei Cloud Regions to LTS
- Collecting Kubernetes Logs from Third-Party Clouds, IDCs, and Other Huawei Cloud Regions to LTS
- Collecting Syslog Aggregation Server Logs to LTS
- Importing Logs of Self-built ELK to LTS
- Using Flume to Report Logs to LTS
- Collecting Zabbix Data Through ECS Log Ingestion
- Collecting Logs from Multiple Channels to LTS
- Log Search and Analysis
- Log Transfer
- Billing
- Developer Guide
-
API Reference
- Before You Start
- API Overview
- Calling APIs
- API Calling Examples
- Examples
-
APIs
- Host Group Management
- Log Group Management
- Log Stream Management
- Log Management
- Log Ingestion
- Log Transfer
- Log Collection Beyond Free Quota
- Cloud Log Structuring
- Container Log Ingestion from AOM to LTS
- Alarm Topics
- Message Template Management
- SQL Alarm Rules
- Keyword Alarm Rules
- Alarm List
- Tag Management
- Dashboard Management
- Log Charts
- Quick Search
- Multi-Account Log Aggregation
- Permissions Policies and Supported Actions
- Appendix
- SDK Reference
-
FAQs
- Overview
- Consultation
- Log Management
-
Host Management
- What Do I Do If ICAgent Installation Fails in Windows and the Message "SERVICE STOP" Is Displayed?
- What Do I Do If ICAgent Upgrade Fails on the LTS Console?
- What Do I Do If I Could Not Query New Logs on LTS?
- What Do I Do If ICAgent Restarts Repeatedly After Being Installed?
- What Do I Do If ICAgent Is Displayed as Offline on the LTS Console After Installation?
- What Do I Do If I Do Not See a Host with ICAgent Installed on the LTS Console?
- How Do I Create a VPC Endpoint on the VPCEP Console?
- How Do I Obtain an AK/SK Pair?
- How Do I Install ICAgent by Creating an Agency?
-
Log Ingestion
- What Do I Do If LTS Cannot Collect Logs After I Configure Host Log Ingestion?
- Will LTS Stop Collecting Logs After the Free Quota Is Used Up If I Disable "Continue to Collect Logs When the Free Quota Is Exceeded" in AOM?
- What Do I Do If the CPU Usage Is High When ICAgent Is Collecting Logs?
- What Kinds of Logs and Files Does LTS Collect?
- How Do I Disable the Function of Collecting CCE Standard Output Logs to AOM on the LTS Console?
- What Log Rotation Scheme Should I Use for ICAgent to Collect Logs?
- Does LTS Use the Log4j Plug-in to Report Logs?
- How Long Does It Take to Generate Logs After Configuring Log Ingestion?
- What Do I Do If LTS Cannot Collect Logs After I Configure Log Ingestion with ICAgent?
- Log Search and Analysis
-
Log Transfer
- Does LTS Delete Logs That Have Been Transferred to OBS Buckets?
- What Are the Common Causes of LTS Log Transfer Abnormalities?
- How Do I Transfer CTS Logs to an OBS Bucket?
- What Do I Do If I Cannot View Historical Data in an OBS Bucket After Transferring Data from LTS to OBS?
- What Do I Do If I Cannot Find a New Partition in a DLI Table After Logs Are Transferred to DLI?
-
More Documents
- User Guide (ME-Abu Dhabi Region)
- API Reference (ME-Abu Dhabi Region)
- User Guide(Paris Regions)
- API Reference(Paris Regions)
- User Guide (Kuala Lumpur Region)
- API Reference (Kuala Lumpur Region)
- User Guide (Ankara Region)
-
API Reference (Ankara Region)
- Before You Start
- Calling APIs
- API Calling Examples
- APIs
- Permissions and Supported Actions
- Appendix
- Change History
- Videos
- General Reference
Copied.
Ingesting OBS Files to LTS (Beta)
Files in OBS buckets can be imported to LTS at a time or periodically. Then, you can search for, analyze, and process these files. Files in encrypted buckets cannot be imported to LTS. To import such files, delete the buckets' encryption configurations first. For details, see Deleting the Encryption Configuration of a Bucket.
This function is available only for whitelisted users in regions CN North-Beijing4 and CN South-Guangzhou. To use it, submit a service ticket.
Importing Files in a Single OBS Bucket to LTS
- Log in to the LTS console.
- Choose Log Ingestion > Ingestion Center in the navigation pane and click OBS (Object Storage Service).
Alternatively, choose Log Ingestion > Ingestion Management in the navigation pane, and click Ingest Log > OBS (Object Storage Service).
Alternatively, choose Log Management in the navigation pane and click the target log stream to access its details page. Click
in the upper right corner. On the displayed page, click the Log Ingestion tab and click Ingest Log. In the displayed dialog box, click OBS (Object Storage Service).
- Select a log group from the Log Group drop-down list. If there is no desired log group, click Create Log Group. For details, see Managing Log Groups.
- Select a log stream from the Log Stream drop-down list. If there is no desired log stream, click Create Log Stream. For details, see Managing Log Streams.
- Click Next: Configurations.
- On the Configurations page, set parameters by referring to Table 1.
Table 1 Configuring the collection Type
Parameter
Description
Basic Settings
Collection Configuration Name
Enter a name containing 1 to 64 characters. Only letters, digits, hyphens (-), underscores (_), and periods (.) are allowed. It cannot start with a period or underscore, or end with a period.
Task Monitoring
This function is enabled by default.
It logs task execution statuses to log stream lts-system/lts-obs2lts-statistics, allowing you to view OBS file import data on LTS's Task Monitoring Center and configure alarm rules to promptly detect any import issues.
OBS Data Source Configuration
OBS Bucket
Select the OBS bucket from which log files are imported.
Folder Prefix
Enter the prefix (your_prefix/) or the full path (your_prefix/file.gz) of the OBS files to be imported. Only original files whose size less than 5 GB can be imported.
Regular Expression for File Filtering
Enter a regular expression for filtering files, so that only files with names matching the regular expression will be imported. If no regular expression is specified, files are not filtered.
NOTE:
Assume that there are files aab and aba in the directory:
- To match only file aab, use regular expression aab, aa, ^aab, or aa.
- To match only file aba, use regular expression aba, ^aba, or ^ab. Do not use ab, which will also match aab.
- To match both aab and aba, use regular expression ab or a.*.
- If there are regular keywords, they must be escaped. For example, {} must be escaped to \{\}.
Compressed Format
Auto check, non-compression, and gzip/zip/snappy compression are supported. If you select ZIP, only zip packages containing a single file without any folders are supported.
Import Interval
- One-off: LTS imports files only once and does not detect new files.
- Custom interval: LTS automatically detects new files and imports them at a fixed interval.
If you enable Restore Archived Files, OBS files in the Archive storage class can be restored. This option must be enabled for Archive files. Restoring Archive files takes some time (an expedited restore of Archive files takes 1 to 5 minutes). For details, see Object Restore Option and Time Required. Clicking Preview in the lower right corner for the first time may time out. Try clicking Preview again.
NOTE:
- If a periodic task starts its first scan of files on OBS, the range of the last modification time of the files is (First run time of the period – Fixed interval time, First run time of the period]. For example, if the periodic task starts for the first time at 12:00:00 and the fixed time interval configured for the OBS import task is 10 minutes, then the last modification time of the OBS files scanned for the first time will be within the interval (11:50:00, 12:00:00]. The second period runs at 12:10:00, so the last modification time of the files scanned will be within the interval (12:00:00, 12:10:00].
- If a one-off task fails to process a file, it will not parse or report any other scanned files to LTS.
- If a custom-interval task is disabled and then enabled again, the monitoring data continuity can be retained for up to one day.
- If a custom-interval task fails to process files in an interval, it will not parse or report any other files scanned during that interval to LTS.
Data Format Configuration
Log File Code
The log file encoding format can be UTF-8 or GBK.
UTF-8 encoding is a variable-length encoding mode and represents Unicode character sets. GBK, an acronym for Chinese Internal Code Extension Specification, is a Chinese character encoding standard that extends both the ASCII and GB2312 encoding systems.
Extraction Mode
Select an extraction mode based on the log type. A maximum of 1 MB of an OBS log will be extracted. Any excess will be truncated and discarded.
NOTE:
- If a single-line log exceeds 1 MB, the excess of that line will be truncated and discarded.
- If a block in a multi-line log exceeds 1 MB, the excess of that block will be truncated and discarded.
- Logs in ORC and JSON formats are parsed as single-line logs. If a single line exceeds 1 MB, the log will be discarded.
- Single-line - full-text: collects the full text of single-line logs without structuring parsing. To perform structuring parsing on logs, complete OBS file import and then configure structuring parsing by referring to Setting Cloud Structuring Parsing.
- Multi-line - full-text: collects the full text of multi-line logs (such as stack logs) without structuring parsing. To perform structuring parsing on logs, complete OBS file import and then configure structuring parsing by referring to Setting Cloud Structuring Parsing.
- ORC: collects logs in ORC format.
If Custom Time is disabled, the time when logs are collected is used as the log time.
If Custom Time is enabled, you can specify a field to set the log time. Set the time field key name, value, and time format, and click
to verify your settings. If the imported data is written to the Cluster Switch System (CSS), LTS does not support you to set a time two days ago for ORC logs. For details about the custom time format, see Patterns for Formatting and Parsing on the Oracle official website.
- JSON: collects logs in JSON format.
If Custom Time is disabled, the time when logs are collected is used as the log time.
If Custom Time is enabled, you can specify a field to set the log time. Set the time field key name, value, and time format, and click
to verify your settings. If the imported data is written to the CSS, LTS does not support you to set a time two days ago for JSON logs. For details about the custom time format, see Patterns for Formatting and Parsing on the Oracle official website.
Set 1 to 4 JSON parsing layers. The value must be an integer and is 1 by default. This function expands the fields of a JSON log. For example, for raw log {"key1":{"key2":"value"}}, if you choose to parse it into 1 layer, the log will become {"key1":{"key2":"value"}}; if you choose to parse it into 2 layers, the log will become {"key1.key2":"value"}.
- After the setting is complete, click Preview in the lower right corner. The preview function scans and returns only the first 10 lines of the first file that meets the conditions.
- Check the result preview in the lower part. If the result is correct, click Submit.
- The created ingestion configuration will be displayed. During closed beta testing, up to 10 OBS ingestion configurations can be created.
- Click its name to view its details.
- Click Modify in the Operation column to modify the ingestion configuration. Ingestion configurations whose import interval is One-off cannot be modified.
- Click Configure Tag in the Operation column to add a tag.
- Click Copy in the Operation column to copy the ingestion configuration.
- Click Delete in the Operation column to delete the ingestion configuration.
NOTE:
Deleting an ingestion configuration may lead to log collection failures, potentially resulting in service exceptions related to user logs. In addition, the deleted ingestion configuration cannot be restored. Exercise caution when performing this operation.
- Click the log stream in the Log Stream column to go to its details page. Then you can search for and analyze logs ingested to LTS. For details, see Log Search and Analysis.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot