- What's New
- Function Overview
- Service Overview
- Getting Started
-
User Guide
- Log Management
-
Log Ingestion
-
Ingesting Cloud Service Logs to LTS
- Ingesting AOM Logs to LTS
- Ingesting APIG Logs to LTS
- Ingesting BMS Text Logs to LTS
- Ingesting CBH Logs to LTS
- Ingesting CCE Application Logs to LTS
- Ingesting CFW Logs to LTS
- Ingesting CTS Logs to LTS
- Ingesting GaussDB(DWS) Logs to LTS
- Ingesting ECS Text Logs to LTS
- Ingesting ELB Logs to LTS
- Ingesting Enterprise Router Logs to LTS
- Ingesting FunctionGraph Logs to LTS
- Ingesting ModelArts Logs to LTS
- Ingesting SMN Logs to LTS
- Ingesting SecMaster Logs to LTS
- Ingesting ServiceStage Containerized Application Logs to LTS
- Ingesting ServiceStage Cloud Host Logs to LTS
- Ingesting VPC Logs to LTS
- Ingesting WAF Logs to LTS
- Using APIs to Ingest Logs to LTS
- Other Ingestion Modes
- Setting ICAgent Structuring Parsing Rules
-
Ingesting Cloud Service Logs to LTS
- Host Management
-
Log Search and Analysis
- Overview
- Setting Cloud Structuring Parsing
- Setting Indexes
- Searching Logs
- Viewing Real-Time Logs
- Analyzing Logs in LTS
-
SQL Analysis Syntax
- Overview
- SQL Aggregate Functions
- SQL Period-over-Period Functions
- SQL JSON Functions
- SQL IP Functions
- SQL Mathematical Functions
- SQL Time Functions
- SQL Extrema Functions
- SQL String Functions
- SQL SPLIT Functions
- SQL Comparison Operators
- SQL IP Address Functions
- SQL Reduction Functions
- Other SQL Functions
- SQL JOIN Syntax
- SQL Query Example
-
Log Visualization
- Overview
- Visualizing Logs in Statistical Charts
-
Visualizing Logs in Dashboards
- Creating a Dashboard
- Adding a Dashboard Filter
-
Dashboard Templates
- APIG Dashboard Templates
- CCE Dashboard Templates
- CDN Dashboard Templates
- CFW Dashboard Templates
- CSE Dashboard Templates
- DCS Dashboard Template
- DDS Dashboard Template
- DMS Dashboard Template
- DSL Dashboard Template
- ER Dashboard Template
- METRIC Dashboard Template
- Nginx Dashboard Templates
- VPC Dashboard Template
- WAF Dashboard Templates
- Log Alarms
- Log Transfer
- Log Processing
- Configuration Center
- API Reference
- Best Practices
-
FAQs
- Overview
- Consultation
-
Host Management
- What Do I Do If ICAgent Installation Fails in Windows and the Message "SERVICE STOP" Is Displayed?
- What Do I Do If ICAgent Upgrade Fails on the LTS Console?
- What Do I Do If I Could Not Query New Logs on LTS?
- What Do I Do If ICAgent Restarts Repeatedly After Being Installed?
- What Do I Do If ICAgent Is Displayed as Offline on the LTS Console After Installation?
- What Do I Do If I Do Not See a Host with ICAgent Installed on the LTS Console?
- How Do I Create a VPC Endpoint on the VPCEP Console?
- How Do I Obtain an AK/SK Pair?
- How Do I Install ICAgent by Creating an Agency?
-
Log Ingestion
- What Do I Do If LTS Cannot Collect Logs After I Configure Host Log Ingestion?
- Will LTS Stop Collecting Logs After the Free Quota Is Used Up If I Disable "Continue to Collect Logs When the Free Quota Is Exceeded" in AOM?
- What Do I Do If the CPU Usage Is High When ICAgent Is Collecting Logs?
- What Kinds of Logs and Files Does LTS Collect?
- How Do I Disable the Function of Collecting CCE Standard Output Logs to AOM on the LTS Console?
- How Long Does It Take to Generate Logs After Configuring Log Ingestion?
- What Do I Do If LTS Cannot Collect Logs After I Configure Log Ingestion with ICAgent?
- Log Search and Analysis
- Log Transfer
- SDK Reference
- Videos
Show all
Function Overview
-
Log Tank Service
-
Log Tank Service (LTS) collects log data from hosts and cloud services. By processing massive amounts of logs efficiently, securely, and in real time, LTS provides useful insights for you to optimize the availability and performance of cloud services and applications. It also helps you efficiently perform real-time decision-making, device O&M, and service trend analysis.Real-time log ingestion: You can ingest logs from hosts and cloud services using ICAgent, APIs, or SDKs.Log transfer: Log transfer is to create log copies in destination cloud services. You can transfer logs to Object Storage Service (OBS), or Data Ingestion Service (DIS) for long-term storage.
-
-
Log groups
-
A log group is a group of log streams which share the same log retention settings. Up to 100 log groups can be created for a single account.
Regions: Available in all regions
-
-
Log streams
-
A log stream is the basic unit for reading and writing logs. You can separate different types of logs (such as operation logs and access logs) into different log streams for easier management. Sorting logs into different log streams makes it easier to find specific logs when you need them. Up to 100 log streams can be created in a log group.
Regions: Available in all regions
-
-
Host management
-
ICAgent is the log collection tool of LTS. You need to install ICAgent on the hosts from which you want to collect logs.
Regions: Available in all regions
-
-
Host groups
-
Host groups allow you to configure host log ingestion efficiently. You can sort multiple hosts to a host group and associate the host group with log ingestion configurations. The ingestion configurations will be applied to all the hosts in the host group, saving you the trouble of configuring the hosts one by one.
- When there is a new host, simply add it to a host group and the host will automatically inherit the log ingestion configurations associated with the host group.
- You can also use host groups to modify the log collection paths for multiple hosts at a go.
Regions: Available in all regions
-
-
OBS 2.0支持
-
ICAgent collects logs from hosts based on your specified collection rules, and packages and sends the collected log data to LTS on a log-stream basis. You can view logs on the LTS console in real time.
Regions: Available in all regions
-
-
Log transfer
-
LTS retains the log data reported by hosts and cloud services for seven days by default. To store logs for a long time, you can transfer them to other cloud services.
Regions: Available in all regions
-
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.