Log Tank Service
Log Tank Service
- What's New
- Function Overview
- Service Overview
- Billing
- Getting Started
-
User Guide
- Granting LTS Permissions to IAM Users
- Purchasing LTS Resource Packages
- Log Management
-
Log Ingestion
- Overview
-
Using ICAgent to Collect Logs
- Overview
- Installing ICAgent (Intra-Region Hosts)
- Installing ICAgent (Extra-Region Hosts)
- Managing ICAgent
- Managing Host Groups
- Ingesting BMS Text Logs to LTS
- Ingesting CCE Application Logs to LTS
- Ingesting ECS Text Logs to LTS
- Ingesting ServiceStage Containerized Application Logs to LTS
- Ingesting ServiceStage Cloud Host Logs to LTS
- Ingesting Self-Built Kubernetes Application Logs to LTS
- Setting ICAgent Structuring Parsing Rules
-
Ingesting Cloud Service Logs to LTS
- Overview
- Ingesting AOM Logs to LTS
- Ingesting APIG Logs to LTS
- Ingesting CBH Logs to LTS
- Ingesting CFW Logs to LTS
- Ingesting CTS Logs to LTS
- Ingesting DDS Logs to LTS
- Ingesting DMS for Kafka Logs to LTS
- Ingesting DRS Logs to LTS
- Ingesting GaussDB(DWS) Logs to LTS
- Ingesting ELB Logs to LTS
- Ingesting Enterprise Router Logs to LTS
- Ingesting FunctionGraph Logs to LTS
- Ingesting GaussDB Logs to LTS
- Ingesting GES Logs to LTS
- Ingesting TaurusDB Logs to LTS
- Ingesting GeminiDB Logs to LTS
- Ingesting GeminiDB Mongo Logs to LTS
- Ingesting GeminiDB Cassandra Logs to LTS
- Ingesting IoTDA Logs to LTS
- Ingesting ModelArts Logs to LTS
- Ingesting MRS Logs to LTS
- Ingesting RDS for MySQL Logs to LTS
- Ingesting RDS for PostgreSQL Logs to LTS
- Ingesting RDS for SQL Server Logs to LTS
- Ingesting ROMA Connect Logs to LTS
- Ingesting SMN Logs to LTS
- Ingesting SecMaster Logs to LTS
- Ingesting OBS Files to LTS (Beta)
- Ingesting VPC Logs to LTS
- Ingesting WAF Logs to LTS
- Using APIs to Ingest Logs to LTS
- Ingesting Logs to LTS Across IAM Accounts
- Using Kafka to Report Logs to LTS
- Using Flume to Report Logs to LTS
- Log Search and Analysis
-
Log Visualization
- Overview
- Visualizing Logs in Statistical Charts
-
Visualizing Logs in Dashboards
- Creating a Dashboard
- Adding a Dashboard Filter
-
Dashboard Templates
- APIG Dashboard Templates
- CCE Dashboard Templates
- CDN Dashboard Templates
- CFW Dashboard Templates
- CSE Dashboard Templates
- DCS Dashboard Template
- DDS Dashboard Template
- DMS Dashboard Template
- DSL Dashboard Template
- ER Dashboard Template
- METRIC Dashboard Template
- Nginx Dashboard Templates
- VPC Dashboard Template
- WAF Dashboard Templates
- Log Alarms
- Log Transfer
- Log Processing
- Configuration Center
- Querying Real-Time LTS Traces
-
Best Practices
- Overview
-
Log Ingestion
- Collecting Logs from Third-Party Cloud Vendors, Internet Data Centers, and Other Huawei Cloud Regions to LTS
- Collecting Kubernetes Logs from Third-Party Clouds, IDCs, and Other Huawei Cloud Regions to LTS
- Collecting Syslog Aggregation Server Logs to LTS
- Importing Logs of Self-built ELK to LTS
- Using Flume to Report Logs to LTS
- Collecting Zabbix Data Through ECS Log Ingestion
- Collecting Logs from Multiple Channels to LTS
- Log Search and Analysis
- Log Transfer
- Billing
- Developer Guide
-
API Reference
- Before You Start
- API Overview
- Calling APIs
- API Calling Examples
- Examples
-
APIs
- Host Group Management
- Log Group Management
- Log Stream Management
- Log Management
- Log Ingestion
- Log Transfer
- Log Collection Beyond Free Quota
- Cloud Log Structuring
- Container Log Ingestion from AOM to LTS
- Alarm Topics
- Message Template Management
- SQL Alarm Rules
- Keyword Alarm Rules
- Alarm List
- Tag Management
- Dashboard Management
- Log Charts
- Quick Search
- Multi-Account Log Aggregation
- Permissions Policies and Supported Actions
- Appendix
- SDK Reference
-
FAQs
- Overview
- Consultation
- Log Management
-
Host Management
- What Do I Do If ICAgent Installation Fails in Windows and the Message "SERVICE STOP" Is Displayed?
- What Do I Do If ICAgent Upgrade Fails on the LTS Console?
- What Do I Do If I Could Not Query New Logs on LTS?
- What Do I Do If ICAgent Restarts Repeatedly After Being Installed?
- What Do I Do If ICAgent Is Displayed as Offline on the LTS Console After Installation?
- What Do I Do If I Do Not See a Host with ICAgent Installed on the LTS Console?
- How Do I Create a VPC Endpoint on the VPCEP Console?
- How Do I Obtain an AK/SK Pair?
- How Do I Install ICAgent by Creating an Agency?
-
Log Ingestion
- What Do I Do If LTS Cannot Collect Logs After I Configure Host Log Ingestion?
- Will LTS Stop Collecting Logs After the Free Quota Is Used Up If I Disable "Continue to Collect Logs When the Free Quota Is Exceeded" in AOM?
- What Do I Do If the CPU Usage Is High When ICAgent Is Collecting Logs?
- What Kinds of Logs and Files Does LTS Collect?
- How Do I Disable the Function of Collecting CCE Standard Output Logs to AOM on the LTS Console?
- What Log Rotation Scheme Should I Use for ICAgent to Collect Logs?
- Does LTS Use the Log4j Plug-in to Report Logs?
- How Long Does It Take to Generate Logs After Configuring Log Ingestion?
- What Do I Do If LTS Cannot Collect Logs After I Configure Log Ingestion with ICAgent?
- Log Search and Analysis
-
Log Transfer
- Does LTS Delete Logs That Have Been Transferred to OBS Buckets?
- What Are the Common Causes of LTS Log Transfer Abnormalities?
- How Do I Transfer CTS Logs to an OBS Bucket?
- What Do I Do If I Cannot View Historical Data in an OBS Bucket After Transferring Data from LTS to OBS?
- What Do I Do If I Cannot Find a New Partition in a DLI Table After Logs Are Transferred to DLI?
-
More Documents
- User Guide (ME-Abu Dhabi Region)
- API Reference (ME-Abu Dhabi Region)
- User Guide(Paris Regions)
- API Reference(Paris Regions)
- User Guide (Kuala Lumpur Region)
- API Reference (Kuala Lumpur Region)
- User Guide (Ankara Region)
-
API Reference (Ankara Region)
- Before You Start
- Calling APIs
- API Calling Examples
- APIs
- Permissions and Supported Actions
- Appendix
- Change History
- Videos
- General Reference
On this page
Help Center/
Log Tank Service/
User Guide/
Log Visualization/
Visualizing Logs in Dashboards/
Dashboard Templates/
DSL Dashboard Template
Copied.
DSL Dashboard Template
LTS provides DSL processing for you to achieve one-stop log processing. Using domain-defined script languages and more than 200 built-in functions, you can implement end-to-end log processing tasks on the LTS console, such as log normalization, enrichment, transfer, anonymization, and filtering.
LTS provides the DSL processing task monitoring center dashboard template to display information such as the processing task ID/name and number of input/output lines.
Prerequisites
- A DSL processing task has been created.
- Logs have been structured. For details, see Setting Cloud Structuring Parsing.
DSL Processing Task Monitoring Center
- Log in to the LTS console. In the navigation pane, choose Dashboards.
- Choose DSL dashboard templates under Dashboard Templates and click DSL processing task monitoring center to view the chart details.
- Filter by processing task ID. The associated query and analysis statement is:
select distinct(task_id)
- Filter by processing task name. The associated query and analysis statement is:
select distinct(task_name)
- Input Lines. The associated query and analysis statement is:
SELECT CASE WHEN "input" < 1000 THEN concat( cast( "input" AS VARCHAR ), 'Lines' ) WHEN "input" < 1000 * 1000 THEN concat( cast( round( "input"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "input" < 1000000000 THEN concat( cast( round( "input"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "input"/ 1000.0 < 1000000000 THEN concat( cast( round( "input"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "input"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("process.accept") as "input")
- Output Lines. The associated query and analysis statement is:
SELECT CASE WHEN "delivered" < 1000 THEN concat( cast( "delivered" AS VARCHAR ), 'Lines' ) WHEN "delivered" < 1000 * 1000 THEN concat( cast( round( "delivered"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "delivered" < 1000000000 THEN concat( cast( round( "delivered"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "delivered"/ 1000.0 < 1000000000 THEN concat( cast( round( "delivered"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "delivered"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("process.delivered") as "delivered")
- Filtered Lines. The associated query and analysis statement is:
SELECT CASE WHEN "drop" < 1000 THEN concat( cast( "drop" AS VARCHAR ), 'Lines' ) WHEN "drop" < 1000 * 1000 THEN concat( cast( round( "drop"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "drop" < 1000000000 THEN concat( cast( round( "drop"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "drop"/ 1000.0 < 1000000000 THEN concat( cast( round( "drop"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "drop"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("process.drop") as "drop")
- Failed Lines. The associated query and analysis statement is:
SELECT CASE WHEN "failed" < 1000 THEN concat( cast( "failed" AS VARCHAR ), 'Lines' ) WHEN "failed" < 1000 * 1000 THEN concat( cast( round( "failed"/ 1000, 1 ) AS VARCHAR ), 'Thousands lines' ) WHEN "failed" < 1000000000 THEN concat( cast( round( "failed"/ 1000000.0, 1 ) AS VARCHAR ), 'Million lines' ) WHEN "failed"/ 1000.0 < 1000000000 THEN concat( cast( round( "failed"/ 1000 / 1000000.0, 1 ) AS VARCHAR ), 'Billion lines' ) ELSE concat( cast( round( "failed"/ 1000.0 / 1000 / 1000 / 1000, 1 ) AS VARCHAR ), 'Trillion lines' ) END AS "total" from (select sum("process.failed") as "failed")
- Execution Records. The associated query and analysis statement is:
select TIME_FORMAT( MILLIS_TO_TIMESTAMP("start"), 'yyyy-MM-dd HH:mm:ss:SSS', '+08:00') as "Started",TIME_FORMAT( MILLIS_TO_TIMESTAMP("end"), 'yyyy-MM-dd HH:mm:ss:SSS', '+08:00') as "Ended", "process.accept" as "Input Lines", "process.delivered" as "Output Lines", "process.drop" as "Filtered Lines", "process.failed" as "Failed Lines" limit 1000
- Filter by processing task ID. The associated query and analysis statement is:
Parent topic: Dashboard Templates
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
The system is busy. Please try again later.
For any further questions, feel free to contact us through the chatbot.
Chatbot