- What's New
- Function Overview
- Service Overview
- Billing
- Getting Started
-
User Guide
- Granting LTS Permissions to IAM Users
- Purchasing LTS Resource Packages
- Log Management
-
Log Ingestion
- Overview
-
Using ICAgent to Collect Logs
- Overview
- Installing ICAgent (Intra-Region Hosts)
- Installing ICAgent (Extra-Region Hosts)
- Managing ICAgent
- Managing Host Groups
- Ingesting BMS Text Logs to LTS
- Ingesting CCE Application Logs to LTS
- Ingesting ECS Text Logs to LTS
- Ingesting ServiceStage Containerized Application Logs to LTS
- Ingesting ServiceStage Cloud Host Logs to LTS
- Ingesting Self-Built Kubernetes Application Logs to LTS
- Setting ICAgent Structuring Parsing Rules
-
Ingesting Cloud Service Logs to LTS
- Overview
- Ingesting AOM Logs to LTS
- Ingesting APIG Logs to LTS
- Ingesting CBH Logs to LTS
- Ingesting CFW Logs to LTS
- Ingesting CTS Logs to LTS
- Ingesting DDS Logs to LTS
- Ingesting DMS for Kafka Logs to LTS
- Ingesting DRS Logs to LTS
- Ingesting GaussDB(DWS) Logs to LTS
- Ingesting ELB Logs to LTS
- Ingesting Enterprise Router Logs to LTS
- Ingesting FunctionGraph Logs to LTS
- Ingesting GaussDB Logs to LTS
- Ingesting GES Logs to LTS
- Ingesting TaurusDB Logs to LTS
- Ingesting GeminiDB Logs to LTS
- Ingesting GeminiDB Mongo Logs to LTS
- Ingesting GeminiDB Cassandra Logs to LTS
- Ingesting IoTDA Logs to LTS
- Ingesting ModelArts Logs to LTS
- Ingesting MRS Logs to LTS
- Ingesting RDS for MySQL Logs to LTS
- Ingesting RDS for PostgreSQL Logs to LTS
- Ingesting RDS for SQL Server Logs to LTS
- Ingesting ROMA Connect Logs to LTS
- Ingesting SMN Logs to LTS
- Ingesting SecMaster Logs to LTS
- Ingesting OBS Files to LTS (Beta)
- Ingesting VPC Logs to LTS
- Ingesting WAF Logs to LTS
- Using APIs to Ingest Logs to LTS
- Ingesting Logs to LTS Across IAM Accounts
- Using Kafka to Report Logs to LTS
- Using Flume to Report Logs to LTS
- Log Search and Analysis
-
Log Visualization
- Overview
- Visualizing Logs in Statistical Charts
-
Visualizing Logs in Dashboards
- Creating a Dashboard
- Adding a Dashboard Filter
-
Dashboard Templates
- APIG Dashboard Templates
- CCE Dashboard Templates
- CDN Dashboard Templates
- CFW Dashboard Templates
- CSE Dashboard Templates
- DCS Dashboard Template
- DDS Dashboard Template
- DMS Dashboard Template
- DSL Dashboard Template
- ER Dashboard Template
- METRIC Dashboard Template
- Nginx Dashboard Templates
- VPC Dashboard Template
- WAF Dashboard Templates
- Log Alarms
- Log Transfer
- Log Processing
- Configuration Center
- Querying Real-Time LTS Traces
-
Best Practices
- Overview
-
Log Ingestion
- Collecting Logs from Third-Party Cloud Vendors, Internet Data Centers, and Other Huawei Cloud Regions to LTS
- Collecting Kubernetes Logs from Third-Party Clouds, IDCs, and Other Huawei Cloud Regions to LTS
- Collecting Syslog Aggregation Server Logs to LTS
- Importing Logs of Self-built ELK to LTS
- Using Flume to Report Logs to LTS
- Collecting Zabbix Data Through ECS Log Ingestion
- Collecting Logs from Multiple Channels to LTS
- Log Search and Analysis
- Log Transfer
- Billing
- Developer Guide
-
API Reference
- Before You Start
- API Overview
- Calling APIs
- API Calling Examples
- Examples
-
APIs
- Host Group Management
- Log Group Management
- Log Stream Management
- Log Management
- Log Ingestion
- Log Transfer
- Log Collection Beyond Free Quota
- Cloud Log Structuring
- Container Log Ingestion from AOM to LTS
- Alarm Topics
- Message Template Management
- SQL Alarm Rules
- Keyword Alarm Rules
- Alarm List
- Tag Management
- Dashboard Management
- Log Charts
- Quick Search
- Multi-Account Log Aggregation
- Permissions Policies and Supported Actions
- Appendix
- SDK Reference
-
FAQs
- Overview
- Consultation
- Log Management
-
Host Management
- What Do I Do If ICAgent Installation Fails in Windows and the Message "SERVICE STOP" Is Displayed?
- What Do I Do If ICAgent Upgrade Fails on the LTS Console?
- What Do I Do If I Could Not Query New Logs on LTS?
- What Do I Do If ICAgent Restarts Repeatedly After Being Installed?
- What Do I Do If ICAgent Is Displayed as Offline on the LTS Console After Installation?
- What Do I Do If I Do Not See a Host with ICAgent Installed on the LTS Console?
- How Do I Create a VPC Endpoint on the VPCEP Console?
- How Do I Obtain an AK/SK Pair?
- How Do I Install ICAgent by Creating an Agency?
-
Log Ingestion
- What Do I Do If LTS Cannot Collect Logs After I Configure Host Log Ingestion?
- Will LTS Stop Collecting Logs After the Free Quota Is Used Up If I Disable "Continue to Collect Logs When the Free Quota Is Exceeded" in AOM?
- What Do I Do If the CPU Usage Is High When ICAgent Is Collecting Logs?
- What Kinds of Logs and Files Does LTS Collect?
- How Do I Disable the Function of Collecting CCE Standard Output Logs to AOM on the LTS Console?
- What Log Rotation Scheme Should I Use for ICAgent to Collect Logs?
- Does LTS Use the Log4j Plug-in to Report Logs?
- How Long Does It Take to Generate Logs After Configuring Log Ingestion?
- What Do I Do If LTS Cannot Collect Logs After I Configure Log Ingestion with ICAgent?
- Log Search and Analysis
-
Log Transfer
- Does LTS Delete Logs That Have Been Transferred to OBS Buckets?
- What Are the Common Causes of LTS Log Transfer Abnormalities?
- How Do I Transfer CTS Logs to an OBS Bucket?
- What Do I Do If I Cannot View Historical Data in an OBS Bucket After Transferring Data from LTS to OBS?
- What Do I Do If I Cannot Find a New Partition in a DLI Table After Logs Are Transferred to DLI?
-
More Documents
- User Guide (ME-Abu Dhabi Region)
- API Reference (ME-Abu Dhabi Region)
- User Guide(Paris Regions)
- API Reference(Paris Regions)
- User Guide (Kuala Lumpur Region)
- API Reference (Kuala Lumpur Region)
- User Guide (Ankara Region)
-
API Reference (Ankara Region)
- Before You Start
- Calling APIs
- API Calling Examples
- APIs
- Permissions and Supported Actions
- Appendix
- Change History
- Videos
- General Reference
Copied.
Processing Logs with SQL Scheduled Jobs
LTS supports SQL scheduled jobs, allowing you to periodically analyze log content. SQL scheduled jobs use standard SQL syntax to periodically analyze logs as configured in scheduling rules, and store analysis results to a target log stream.
Currently, this function is available only to whitelisted users in regions CN North-Beijing4, CN South-Guangzhou, and CN East-Shanghai1. To use it, submit a service ticket.
Prerequisites
- Logs have been collected.
- Raw logs have been structured. For details, see Setting Cloud Structuring Parsing.
Constraints
A maximum of 20 SQL scheduled jobs can be created.
Creating a SQL Scheduled Job
- Log in to the LTS console.
- Choose Log Jobs in the navigation pane and click SQL Scheduled Jobs > Create SQL Scheduled Job.
Alternatively, choose Log Management in the navigation pane and click the name of a log group or log stream. On the log stream details page, click
. On the displayed page, click SQL Scheduled Jobs > Create SQL Scheduled Job.
NOTE:
- When creating a SQL scheduled job on the log stream details page, the source log group/stream cannot be changed.
- The created job is displayed in the SQL scheduled job list. To view details, click the job name.
- On the Create SQL Scheduled Job page, configure required parameters.
- In the Configure Computation step, complete the following settings and click Next.
Table 1 Computation parameters Parameter
Description
Job Name
SQL scheduled job name. Enter 1 to 64 characters, including letters, digits, hyphens (-), and underscores (_). Do not start or end with a hyphen or underscore.
Description
Description of the SQL scheduled job. Enter up to 1000 characters.
Source Log Group/Stream
Select a log group/stream for which log structuring has been enabled. Log content in the source log group/stream will be stored in the target log group/stream after being processed by the SQL scheduled job.
Statistics
By SQL: uses the old SQL engine.
SQL Code
Enter a query and analysis statement, which will be executed to analyze logs during the SQL scheduled job execution.
Specify a time range to query logs from. Click Preview to preview the result.
Target Log Group/Stream
Log group/stream that store the SQL analysis results.
- Configure the following scheduling parameters.
Table 2 Scheduling parameters Parameter
Description
Interval
Frequency of executing the SQL scheduled job. An execution instance is generated every time the job is executed. The interval determines the time of each execution instance.
- Hourly: The job is executed every hour.
- Daily: The job is executed at a fixed time every day.
- Weekly: The job is executed at a fixed time on a fixed day in a week.
- Custom interval: The job is executed at a specified interval.
- CRON: The job is executed at the interval defined in a cron expression. Cron expressions use the 24-hour format and are precise down to the minute. Examples:
- 0/10 * * * *: The query starts from 00:00 and is performed every 10 minutes at 00:00, 00:10, 00:20, 00:30, 00:40, 00:50, 01:00, and so on. For example, if the current time is 16:37, the next query is at 16:50.
- 0 0/5 * * *: The query starts from 00:00 and is performed every 5 hours at 00:00, 05:00, 10:00, 15:00, 20:00, and so on. For example, if the current time is 16:37, the next query is at 20:00.
- 0 14 * * *: The query is performed at 14:00 every day.
- 0 0 10 * *: The query is performed at 00:00 on the 10th day of every month.
Scheduling Time Range
Scheduling time range. Options:
- Start time: The scheduling starts from a specified time, and no time range needs to be set. If the job is deleted, no new instances will be generated.
- Time range: The scheduling time defined by the interval must be within this range. Otherwise, no new instances will be generated.
Start Time
The time when the job will first be executed.
Set the start time or the time range based on the scheduling time range you select.
SQL Window
Runtime of the SQL scheduled job. Only logs in this period will be analyzed. The time expression can be precise to second (s: second; m: minute; h: hour), and the maximum period is 24 hours. Select 5 minutes, 15 minutes, 1 hour, or 1 day from the drop-down list, or select Custom and enter a time expression. Format: [+/-{num}h+/-{num}m+/-{num}s@s,+/-{num}@h), where:
- [: Indicates that the endpoint is included.
- ): Indicates that the endpoint is excluded.
- +: Indicates that the time starts forward from the current time. For example, if the current time is 16:00, +1h means 17:00. The plus sign (+) is not recommended.
- -: Indicates that the time starts backward from the current time. For example, if the current time is 16:00, -1h means 15:00.
- {num}: Indicates an integer. The time difference before and after the comma cannot exceed 24 hours.
- @: Indicates rounding up. @h means rounding up to the nearest hour while ignoring the minute and second. @m means rounding up to the nearest minute while ignoring the second. @s means rounding up to the nearest second.
Example time expressions:
- 5 minutes: [-5m@m,@m)
- 15 minutes: [-15m@m,@m)
- 1 hour: [-1h@h,@h)
- 1 day: [-24h@h,@h)
- Custom (no more than one day): [-65m@h,-5m@h)
NOTE:
The maximum interval indicated by a time expression is five times the scheduling interval.
Timeout
Threshold for automatic retry when the SQL analysis fails. An execution instance ends if it reaches the specified timeout or has timed out for the specified number of times. The status of the instance is Failed.
- Timeout: 60–180 seconds
- Timeout times: 1–10
- In the Configure Computation step, complete the following settings and click Next.
- Click OK.
Viewing a SQL Scheduled Job
- On the SQL Scheduled Jobs tab page, click the name of the target job to view its details, including the basic information and execution instance.
NOTE:
A maximum of 100 logs can be reported for each instance.
Modifying a SQL Scheduled Job
- On the SQL Scheduled Jobs tab page, locate the target job and click Modify in the Operation column.
- On the displayed page, modify the settings by referring to Table 2.
NOTE:
The SQL scheduled job name and source log group/stream names cannot be modified.
Deleting a SQL Scheduled Job
- On the SQL Scheduled Jobs tab page, locate the target job and click Delete in the Operation column.
Alternatively, click the name of the target job. On the displayed page, click Delete in the upper right corner.
- Click OK.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot