- What's New
- Function Overview
- Service Overview
- Billing
- Getting Started
-
User Guide
- Granting LTS Permissions to IAM Users
- Purchasing LTS Resource Packages
- Log Management
-
Log Ingestion
- Overview
-
Using ICAgent to Collect Logs
- Overview
- Installing ICAgent (Intra-Region Hosts)
- Installing ICAgent (Extra-Region Hosts)
- Managing ICAgent
- Managing Host Groups
- Ingesting BMS Text Logs to LTS
- Ingesting CCE Application Logs to LTS
- Ingesting ECS Text Logs to LTS
- Ingesting ServiceStage Containerized Application Logs to LTS
- Ingesting ServiceStage Cloud Host Logs to LTS
- Ingesting Self-Built Kubernetes Application Logs to LTS
- Setting ICAgent Structuring Parsing Rules
-
Ingesting Cloud Service Logs to LTS
- Overview
- Ingesting AOM Logs to LTS
- Ingesting APIG Logs to LTS
- Ingesting CBH Logs to LTS
- Ingesting CFW Logs to LTS
- Ingesting CTS Logs to LTS
- Ingesting DDS Logs to LTS
- Ingesting DMS for Kafka Logs to LTS
- Ingesting DRS Logs to LTS
- Ingesting GaussDB(DWS) Logs to LTS
- Ingesting ELB Logs to LTS
- Ingesting Enterprise Router Logs to LTS
- Ingesting FunctionGraph Logs to LTS
- Ingesting GaussDB Logs to LTS
- Ingesting GES Logs to LTS
- Ingesting TaurusDB Logs to LTS
- Ingesting GeminiDB Logs to LTS
- Ingesting GeminiDB Mongo Logs to LTS
- Ingesting GeminiDB Cassandra Logs to LTS
- Ingesting IoTDA Logs to LTS
- Ingesting ModelArts Logs to LTS
- Ingesting MRS Logs to LTS
- Ingesting RDS for MySQL Logs to LTS
- Ingesting RDS for PostgreSQL Logs to LTS
- Ingesting RDS for SQL Server Logs to LTS
- Ingesting ROMA Connect Logs to LTS
- Ingesting SMN Logs to LTS
- Ingesting SecMaster Logs to LTS
- Ingesting OBS Files to LTS (Beta)
- Ingesting VPC Logs to LTS
- Ingesting WAF Logs to LTS
- Using APIs to Ingest Logs to LTS
- Ingesting Logs to LTS Across IAM Accounts
- Using Kafka to Report Logs to LTS
- Using Flume to Report Logs to LTS
- Log Search and Analysis
-
Log Visualization
- Overview
- Visualizing Logs in Statistical Charts
-
Visualizing Logs in Dashboards
- Creating a Dashboard
- Adding a Dashboard Filter
-
Dashboard Templates
- APIG Dashboard Templates
- CCE Dashboard Templates
- CDN Dashboard Templates
- CFW Dashboard Templates
- CSE Dashboard Templates
- DCS Dashboard Template
- DDS Dashboard Template
- DMS Dashboard Template
- DSL Dashboard Template
- ER Dashboard Template
- METRIC Dashboard Template
- Nginx Dashboard Templates
- VPC Dashboard Template
- WAF Dashboard Templates
- Log Alarms
- Log Transfer
- Log Processing
- Configuration Center
- Querying Real-Time LTS Traces
-
Best Practices
- Overview
-
Log Ingestion
- Collecting Logs from Third-Party Cloud Vendors, Internet Data Centers, and Other Huawei Cloud Regions to LTS
- Collecting Kubernetes Logs from Third-Party Clouds, IDCs, and Other Huawei Cloud Regions to LTS
- Collecting Syslog Aggregation Server Logs to LTS
- Importing Logs of Self-built ELK to LTS
- Using Flume to Report Logs to LTS
- Collecting Zabbix Data Through ECS Log Ingestion
- Collecting Logs from Multiple Channels to LTS
- Log Search and Analysis
- Log Transfer
- Billing
- Developer Guide
-
API Reference
- Before You Start
- API Overview
- Calling APIs
- API Calling Examples
- Examples
-
APIs
- Host Group Management
- Log Group Management
- Log Stream Management
- Log Management
- Log Ingestion
- Log Transfer
- Log Collection Beyond Free Quota
- Cloud Log Structuring
- Container Log Ingestion from AOM to LTS
- Alarm Topics
- Message Template Management
- SQL Alarm Rules
- Keyword Alarm Rules
- Alarm List
- Tag Management
- Dashboard Management
- Log Charts
- Quick Search
- Multi-Account Log Aggregation
- Permissions Policies and Supported Actions
- Appendix
- SDK Reference
-
FAQs
- Overview
- Consultation
- Log Management
-
Host Management
- What Do I Do If ICAgent Installation Fails in Windows and the Message "SERVICE STOP" Is Displayed?
- What Do I Do If ICAgent Upgrade Fails on the LTS Console?
- What Do I Do If I Could Not Query New Logs on LTS?
- What Do I Do If ICAgent Restarts Repeatedly After Being Installed?
- What Do I Do If ICAgent Is Displayed as Offline on the LTS Console After Installation?
- What Do I Do If I Do Not See a Host with ICAgent Installed on the LTS Console?
- How Do I Create a VPC Endpoint on the VPCEP Console?
- How Do I Obtain an AK/SK Pair?
- How Do I Install ICAgent by Creating an Agency?
-
Log Ingestion
- What Do I Do If LTS Cannot Collect Logs After I Configure Host Log Ingestion?
- Will LTS Stop Collecting Logs After the Free Quota Is Used Up If I Disable "Continue to Collect Logs When the Free Quota Is Exceeded" in AOM?
- What Do I Do If the CPU Usage Is High When ICAgent Is Collecting Logs?
- What Kinds of Logs and Files Does LTS Collect?
- How Do I Disable the Function of Collecting CCE Standard Output Logs to AOM on the LTS Console?
- What Log Rotation Scheme Should I Use for ICAgent to Collect Logs?
- Does LTS Use the Log4j Plug-in to Report Logs?
- How Long Does It Take to Generate Logs After Configuring Log Ingestion?
- What Do I Do If LTS Cannot Collect Logs After I Configure Log Ingestion with ICAgent?
- Log Search and Analysis
-
Log Transfer
- Does LTS Delete Logs That Have Been Transferred to OBS Buckets?
- What Are the Common Causes of LTS Log Transfer Abnormalities?
- How Do I Transfer CTS Logs to an OBS Bucket?
- What Do I Do If I Cannot View Historical Data in an OBS Bucket After Transferring Data from LTS to OBS?
- What Do I Do If I Cannot Find a New Partition in a DLI Table After Logs Are Transferred to DLI?
-
More Documents
- User Guide (ME-Abu Dhabi Region)
- API Reference (ME-Abu Dhabi Region)
- User Guide(Paris Regions)
- API Reference(Paris Regions)
- User Guide (Kuala Lumpur Region)
- API Reference (Kuala Lumpur Region)
- User Guide (Ankara Region)
-
API Reference (Ankara Region)
- Before You Start
- Calling APIs
- API Calling Examples
- APIs
- Permissions and Supported Actions
- Appendix
- Change History
- Videos
- General Reference
Copied.
Importing Logs of Self-built ELK to LTS
Solution Overview
ELK is an acronym that stands for Elasticsearch, Logstash, and Kibana. Together, these three tools provide a most commonly used log analysis and visualization solution in the industry.
- Elasticsearch is an open-source, distributed, and RESTful search and analysis engine based on Lucene.
- Logstash is an open-source data processing pipeline on the server side. It allows you to collect and transform data from multiple sources in real time, and then send the data to your repository. It is usually used to collect, filter, and forward logs.
- Kibana is an open-source platform for data analysis and visualization, enabling you to create dashboards and search and query data. It is usually used together with Elasticsearch.
LTS outperforms the ELK solution in terms of function diversity, costs, and performance. For an in-depth comparison, see What Are the Advantages of LTS Compared with Self-built ELK Stack? This section describes how to use custom Python scripts and ICAgent to migrate logs from Elasticsearch to LTS.
ICAgent can be installed on ECSs to collect their log files. With this function, you can import Elasticsearch logs to LTS.
You can flush Elasticsearch data to ECSs using Python scripts, and then collect the flushed log files to LTS using its log ingestion function.

Importing Logs of Self-built ELK to LTS
- Log in to the LTS console.
- Install ICAgent on the ECS.
- Configure ECS log ingestion on the LTS console. For details, see Ingesting ECS Text Logs to LTS.
- Prepare for script execution. The following example is for reference only. Enter your actual information.
- If you use Python for the first time, you need to install the Python environment.
- If you use Elasticsearch for the first time, you need to install the Python data package of the corresponding Elasticsearch version. Elasticsearch 7.10.1 is used in this solution test.
pip install elasticsearch==7.10.1
- Elasticsearch used in this solution test is created by Huawei Cloud Search Service (CSS).
- Run the python script for constructing index data. If the index already has data, skip this step and go to 6.
The python script must be executed on the ECS and named xxx.py. The following is an example of constructing data:
Modify the following italic fields as required. In this example, 1,000 data records with the content This is a test log,Hello world!!!\n are inserted.
- index: name of the index to be created. It is test in this example.
- es: URL for accessing Elasticsearch. It is http://127.0.0.1:9200 in this example.
from elasticsearch import Elasticsearch def creadIndex(index): mappings = { "properties": { "content": { "type": "text" } } } es.indices.create(index=index, mappings=mappings) def reportLog(index): i = 0 while i < 1000: i = i + 1 body = {"content": "This is a test log,Hello world!!!\n"} es.index(index=index,body=body) if __name__ == '__main__': # Index name index = 'test' # Link to Elasticsearch es = Elasticsearch("http://127.0.0.1:9200") creadIndex(index) reportLog(index)
- Construct the Python read and write script to write Elasticsearch data to the disk. The output file path must be the same as that configured in the log ingestion rule.
The script must be executed on the ECS and named xxx.py. The following is an example of the script for writing data to the disk:
Modify the following italic fields as required.
- index: index name. It is test in this example.
- pathFile: absolute path for writing data to the disk. It is /tmp/test.log in this example.
- scroll_size: size of the index rolling query. It is 100 in this example.
- es: URL for accessing Elasticsearch. It is http://127.0.0.1:9200 in this example.
from elasticsearch import Elasticsearch def writeLog(res, pathFile): data = res.get('hits').get('hits') i = 0 while i < len(data): log = data[i].get('_source').get('content') file = open(pathFile, 'a', encoding='UTF-8') file.writelines(log) i = i + 1 file.flush() file.close() if __name__ == '__main__': # Index name index = 'test' # Output file path pathFile = '/tmp/' + index + '.log' # Size of the scrolling query. The default value is 100. scroll_size = 100 # Link to Elasticsearch es = Elasticsearch("http://127.0.0.1:9200") init = True while 1: if (init == True): res = es.search(index=index, scroll="1m", body={"size": scroll_size}) init =False else: scroll_id = res.get("_scroll_id") res = es.scroll(scroll="1m", scroll_id=scroll_id) if not res.get('hits').get('hits'): break writeLog(res, pathFile)
- Ensure that Python has been installed and run the following command on the ECS to write the Elasticsearch index data to the disk:
python xxx.py
- Check whether the data was successfully queried and written into the disk.
In this example, the path for writing data to the disk is /tmp/test.log. Replace it with your actual path. Run the following command to check whether the data has been written to the disk:
tail -f /tmp/test.log
- Log in to the LTS console. On the Log Management page, click the target log stream to go to its details page. If log data is displayed on the Log Search tab page, log collection is successful.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot