Using DIS to Import Local Data to Elasticsearch
You can use DIS to upload log data stored on the local Windows PC to the DIS queue and use CDM to migrate the data to Elasticsearch in CSS. In this way, you can efficiently manage and obtain logs through Elasticsearch. Data files can be in the JSON or CSV format.
Figure 1 shows the data transmission process.
Procedure
- Log in to the DIS management console.
- Purchase a DIS stream.
For details, see "Creating a DIS Stream" in the Data Ingestion Service User Guide.
- Install and configure DIS Agent.
For details, see "Installing DIS Agent" and "Configuring DIS Agent" in Data Ingestion Service User Guide.
- Start DIS Agent and upload the collected local data to the DIS queue.
For details, see "Starting DIS Agent" in the Data Ingestion Service User Guide.
For example, upload the following data to a DIS queue using the DIS Agent:
{"logName":"aaa","date":"bbb"} {"logName":"ccc","date":"ddd"} {"logName":"eee","date":"fff"} {"logName":"ggg","date":"hhh"} {"logName":"mmm","date":"nnn"}
- Log in to the CSS management console.
- In the navigation pane on the left, choose Clusters > Elasticsearch to switch to the Clusters page.
- From the cluster list, locate the row that contains the cluster to which you want to import data, and click Access Kibana in the Operation column.
- In the Kibana navigation pane on the left, choose Dev Tools.
- On the Console page, run the related command to create an index for the data to be stored and specify a custom mapping to define the data type:
If there is an available index in the cluster where you want to import data, this step is not required. If there is no available index, create an index by referring to the following sample code.
For example, on the Console page, run the following command to create index apache and specify a custom mapping to define the data type:
Versions earlier than 7.x
PUT /apache { "settings": { "number_of_shards": 1 }, "mappings": { "logs": { "properties": { "logName": { "type": "text", "analyzer": "ik_smart" }, "date": { "type": "keyword" } } } } }
Versions later than 7.x
PUT /apache { "settings": { "number_of_shards": 1 }, "mappings": { "properties": { "logName": { "type": "text", "analyzer": "ik_smart" }, "date": { "type": "keyword" } } } }
The command is successfully executed if the following information is displayed.
{ "acknowledged" : true, "shards_acknowledged" : true, "index" : "apache" }
- Log in to the CDM management console.
- Purchase a CDM cluster.
For details, see "Creating a Cluster" in the Cloud Data Migration User Guide.
- Create a link between CDM and CSS.
For details, see "Creating a Link" in the Cloud Data Migration User Guide.
- Create a link between CDM and DIS.
For details, see "Creating a Link" in the Cloud Data Migration User Guide.
- Create a job on the purchased CDM cluster and migrate the data in the DIS queue to the target cluster in CSS.
For details, see "Table/File Migration" in the Cloud Data Migration User Guide.
- On the Console page of Kibana, search for the imported data.
On the Console page of Kibana, run the following command to search for data. View the search results. If the searched data is consistent with the imported data, the data has been imported successfully.
GET apache/_search
The command is successfully executed if the following information is displayed.
{ "took": 81, "timed_out": false, "_shards": { "total": 1, "successful": 1, "skipped": 0, "failed": 0 }, "hits": { "total": 5, "max_score": 1, "hits": [ { "_index": "apache", "_type": "logs", "_id": "txfbqnEBPuwwWJWL-qvP", "_score": 1, "_source": { "date": """{"logName":"aaa"""", "logName": """"date":"bbb"}""" } }, { "_index": "apache", "_type": "logs", "_id": "uBfbqnEBPuwwWJWL-qvP", "_score": 1, "_source": { "date": """{"logName":"ccc"""", "logName": """"date":"ddd"}""" } }, { "_index": "apache", "_type": "logs", "_id": "uRfbqnEBPuwwWJWL-qvP", "_score": 1, "_source": { "date": """{"logName":"eee"""", "logName": """"date":"fff"}""" } }, { "_index": "apache", "_type": "logs", "_id": "uhfbqnEBPuwwWJWL-qvP", "_score": 1, "_source": { "date": """{"logName":"ggg"""", "logName": """"date":"hhh"}""" } }, { "_index": "apache", "_type": "logs", "_id": "uxfbqnEBPuwwWJWL-qvP", "_score": 1, "_source": { "date": """{"logName":"mmm"""", "logName": """"date":"nnn"}""" } } ] } }
apache specifies the created index name. Set this parameter based on site requirements.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot