Compute
Elastic Cloud Server
Huawei Cloud Flexus
Bare Metal Server
Auto Scaling
Image Management Service
Dedicated Host
FunctionGraph
Cloud Phone Host
Huawei Cloud EulerOS
Networking
Virtual Private Cloud
Elastic IP
Elastic Load Balance
NAT Gateway
Direct Connect
Virtual Private Network
VPC Endpoint
Cloud Connect
Enterprise Router
Enterprise Switch
Global Accelerator
Management & Governance
Cloud Eye
Identity and Access Management
Cloud Trace Service
Resource Formation Service
Tag Management Service
Log Tank Service
Config
OneAccess
Resource Access Manager
Simple Message Notification
Application Performance Management
Application Operations Management
Organizations
Optimization Advisor
IAM Identity Center
Cloud Operations Center
Resource Governance Center
Migration
Server Migration Service
Object Storage Migration Service
Cloud Data Migration
Migration Center
Cloud Ecosystem
KooGallery
Partner Center
User Support
My Account
Billing Center
Cost Center
Resource Center
Enterprise Management
Service Tickets
HUAWEI CLOUD (International) FAQs
ICP Filing
Support Plans
My Credentials
Customer Operation Capabilities
Partner Support Plans
Professional Services
Analytics
MapReduce Service
Data Lake Insight
CloudTable Service
Cloud Search Service
Data Lake Visualization
Data Ingestion Service
GaussDB(DWS)
DataArts Studio
Data Lake Factory
DataArts Lake Formation
IoT
IoT Device Access
Others
Product Pricing Details
System Permissions
Console Quick Start
Common FAQs
Instructions for Associating with a HUAWEI CLOUD Partner
Message Center
Security & Compliance
Security Technologies and Applications
Web Application Firewall
Host Security Service
Cloud Firewall
SecMaster
Anti-DDoS Service
Data Encryption Workshop
Database Security Service
Cloud Bastion Host
Data Security Center
Cloud Certificate Manager
Edge Security
Situation Awareness
Managed Threat Detection
Blockchain
Blockchain Service
Web3 Node Engine Service
Media Services
Media Processing Center
Video On Demand
Live
SparkRTC
MetaStudio
Storage
Object Storage Service
Elastic Volume Service
Cloud Backup and Recovery
Storage Disaster Recovery Service
Scalable File Service Turbo
Scalable File Service
Volume Backup Service
Cloud Server Backup Service
Data Express Service
Dedicated Distributed Storage Service
Containers
Cloud Container Engine
SoftWare Repository for Container
Application Service Mesh
Ubiquitous Cloud Native Service
Cloud Container Instance
Databases
Relational Database Service
Document Database Service
Data Admin Service
Data Replication Service
GeminiDB
GaussDB
Distributed Database Middleware
Database and Application Migration UGO
TaurusDB
Middleware
Distributed Cache Service
API Gateway
Distributed Message Service for Kafka
Distributed Message Service for RabbitMQ
Distributed Message Service for RocketMQ
Cloud Service Engine
Multi-Site High Availability Service
EventGrid
Dedicated Cloud
Dedicated Computing Cluster
Business Applications
Workspace
ROMA Connect
Message & SMS
Domain Name Service
Edge Data Center Management
Meeting
AI
Face Recognition Service
Graph Engine Service
Content Moderation
Image Recognition
Optical Character Recognition
ModelArts
ImageSearch
Conversational Bot Service
Speech Interaction Service
Huawei HiLens
Video Intelligent Analysis Service
Developer Tools
SDK Developer Guide
API Request Signing Guide
Terraform
Koo Command Line Interface
Content Delivery & Edge Computing
Content Delivery Network
Intelligent EdgeFabric
CloudPond
Intelligent EdgeCloud
Solutions
SAP Cloud
High Performance Computing
Developer Services
ServiceStage
CodeArts
CodeArts PerfTest
CodeArts Req
CodeArts Pipeline
CodeArts Build
CodeArts Deploy
CodeArts Artifact
CodeArts TestPlan
CodeArts Check
CodeArts Repo
Cloud Application Engine
MacroVerse aPaaS
KooMessage
KooPhone
KooDrive

Using Kibana or APIs to Import Data to Elasticsearch

Updated on 2023-06-20 GMT+08:00

You can import data in various formats, such as JSON and CSV, to Elasticsearch in CSS by using Kibana or APIs.

Importing Data Using Kibana

Before importing data, ensure that you can use Kibana to access the cluster. The following procedure illustrates how to use the POST command to import data.
  1. Log in to the CSS management console.
  2. In the navigation pane on the left, choose Clusters > Elasticsearch to switch to the Clusters page.
  3. Choose Clusters in the navigation pane. Locate the target cluster and click Access Kibana in the Operation column to log in to Kibana.
  4. Click Dev Tools in the navigation tree on the left.
  5. (Optional) On the Console page, run the related command to create an index for storing data and specify a custom mapping to define the data type.

    If there is an available index in the cluster where you want to import data, skip this step. If there is no available index, create an index by referring to the following sample code.

    For example, on the Console page of Kibana, run the following command to create an index named my_store and specify a user-defined mapping to define the data type:

    Versions earlier than 7.x
    PUT /my_store
    {
        "settings": {
            "number_of_shards": 1
        },
        "mappings": {
            "products": {
                "properties": {
                    "productName": {
                        "type": "text"
                    },
                    "size": {
                        "type": "keyword"
                    }
                }
            }
        }
    }

    Versions later than 7.x

    PUT /my_store
    {
        "settings": {
            "number_of_shards": 1
        },
        "mappings": {
            "properties": {
                "productName": {
                    "type": "text"
                },
                "size": {
                    "type": "keyword"
                }
            }
        }
    }
  6. Run commands to import data. For example, run the following command to import a piece of data:
    Versions earlier than 7.x
    POST /my_store/products/_bulk 
    {"index":{}} 
    {"productName":"Latest art shirts for women in 2017 autumn","size":"L"}

    Versions later than 7.x

    POST /my_store/_bulk  
    {"index":{}}  
    {"productName":"Latest art shirts for women in 2017 autumn","size":"L"}

    The command output is similar to that shown in Figure 1. If the value of the errors field in the result is false, the data is successfully imported.

    Figure 1 Response message

Importing Data Using APIs

You can call the bulk API using the cURL command to import a JSON data file.

NOTE:

You are advised to import a file smaller than 50 MB.

  1. Log in to the ECS that you use to access the cluster.
  2. Run the following command to import JSON data:
    In the command, replace the value of {Private network address and port number of the node} with the private network address and port number of a node in the cluster. If the node fails to work, the command will fail to be executed. If the cluster contains multiple nodes, you can replace the value of {Private network address and port number of the node} with the private network address and port number of any available node in the cluster. If the cluster contains only one node, restore the node and execute the command again. test.json indicates the JSON file whose data is to be imported.
    curl -X PUT "http://{Private network address and port number of the node} /_bulk" -H 'Content-Type: application/json' --data-binary @test.json
    NOTE:

    The value of the -X parameter is a command and that of the -H parameter is a message header. In the preceding command, PUT is the value of the -X parameter and 'Content-Type: application/json' --data-binary @test.json is the value of the -H parameter. Do not add -k between a parameter and its value.

    Example: In this example, assume that you need to import data in the testdata.json file to an Elasticsearch cluster, where communication encryption is disabled and the private network address and port number of one node are 192.168.0.90 and 9200 respectively. The data in the testdata.json file is as follows:

    Versions earlier than 7.x

    {"index": {"_index":"my_store","_type":"products"}}
    {"productName":"Autumn new woman blouses 2019","size":"M"}
    {"index": {"_index":"my_store","_type":"products"}}
    {"productName":"Autumn new woman blouses 2019","size":"L"}

    Versions later than 7.x

    {"index": {"_index":"my_store"}}
    {"productName":"Autumn new woman blouse 2019","size":"M"}
    {"index": {"_index":"my_store"}}
    {"productName":"Autumn new woman blouse 2019","size":"L"}

    Perform the following steps to import the data:

    1. Run the following command to create an index named my_store:
      Versions earlier than 7.x
      curl -X PUT http://192.168.0.90:9200/my_store -H 'Content-Type: application/json' -d '
       { 
         "settings": { 
           "number_of_shards": 1 
         }, 
         "mappings": { 
           "products": { 
             "properties": { 
               "productName": { 
                 "type": "text" 
                 }, 
               "size": { 
                 "type": "keyword" 
               } 
             } 
           } 
         } 
       }'

      Versions later than 7.x

      curl -X PUT http://192.168.0.90:9200/my_store -H 'Content-Type: application/json' -d '
      {
          "settings": {
              "number_of_shards": 1
          },
          "mappings": {
              "properties": {
                  "productName": {
                      "type": "text"
                  },
                  "size": {
                      "type": "keyword"
                  }
              }
          }
      }'
    2. Run the following command to import the data in the testdata.json file:
      curl -X PUT "http://192.168.0.90:9200/_bulk" -H 'Content-Type: application/json' --data-binary @testdata.json

We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out more

Feedback

Feedback

Feedback

0/500

Selected Content

Submit selected content with the feedback