is there any way to import a json file(contains 100 documents) in elasticsearch server.?

As dadoonet already mentioned, the bulk API is probably the way to go. To transform your file for the bulk protocol, you can use jq.

Assuming the file contains just the documents itself:

$ echo '{"foo":"bar"}{"baz":"qux"}' | 
jq -c '
{ index: { _index: "myindex", _type: "mytype" } },
. '

{"index":{"_index":"myindex","_type":"mytype"}}
{"foo":"bar"}
{"index":{"_index":"myindex","_type":"mytype"}}
{"baz":"qux"}

And if the file contains the documents in a top level list they have to be unwrapped first:

$ echo '[{"foo":"bar"},{"baz":"qux"}]' | 
jq -c '
.[] |
{ index: { _index: "myindex", _type: "mytype" } },
. '

{"index":{"_index":"myindex","_type":"mytype"}}
{"foo":"bar"}
{"index":{"_index":"myindex","_type":"mytype"}}
{"baz":"qux"}

jq's -c flag makes sure that each document is on a line by itself.

If you want to pipe straight to curl, you'll want to use --data-binary @-, and not just -d, otherwise curl will strip the newlines again.


You should use Bulk API. Note that you will need to add a header line before each json document.

$ cat requests
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } }
{ "field1" : "value1" }
$ curl -s -XPOST localhost:9200/_bulk --data-binary @requests; echo
{"took":7,"items":[{"create":{"_index":"test","_type":"type1","_id":"1","_version":1,"ok":true}}]}

I'm sure someone wants this so I'll make it easy to find.

FYI - This is using Node.js (essentially as a batch script) on the same server as the brand new ES instance. Ran it on 2 files with 4000 items each and it only took about 12 seconds on my shared virtual server. YMMV

var elasticsearch = require('elasticsearch'),
    fs = require('fs'),
    pubs = JSON.parse(fs.readFileSync(__dirname + '/pubs.json')), // name of my first file to parse
    forms = JSON.parse(fs.readFileSync(__dirname + '/forms.json')); // and the second set
var client = new elasticsearch.Client({  // default is fine for me, change as you see fit
  host: 'localhost:9200',
  log: 'trace'
});

for (var i = 0; i < pubs.length; i++ ) {
  client.create({
    index: "epubs", // name your index
    type: "pub", // describe the data thats getting created
    id: i, // increment ID every iteration - I already sorted mine but not a requirement
    body: pubs[i] // *** THIS ASSUMES YOUR DATA FILE IS FORMATTED LIKE SO: [{prop: val, prop2: val2}, {prop:...}, {prop:...}] - I converted mine from a CSV so pubs[i] is the current object {prop:..., prop2:...}
  }, function(error, response) {
    if (error) {
      console.error(error);
      return;
    }
    else {
    console.log(response);  //  I don't recommend this but I like having my console flooded with stuff.  It looks cool.  Like I'm compiling a kernel really fast.
    }
  });
}

for (var a = 0; a < forms.length; a++ ) {  // Same stuff here, just slight changes in type and variables
  client.create({
    index: "epubs",
    type: "form",
    id: a,
    body: forms[a]
  }, function(error, response) {
    if (error) {
      console.error(error);
      return;
    }
    else {
    console.log(response);
    }
  });
}

Hope I can help more than just myself with this. Not rocket science but may save someone 10 minutes.

Cheers