How to load very large csv files in nodejs?

fs.readFile will load the entire file into memory, but fs.createReadStream will read the file in chunks of the size you specify.

This will prevent it from running out of memory


You may want to stream the CSV, instead of reading it all at once:

  • csv-parse has streaming support: http://csv.adaltas.com/parse/
  • or, you may want to take a look at csv-stream: https://www.npmjs.com/package/csv-stream

Stream works perfectly, it took only 3-5 seconds :

var csv = require('csv-parser')
var data = []

fs.createReadStream('path/to/my/data.csv')
  .pipe(csv())
  .on('data', function (row) {
    data.push(row)
  })
  .on('end', function () {
    console.log('Data loaded')
  })

Tags:

Csv

Node.Js