How to convert CSV to JSON in Node.JS

Posted by

Reading a CSV file and parsing it to JSON is an easy-peasy job in Node.JS.

Something important that you need to take into consideration before continue to implement the solutions below is your source file size. If you are having a huge CSV file with a lot of records(rows), probably you will have to use the solution with streams. Otherwise, you can go with the “normal” one, reading the whole file and when everything is read, parsing the records.

As you may already guessed, we will go and implement both solutions. I will use the npm package csvjson to help us parsing the file. We could always implement a parser by ourselves, but csvjson provide us with a useful and easy to use methods for both solutions.

Project preparation

First, we will create a new Node project and install the required dependency(yes, only one).

Navigate to the folder, where you want to create your project and execute the following commands in the terminal:

npm init -y
npm install csvjson --save

The first command will initialize a node project without asking any questions(-y tells npm to use the default init settings). And the next one will install the csvjson package, who will help us with the parsing.

Now, your main project directory should have the following files and folders:


There is only one more and we are done with the preparation – to create the source CSV file which we will use to test our solutions.

Create a file with name test-data.csv having the following content:

 1,John Smith, Manager
 2,Johny Bravo, Employee
 3,Peter, N/A

This is an example data that contains header information(the first line) and the actual data(rest of the lines).

Whew … we are ready to focus on the the essential part 🙂

Convert CSV to JSON using readFile

For this solution, we will create a file read-file.js in our main project folder.

Include the following code to this file:

const csvjson = require('csvjson');
const readFile = require('fs').readFile;

readFile('./test-data.csv', 'utf-8', (err, fileContent) => {
    if(err) {
        console.log(err); // Do something to handle the error or just throw it
        throw new Error(err);

    const jsonObj = csvjson.toObject(fileContent);

This is the complete, working code. I will explain it line by line.

First we are importing the required dependencies – the external csvjson package and the built-in file system(fs) module(we will be using the readFile only, that’s why we are taking only this method).

The next part of the code is the main logic. We are passing three parameters to the readFile function: the first one is our CSV source file location, the next one is the encoding(it’s important to provide it, otherwise we will receive a buffer instead of string as response and we will not be able to parse it) and the last one is the callback function. There, we expect to receive two parameters(error – if any and the actual file content).

In case there is an error while reading file(one of the reasons can be if the file doesn’t exist), we should be ready to catch it and do something with it. This is not a subject of this article.

In case there is no error, we are ready to pass the content to csvjson library, which will parse it and return an array of objects.

Start our code by:

node ./read-file.js

The actual result printed on your console should be:

[ { id: '1', name: 'John Smith', position: 'Manager' },
  { id: '2', name: 'Johny Bravo', position: 'Employee' },
  { id: '3', name: 'Peter', position: 'N/A' } ]

You should also be aware the the toObject method used above accepts and second parameter(with type Object) containing the parsing options. Without passing it(like we do) it’s using the default options. To get familiar with them, you can check the library documentation.

Any questions ? Feel free to ask in comments 🙂

Let’s continue with the solution using streams.

Convert CSV to JSON using Streams

As we will be using readStream to read the file, there is no sense to save the data in variable.

That’s why we will be using also writeStream to write back the transformed data on the disc in a file called test-data-output-stream.json.

Create a file read-stream.js in the main directory and include the following code:

const createReadStream = require('fs').createReadStream;
const createWriteStream = require('fs').createWriteStream;
const csvjson = require('csvjson');

const toObject =;
const stringify =;

createReadStream('./test-data.csv', 'utf-8')

The first two lines import the built-in read/write stream functionality from the file system module(fs).

Then we take advantage of the csvjson library again. Instead of writing transformation streams by ourselves, we will use the ones provided by the library. Behind the scenes, both and are Transform streams.

The first one is helping us to parse the plain text, extracted from the source CSV while the next one stringify the result of the first one, so we can write/put it in the json output file.

All of these functionalities are combined, using the pipe method. In short, this method reads data from a readable stream as it becomes available and writes it to a destination writable stream.

Read more about pipe in the official documentation of Node.JS

Start the code by:

node ./read-stream.js

After the execution, you should see a test-data-output-stream.json file in your main directory with the following content:

[{"id":"1","name":"John Smith","position":"Manager"},
{"id":"2","name":"Johny Bravo","position":"Employee"},{"id":"3","name":"Peter","position":"N/A"}]


You just learned two ways to convert CSV(comma-separated values) to a JSON using Node.JS and the lightweight, but powerful library – csvjson.

The full code is here –

Click here to read How to convert JSON to CSV in Node.JS

Good luck!