
This article is a continuation of How to convert CSV to JSON in Node.JS.
We will keep up with the same pattern used there – introducing two solutions to our problem.
The first solution will use the standard readFile method, i.e. reading the whole JSON file, when everything is read – transform the data, and write it in a new CSV file. This solution is not a good idea to be used for large files, because of the memory limits.
The second one is a little bit more advanced and is using streams. The advantage of this solution is that we are reading the file part-by-part, parse the JSON, transform the chunk to CSV and push the transformed data in writable stream. This way we can read a gigabytes of data without even bothering about memory limits.
We will be using the csvjson npm package to help us with the transformation from JSON to CSV and JSONStream for the streams solution as the csvjson does not still supports json to csv transformation using streams(csv to json is supported).
The read/write operations, will be handled by us using the native fs module.
Project preparation
Before continue with the implementation of both methods, we will do a quick preparation by create a new project using npm.
Navigate to the folder where you want to create your project, open the terminal and execute the following commands:
npm init -y npm install csvjson --save npm install jsonstream --save
The init command is initializing the project and the -y argument is telling npm to use the default settings. It’s always better to configure it according your needs, but for our testing purposes this is not the case.
And the other two commands are installing the dependencies we will be using – csvjson and jsonstream.
When the execution is finished, our main project structure should look like this:
/node-modules package.json package-lock.json
Last thing before start writing some code is to create a JSON file that will be used to test the solutions.
Create a file test-data.json and fill it with the following data:
[ { "id":"1", "name":"John Smith", "position":"Manager" }, { "id":"2", "name":"Johny Bravo", "position":"Employee" }, { "id":"3", "name":"Peter", "position":"N/A" } ]
This is a pretty simple, not nested example. However, the library for transformation we will be using is smart enough to handle and nested data. Read their documentation for more info if that is your case.
Convert JSON to CSV using readFile
Create a file with name read-file.js in your main project directory.
Include the following code to the file:
const csvjson = require('csvjson'); const readFile = require('fs').readFile; const writeFile = require('fs').writeFile; readFile('./test-data.json', 'utf-8', (err, fileContent) => { if (err) { console.log(err); // Do something to handle the error or just throw it throw new Error(err); } const csvData = csvjson.toCSV(fileContent, { headers: 'key' }); writeFile('./test-data.csv', csvData, (err) => { if(err) { console.log(err); // Do something to handle the error or just throw it throw new Error(err); } console.log('Success!'); }); });
This is the final, working version. We will explore it line by line.
Importing the required dependencies in the beginning of your file/code is a common practice in Node.JS. In this example, we are importing the external library we will be using and readFile/writeFile methods, provided by the built-in fs module.
Then, we are reading the JSON file, using the readFile. The first argument provided to it is the path to our test file, the second is the encoding(important one, without specify the encoding, we will receive Buffer instead of string) and the third one – callback function. There, we expect to receive two parameters(error – if any and the actual file content).
I’m sure that you are aware how important is to catch the errors and handle them properly. This is not the aim of this article, but keep it in mind.
When there is no error and the content is now available, csvjson will help us with converting it to CSV by using the method toCSV.
Passing the json data as first argument and options object as second.
The option we are using is only one – headers: ‘key’. It’s telling the library to use the object keys as headers.
Without – headers: ‘key’:
[].id,[].name,[].position 1,John Smith,Manager 2,Johny Bravo,Employee 3,Peter,N/A
With – headers: ‘key’:
id,name,position 1,John Smith,Manager 2,Johny Bravo,Employee 3,Peter,N/A
The second variant looks a little bit better to me 🙂
And now, we can write the transformed data to a CSV file using the writeFile method. As like the readFile, the first argument is the file path(the file will be created automatically), the second is your data and the last one callback.
However, the callback is expecting only one parameter – error. If it’s null, then the operation is successful.
Let’s test it.
Execute this command in the terminal:
node ./read-file.js
Check your directory. A new file test-data.csv will be created.
Open it using your IDE or your favorite CSV viewer. The whole data from your JSON file should be there 🙂
Any questions ? Feel free to ask in comments.
We will continue with the solution using streams.
Convert JSON to CSV using Streams
For this solution, we will use the same test file created in the previous step – test-data.json. You can always insert more data to it to taste the real power of streams.
Create a file read-stream.js and include the following code:
const csvjson = require('csvjson'); const fs = require('fs'); const stream = require('stream'); const JSONStream = require('jsonstream'); const createReadStream = fs.createReadStream; const createWriteStream = fs.createWriteStream; let line = 0; const transformJsonToCSV = new stream.Transform({ transform: function transformer(chunk, encoding, callback) { line++; callback(false, csvjson.toCSV(chunk, { headers: line > 1 ? 'none' : 'key' })); }, readableObjectMode: true, writableObjectMode: true, }); createReadStream('./test-data.json', 'utf-8') .pipe(JSONStream.parse('*')) .pipe(transformJsonToCSV) .pipe(createWriteStream('./test-data-output-stream.csv'));
This is the complete solution for this example. As you can see it’s a little bit more complex than the previous one. One of the reasons for this is the fact that csvjson is supporting a complete streaming solution for CSV to JSON, but not for JSON to CSV.
Basically we are implement our own solution with the help of JSONStream package. You may be wondering why we do this. Because when a readable stream for a JSON file is created, the stream is sending the chunks without caring if it’s possible to parse the data, is it a complete object and so on.
It’s absolutely normal to receive a chunk like this
[ { "id":"1", "name":"John Smith", "position":"Manager" }, { "id":"2",
The csvjson method toCsv is doing a JSON.parse() on the received data behind the scenes and of course is throwing an error as this is not a valid JSON. Fixing this is done by the JSONStream.parse(‘*’). This method takes care of the chunk received by the read stream and provide the streaming data in a correct format by passing the objects to the transform method one by one.
One more thing that I didn’t cover and you are seeing in the code is a small logic for the line number. Passing headers option for all the lines except the first one as ‘none’ and ‘key’ for the first. This will include the headers only on the first line of our CSV, rather than before each line.
I have tested this solution with JSON array containing over 50 000 objects and I would says it’s working as expected. However, JSON streaming is a pretty new thing to me also and a more performant and optimized method could exist.
If you just want to use a full solution from a single library, without parsing and transforming it by yourself, you can check the streaming solution by – json2csv.
Please let me know in the comments section in case of questions or suggestions for improvement.
Conclusion
You just learned two ways to convert JSON to CSV(comma-separated values) using Node.JS and the lightweight, but powerful libraries – csvjson and jsonstream.
The full code is here – https://github.com/vikobg/first-class-js/tree/master/nodejs-json-to-csv
Click here to read How to convert CSV to JSON in Node.JS
Good luck!

Viktor Borisov is a full-stack JavaScript Developer and teaching enthusiast. His specialties are vanilla JS, Node.JS, AWS and Angular.