Convert a large CSV file to JSON using Python
I’ve been playing around with Python and the Google Ads API for some keyword planning automation. I ended up with a large CSV file that I needed to convert to a keyed JSON file and wrote a quick python script to that I figured I’ll share with you.
Now, while I know it is very easy to use the CSV file in Python, I wanted to convert that file into a keyed JSON file instead.
Call me biased, but coming from the world of PHP and JavaScript, I personally think JSON files are easier on the eyes and can be utilized in other places where CSV would be to complicated to implement . Whether it is storing it into a database column (I do not recommend that), importing it into a HTML document, or even loading it into VueJS, React etc. You’ll get my point.
Anyways, I first tried a couple of those CSV to JSON converter sites that roam the internet. But quickly realized they all crash due to the file being to big. So next I figured, “if the mountain won’t come to Muhammad, then Muhammad must go to the mountain”. So I wrote a quick python script that you can use in the command line to convert a CSV file into a keyed JSON file.
import csv
import json
import sys
import os
def csv_to_json(csv_file, json_file):
"""
Convert a CSV file to a JSON file
"""
# Open the CSV
with open(csv_file, 'r') as csv_file:
reader = csv.DictReader(csv_file)
data = list(reader)
# Convert to JSON
json_data = json.dumps(data, indent=4)
# create parent directories if not exists
if not os.path.exists(os.path.dirname(json_file)):
os.makedirs(os.path.dirname(json_file))
# Write JSON to a file
with open(json_file, 'w') as json_file:
json_file.write(json_data)
def main():
files = sys.argv[1].split(' ')
csv_to_json(files[0], files[1])
if __name__ == "__main__":
main()
How you use it
Copy the code section into a file like `csv_to_json.py`. Then run python csv_to_json.py input_csv_file_path_and_name.csv output_file_path_and_name.json
.
You can use any folder depth the script will automatically create all folders needed.