Split a large json file into multiple smaller files

I have a large JSON file, about 5 million records and a file size of about 32GB, that I need to get loaded into our Snowflake Data Warehouse. I need to get this file broken up into chunks of about 200k records (about 1.25GB) per file. I’d like to do this in either Node.JS or Python for deployment to an AWS Lambda function, unfortunately I haven’t coded in either, yet. I have C# and a lot of SQL experience, and learning both node and python are on my to do list, so why not dive right in, right!?