-
Hi, I spent lots of time trying to figure this out on my own, but I did not succeed: I'm trying to download a big zipped csv file that I want to unzip and parse. The zipped file is just small enough to fit into main memory, however the unzipped file leads to a heap out of memory. It's a node.js backend so writing to disk is no problem. I figure streaming and or writing to / reading from disk is the solution. Can I download the zipped file in chunks, unzip them on arrival, translate them to strings and write it to disk? I figure writing to disk has to be done because since it's a csv I can only parse full rows. So directly parsing the chunks won't work when a row is divided between two chunks. Is is better to write the string to disk or directly the unzipped UInt8Array? Is the order guaranteed? I.e. can I be assured that the header row remains the first row in the unzipped csv file when it's split into chunks? Is there any code snippet which solves a similar problem, I can have a look at? Is there a better approach? Thanks a lot! :) |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Came eventually up with my own solution:
|
Beta Was this translation helpful? Give feedback.
Came eventually up with my own solution: