-
Hi guys. In my Angular app I have a dropzone component where I upload a big amount of files to client, which later need to be uploaded to server. I have a working solution using JsZip + RxJs operators (I upload chunks sequentially (to avoid mess), then merge them + unzip the resulted file on server). However I am not satisfied with the performance: for 2.2GB file, the zipping is 9 min, uploading is 5. Now I've decided to use fflate, but I am puzzled and overwhelmed with it - too much of everything here... Could you please advice me how to accomplish my goal in the best possible way (memory usage + speed)? Not a ready code :), just schematically. Because I don't understand the workflow in my case (like - do I need to fill a whole zip object with files or I can start streaming the first chunk to the server once the zip file is filled up with [chunkSizeInBytes] bytes of data, etc.) Thank you very much. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
I ended up with the following code. But it did not bring me the speeding up compared to JsZip:
Besides, when unzipping at back-end, I've got the exception "End of Central Directory record could not be found" (I used standard |
Beta Was this translation helpful? Give feedback.
-
You should probably make use of the WHATWG streams API to maximize performance here. If First you need a way to read the file into a zip entry. Since you're using level 0 compression, I've used a import { Zip, ZipPassThrough } from 'fflate';
const addFileToZip = async (zip: Zip, file: File) => {
const zipEntry = new ZipPassThrough(file.name.replace(/^\/+/, ''));
zip.add(zipEntry);
const reader = file.stream().getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
zipEntry.push(value);
}
// finish file
zipEntry.push(new Uint8Array(0), true);
} Now you create a Zip file that uploads the chunks as it receives them, instead of waiting till the end. This will save you quite a bit of time while uploading, and likely some memory too. let prevUpload = Promise.resolve();
const zip = new Zip((err, chunk, final) => {
if (err) {
// handle as you see fit
}
// make an uploadChunkPromise that returns a promise instead of an RxJS observable
prevUpload = prevUpload.then(() => uploadChunkPromise(chunk, ...));
if (final) {
promiseToObservable(prevUpload).subscribe(observer);
}
}); Now just add your files and you're done. // assuming you have a files array like this:
// var files: File[];
for (const file of files) {
await addFileToZip(zip, file);
}
zip.end(); Full codeimport { Zip, ZipPassThrough } from 'fflate';
const addFileToZip = async (zip: Zip, file: File) => {
const zipEntry = new ZipPassThrough(file.name.replace(/^\/+/, ''));
zip.add(zipEntry);
const reader = file.stream().getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
zipEntry.push(value);
}
// finish file
zipEntry.push(new Uint8Array(0), true);
}
let prevUpload = Promise.resolve();
const zip = new Zip((err, chunk, final) => {
if (err) {
// handle as you see fit
}
// make an uploadChunkPromise that returns a promise instead of an RxJS observable
prevUpload = prevUpload.then(() => uploadChunkPromise(chunk, ...));
if (final) {
promiseToObservable(prevUpload).subscribe(observer);
}
});
for (const file of files) {
await addFileToZip(zip, file);
}
zip.end(); The end-of-central-directory thing doesn't look like it should be an issue with your current code, but it definitely won't be with this. Let me know if you have any other questions! |
Beta Was this translation helpful? Give feedback.
You should probably make use of the WHATWG streams API to maximize performance here. If
fetch
supportedReadableStream
in all browsers you could do the entire thing with WHATWG streams, but as it does not, you can do it as follows.First you need a way to read the file into a zip entry. Since you're using level 0 compression, I've used a
ZipPassThrough
instead ofZipDeflate
(it's a bit faster).