Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

download/upload with streaming for files with fetch and s3 #16808

Open
dimonnwc3 opened this issue Jan 27, 2025 · 1 comment
Open

download/upload with streaming for files with fetch and s3 #16808

dimonnwc3 opened this issue Jan 27, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@dimonnwc3
Copy link

dimonnwc3 commented Jan 27, 2025

What version of Bun is running?

1.2.1

What platform is your computer?

Darwin 24.2.0 arm64 arm

What steps can reproduce the bug?

there is code for uploading or downloading files using various methods in bun
most of them buffer whole file content in memory before sending

import { file, s3, write } from "bun"

const src = "/path-to-1gb-file"
const bucket = "bucket-name"

async function uploadFileToUrl() {
  await fetch(s3.presign("key", { bucket, method: "PUT" }), {
    method: "PUT",
    body: file(src),
    verbose: true,
  })
}

async function uploadFileStreamToUrl() {
  await fetch(s3.presign("key", { bucket, method: "PUT" }), {
    method: "PUT",
    body: file(src).stream(),
    verbose: true,
  })
}

async function uploadFileToS3() {
  await write(s3.file("key", { bucket }), file(src))
}

async function downloadFileFromUrl() {
  await write(
    "downloadFileFromUrl",
    await fetch(s3.presign("key", { bucket }), {
      verbose: true,
    }),
  )
}

async function downloadFileFromS3() {
  await write(
    "downloadFileFromS3",
    s3.file("key", { bucket }),
  )
}

console.log(process.memoryUsage().rss / 1024 / 1024)

setInterval(() => {
  console.log(process.memoryUsage().rss / 1024 / 1024)
}, 300).unref()

await uploadFileToUrl()
await uploadFileStreamToUrl()
await uploadFileToS3()
await downloadFileFromUrl()
await downloadFileFromS3()

What is the expected behavior?

I intuitively expected that all operation with files and fetch/s3 will use streaming under the hood without buffering whole file in memory.

What do you see instead?

await uploadFileToUrl() <- buffers whole file in memory
await uploadFileStreamToUrl() <- fails with 411 http status, because content length is not known, even if I add header
await uploadFileToS3() <- buffers whole file in memory
await downloadFileFromUrl() <- buffers whole file in memory
await downloadFileFromS3() <- streams as expected

Additional information

No response

@dimonnwc3 dimonnwc3 added bug Something isn't working needs triage labels Jan 27, 2025
@cirospaciari
Copy link
Member

Indeed the way streams are consumed today this can happen, bun will free the memory as soon it can, but it can buffer the whole file, this will be change to another implementation that is more memory efficient

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants