Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HTTP 413 Payload Too Large on Cloud Run due to 32MB payload Limit. #220

Open
pcboy opened this issue Jan 13, 2025 · 0 comments
Open

HTTP 413 Payload Too Large on Cloud Run due to 32MB payload Limit. #220

pcboy opened this issue Jan 13, 2025 · 0 comments

Comments

@pcboy
Copy link

pcboy commented Jan 13, 2025

I'm trying to set up attic on cloud run. And I'm encountering an annoying issue.
During push of some packages, I get:

❌ 56akkaf0rcdiwd02hqj0bv30536r5z8b-ruby3.3-grpc-1.66.0: HTTP 413 Payload Too Large: 
<html><head>
<meta http-equiv="content-type" content="text/html;charset=utf-8">
<title>413 Request Entity Too Large</title>
</head>
<body text=#000000 bgcolor=#ffffff>
<h1>Error: Request Entity Too Large</h1>
<h2>Your client issued a request that was too large.
</h2>
<h2><script>
  (function() { var c=function(a,d,b){a=a+"=deleted; path="+d;b!=null&&(a+="; domain="+b);document.cookie=a+"; expires=Thu, 01 Jan 1970 00:00:00 GMT"};var g=function(a){var d=e,b=location.hostname;c(d,a,null);c(d,a,b);for(var f=0;;){f=b.indexOf(".",f+1);if(f<0)break;c(d,a,b.substring(f+1))}};var h;if(unescape(encodeURI(document.cookie)).length>4E3){for(var k=document.cookie.split(";"),l=[],m=0;m<k.length;m++){var n=k[m].match(/^\s*([^=]+)/);n&&l.push(n[1])}for(var p=0;p<l.length;p++){var e=l[p];g("/");for(var q=location.pathname,r=0;;){r=q.indexOf("/",r+1);if(r<0)break;var t=q.substring(0,r);g(t);g(t+"/")}q.charAt(q.length-1)!="/"&&(g(q),g(q+"/"))}h=!0}else h=!1;
h&&setTimeout(function(){if(history.replaceState){var a=location.href;history.replaceState(null,"","/");location.replace(a)}},1E3); })();

</script>
</h2>
</body></html>

This is due to cloud run having a 32MB payload limit. See https://cloud.google.com/run/quotas#cloud_run_limits .
As far as I know, there are basically 3 ways to fix this problem:

  1. Switching the cloud run instance to HTTP/2. As there is no size limit for HTTP/2 requests. But currently attic does not seem to support HTTP/2 servers (at least curl -i --http2-prior-knowledge https://my-atticd.us-central1.run.app returns upstream connect error or disconnect/reset before headers. reset reason: connection termination)
  2. Using signed urls with GCS.
  3. Splitting the upload into 32MB chunks and doing multipart upload?

How could we fix that the easiest way?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant