"413 Request Entity Too Large" in Nginx with "client_max_body_size" set

I am uploading a 26Gb file, but I am getting:

413 Request Entity Too Large

I know, this is related to client_max_body_size, so I have this parameter set to 30000M.

  location /supercap {
    root  /media/ss/synology_office/server_Seq-Cap/;
    index index.html;
    proxy_pass  http://api/supercap;
  }

  location /supercap/pipe {
    client_max_body_size 30000M;
    client_body_buffer_size 200000k;
    proxy_pass  http://api/supercap/pipe;
    client_body_temp_path /media/ss/synology_office/server_Seq-Cap/tmp_nginx;
  }

But I still get this error when the whole file has been uploaded.


Modify NGINX Configuration File

sudo nano /etc/nginx/nginx.conf

Search for this variable: client_max_body_size. If you find it, just increase its size to 100M, for example. If it doesn’t exist, then you can add it inside and at the end of http

client_max_body_size 100M;

Restart nginx to apply the changes.

sudo service nginx restart

Modify PHP.ini File for Upload Limits

It’s not needed on all configurations, but you may also have to modify the PHP upload settings as well to ensure that nothing is going out of limit by php configurations.

If you are using PHP5-FPM use following command,

sudo nano /etc/php5/fpm/php.ini

If you are using PHP7.0-FPM use following command,

sudo nano /etc/php/7.0/fpm/php.ini

Now find following directives one by one

upload_max_filesize
post_max_size

and increase its limit to 100M, by default they are 8M and 2M.

upload_max_filesize = 100M
post_max_size = 100M

Finally save it and restart PHP.

PHP5-FPM users use this,

sudo service php5-fpm restart

PHP7.0-FPM users use this,

sudo service php7.0-fpm restart

It will work fine !!!


If you're uploading files of that size you should probably just disable the body size check altogether with:

client_max_body_size 0;

With respect, I'm not sure why you're using http to transfer that much data. I tend to do my large transfers over ssh

//such as:
tar cjf - /path/to/stuff | ssh user@remote-host "cd /path/to/remote/stuff;tar xjf -"

...which gives me a bzip-compressed transfer. But if I needed to do a resumable transfer, I might use sftp, lftp, even rsync. Any of those (or their derivatives or siblings) is capable of

  1. employing an encrypted channel if desired,
  2. resuming an interrrupted transfer and
  3. compressing the transfer

Only one of those would be an option to you when attempting to upload over http (namely, #1 if you were on https).

I hope you'll look into any of the above or the several other alternatives.