Hi. I'm running a VServer with Debian, Apache 2.2.9, FCGI and SuExec. I'm hosting a website with a pay-for-download service which uses a PHP to deliver files. The code for that is basically (with some checking for valid file/token/etc before): PHP: header('Content-Description: File Transfer');header("Pragma: public");header("Expires: 0");header("Cache-Control: must-revalidate, post-check=0, pre-check=0");header("Cache-Control: private",false);header("Content-Type: application/octet-stream");header("Content-Disposition: attachment; filename=\"".basename($filepath)."\";" );header("Content-Transfer-Encoding: binary");header("Content-Length: ".filesize($filepath));ob_clean();flush();readfile($filepath); This works perfectly fine on one of my computers (Windows 7 x64) and all installed browsers (Chrome/Firefox/Internet Explorer/Opera) but it fails on a different computer (Windows XP SP3) as soon as I try to download bigger files (100-200 MB). The download just stops before 100% are downloaded. The Apache error.log shows entries like Code: [Thu Oct 21 02:15:40 2010] [warn] (104)Connection reset by peer: mod_fcgid: read data from fastcgi server error [Thu Oct 21 02:15:40 2010] [warn] (104)Connection reset by peer: mod_fcgid: ap_pass_brigade failed in handle_request function So it is obviously related to FCGI. My first idea was to mess around with the fcgid parameters, but that didn't make any changes, so I reset to my previous fcgid settings: PHP: <IfModule mod_fcgid.c> AddHandler fcgid-script .php AddType application/xhttpd-php .php IPCConnectTimeout 20</IfModule> The next thing I tested was to increase the PHP memory limit from 20MB to 256MB, but this didn't change anything either. Any ideas? I really don't understand why this works on one computer but doesn't work on others. Help is much appreciated! Thanks tree8 PS: There are more machines where it doesn't work (and just stops before the 100%), so this cannot be a local problem.