Originally Posted by falko
I think the problem is that PHP on 32bit systems does not support files larger than 2GB.
Thanks Ben & Falko for your suggestions.
Currently I'm busy duplicating the setup -as it's a production server, so alas I can't delete any files there-
So far I tried replacing the zip binary in .../tools/zip/ ( v2.3 ) with the newer ( v2.32 ) to rule out any issues with that.
The largest file is a 239MB image, & deleting content until the projected archive is less then 2GB does create one ( albeit I still get a zip I/O error: File too large ).
Falko's reply made me thinking of a related route -although the issue might be 2fold, aside from the idea that I find it hard to believe that this system is a rarity > 32 bit with relatively large amount of content. I mean to say there should be still quite a few 32bit community webservers-
So aside from the 32bit PHP limit ( as it now creates an archive ) I believe the zip binary needs quite a lot of RAM as a buffer in order to compress files -there's now 1 239MB ( 1 error ) file with lots of small files, the old server only has 512MB RAM.
Just out of curiosity I'll duplicate the setup on a 32bit testbed with 2GB RAM, in order to try rule out some of my previous statements. ( kind of pointless actually as the new system is 64bit ).
I'll post my findings... but if it's indeed a ZIP buffer related issue why not use a normal TAR call -in ../shell/backup.php- which doesn't have those large memory requirements ( or for sake of those windoz fanboys the RAR / 7zip libraries )