Backup Manager > zip I/O error: File too large

Discussion in 'Installation/Configuration' started by Djamu, Jun 2, 2008.

  1. Djamu

    Djamu New Member

    Planning to move a webserver to new hardware I tried the site Backup manager -I usually backup with my own script.. if anybody's interested I'll post it -
    1 small site went all right, the other didn't backup... until I found this thread > How do I enable automatic backup?
    I set $go_info["server"]["do_automated_backups"] to 1 in /home/admispconfig/ispconfig/lib/config.inc.php
    and ran
    Code:
    /root/ispconfig/php/php /root/ispconfig/scripts/shell/backup.php
    as soon as the zip file becomes 2GB I get this error
    -kind of weird as I'm having lots of zip / rar / tgz files much bigger then this-

    any suggestions ?
     
    Last edited: Jun 2, 2008
  2. Ben

    Ben ISPConfig Developer ISPConfig Developer

    Well seems to be a problem with zip itself... checking the code, there is just a shellcall to zip.

    So this error might be not about the ziparchive beeing to big, maybe the file trying to be compressed is too big?
    May you check for any files in the web's dir with e.g. du -hs
    Also don't forget about the logfiles in the web's dir...
     
    Last edited: Jun 2, 2008
  3. falko

    falko Super Moderator ISPConfig Developer

    I think the problem is that PHP on 32bit systems does not support files larger than 2GB.
     
  4. Djamu

    Djamu New Member

    Thanks Ben & Falko for your suggestions.

    Currently I'm busy duplicating the setup -as it's a production server, so alas I can't delete any files there-

    So far I tried replacing the zip binary in .../tools/zip/ ( v2.3 ) with the newer ( v2.32 ) to rule out any issues with that.

    The largest file is a 239MB image, & deleting content until the projected archive is less then 2GB does create one ( albeit I still get a zip I/O error: File too large ).

    Falko's reply made me thinking of a related route -although the issue might be 2fold, aside from the idea that I find it hard to believe that this system is a rarity > 32 bit with relatively large amount of content. I mean to say there should be still quite a few 32bit community webservers-

    So aside from the 32bit PHP limit ( as it now creates an archive ) I believe the zip binary needs quite a lot of RAM as a buffer in order to compress files -there's now 1 239MB ( 1 error ) file with lots of small files, the old server only has 512MB RAM.

    Just out of curiosity I'll duplicate the setup on a 32bit testbed with 2GB RAM, in order to try rule out some of my previous statements. ( kind of pointless actually as the new system is 64bit ).

    I'll post my findings... but if it's indeed a ZIP buffer related issue why not use a normal TAR call -in ../shell/backup.php- which doesn't have those large memory requirements ( or for sake of those windoz fanboys the RAR / 7zip libraries )

    :cool:
     
    Last edited: Jun 3, 2008

Share This Page