I have automated backups running everynight (the ones build-in in ISPConfig) - which are then sent do an ftp server with other stuffs (database backups, configs and so on) by another script.
I was wondering why one of my site could not be backed up - and it seems that I've reached over 2Gb of compressed datas for it.
The backup script uses zip - which don't handle files larger than 2Gb.
Is there an easy way to change that ? (ie : use bzip2 or tgz)
I'm not sure zip is used elsewhere, if so I would just change the path to it for bzip2 in the config and maybe some cli options in the backup script - doing that would be ok ?
To finish, changing that backup format would be a good idea I suppose