I've since opted for redundant backup techniques, which is what I usually do.
I like the ISPConfig stuff since it's easy to tar up and has a sane layout.
To date/time stamp various databases and subdirectories, all I did was to work up a script containing routines similar to the below for each subdir/database I want to back up:
# Backup Debianhelp.Org's /home/www/web7/web subdirectory
nice -n 17 tar -cjvvf /home/backups/nexo-www-debianhelp-org_home-www-web7-web-`date +%y%m%d`.tar.bz2 /home/www/web7/web
# Do debianHELP.org's PostNuke database
nice -n 17 mysqldump --user=web7_u3 --password=secret --add-drop-table web7_db1 > nexo-www-debianhelp-org-`date +%y%m%d`.sql
nice -n 17 bzip2 nexo-www-debianhelp-org-`date +%y%m%d`.sql
I nice the routines just because they're not high priority. I'll use routines similar to the above for each web site, and other various system subdirectories and other databases. As you see above, all these are collected into the /home/backups subdirectory.
The script above is invoked remotely simply by running the script via ssh:
ssh firstname.lastname@example.org /home/user/bin/RoutineFileTarballBackup.sh
then copy the files off the remote server:
scp email@example.com:/home/backups/nexo*bz2 /pub/backups/websitebackups
and then I clean up the /home backups by simply doing a:
ssh firstname.lastname@example.org rm /home/backups/nexo*bz2
As described, the script is very simple -- just a series of in-line commands, no error checking, nothing fancy. But since this is one backup technique of three I use, and since I keep files for many days, it doesn't have to be bulletproof.
Of course, as described above using ssh, you'll have to log in for each ssh/scp call. But by exchanging keys this can easily be made password-prompt-less.