Using Wget....

Discussion in 'Technical' started by shiidii, Feb 22, 2007.

  1. shiidii

    shiidii New Member

    Wget seems simple enough to use, there are many options to make it do specific things... but to make it do the simplest things I am having trouble with.
    I just want to wget a folder of files and sub-folders (containing html) and copy it straight to the directory I'm in.

    I understand how to remove the domain name and cut--dirs, but no matter what I try, it still wont straight up just copy the folder I want. It will try and be helpful and download every file associated with the HTML in the folders, but I don't want that. I just want to DOWNLOAD a folder of files, simple command, but hard to acheive. Any help would be appreciated. :)
  2. falko

    falko Super Moderator ISPConfig Developer

    Are these files somehow linked in the HTML files? Because otherwise wget cannot know that these files exist.
  3. shiidii

    shiidii New Member

    Thanks for the reply.
    Yeah the files are linked

    Here is my situation, my FTP has gone down from unrelated complications, while I'm fixing that i'm using SSH and wget to update a website. This involves uploading the website to a separate location and downloading it using wget. Therefore, I'm copying folders full of html and image files. However I thought Wget would just copy the directory I have specified. It doesn't seem to want to do that.

    Any suggestions on how to copy a directory (full of html files) from another website? Or will I have to Wget each files separately? The best bet is archiving, uploading to this other website, Wgetting that archive then extracting. :(
  4. zcworld

    zcworld New Member

    i hope this helps

    if you meaning you gotta copy over you all of your webroot folder
    just zip up the files / folders ( one big zip) and upload it
    and unzip it that way
  5. shiidii

    shiidii New Member

    That sounds like the most plausible option, thanks!

Share This Page