#1  
Old 22nd February 2007, 11:32
shiidii shiidii is offline
Member
 
Join Date: Jul 2006
Location: Melbourne, Australia
Posts: 39
Thanks: 0
Thanked 0 Times in 0 Posts
Question Using Wget....

Wget seems simple enough to use, there are many options to make it do specific things... but to make it do the simplest things I am having trouble with.
I just want to wget a folder of files and sub-folders (containing html) and copy it straight to the directory I'm in.

I understand how to remove the domain name and cut--dirs, but no matter what I try, it still wont straight up just copy the folder I want. It will try and be helpful and download every file associated with the HTML in the folders, but I don't want that. I just want to DOWNLOAD a folder of files, simple command, but hard to acheive. Any help would be appreciated.
Reply With Quote
Sponsored Links
  #2  
Old 23rd February 2007, 16:05
falko falko is offline
Super Moderator
 
Join Date: Apr 2005
Location: Lüneburg, Germany
Posts: 41,701
Thanks: 1,900
Thanked 2,735 Times in 2,571 Posts
Default

Are these files somehow linked in the HTML files? Because otherwise wget cannot know that these files exist.
__________________
Falko
--
Download the ISPConfig 3 Manual! | Check out the ISPConfig 3 Billing Module!

FB: http://www.facebook.com/howtoforge

nginx-Webhosting: Timme Hosting | Follow me on:
Reply With Quote
  #3  
Old 24th February 2007, 00:25
shiidii shiidii is offline
Member
 
Join Date: Jul 2006
Location: Melbourne, Australia
Posts: 39
Thanks: 0
Thanked 0 Times in 0 Posts
Default

Thanks for the reply.
Yeah the files are linked

Here is my situation, my FTP has gone down from unrelated complications, while I'm fixing that i'm using SSH and wget to update a website. This involves uploading the website to a separate location and downloading it using wget. Therefore, I'm copying folders full of html and image files. However I thought Wget would just copy the directory I have specified. It doesn't seem to want to do that.

Any suggestions on how to copy a directory (full of html files) from another website? Or will I have to Wget each files separately? The best bet is archiving, uploading to this other website, Wgetting that archive then extracting.
Reply With Quote
  #4  
Old 24th February 2007, 10:36
zcworld zcworld is offline
Senior Member
 
Join Date: Jul 2006
Location: South Australia
Posts: 329
Thanks: 2
Thanked 37 Times in 37 Posts
Send a message via MSN to zcworld Send a message via Skype™ to zcworld
Default

i hope this helps

if you meaning you gotta copy over you all of your webroot folder
just zip up the files / folders ( one big zip) and upload it
and unzip it that way
__________________
Shane Ebert :: Facebok
Reply With Quote
  #5  
Old 25th February 2007, 02:09
shiidii shiidii is offline
Member
 
Join Date: Jul 2006
Location: Melbourne, Australia
Posts: 39
Thanks: 0
Thanked 0 Times in 0 Posts
 
Default

That sounds like the most plausible option, thanks!
Reply With Quote
Reply

Bookmarks

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
ispconfig email login with user@domain How to desmondk Tips/Tricks/Mods 91 24th June 2011 15:43
CentOS 4.4 (64-bit) Installation Script Bailx Tips/Tricks/Mods 10 22nd February 2008 01:20
clamav spamassassin with simscan @ debian Shamael Anwar Installation/Configuration 2 15th December 2006 20:06
SNORT and BASE on a CLEAN "The Perfect Setup - Debian Sarge (3.1)" edge Suggest HOWTO 5 10th September 2006 00:07
Extreme n00b questions re: Ubuntu 6.06 Perfect Setup titbeak HOWTO-Related Questions 2 2nd September 2006 14:09


All times are GMT +2. The time now is 16:54.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.