Hello,

There is a community forum (running on the vBulletin board system) that has
a download section that i would like to retrieve all the files from a
section. Now, the only reason im doing this with wget and not manually
downloading them is because there is a minimum of 1 minute between download
times, meaning i would have to sit there and manually download hundreds of
files while waiting 1 minute in between (that could take a while, also to
note, none of the files are greater than 100kb so it wont be abusing the
bandwidth of the site). I tried to retrieve the cookie file by using:

*wget --spider --save-cookies=/home/cody/Desktop/cookies.txt
--keep-session-cookies
--post-data='vb_login_username=user&vb_login_password=pass'
http://**www.forurm.com/login.php?do=login*

All that, that creates is a file with the timestamps inside of it (not
saving the cookie information used to log into the forum). My plan was to
use that to retreieve the cookie, then use the *--load-cookie* function
inorder to re-login.

On a side note, how well does wget handle urls that are formed like *
downloads.php?do=file&id=###*? When you go to that url it will then propmpt
you to open/save the file.

Thankyou very much

Reply via email to