wget suggestion
There needs to be a way to tell wget to reject all domains EXCEPT those that are accepted. This should include subdomains. Ie. I just want to download www.mydomain.com and cache.mydomain.com. I thought the --domains option would work this way but it doesn't.
Re: wget suggestion
From: Robert La Ferla There needs to be a way to tell wget to reject all domains EXCEPT those that are accepted. This should include subdomains. Ie. I just want to download www.mydomain.com and cache.mydomain.com. I thought the --domains option would work this way but it doesn't. Can you provide any evidence that it doesn't? Useful info might include the wget version, your OS and version, the command you used, and the results you got. Adding -d to the command often reveals more than not using it. A real example is usually more useful than a fictional example. If you can't exhibit the actual failure and explain how to reproduce it, you might do better with a psychic hot-line, as most of us are not skilled in remote viewing. Steven M. Schweda [EMAIL PROTECTED] 382 South Warwick Street(+1) 651-699-9818 Saint Paul MN 55105-2547
Re: wget suggestion
GNU Wget 1.10.2 Capture this sub-site and not the rest of the site so that you can view it locally. i.e. just www.boston.com and cache.boston.com http://www.boston.com/ae/food/gallery/cheap_eats/ On May 3, 2007, at 10:34 PM, Steven M. Schweda wrote: From: Robert La Ferla There needs to be a way to tell wget to reject all domains EXCEPT those that are accepted. This should include subdomains. Ie. I just want to download www.mydomain.com and cache.mydomain.com. I thought the --domains option would work this way but it doesn't. Can you provide any evidence that it doesn't? Useful info might include the wget version, your OS and version, the command you used, and the results you got. Adding -d to the command often reveals more than not using it. A real example is usually more useful than a fictional example. If you can't exhibit the actual failure and explain how to reproduce it, you might do better with a psychic hot-line, as most of us are not skilled in remote viewing. -- -- Steven M. Schweda [EMAIL PROTECTED] 382 South Warwick Street(+1) 651-699-9818 Saint Paul MN 55105-2547
Re: wget suggestion
From: Robert La Ferla GNU Wget 1.10.2 Ok. Running on what? Capture this sub-site and not the rest of the site so that you can view it locally. i.e. just www.boston.com and cache.boston.com http://www.boston.com/ae/food/gallery/cheap_eats/ What is a sub-site? Do you mean this page, or this page and all the pages to which it links, excluding off-site pages, or what? I have a better idea. Read this again: Can you provide any evidence that it doesn't? Useful info might include the wget version, your OS and version, the command you used, and the results you got. Adding -d to the command often reveals more than not using it. A real example is usually more useful than a fictional example. If you can't exhibit the actual failure and explain how to reproduce it, you might do better with a psychic hot-line, as most of us are not skilled in remote viewing. You might also consider phrasing your demands as polite requests in future. Phrases like I would like to learn how to, or Can you explain how to can be useful for this. Even better would be, I tried this command insert command here, and I got this result insert result here, but I was expecting something more like this insert expected result here, and I definitely didn't expect this insert undesirable result here. Steven M. Schweda [EMAIL PROTECTED] 382 South Warwick Street(+1) 651-699-9818 Saint Paul MN 55105-2547
wget Suggestion: ability to scan ports BESIDE #80, (like 443) Anyway Thanks for WGET!
Re: wget Suggestion: ability to scan ports BESIDE #80, (like 443) Anyway Thanks for WGET!
- Original Message - From: [EMAIL PROTECTED] To: [EMAIL PROTECTED] Sent: Sunday, December 07, 2003 8:04 AM Subject: wget Suggestion: ability to scan ports BESIDE #80, (like 443) Anyway Thanks for WGET! What's wrong with wget https://www.somesite.com ?
wget suggestion
Hi Just a suggestion. I'm using wget 1.6. If using FTP, add an option to download with same file permissions. Cheers Michiel --
wget suggestion...
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 One small suggestion for a possible later release... a mask for all files.. wget -m http://localhost/*.txt for example. Other than that.. all's good =) Regards.. Total K http://www.oc32.cjb.net ~ OC32 Home http://www.digiserv.cjb.net ~ Home of [Total K] http://www.digitaldisorder.cjb.net/php/download.php?sec=pgpsub_sec=f=totalk.asc ~ PGP Key - --- If mini skirts get any higher, said the Fairy to the Gnome, We'll have two more cheeks to powder, and a few more hairs to comb. -BEGIN PGP SIGNATURE- Version: PGP 6.5.3 iQA/AwUBO/ptAuVkNn/VM/QPEQKhfQCgqsmh85/7XZlWdFNYHS2tyt8g0hUAnR0H k6ekAq6xnmZQMU23vzHKTccA =QtzJ -END PGP SIGNATURE-
Re: wget suggestion
[EMAIL PROTECTED] wrote: hiya! i'd like to have wget forking into background as default (via .wgetrc) but sometimes, eg. in shell scripts, i need wget to stay in foreground, so the script knows when the file is completely downloaded (well, after wget exits =) is it possible to implement such a feature? thanks in advance, wget rocks! greets, alex you can get wget running in background by adding `' at the end, i.e. wget http://somewhere/file.txt if you don't add `' wget will run in foreground, then you still can `ctrl+z' and `bg' to send it to background or simply close the terminal in which wget is running (it will also send wget in background and even will send all messages to `wget-log' log file)... well, all this is written somewhere in the docs I'm sure :) P! Vladi. -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] Personal home page at http://www.biscom.net/~cade DataMax Ltd. http://www.datamax.bg Too many hopes and dreams won't see the light... smime.p7s Description: S/MIME Cryptographic Signature
wget suggestion
hiya! i'd like to have wget forking into background as default (via .wgetrc) but sometimes, eg. in shell scripts, i need wget to stay in foreground, so the script knows when the file is completely downloaded (well, after wget exits =) is it possible to implement such a feature? thanks in advance, wget rocks! greets, alex
WGET suggestion
Hello, I'm using wget and prefer it to a number of GUI-programs. It only seems to me that Style Sheets (css-files) aren't downloaded. Is this true, or am I doing something wrong? If not, I would suggest that stylesheets should also be retrieved by wget. Regards, Michael -- Michael Widowitz [EMAIL PROTECTED] http://widowitz.com - letztes Update 22.4.2001 http://astraxa.net
Re: WGET suggestion
\Quoting Michael Widowitz ([EMAIL PROTECTED]): I'm using wget and prefer it to a number of GUI-programs. It only seems to me that Style Sheets (css-files) aren't downloaded. Is this true, or am I doing something wrong? If not, I would suggest that stylesheets should also be retrieved by wget. Michael, which version of wget do you use? I guess (but maybe I'm mistaken) that versions 1.6 and upwards do download CSS when doing recursive traversal (or --page-requisities). -- jan +-- Jan Prikryl| vr|vis center for virtual reality and visualisation [EMAIL PROTECTED] | http://www.vrvis.at +--