Re: more of a question I guess..
doesn't work, cannot login... P! Vladi. T. Bharath wrote: replace the first @ with %40 and check Regards Bharath Turgut Kalfaoglu wrote: Hi. I love WGET, but I have a stumper: How do you put a URL if the FTP site requires a password that has an '@' in it?? Like: wget ftp://userid:password@[EMAIL PROTECTED]:21/pub/incoming/blah.zip does not work; because of the two '@'s confuse the parser. Thanks :) -turgut - Turgut Kalfaoglu: http://www.kalfaoglu.com EgeNet Internet Services: http://www.egenet.com.tr -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] http://www.biscom.net/~cade Now, sure as the sun will cross the sky, This lie is over Lost, like the tears that used to tide me over... smime.p7s Description: S/MIME Cryptographic Signature
Re: more of a question I guess..
I have go the same problem and I found that my previous version of wget can handle this (the version was 1.7.1 I guess): wget ftp://user:pass@[EMAIL PROTECTED]/etc and worked fine, lates sources print error `no host in the url' or similar, perhaps --ftp-user and --ftp-pass would be good solution (just like http auth) P! Vladi. Turgut Kalfaoglu wrote: Hi. I love WGET, but I have a stumper: How do you put a URL if the FTP site requires a password that has an '@' in it?? Like: wget ftp://userid:password@[EMAIL PROTECTED]:21/pub/incoming/blah.zip does not work; because of the two '@'s confuse the parser. Thanks :) -turgut - Turgut Kalfaoglu: http://www.kalfaoglu.com EgeNet Internet Services: http://www.egenet.com.tr -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] http://www.biscom.net/~cade Now, sure as the sun will cross the sky, This lie is over Lost, like the tears that used to tide me over... smime.p7s Description: S/MIME Cryptographic Signature
WNP for cvs sources of Dec 9, 2001
hi! here is latest wget-new-percentage patch: http://www.biscom.net/~cade/away/wget-new-percentage/wget-new-percentage-cvs-20011209.tar.gz any feedback is welcome, thanx! P! Vladi. -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] Personal home page at http://www.biscom.net/~cade DataMax Ltd. http://www.datamax.bg Too many hopes and dreams won't see the light... smime.p7s Description: S/MIME Cryptographic Signature
Re: --random-wait
hi! --random-wait=20 between 0 and 20 secs. --random-wait=20..30 between 20 and 30 (closed-closed) :))) or of cource: --random-wait=20-30 P! Vladi. Herold Heiko wrote: I couldn't test that option yet, but it seems to use 0 .. 2*opt.wait for it's random waiting. What if somebody wants to wait a longer time with different random intervall... say wait from 20..30 seconds ? Possibly it would be better giving random-wait a numeric option, and use something like --wait=25 --random-wait=10 waitsecs = opt.wait \ + ( random() % (opt.randomwait + 1) ) \ - opt.randomwait / 2 ; if ( waitsecs 0 ) waitsecs = 0; Heiko -- -- PREVINET S.p.A.[EMAIL PROTECTED] -- Via Ferretto, 1ph x39-041-5907073 -- I-31021 Mogliano V.to (TV) fax x39-041-5907087 -- ITALY -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] Personal home page at http://www.biscom.net/~cade DataMax Ltd. http://www.datamax.bg Too many hopes and dreams won't see the light... smime.p7s Description: S/MIME Cryptographic Signature
Re: Patch: --range switch implemented
hi! Here is my IMO (in case someone is really interested in:)) all ranges 0-based, support few syntax-es: --range=0..1024-- closed-closed --range=0-1024 -- closed-open --range=1024+2048 -- take 3..4 K's :) i.e. get 2k starting on pos 1024 (well last one could be like --range=2048@1024 just for fun) implementation of all cases is trivial and I cannot see why not having them all? P! Vladi. Hrvoje Niksic wrote: Herold Heiko [EMAIL PROTECTED] writes: However, of the top of my head I can't remember many occasions where 0-n means closed-open There are. (And note that it's n-m in the general case, not just 0-n.) Off the top of my head, the Java string subscripts, Lisp array-related functions, Python slices, various Emacs functions, etc. The Python example is easy to demonstrate: range(0, 10) [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] Also: a = [0, 1, 2, 3, 4, 5] a[2:4] [2, 3] This makes perfect sense to me, but not everyone would agree. The nicest thing about it is that it allows this: a[0:3] + a[3:] [0, 1, 2, 3, 4, 5] I.e. you can construct the original interval by appending the touching subintervals. This is nice for downloads because it allows you to download 0-2k, 2k-5k, etc., without the one overlapping byte. Common Lisp: [1] (setq a '(0 1 2 3 4 5)) (0 1 2 3 4 5) [2] (subseq a 2 4) (2 3) Perl avoids the potential confusion by having its SUBSTR take offset (0-based) and length, which is clear to everyone. while there are at least Pascal and Perl where 0..n The Pascal reference is to 1..n, not 0..n. Which is one point you seem to have missed: IMHO [start, end] makes more sense with intervals that start with 1, and [start, end) makes more sense with intervals with start with 0. 0..10 #11 bytes including first one, like Perl, Pascal 1-10 #10 bytes including first one 1-10 is what I meant by the Pascal way because most Pascal programs use 1-based arrays. Again, assuming we want to download 16 bytes, the three options are, in my order of preference: 1 .. 16 # end-closed 1-based, Pascal-like 0 .. 15 # end-closed 0-based, Perl-like 0 .. 16 # end-open 0-based, Python-like Maybe we should support all 3, but document only one in --help? That way most users will not notice the complexity. Also, the first option could well be ignored since 1-based arrays are for wimps. :-) -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] Personal home page at http://www.biscom.net/~cade DataMax Ltd. http://www.datamax.bg Too many hopes and dreams won't see the light... smime.p7s Description: S/MIME Cryptographic Signature
Re: wget suggestion
[EMAIL PROTECTED] wrote: hiya! i'd like to have wget forking into background as default (via .wgetrc) but sometimes, eg. in shell scripts, i need wget to stay in foreground, so the script knows when the file is completely downloaded (well, after wget exits =) is it possible to implement such a feature? thanks in advance, wget rocks! greets, alex you can get wget running in background by adding `' at the end, i.e. wget http://somewhere/file.txt if you don't add `' wget will run in foreground, then you still can `ctrl+z' and `bg' to send it to background or simply close the terminal in which wget is running (it will also send wget in background and even will send all messages to `wget-log' log file)... well, all this is written somewhere in the docs I'm sure :) P! Vladi. -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] Personal home page at http://www.biscom.net/~cade DataMax Ltd. http://www.datamax.bg Too many hopes and dreams won't see the light... smime.p7s Description: S/MIME Cryptographic Signature
utime
hi! I have question: is there a way to make wget no to update file time (modification) accordingly to the remote one? i.e. not to touch the downloaded file. afai can understand there is no such option, am I right or I miss something... thanx! P! Vladi. -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] Personal home page at http://www.biscom.net/~cade DataMax Ltd. http://www.datamax.bg Too many hopes and dreams won't see the light... smime.p7s Description: S/MIME Cryptographic Signature
referer question
hi! is it possible (well I mean easy way:)) to make wget pass referer auto? I mean for every url that wget tries to fetch to pass hostname as referer. for example: http://www.somewhere.org/path/etc/page.html then the referer should be `http://www.somewhere.org/' well this is not problem for single url's but when I try to mirror pages that link to other hosts and any of those hosts require all pages to have referer from the same host then the mirror fails... I expect this to be switch of cource, not default behaviour... P! Vladi. -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] Personal home page at http://www.biscom.net/~cade DataMax Ltd. http://www.datamax.bg Too many hopes and dreams won't see the light... S/MIME Cryptographic Signature
Re: referer question
hi! well I think you misunderstood perhaps... I mean not to give referer (the --referer) but to give something like --auto-referer (possibly in wgetrc even) and wget will set referer for each url it process to the host-part from the same url. i.e. I'll give the example once again: I run: wget --auto-referer -m http://host1/path wget sets referer for all host1's urls to `http://host1/', if any page links to `http://host2/path', wget will set referer to `http://host2/' etc... I'm sorry but I wouldn't be able to explain further I'm afraid :))) thanx to you both for the reply! P! Vladi. Jan Hnila wrote: Hello! To be able to use the referer switch, you must have a new version of wget - I'm not sure, if 1.6 is enough, 1.7 certainly is enough and 1.5.3 is not enough.(Get more info's from http://wget.sunsite.dk) The switch is --referer=URL Try to use it with the -d (debug) switch to see, that it works. For example: wget -d --referer=http://wget.sunsite.dk/wgetdev.html http://www.gnu.org Of course, adjust it to suit your needs. (And you can put this setting in your wgetrc file, if you want -just omit the --.) Kind regards Jan Hnila P.S. The advice suggested by Jens is actually something different - it is the identity of your browser - it lets you pretend, that you are not wget, but for example Netscape: For example: --user-agent=Mozilla/4.72 [en] (X11; U; Linux 2.2.17 i686) or --user-agent=Mozilla/4.0 (compatible; MSIE 4.01; Mac_PowerPC) --user-agent=AGENT //(AGENT is the string you would like to be identified as) -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] Personal home page at http://www.biscom.net/~cade DataMax Ltd. http://www.datamax.bg Too many hopes and dreams won't see the light... S/MIME Cryptographic Signature
Re: -c -continue download
hi! this is known bug for http... on 01 Apr 2001 Hrvoje Niksic wrote: ---cut--- I think I've got this fixed now. The patch I'll submit also fixes this TODO entry: * If -c used on a file that's already completely downloaded, don't re-download it (unless normal --timestamping processing would cause you to do so). Except I'm not sure it handles timestamping the way this TODO entry would like to. --cut--- so this is perhaps fixed in 1.7 or latest CVS version... P! Vladi. Yun Zheng Hu wrote: hello, when you download a file with wget -c and its not complete it will resume..that's good.. But if the file was succesfully downloaded and you download it again with wget -c it will restart the download!! is it possible to let wget skip the download if it's already complete? Yun Zheng Hu -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] Personal home page at http://www.biscom.net/~cade DataMax Ltd. http://www.datamax.bg Too many hopes and dreams won't see the light... S/MIME Cryptographic Signature
fancy logs
hi! this is a (crazy) idea but it could be usefull (more or less) it is to make wget add lines like: start-time end-time size status url to a `central' log after downloading of each file... `status' can be used to determine if it is ok, timeout closed connection, etc. `central-log' could be ~/.wget_log for example. perhaps not very comfortable when mirroring but it can add single line for each url being mirrored. as I said this feature is perhaps arguable but I'd like to know your opinion on it... thanx! P! Vladi. -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] Personal home page at http://www.biscom.net/~cade DataMax Ltd. http://www.datamax.bg No tears to cry, no feelings left...
-c question
hi! `wget -c file' starts to download file from the begining if the file is completely downloaded already... why?! I expect wget to do nothing in this case: I wanted it to download file to the end (i.e. to continue, -c) and if the file is already here so there is nothing to do. There is nothing for this case in the man so I'd like to know is there explanation for this behaviour, otherwise I think it should be considered `a bug'. thanx for the attention! P! Vladi. -- Vladi Belperchinov-Shabanski [EMAIL PROTECTED] [EMAIL PROTECTED] Personal home page at http://www.biscom.net/~cade DataMax Ltd. http://www.datamax.bg No tears to cry, no feelings left...