wget sorts ip's from lowest to highest when multiple ip's are linked to one hostname
Hi. With the recent release of Linux 2.4.28, which fixes some security issues I quickly set out to grab a copy so that I could update my distribution. However, I was having a problem because the file isn't on all the mirrors yet. So I put wget in a simple while loop to get around the if file is not found, quit retrying behaviour. It should be a safe thing to do since ftp.us.kernel.org uses a round robin dns scheme, which means that everytime you lookup the host, it returns the list of IP's in a different order. However, it seems like wget sorts the IP's it gets from the resolver in the order of lowest to highest. I looked through the manpage but didnt see a way to change this behaviour. very simple while.. while [ ! -f linux-2.4.28.tar.bz2 ]; do wget -c ftp://ftp.us.kernel.org/pub/linux/kernel/v2.4/linux-2.4.28.tar.bz2; done Resolving ftp.us.kernel.org... 128.46.156.117, 128.105.103.12, 128.118.2.96, ...Connecting to ftp.us.kernel.org[128.46.156.117]:21... connected. Resolving ftp.us.kernel.org... 66.230.217.253, 69.31.98.210, 128.30.2.36, ... Connecting to ftp.us.kernel.org[66.230.217.253]:21... connected. While this seems to work fine after a while on linux, maybe because my setup isn't caching dns, or my dns server is working differently, I have a problem running this from work. Resolving ftp.us.kernel.org... 144.92.104.38, 144.174.32.40, 155.98.64.81, ... Connecting to ftp.us.kernel.org[144.92.104.38]:21... connected. Which will just happen over and over. It won't connect to any IP other than 144.92.104.38. But when I use dig, the order is definately different. Something else to note is at work I'm trying this on OSX 10.3.6, with wget 1.9.1, which was provided by 'fink'. On my linux box I'm also running wget 1.9.1. Either way, why is wget sorting the list of IP's it gets from the resolver? I feel if wget wasn't sorting the IP's, this problem wouldn't happen. But maybe there is a reason for it? Thanks -miah
[EMAIL PROTECTED]: Re: wget sorts ip's from lowest to highest when multiple ip's are linked to one hostname]
Its one of those days =) I'm not on the list, so when I replied to my own post I am the only one that would get it. -miah - Forwarded message from miah [EMAIL PROTECTED] - Date: Wed, 17 Nov 2004 11:51:26 -0500 From: miah [EMAIL PROTECTED] To: miah [EMAIL PROTECTED] Subject: Re: wget sorts ip's from lowest to highest when multiple ip's are linked to one hostname X-Spam-Status: No, hits=-7.8 required=3.5 tests=AWL,BAYES_00 autolearn=ham version=2.63 Maybe I'm a little quick to say this is a bug. Looking at the way round-robin dns works a bit more, this is probably not a bug. Maybe the real problem here is that I was trying to get around what I felt was a problem with wget not being able to retry if a file isn't found and try the next host in the list. I'm not sure if DNS servers themselves do some sorting on a zone when its loaded, but looking at microsoft.com, and yahoo.com (two other domains that use round-robin dns), I can see that the list will allways be the same, it just moves up and down. As to why I'm having the problem on my mac with it trying the same host over and over.. I can only guess that the local resolver is caching. -miah On Wed, Nov 17, 2004 at 11:06:38AM -0500, miah wrote: Hi. With the recent release of Linux 2.4.28, which fixes some security issues I quickly set out to grab a copy so that I could update my distribution. However, I was having a problem because the file isn't on all the mirrors yet. So I put wget in a simple while loop to get around the if file is not found, quit retrying behaviour. It should be a safe thing to do since ftp.us.kernel.org uses a round robin dns scheme, which means that everytime you lookup the host, it returns the list of IP's in a different order. However, it seems like wget sorts the IP's it gets from the resolver in the order of lowest to highest. I looked through the manpage but didnt see a way to change this behaviour. very simple while.. while [ ! -f linux-2.4.28.tar.bz2 ]; do wget -c ftp://ftp.us.kernel.org/pub/linux/kernel/v2.4/linux-2.4.28.tar.bz2; done Resolving ftp.us.kernel.org... 128.46.156.117, 128.105.103.12, 128.118.2.96, ...Connecting to ftp.us.kernel.org[128.46.156.117]:21... connected. Resolving ftp.us.kernel.org... 66.230.217.253, 69.31.98.210, 128.30.2.36, ... Connecting to ftp.us.kernel.org[66.230.217.253]:21... connected. While this seems to work fine after a while on linux, maybe because my setup isn't caching dns, or my dns server is working differently, I have a problem running this from work. Resolving ftp.us.kernel.org... 144.92.104.38, 144.174.32.40, 155.98.64.81, ... Connecting to ftp.us.kernel.org[144.92.104.38]:21... connected. Which will just happen over and over. It won't connect to any IP other than 144.92.104.38. But when I use dig, the order is definately different. Something else to note is at work I'm trying this on OSX 10.3.6, with wget 1.9.1, which was provided by 'fink'. On my linux box I'm also running wget 1.9.1. Either way, why is wget sorting the list of IP's it gets from the resolver? I feel if wget wasn't sorting the IP's, this problem wouldn't happen. But maybe there is a reason for it? Thanks -miah - End forwarded message -
Windows and 2 GB downloads
Wget (windows) seems to have problems with downloads larger than 2 GB. --version: GNU Wget 1.9+cvs-dev-200404081407 tried 1.9.1(a) too. Screen (watch the length): -- D:\xwget ftp://ftp-stud.fht-esslingen.de/pub/Mirrors/fedora.redhat.com/linux/co re/3/i386/iso/FC3-i386-DVD.iso --17:02:12-- ftp://ftp-stud.fht-esslingen.de/pub/Mirrors/fedora.redhat.com/linu x/core/3/i386/iso/FC3-i386-DVD.iso = `FC3-i386-DVD.iso' Resolving ftp-stud.fht-esslingen.de... 129.143.116.10 Connecting to ftp-stud.fht-esslingen.de|129.143.116.10|:21... connected. Logging in as anonymous ... Logged in! == SYST ... done.== PWD ... done. == TYPE I ... done. == CWD /pub/Mirrors/fedora.redhat.com/linux/core/3/i386/i so ... done. == PORT ... done.== RETR FC3-i386-DVD.iso ... done. Length: -1,828,556,800 (unauthoritative) [ = ] 195,840 163.18K/s ^ --- wget using -c on the already completed download wget restarts at -1,828,556,800 and indeed downloads/appends additional 1,8 GB. regards, Joachim Otahal -- KONTAKT: LW Datentechnik GmbH Joachim Otahal Technik Pfarrberg 1 70794 Filderstadt TELEFON 0711/709590 0711/7089098 MOBIL 0172/7728181 FAX 0711/7089099 MAIL otahal_a_t_lw-datentechnik.de
RE: Newbie needs to start wget in background
Title: Message Hi again, and thank you very much for your help with my little problem. I have succeeded in making wget run in the background using WinCron. nnCron actually looked pretty sweet, but WinCron was free, and I work for cheapskates ;) I installed WinCron normally, and then as a service (per their docs), and wrote a job that called wget from the command line at regular intervals to retrieve updates to our files. WinCron jobs can be started with service start/restart, and they can be passed a switch to suppresses the console window when spawning a new process. Everything runs in the background. No console window ever appears. Problem solved. Thanks again, Mike -Original Message- From: Mathias Wittwer [mailto:[EMAIL PROTECTED] Sent: Wednesday, November 10, 2004 2:38 PM To: 'Herold Heiko'; 'Mike Andersen'; [EMAIL PROTECTED] Subject: RE: Newbie needs to start wget in background www.nncron.ru gives you the possiblity to run any program as a service. that will help solving your issue! US 29.95 for license -Original Message- From: Herold Heiko [mailto:[EMAIL PROTECTED] Sent: Wednesday, 10 November 2004 5:40 a.m. To: 'Mike Andersen'; [EMAIL PROTECTED] Subject: RE: Newbie needs to start wget in background This is not a wget problem. Your task scheduler runs wget in foreground, over any console application (the movie) you are running currently. THEN wget immedeatly correctly puts itself in background, the window closes and your previous topmost application (the movie) is topmost again (although possibly without keyboard/mouse input focus!). You need to investigate how to run wget in a different way. With the old scheduler (windows NT 4) you could configure the service accordingly, I don't know the impact of that on the IE5.5/W2K/XP Task Scheduler (which replaced the previous scheduler). Heiko -- -- PREVINET S.p.A. www.previnet.it -- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED] -- +39-041-5907073 ph -- +39-041-5907472 fax -Original Message- From: Mike Andersen [mailto:[EMAIL PROTECTED] Sent: Wednesday, November 10, 2004 12:19 AM To: [EMAIL PROTECTED] Subject: Newbie needs to start wget in background Hi, Can anyone on the list tell me how to have wget start and run entirely in the background on Windows XP, so no console window ever opens, not even briefly. I would like to use wget for automatically updating files on windows XP client machines, however it needs to be completely invisible to the user. The client machines will be playing small movies, which will update from time to time, and wget seems like a natural solution to have the client machines pull down their own updates. My strategy so far has been to use Windows Task Scheduler and schedule at regular intervals a command line call to wget, like this: C:\WINDOWS\wget.exe -b -q -m -np -nH -nd -l 1 -Pc:/test http://www.mydomain.com/movies/ As you can see, I'm using the -b (background) and -q (quiet) options, and this *almost* works. A windows console appears for just a fraction of a second, but it appears on top of the movie and I know my client will not accept that. I even tried putting the on the end of the command, but no dice. I've also tried using the -o option, but that didn't help either. I've looked through the list and noticed a thread or two about making wget run completely in the background, but this appears to involve patching wget, and I'm not sure if it works in windows, and I doubt I have the know-how to patch the application it correctly. And that's assuming that a version of wget that runs entirely in the background does actually exist. I'd really appreciate it if anyone on the list could help me out with this. I'm not subscribed to the list, so please copy me in your responses: [EMAIL PROTECTED]. Many Thanks, Mike