Re: apt-get via Windows with wget
It seems one cannot just use the wget .exe without the DLLs, even if one only wants to connect to just http sites, not any https sites. So one cannot just click on the wget .exe from inside Unzip's filelist.
RE: apt-get via Windows with wget
Hi Heiko! > > Until now, I linked to your main page. > > Would you mind if people short-cut this? > Linking to the directory is bad since people would download Sorry, I meant linking directly to the "latest" zip. However, I personally prefer to read what the provider (in this case you) has to say about a download anyway. > Do link to the complete url if you prefer to, although I like to keep > some stats. Understood. > for example since start of the year > there have been 7 referrals from www.jensroesner.de/wgetgui Wow, that's massive... ...not! ;-) > Since that is about 0.05% stats shouldn't be > altered too much if you link directly to the archive ;-) Thanks for pointing that out ;-} > > What do you think about adding a "latest-ssl-libraries.zip"? > I don't think so. > If you get the "latest complete wget" archive those are included anyway > and you are sure it will work. Oh I'm very sorry, must have overread/misunderstood that. I thought the "latest" zip would not contain the SSLs. That's great. > I'd prefer to not force a unneeded (admittedly, small) download by bundling > the ssl libraries in every package. Very true. Especially as wget seems to be used by quite some people on slow connections. Kind regards Jens -- GMX ProMail (250 MB Mailbox, 50 FreeSMS, Virenschutz, 2,99 EUR/Monat...) jetzt 3 Monate GRATIS + 3x DER SPIEGEL +++ http://www.gmx.net/derspiegel +++
RE: apt-get via Windows with wget
> From: "Jens Rösner" [mailto:[EMAIL PROTECTED] > Subject: RE: apt-get via Windows with wget > Great! Thank you very much, Heiko. > I think I'll use it on my wgetgui page as well! :) > But what would you prefer? > Until now, I linked to your main page. > Would you mind if people short-cut this? Linking to the directory is bad since people would download the any file without reading the descriptions and the send me hatemail at the first problem (that's why I added the index in the sunsite.dk server, pointing to the main index). Do link to the complete url if you prefer to, although I like to keep some stats. It is rather interesting how the downloads fluctuate between releases, while the emails I get have a similar frequency but rather shifted (about a month) - as if most people sending me email wait at least a month after the download before sending, or as if the peak of downloads after a release is due to people who don't send me email (neither "thanks" or "there's a problem" mails), while the sending users wait a month after a release. I believe these are true about half-and-half (due to some considerations when I compiled about a release a week). Maybe it is stupid but I like to keep some of those stats. Some of them now are visible since I can't get the data from my provider anymore and switched to nedstat basic (which is public) - for example since start of the year there have been 7 referrals from www.jensroesner.de/wgetgui , about in line with what I saw before. Since that is about 0.05% stats shouldn't be altered too much if you link directly to the archive ;-) > What do you think about adding a "latest-ssl-libraries.zip"? I don't think so. If you get the "latest complete wget" archive those are included anyway and you are sure it will work. If you get anything else you need to have the _correct_ libraries or probably it won't work. Those libraries don't change too often anyway, at least I try not to compile them unless needed (security fixes mainly). So, as I see it, people who want to get "the latest working stuff" download a complete working package. People who often get the latest cvs binary know what they want and can get the correct libraries if they don't have them already (normally they do). People who want to get some previous binary wouldn't use the "latest ssl stuff zip" anyway since the need to take a look at the description since the "latest" could be wrong combination with their choosen binary, and I'd prefer to not force a unneeded (admittedly, small) download by bundling the ssl libraries in every package. Bye Heiko -- -- PREVINET S.p.A. www.previnet.it -- Heiko Herold [EMAIL PROTECTED] -- +39-041-5907073 ph -- +39-041-5907472 fax
RE: apt-get via Windows with wget
Hello Heiko! > I added a wget-complete-stable.zip, if you want to link to a fixed url > use > that, I'll update it whenever needed. Currently it is the same archive > as the wget-wget-1.9.1b-complete.zip . Great! Thank you very much, Heiko. I think I'll use it on my wgetgui page as well! :) But what would you prefer? Until now, I linked to your main page. Would you mind if people short-cut this? [SSL-enabled / plain binaries] > the ssl version". As long as the libraries are placed somewhere in the > path > OR simply kept in the same directory where wget is the ssl version is > fine for everything after all. I agree. What do you think about adding a "latest-ssl-libraries.zip"? Kind regards Jens -- +++ Mailpower für Multimedia-Begeisterte: http://www.gmx.net/topmail +++ 250 MB Mailbox, 1 GB Online-Festplatte, 100 FreeSMS. Jetzt kostenlos testen!
RE: apt-get via Windows with wget
> From: "Jens Rösner" [mailto:[EMAIL PROTECTED] > Note: > Mail redirected from bug to normal wget list. > > H> > ftp://ftp.sunsite.dk/projects/wget/windows/wget-1.9.1b-complete.zip, > > OK, but too bad there's no stable second link .../latest.zip so I > > don't have to update my web page to follow the link. > Yep, this would make things much easier for applications like yours. Dan, Jens, I added a wget-complete-stable.zip, if you want to link to a fixed url use that, I'll update it whenever needed. Currently it is the same archive as the wget-wget-1.9.1b-complete.zip . > > Furthermore, they don't need SSL, but I don't see any 'diet' > > versions... > Right, Heiko is so kind to compile the SSL enabled wget binaries. > If you need it without SSL, you would have to compile it yourself. > But since you don't have windows... For some time I provided a binary with and without ssl, then I started to get >15 email/month "I can't dl https sitez" (change to your preferred hax0r speak) and grew tired answering "read the description in the index, you want the ssl version". As long as the libraries are placed somewhere in the path OR simply kept in the same directory where wget is the ssl version is fine for everything after all. Heiko -- -- PREVINET S.p.A. www.previnet.it -- Heiko Herold [EMAIL PROTECTED] -- +39-041-5907073 ph -- +39-041-5907472 fax
Re: apt-get via Windows with wget
Note: Mail redirected from bug to normal wget list. > H> For getting Wget you might want to link directly to > H> ftp://ftp.sunsite.dk/projects/wget/windows/wget-1.9.1b-complete.zip, > OK, but too bad there's no stable second link .../latest.zip so I > don't have to update my web page to follow the link. Yep, this would make things much easier for applications like yours. However, I think the current wget version can to all you'll ever need for that purpose. > Furthermore, they don't need SSL, but I don't see any 'diet' > versions... Right, Heiko is so kind to compile the SSL enabled wget binaries. If you need it without SSL, you would have to compile it yourself. But since you don't have windows... > H> Oh, and the Windows users should preferrably be ones who know how to > H> run a command-line application, but I assume you've got that covered. > Exactly not. I recall being able to get to a little window where one > enters a command... Anyway, can you give an example of all the steps > needed to do wget -x -i fetch_list.txt -B > http://debian.linux.org.tw/debian/pool/main/ Hm, why don't you do the following: download wget and the ssl libraries, unzip them on a windows box (I know, you don't have it, but someone on this planet you know, should have it, I heared it is fairly wide spread). Unzip them to a sensible directory like c:\wget\ add a startupdate.bat file to the directory this file should read something like wget -x -i fetch_list.txt -B http://debian.linux.org.tw/debian/pool/main/ Now, pack everything into a zip again, preserving full folder info. (I always use Power Archiver 6.1, the last freeware version.) now create a self-extracting archive from it. Distribute the archive.exe to your "buddies" all they have to do is a) doubleclick on the archive b) browse to c:\wget\ with Windows explorer and c) doubleclick on startupdate.bat d) afterwards, do the CD writing Thinking about it, you could distribute wget with the SSL and startupdate.bat file unzipped on a 1.44MB floppy disk. CU Jens http://www.jensroesner.de/wgetgui/ -- +++ Mailpower für Multimedia-Begeisterte: http://www.gmx.net/topmail +++ 250 MB Mailbox, 1 GB Online-Festplatte, 100 FreeSMS. Jetzt kostenlos testen!
Re: apt-get via Windows with wget
H> For getting Wget you might want to link directly to H> ftp://ftp.sunsite.dk/projects/wget/windows/wget-1.9.1b-complete.zip, OK, but too bad there's no stable second link .../latest.zip so I don't have to update my web page to follow the link. Furthermore, they don't need SSL, but I don't see any 'diet' versions... H> Oh, and the Windows users should preferrably be ones who know how to H> run a command-line application, but I assume you've got that covered. Exactly not. I recall being able to get to a little window where one enters a command... Anyway, can you give an example of all the steps needed to do wget -x -i fetch_list.txt -B http://debian.linux.org.tw/debian/pool/main/ You probably could add this example to the web page too, (without the [] lines.): [Click on fetch_list.txt; save it to a file.] Click on ..wget...zip URL UNzip it [yes, can get this far, I remember] then what then what wget [options] [nero [OK, they can handle that.]
Re: apt-get via Windows with wget
Dan Jacobson <[EMAIL PROTECTED]> writes: > I suppose Windows users don't have a way to get more that one file at > once, hence to have a Windows user download 500 files and burn them > onto a CD, as in > http://jidanni.org/comp/apt-offline/index_en.html > so one needs wget? Yes, either Wget or some other third-party application. I'd assume that there are many applications that cover that particular niche, but apparent popularity of Wget under Windows seems to speak otherwise. GetRight is a full-featured applications comes to mind, but I don't really know if it has the feature you need, getting a bunch of URLs from a large list. (I'd assume it does, but I can't guarantee it.) > Any tips on the concept in my web page? I don't have Windows to try > it. Certainly something will go wrong? I don't see anything obvious that might go wrong. Nero will burn "long" file names with Joliet extensions, which any modern Linux will be happy to read. For getting Wget you might want to link directly to ftp://ftp.sunsite.dk/projects/wget/windows/wget-1.9.1b-complete.zip, so the user doesn't have to browse through Herold's page to see what it's about and what to get. Oh, and the Windows users should preferrably be ones who know how to run a command-line application, but I assume you've got that covered.