Mauro Tortonesi wrote:
On Tue, 2 Sep 2003, Hrvoje Niksic wrote:
Mauro Tortonesi [EMAIL PROTECTED] writes:
On Tue, 2 Sep 2003, Jeremy Reeve wrote:
I've written a trivial patch to implement the --disable-dns-cache
feature
as described in the TODO contained in the CVS tree. I need to write
niurui wrote:
Hi, all
I want to embed wget in my program. first my program will use wget to get
a file from internet,
then it will search this file for some words. So i want to save that file
in memory rather than
on hard disk.
Can wget save the file in memory? how to?
Wget is totally the
Sherwood Botsford wrote:
I wish I had a file exclude option.
I'm behind a firewall that doesn't allow ftp, so I have to find
sites that use http for file transfer.
I'm currently trying to install cygwin on my local lan network.
To do that, I;m using wget to mirror the remote site locally,
David Balazic wrote:
As I got no response on [EMAIL PROTECTED], I am resending my report
here.
One forwards to the other. The problem is that the wget maintainer is
absent, and likely to continue to be so for several more months.
As a result, wget development is effectively stalled.
Max.
David Balazic wrote:
Max Bowsher wrote:
David Balazic wrote:
As I got no response on [EMAIL PROTECTED], I am resending my report
here.
One forwards to the other. The problem is that the wget maintainer is
absent, and likely to continue to be so for several more months.
As a result
Kalin KOZHUHAROV wrote:
Dieter Kuntz wrote:
i will test with --http-user..
OK, I think you will not make it this way.
What we are talking here is form sumbmission, not just password.
Your password happenes to be part of a form.
So first look at the html source of the page where
Nick Earle wrote:
Can someone let me know if and when this feature is going to be
implemented within WGet.
I am currently using WGet on a Windows platform and this is the only
feature that appears to be missing from this utility.
wget development seems to be at a standstill. AFAIK, there is
Nick Earle wrote:
Can someone let me know if and when this feature is going to be
implemented within WGet.
If and when someone decides to implement it. But there is almost certainly
not going to be another release until after Hrvoje Niksic has returned.
Max.
Aaron S. Hawley wrote:
On Fri, 14 Feb 2003, Max Bowsher wrote:
If and when someone decides to implement it. But there is almost
certainly not going to be another release until after Hrvoje Niksic
has returned.
Can someone at FSF do something? [EMAIL PROTECTED], [EMAIL PROTECTED
Praveen wrote:
Hi there,
I am trying to download some asp file. It gives me the html output of
the file. Is it possible to download the source code through wget.
No.
Max.
- Original Message -
From: Ken Senior [EMAIL PROTECTED]
There does not seem to be support to change disks when accessing a VMS
server via wget. Is this a bug or just a limitation?
Wget does plain old HTTP and FTP. I know nothing about VMS. Does it have
some strange syntax for discs?
deeps [EMAIL PROTECTED] wrote:
I just want to see the source files for this java chat application,
!?!?!?!
Java is a compiled language! The source files probably aren't even on the
webserver.
Max.
DennisBagley [EMAIL PROTECTED] wrote:
ok - am using wget to mirror an ftp site [duh]
and would like it not only to keep an up to
date copy of the files [which it does beautifuly]
but also remove files that are no-longer on the ftp server
?? Is this possible ???
Use a perl script.
Max.
It's already fixed in CVS for 1.9
Max.
Ivan A. Bolsunov [EMAIL PROTECTED] wrote:
version: 1.8.1
in file: html-url.c
in function:
tag_handle_meta()
{
... skipped ...
char *p, *refresh = find_attr (tag, content, attrind);
int timeout = 0;
for (p = refresh; ISDIGIT
ROMNEY Jean-Francois [EMAIL PROTECTED] wrote:
I can't download files with wget 1.8.1 by using wildcards.
Without wildcards, it works. The option --glob=on seems to have no
effect. The command is :
wget -d --glob==on -nc
ftp://--:---;ftpcontent.mediapps.com/francais/journal/eco/*.jpg
ROMNEY Jean-Francois [EMAIL PROTECTED] wrote:
I can't download files with wget 1.8.1 by using wildcards.
Without wildcards, it works. The option --glob=on seems to have no
effect. The command is :
wget -d --glob==on -nc
ftp://--:---;ftpcontent.mediapps.com/francais/journal/eco/*.jpg
ROMNEY Jean-Francois [EMAIL PROTECTED] wrote:
There is the same error with only one =
What do you meen with Escape/quote ?
Read the documentation for your shell.
wget never sees the * beacuse the shell has already globbed with it.
Also, please keep discussions on-list.
Max.
Yun MO wrote:
Dear Ma'am/sir,
I could not get all files with "wget -r" command for following
address. Would you help me?
Thank you in advance.
M.Y.
---
meta NAME="robots" CONTENT="noindex,nofollow"
Wget is obeying the robots instruction.
wget -e robots=off ...
Thomas Lussnig wrote:
this is an Windows spezific problem. Normaly should prn.html an valid
Filename.
And as you can check Long filenames can contain :.
No they can't. And, on NTFS including a : in a filename causes the data to be
written into an invisible named stream.
But there is an
[EMAIL PROTECTED] wrote:
We're using the wget app to run our scheduled tasks. Each time it runs, a
copy of the file is created with a number added to the end of it. Is there
a way to turn this off? We tried adding --quiet to the bat file but it
still wrote the file.
-nc or -N depending on
Try the source I sent you.
Dominique wrote:
thank you Max,
np
Is it different than the one I CVS-ed yesterday? I mean, does it have
changes in creating filenames? Please note, that I finally compiled it
and could run it.
No changes... I did run autoconf, so you could go straight to
Max Bowsher wrote:
The problem is that with a ?x=y, where y contains slashes, wget passes them
unchanged to the OS, causing directories to be created, but fails to adjyst
relative links to account for the fact that the page is in a deeper directory
that it should be. The solution is to map
As a dial-up user, I find it extremely useful to have access to the
full range of cvs functionality whilst offline. Some other projects
provide read-only rsync access to the CVS repository, which allows a
local copy of the repository to be made, not just a checkout of a
particular version.
Dominique wrote:
tryit_edit.asp?filename=tryhtml_basicreferer=http://www.w3schools.com/html/html
_examples.asp
and just this one is truncated. I think some regexp or pattern or
explicit list of where_not_to_break_a_string characters would solve
the problem. Or maybe it is already possible,
-e robots=off
Jon W. Backstrom wrote:
Dear Gnu Developers,
We just ran into a situation where we had to spider a site of our
own on a outsourced service because the company was going out of
business. Because wget respects the robots.txt file, however, we
could not get an archive made
Thomas Lussnig wrote:
Why should be there an url encoding ?
/ are an legal character in url and in the GET string.
Ist used for example for Path2Query translation.
The main problem is that wget need to translate an URL to and
Filesystem name.
Yes, you are right, I wasn't think clearly.
You don't give a whole lot of information. It's kind of impossible to help when
you don't know what the problem is.
Posting the URL of the problem site would be a good idea.
Max.
Dominique wrote:
Is it possible at all?
dominique
Dominique wrote:
Hi,
I have a problem trying to wget a
Christopher Stone wrote:
Thank you all.
Now the issue seems to be that it only gets the root
directory.
I ran 'wget -km -nd http://www.mywebsite.com
-r
Max.
Jens Rösner wrote:
Hi!
Max' hint is incorrect I think, as -m includes
-N (timestamps) and -r (recursive)
Ooops, you're right. I tend not to use -m much myself. I should pay more
attention!
Max.
Christopher Stone wrote:
When I ran wget and sucked the site to my local box,
it pulled all the pages down and the index page comes
up fine, but when I click on a link, it goes back to
the remote server.
What switch(s) do I use, so that when I pull the pages
to my box, that all of the
Vernon, Clayton wrote:
I'm having all kinds of difficulties downloading from servlets.
Here is a script for a URL that works in a browser to donload a ZIP
file but doesn't work via Wget
http://oasis.caiso.com/servlet/SingleZip?nresultformat=5squeryname=SLD_LOAD
HTTP does not provide a dirlist command, so wget parses html to find other files
it should download. Note: HTML not XML. I suspect that is the problem.
Max.
Funk Gabor wrote:
I recently found that during a (wget) mirror, not all the files are
downloaded. (wget v1.8.2 / debian) For example:
Funk Gabor wrote:
HTTP does not provide a dirlist command, so wget parses html to find
other files it should download. Note: HTML not XML. I suspect that
is the problem.
If wget wouldn't download the rest, I'd say that too. But 1st the dir
gets created, the xml is dloaded (in some other
As a dial-up user, I find it extremely useful to have access to the full range
of cvs functionality whilst offline. Some other projects provide read-only rsync
access to the CVS repository, which allows a local copy of the repository to be
made, not just a checkout of a particular version.
Since
Hartwig, Thomas wrote:
I got a assert exit of wget in retr.c in the function calc_rate
because msecs is 0 or lesser than 0 (in spare cases).
I don't know how perhaps because I have a big line to the server or
the wrong OS. To get worked with this I patched retr.c setting
msecs = 1 if equal
35 matches
Mail list logo