Is it enough to report it to [EMAIL PROTECTED] How can I check if
there is any development plan for fixing this bug?
Gang
Ray Rodriguez wrote:
I think it would be safe to call this a bug, but I seem to think I've seen
something about this before.
On Tue, 28 Mar 2006, gang wu wrote:
Than
* [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
> wget -e robots=off -r -N -k -E -p -H http://www.gnu.org/software/wget/
>
> soon leads to non wget related links being downloaded, eg.
> http://www.gnu.org/graphics/agnuhead.html
In that particular case, I think --no-parent would solve the
problem.
Hrvoje Niksic wrote:
The regexp API's found on today's Unix systems
might be usable, but unfortunately those are not available on Windows.
My personal idea on this is to: enable regex in Unix and disable it on
Windows.
We all use Unix/Linux and regex is really usefull. I think not having
On Thursday 30 March 2006 13:42, Tony Lewis wrote:
> Perhaps --filter=path,i:/path/to/krs would work.
That would look to be the most elegant method. I do hope that the (?i:) and
(?-i:) constructs are supported since I may not want the entire path/file to
be case (in)?sensitive =), but that will
Curtis Hatter wrote:
> Also any way to add modifiers to the regexs?
Perhaps --filter=path,i:/path/to/krs would work.
Tony
* Jim Wright <[EMAIL PROTECTED]> wrote:
> Suppose you want files from some.dom.com://*/foo/*.png. The
> part I'm thinking of here is "foo as last directory component,
> and png as filename extension." Can the individual rules be
> combined to express this?
Only one rule is needed for that patter
* Mauro Tortonesi <[EMAIL PROTECTED]> wrote:
> wget -r --filter=-domain:www-*.yoyodyne.com
This appears to match "www.yoyodyne.com", "www--.yoyodyne.com",
"www---.yoyodyne.com", and so on, if interpreted as a regex.
It would most likely also match "www---zyoyodyneXcom". Perhaps
you want glob
On Thursday 30 March 2006 11:49, you wrote:
> How many keywords do we need to provide maximum flexibility on the
> components of the URI? (I'm thinking we need five.)
>
> Consider http://www.example.com/path/to/script.cgi?foo=bar
>
> --filter=uri:regex could match against any part of the URI
> --fi
How many keywords do we need to provide maximum flexibility on the
components of the URI? (I'm thinking we need five.)
Consider http://www.example.com/path/to/script.cgi?foo=bar
--filter=uri:regex could match against any part of the URI
--filter=domain:regex could match against www.example.com
--
On Wednesday 29 March 2006 12:05, you wrote:
> we also have to reach consensus on the filtering algorithm. for
> instance, should we simply require that a url passes all the filtering
> rules to allow its download (just like the current -A/R behaviour), or
> should we instead adopt a short circuit
On Thu, 30 Mar 2006, Mauro Tortonesi wrote:
>
> > I do like the [file|path|domain]: approach. very nice and flexible.
> > (and would be a huge help to one specific need I have!) I suggest also
> > including an "any" option as a shortcut for putting the same pattern in
> > all three options.
>
>
Hi,
there is a mistake in the french translation of wget --help (on linux
redhat).
in english :
wget --help | grep spider
--spider don't download anything
was translated in french this way :
wget --help | grep spider
--spider ne pas télécharger n'
> From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
> > I agree. Just how often will there be problems in a single
> wget run due to
> > both some.domain.com and somedomain.com present (famous last
> > words...)
>
> Actually it would have to be somedomain.com -- a "."
> will not match the null string
Herold Heiko <[EMAIL PROTECTED]> writes:
>> From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
>> I don't think such a thing is necessary in practice, though; remember
>> that even if you don't escape the dot, it still matches the (intended)
>> dot, along with other characters. So for quick&dirty usag
> From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
> I don't think such a thing is necessary in practice, though; remember
> that even if you don't escape the dot, it still matches the (intended)
> dot, along with other characters. So for quick&dirty usage not
> escaping dots will "just work", and th
Herold Heiko <[EMAIL PROTECTED]> writes:
> Get the best of both, use a syntax permitting a "first match-exits"
> ACL, single ACE permits several statements ANDed together. Cooking
> up a simple syntax for users without much regexp experience won't be
> easy.
I assume ACL stands for "access contro
Herold Heiko <[EMAIL PROTECTED]> writes:
> BTW any comments about the dots ? Requiring escaped dots in domains would
> become old really fast, reversing behaviour (\. = any char) would be against
> the principle of least surprise, since any other regexp syntax does use the
> opposite.
Modifying t
[Immagination running freely, I do not have a lot of experience designing
syntax, but I suffer a lot in a helpdeskish way trying to explain syntax to
users. Hopefully this can be somehow useful]
> we also have to reach consensus on the filtering algorithm. for
> instance, should we simply require
Tested on: GNU Wget 1.9.1 (Win32)
Tested on: GNU Wget 1.10.2 (Win32)
Example: wget "http://Check.Your.CPU.Usage/con";
Or wget "http://Check.Your.CPU.Usage/con.txt";
You can also used aux, prn, con, lpt1, ltp2, com1, com2, ...
Regards,
fRoGGz ([EMAIL PROTECTED])
SecuBox Labs - http://secubox.sh
Jim Wright wrote:
what definition of regexp would you be following?
that's another degree of liberty. hrovje and i have chosen to integrate
in wget the GNU regex implementation, which allows the exploitation of
one of these different syntaxes:
RE_SYNTAX_EMACS
RE_SYNTAX_AWK
RE_SYNTAX_GNU_AWK
20 matches
Mail list logo