Consider this example, which happens to be how I realised this problem:
wget http://www.mxpx.com/ -r --base=.
Here, I want the entire site to be downloaded with each link pointing
to the local file. This works for some links, but it does not take
references to the root directory into account,
So you would suggest handling in the way that when I use
wget --base=/some/serverdir http://server/serverdir/
/.* will be interpreted as /some/.* so if you have a link like
/serverdir/ it would go back to /some/serverdir, right?
I guess this would be ok. Just one question if there is a Link back
On 7/14/07, Matthias Vill [EMAIL PROTECTED] wrote:
So you would suggest handling in the way that when I use
wget --base=/some/serverdir http://server/serverdir/
/.* will be interpreted as /some/.* so if you have a link like
/serverdir/ it would go back to /some/serverdir, right?
Correct.
I
I think I got your point:
All in all this is still a matter of comparing the first against the
current url and counting the common dirs from the left side.
Then you compare that number(a) to the depth of the first url(b) and add
b-a ../ so you get to the right position inside your base.
By that
On 7/14/07, Matthias Vill [EMAIL PROTECTED] wrote:
I think I got your point:
Now i think this could result in different problems like what schould
happen with wget -r --base=/home/matthias/tmp
http://server/with/a/complicated/structure/and/to/many/dirs/a.php;
If you now have a link to
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Christian Roche wrote:
Hi there,
Hi!
please find attached two small patches that could be
considered for wget (against revision 2276).
patch-utils changes the file renaming mechanism when
the -nc option is in effect. Instead of trying to
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Josh Williams wrote:
Consider this example, which happens to be how I realised this problem:
wget http://www.mxpx.com/ -r --base=.
Here, I want the entire site to be downloaded with each link pointing
to the local file. This works for some