-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Josh Williams wrote:
> Consider this example, which happens to be how I realised this problem:
>
> wget http://www.mxpx.com/ -r --base=.
>
> Here, I want the entire site to be downloaded with each link pointing
> to the local file. This works for s
On 7/14/07, Matthias Vill <[EMAIL PROTECTED]> wrote:
I think I got your point:
Now i think this could result in different problems like what schould
happen with "wget -r --base=/home/matthias/tmp
http://server/with/a/complicated/structure/and/to/many/dirs/a.php";
If you now have a link to "/inde
I think I got your point:
All in all this is still a matter of comparing the first against the
current url and counting the common dirs from the left side.
Then you compare that number(a) to the depth of the first url(b) and add
b-a "../" so you get to the right position inside your base.
By tha
On 7/14/07, Matthias Vill <[EMAIL PROTECTED]> wrote:
So you would suggest handling in the way that when I use
wget --base=/some/serverdir http://server/serverdir/
"/.*" will be interpreted as "/some/.*" so if you have a link like
"/serverdir/" it would go back to "/some/serverdir", right?
Corre
So you would suggest handling in the way that when I use
wget --base=/some/serverdir http://server/serverdir/
"/.*" will be interpreted as "/some/.*" so if you have a link like
"/serverdir/" it would go back to "/some/serverdir", right?
I guess this would be ok. Just one question if there is a Lin
Consider this example, which happens to be how I realised this problem:
wget http://www.mxpx.com/ -r --base=.
Here, I want the entire site to be downloaded with each link pointing
to the local file. This works for some links, but it does not take
references to the root directory into account, su