I'm trying to figure out how to do a POST followed by a GET.
If I do something like:
wget http://www.somesite.com/post.cgi --post-data 'a=1b=2'
http://www.somesite.com/getme.html -d
I get the following behavior:
POST /post.cgi HTTP/1.0
snip
[POST data: a=1b=2]
snip
POST /getme.html HTTP/1.0
I use wget 1.8.2.
When I try recursive download site site.com where
site.com/ first page redirect to site.com/xxx.html that have first link in
the page to site.com/
then Wget download only xxx.html and stop.
Other links from xxx.html not followed!
Have wget any rules to convert retrive url to store url?
Or may be in future?
For example:
Get - site.com/index.php?PHPSESSID=123124324
Filter - /PHPSESSID=[a-z0-9]+//i
Save as - site.com/index.php
Tony Lewis [EMAIL PROTECTED] writes:
I'm trying to figure out how to do a POST followed by a GET.
If I do something like:
wget http://www.somesite.com/post.cgi --post-data 'a=1b=2'
http://www.somesite.com/getme.html -d
Well... `--post-data' currently affects all the URLs in the Wget run.
Sergey Vasilevsky [EMAIL PROTECTED] writes:
I use wget 1.8.2. When I try recursive download site site.com where
site.com/ first page redirect to site.com/xxx.html that have first
link in the page to site.com/ then Wget download only xxx.html and
stop. Other links from xxx.html not followed!
Sergey Vasilevsky [EMAIL PROTECTED] writes:
Have wget any rules to convert retrive url to store url? Or may be
in future?
For example:
Get - site.com/index.php?PHPSESSID=123124324
Filter - /PHPSESSID=[a-z0-9]+//i
Save as - site.com/index.php
The problem with this is that it would
Hrvoje Niksic wrote:
Maybe the right thing would be for `--post-data' to only apply to the
URL it precedes, as in:
wget --post-data=foo URL1 --post-data=bar URL2 URL3
snip
But I'm not at all sure that it's even possible to do this and keep
using getopt!
I'll start by saying that I
On Tue, 14 Oct 2003, Tony Lewis wrote:
It would be the same logically equivalent to the following three commands:
wget --user-agent='my robot' --post-data 'data=foo' POST URL1
wget --user-agent='my robot' --post-data 'data=bar' POST URL2
wget --user-agent='my robot' --referer=URL3 GET URL4
Hi,
Right now wget code looks like this:
#ifdef ENABLE_IPV6
int ip_default_family = AF_INET6;
#else
int ip_default_family = AF_INET;
#endif
and then
./connect.c: sock = socket (ip_default_family, SOCK_STREAM, 0);
This assumes that binary compiled with ipv6 support is always used on
Hello,
which this download you will get a segfault.
wget --passive-ftp --limit-rate 32k -r -nc -l 50 \
-X */binary-alpha,*/binary-powerpc,*/source,*/incoming \
-R alpha.deb,powerpc.deb,diff.gz,.dsc,.orig.tar.gz \
ftp://ftp.gwdg.de/pub/x11/kde/stable/3.1.4/Debian
Philip Stadermann [EMAIL
I like these suggestions. How about the following: for 1.9, document
that `--post-data' expects one URL and that its behavior for multiple
specified URLs might change in a future version.
Then, for 1.10 we can implement one of the alternative behaviors.
You're right -- that code was broken. Thanks for the patch; I've now
applied it to CVS with the following ChangeLog entry:
2003-10-15 Philip Stadermann [EMAIL PROTECTED]
* ftp.c (ftp_retrieve_glob): Correctly loop through the list whose
elements might have been deleted.
Hrvoje Niksic wrote:
I like these suggestions. How about the following: for 1.9, document
that `--post-data' expects one URL and that its behavior for multiple
specified URLs might change in a future version.
Then, for 1.10 we can implement one of the alternative behaviors.
That works for
Thanks for the report. I agree that the current code does not work
for many uses -- that's why IPv6 is still experimental. Mauro
Tortonesi is working on contributing IPv6 support that works better.
For the impending release, I think the workaround you posted makes
sense. Mauro, what do you
14 matches
Mail list logo