Re: [Bug-wget] unexpected behaviour of wget on some long links

2013-06-13 Thread Yiwei Yang
Hi, Darshit: Sorry that I just run it again and this time it works fine for me. Not sure how that 999 pop up. Thanks a lot for the help! Yiwei On Thu, Jun 13, 2013 at 11:51 AM, Darshit Shah wrote: > > > On Thu, Jun 13, 2013 at 10:09 PM, Yiwei Yang wrote: > >> Thanks a lot. That blocki

Re: [Bug-wget] unexpected behaviour of wget on some long links

2013-06-13 Thread Darshit Shah
On Thu, Jun 13, 2013 at 10:09 PM, Yiwei Yang wrote: > Thanks a lot. That blocking problem is because I didn't use quote to > surround the URL parameter. But now I just get the HTTP 403 errors and some > links would give me > HTTP request sent, awaiting response... 999 Request denied > 2013-06-13

Re: [Bug-wget] unexpected behaviour of wget on some long links

2013-06-13 Thread Yiwei Yang
Thanks a lot. That blocking problem is because I didn't use quote to surround the URL parameter. But now I just get the HTTP 403 errors and some links would give me HTTP request sent, awaiting response... 999 Request denied 2013-06-13 11:32:28 ERROR 999: Request denied. For the 403 I can understa

Re: [Bug-wget] unexpected behaviour of wget on some long links

2013-06-13 Thread Darshit Shah
Bykov's suggestion is bang on accurate. The issue you are facing is that the ampersand (&) is a special character in the Bash Shell that asks the shell to run the command in the background and return control of the Shell to the user. The shell is reading the & character in your URL and sending the

Re: [Bug-wget] unexpected behaviour of wget on some long links

2013-06-13 Thread Bykov Aleksey
Greetings, Yiwei Yang Sorry for stupid question, but does You try to use qoutes to escape url? wget -p -np -nc -nd --delete-after -t 1 -T 20 -P somefolder "" or wget -p -np -nc -nd --delete-after -t 1 -T 20 -P somefolder '' Shell can interpret ampersand as command separator... -- Best regars, Ale

[Bug-wget] unexpected behaviour of wget on some long links

2013-06-12 Thread Yiwei Yang
Hi, I wrote a c program and read a list of URLs and feed into wget one by one with the following command: wget -p -np -nc -nd --delete-after -t 1 -T 20 -P somefolder However, with some long links, like: http://www.linkedin.com/nhome/nus-redirect?url=http%3A%2F%2Fwww%2Elinkedin%2Ecom%2Fpro