On 11 Jan 2002 at 10:51, Picot Chappell wrote:

> Thanks for your response.  I tried the same command, using your URL, and it
> worked fine.  So I took a look at the site I was retrieving for the failed
> test.
> 
> It's a ssl site (didn't think about it before) and I noticed 2 things.  The
> Frame source pages were not downloaded (they were for www.mev.co.uk) and the
> links were converted to full URLs.
> ie. <FRAME src="menulayer.cgi".....> became <FRAME
> src="https://www.someframed.page/menulayer.cgi"; ...>
> 
> So the content was still reachable, but not really local (this is the original
> problem).  I tried it without the --convert-links, and the frame source
> remained defined as "menulayer.cgi"  but menulayer.cgi was not downloaded.
> 
> Do you think this might be an issue with framesets and ssl sites?  or an issue
> with framesets and cgi source files?

Do you have SSL support compiled in?

Also it is possible that the .cgi script on the server is checking
HTTP request headers and cookies, doesn't like what it sees and is
returning an error. It is sometimes useful to lie to the server 
about the HTTP user agent using the -U option, e.g.:

-U "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 4.0)"

or include something similar in the wgetrc file:

useragent = Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 4.0)

Some log entries would be useful, particularly with the -d option.
You can mask any sensitive bits of the log if you want.

Reply via email to