I just want to de-lurk for a minute. I have been using wget on a regular basis
for various websites.
If Javascript is responsible for writing the content, then you have a web page
that probably uses AJAX, and would be dyanmically updateable. Since Ajax use is
on the rise, I wonder if anyone here can say how does wget deal with sites
using Ajax?
Paul King
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Petr Pisar wrote:
On 2008-06-29, Mishari Almishari [EMAIL PROTECTED] wrote:
Hi,
I want to download the website
www.2006election.nethttp://www.2006election.net.out/
But the downloaded page index.html has no content (except body/head
tags),
eventhough i can see the content when i used internet exprolorer.
This is not bug, that's feature. All the content you see in IE is
generated by JavaScript. See source code of the web page in IE.
No, the command he gives literally yields a completely empty web page:
html
body
/body
/html
- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer,
and GNU Wget Project Maintainer.
http://micah.cowan.name/
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFIaAcD7M8hyUobTrERAmyUAJ0XSHavTRur8J0eMfk4CY/Ck4p+ngCfa+gU
mPn+vwgASK5iPH2J2WTtpWI=
=21dD
-END PGP SIGNATURE-
__ NOD32 3225 (20080629) Information __
This message was checked by NOD32 antivirus system.
http://www.eset.com