On Thu, Dec 30, 2010 at 11:24 PM, g4Ur4v <[email protected]> wrote:
> Is there a way to extract all the HTML pages from a website into a > single HTML/PDF file.For example,the Dive into HTML5 website > (www.diveintohtml5.org) contains the book in various html pages.Is > there a way to combine them all into a single file ? > > Entire website you can download using wget - http://linuxreviews.org/quicktips/wget/ I found htmldoc as a very good utility for creating pdf out of webpages apt-get install htmldoc you can add direct url into htmldoc and you can create pdf out of different webpages too.. -- l...@iitd - http://tinyurl.com/ycueutm
