On Wednesday 28 July 2004 04:51, Vern wrote:
> I get:
>
> # wget -O -q http://www.comp-wiz.com/index.html | lpr -P hp1300n
> --16:49:59--  http://www.comp-wiz.com/index.html
>            => `-q'
> Resolving www.comp-wiz.com... done.
> Connecting to www.comp-wiz.com[207.234.154.95]:80... connected.
> HTTP request sent, awaiting response... 200 OK
> Length: 22,011 [text/html]
>
> 100%[======================================================================
>> ] 22,011       139.58K/s    ETA 00:00
>
> 16:49:59 (139.58 KB/s) - `-q' saved [22011/22011]
>
> lpr: stdin is empty, so no job has been sent.

It would be nice if you did a *little* bit of your own research. If you had a 
look at 'wget --help' or 'man wget' you would have found that the '-O' switch 
will save contents of URL to file (in this case to the file '-q').

Anyway this is getting completely OT. And as you have indicated in other posts 
you wanted to print a copy of the URL "as rendered via a graphical browser". 
Neither wget nor lynx or suitable for this task.

> > http://marginalhacks.com/Hacks/html2jpg/

> Seems to me that this needs XWindows or something install, am I incorrect?

You need to install X and probably a few other bits and pieces as well.

-- 
Jason Wong -> Gremlins Associates -> www.gremlins.biz
Open Source Software Systems Integrators
* Web Design & Hosting * Internet & Intranet Applications Development *
------------------------------------------
Search the list archives before you post
http://marc.theaimsgroup.com/?l=php-general
------------------------------------------
/*
A good marriage would be between a blind wife and deaf husband.
                -- Michel de Montaigne
*/

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to