Richard Lynch wrote:
On Mon, June 12, 2006 4:49 pm, Jochem Maas wrote:
Ryan A wrote:
Thanks for the suggestion, I am not too familier with
wget but (correct me if i am wrong) wont wget just get
the output from the pages ignoreing the links?
that's the default behaviour - but wget has about a
--- Larry Garfield [EMAIL PROTECTED] wrote:
clip 1
that said it could take a week to figure out all
the
parameters. ;-)
/clip 1
...
/clip 2
That's why I included the switches I did. :-) I had
to do something very
similar just last week.
...
-m means mirror. That is,
On Tuesday 13 June 2006 07:22, Ryan A wrote:
Hey,
Thanks for the explanation of the switches.
One part that I dont really understand is:
blah?foo=bar links
into [EMAIL PROTECTED]
having a link such as [EMAIL PROTECTED] is not going to
work to link to the second document...right? (I
On Mon, June 12, 2006 4:49 pm, Jochem Maas wrote:
Ryan A wrote:
Thanks for the suggestion, I am not too familier with
wget but (correct me if i am wrong) wont wget just get
the output from the pages ignoreing the links?
that's the default behaviour - but wget has about a zillion
parameters
Hey Larry,
Thanks again, now i have around 3 different ways of
doing this... can assure the client that all will be
well, all depends now if the project is confirmed and
given to us.
But the info you gave me will serve me even if this
project does not go through or i dont use wget for
this
On Tuesday 13 June 2006 17:57, Ryan A wrote:
Hey Larry,
Thanks again, now i have around 3 different ways of
doing this... can assure the client that all will be
well, all depends now if the project is confirmed and
given to us.
But the info you gave me will serve me even if this
project
Hey all,
heres the short explanation of what I am supposed to
do,
I need to render/convert the entire site to normal
html pages so that it can be loaded onto a cd and
given out.
The good news is that the whole site has not yet been
built so i can start from the ground up.
I have a few ideas on
On 12/06/06, Ryan A [EMAIL PROTECTED] wrote:
Hey all,
heres the short explanation of what I am supposed to
do,
I need to render/convert the entire site to normal
html pages so that it can be loaded onto a cd and
given out.
The good news is that the whole site has not yet been
built so i can
Ryan A wrote:
heres the short explanation of what I am supposed to
do,
I need to render/convert the entire site to normal
html pages so that it can be loaded onto a cd and
given out.
The good news is that the whole site has not yet been
built so i can start from the ground up.
I have a few
Hey all,
heres the short explanation of what I am supposed
to
do,
I need to render/convert the entire site to
normal
html pages so that it can be loaded onto a cd and
given out.
The good news is that the whole site has not yet
been
built so i can start from the ground
-Original Message-
I need to render/convert the entire site to normal
html pages so that it can be loaded onto a cd and
given out.
Does any class program exist that can help me do this?
Save yourself a lot of work and use HTTrack.
http://www.httrack.com/
Brady
--
PHP General
--- Brady Mitchell [EMAIL PROTECTED] wrote:
-Original Message-
I need to render/convert the entire site to
normal
html pages so that it can be loaded onto a cd and
given out.
Does any class program exist that can help me do
this?
Save yourself a lot of work and use
-Original Message-
Save yourself a lot of work and use HTTrack.
http://www.httrack.com/
Very very interesting, thank you!
If you have tried this and have downloaded dynamic
pages/sites (eg: PHP pages) please tell me if you had
any link problems from one page to another.
wget -m -k http://www.yoursite.com/
Cheers. :-)
--
Larry Garfield
On Mon, June 12, 2006 10:54 am, Ryan A said:
Hey all,
heres the short explanation of what I am supposed to
do,
I need to render/convert the entire site to normal
html pages so that it can be loaded onto a cd and
given
Hi,
Thanks for the suggestion, I am not too familier with
wget but (correct me if i am wrong) wont wget just get
the output from the pages ignoreing the links?
Thanks!
Ryan
--- Larry Garfield [EMAIL PROTECTED] wrote:
wget -m -k http://www.yoursite.com/
Cheers. :-)
--
Larry Garfield
Quick question;
If the site is updated with new pages/links is there
anyway of specifying to HTTrack to get just the new
pages or does it get the whole site again?
Reason I ask is they are going to have a s**tload of
pages...maybe 4k or pages
Thanks!
Ryan
--
- The faulty interface lies
- Original Message -
From: Ryan A [EMAIL PROTECTED]
To: Brady Mitchell [EMAIL PROTECTED]; php php
php-general@lists.php.net
Sent: Monday, June 12, 2006 8:09 PM
Subject: RE: [PHP] php-html rendering
Quick question;
If the site is updated with new pages/links is there
anyway of specifying
, it would be the very real thing.
Satyam
- Original Message -
From: Ryan A [EMAIL PROTECTED]
To: Brady Mitchell [EMAIL PROTECTED]; php php
php-general@lists.php.net
Sent: Monday, June 12, 2006 8:09 PM
Subject: RE: [PHP] php-html rendering
Quick question;
If the site
You have just described what wget does...
On Mon, June 12, 2006 10:54 am, Ryan A wrote:
Hey all,
heres the short explanation of what I am supposed to
do,
I need to render/convert the entire site to normal
html pages so that it can be loaded onto a cd and
given out.
The good news is
You have just described what wget does...
Oookayyy, and thats the cue for Ryan old boy to
start reading up on wget :-)
never used wget before...
Will google for it, in the meantime if anybody wants
to send me links (even RTFMs) would appreciate it.
Thanks!
Ryan
On Mon, June 12,
Ryan A wrote:
Hi,
Thanks for the suggestion, I am not too familier with
wget but (correct me if i am wrong) wont wget just get
the output from the pages ignoreing the links?
that's the default behaviour - but wget has about a zillion
parameters for controlling its behaviour, it's quite easy
--- Jochem Maas [EMAIL PROTECTED] wrote:
Ryan A wrote:
Hi,
Thanks for the suggestion, I am not too familier
with
wget but (correct me if i am wrong) wont wget just
get
the output from the pages ignoreing the links?
that's the default behaviour - but wget has about a
zillion
On Monday 12 June 2006 17:08, Ryan A wrote:
that said it could take a week to figure out all the
parameters. ;-)
Heck yeah... just been reading up on it... lots of
stuff, who would think one little four letter word
could do so much.oops, now thinking of another
four letter word
-Original Message-
Quick question;
If the site is updated with new pages/links is there
anyway of specifying to HTTrack to get just the new
pages or does it get the whole site again?
Yes, there is an option to just update the downloaded site. I've never
actually used that option
24 matches
Mail list logo