Hello Bill,

I agree with Chris that I would use wget from Terminal to grab a web
site and its dependent pages as you requested.  However, if you want
to look for a GUI based solution, you might try explore a free app
that you can get from the Mac App store named SiteSucker. This
question came up on the mac-access list, and most of the responses
were to use wget or curl from the Terminal.  But the poster was asking
on behalf of a friend, who didn't feel comfortable using Terminal.
You can read my post about SiteSucker and another app named Maria in
the Mail Archive post at:
• Re: download managers.
http://www.mail-archive.com/mac-access%40mac-access.net/msg12870.html

This post gives the links to SiteSucker, and a few comments about
trying it.  If you read two more posts down the thread (using Control-
n to go to the next to posts) you can read the statement about Maria.
I'm afraid we didn't spend very much time on this topic, since the
person who posed the question had already suggested to his friend that
the best way was to use wget, and the other people responding about
alternatives basically also agreed that wget was what they would
recommend.  And since the friend in question was not a list member,
and the questions were being relayed at second hand, there was not
much incentive to investigate in more detail .

HTH.  Cheers,

Esther


On Jan 23, 12:33 pm, Chris Blouch <[email protected]> wrote:
> Well, if you want to attempt it I'd be happy to break things down a bit.
>
> CB
>
> On 1/23/13 5:16 PM, Bill Holton wrote:
>
>
>
>
>
>
>
>
>
>
>
> > Hi.
>
> > Thanks for the info, but I think you’re about five levels above my pay
> > grade, mac wise.
>
> > Thanks.
>
> > BILL HOLTON
>
> > Email:[email protected] <mailto:[email protected]>
>
> > P:386-624-6309 C: 386-624-3255
>
> > Home Office:1520 Loughton ST
>
> > DeLand, FL32720
>
> > *From:*[email protected]
> > [mailto:[email protected]] *On Behalf Of *Chris Blouch
> > *Sent:* Wednesday, January 23, 2013 5:13 PM
> > *To:* [email protected]
> > *Subject:* Re: Grabbing a web site?
>
> > I don't know of a Safari plugin but if you have MacPorts installed you
> > get the open source wget tool and then use it in terminal to grab a
> > bunch of pages and store them on your hard drive:
>
> > wget -mcrpk -o process.loghttp://www.abcd.com
>
> > Parameters:
>
> > O - output of the process is written to a file instead of the display.
> > In this case I logged everything to process.log
> > M - Mirror - copies timestamps and recursion
> > C - Continues a partly-downloaded transfer. Probably not as big an
> > issue on a good connection
> > R - Recursion, but this might not have been needed with the M. Figured
> > it didn't hurt.
> > P - Download any page dependencies like CSS, images etc.
> > K - Convert all links to relative URLs so it doesn't keep trying to
> > link off to the original site or path
>
> > It will crawl every link of every page and store it all in your
> > current folder. Then you can webshare that folder to make your own
> > mini version of the site or just open the top level file in Safari.
> > The latter method may not always work since modern sites use
> > javascript ajax stuff to suck in piecesparts from a web server which
> > is not what is happening when you just open a file directly from your
> > hard drive.
>
> > CB
>
> > On 1/23/13 4:00 PM, Bill Holton wrote:
>
> > Hi.
>
> > Does anyone know of an accessible Safari plugin or stand alone app
> > that will go to a web site and download everything for off-linevieweing?
>
> > Thanks.
>
> > BILL HOLTON
>
> > Email:[email protected] <mailto:[email protected]>
>
> > P:386-624-6309 C: 386-624-3255
>
> > Home Office:1520 Loughton ST
>
> > DeLand, FL32720
>

-- 
You received this message because you are subscribed to the Google Groups 
"MacVisionaries" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/macvisionaries?hl=en.

Reply via email to