hammer <[EMAIL PROTECTED]> wrote:
>
>
>The "package" looks big (ca. 285 KB ZIPfile and somewhere in the region
>of 2 MB unpacked) but the actual set-up is less than 260 KB on
>disk(ette) - and it *does* run from a diskette, and on any PC under DOS
>(from DOS 3.3 on); the zipped package sure contains all the docs and
>even the sources of some of the parcels being part of it.
>It is up at my place as <www.inti.be/hammer/get-www.zip>
Hm, the idea is not bad, however I've been already using such a program.
UKA_PPP contains also HTGET and a utility to view the downloaded file
while online.
This one I changed first to HLIST then to Bobcat (but you can use any HTML
viewer like Arachne, or Browse - a tiny viewer belonging to the YAN
package). The real problem is that as far as I see none of these methods
can be used as a "site grabber".
>
>This is an upcoming problem. These endless click-alongs to get at
>the destination sought serve only those others who get better from it,
>namely 1. telco's and Net connection providers (sharing the
>interconnection rate), 2. Get-Rich-Quick-ISPs, milking 3. the
>advertizers. We pay double: As telco users, and as consumer who pay the
>publicity overhead, baked into the prices.
>(And sure that those four-lines-long, Java-barricaded URLs have their
>funcion in *that* setup.)
I try to solve this problem for myself in the way I often use ACCMAIL
methods just to get all the links of a basic URL. I know this is not the
perfect solution, but as long as nobody writes a site grabber utility
for DOS it is surely the least expensive.
BTW, has anybody tried GETHTTP from Open World Navigator (OWN) already?
Somehow I believe it could work recursively.
>
>// Heimo Claasen // <[EMAIL PROTECTED]> // Brussels 1999-09-
>HomePage of ReRead - and much to read ==> http://www.inti.be/hammer
--
Tibor Mocsar
To unsubscribe from SURVPC send a message to [EMAIL PROTECTED] with
unsubscribe SURVPC in the body of the message.
Also, trim this footer from any quoted replies.