On Friday 25 July 2003 02:05 am, Gordan wrote:
>> No, I don't disagree. I previously stated that images that are intended to
>> be used as active links should under no circomstances be zipped.
>OK, can you give an example of a Freesite (or create one that specifically 
>suffers the problem that ZIP files would aim to solve, for the purpose of the 
>demonstration) that would benefit from this approach in a demonstrable 
>fashion?

For images: Imagine if the Freenet mainpage were inserted. All the aqua 
borders would take an extrodinary time to load. For html: Sites link YoYo 
that have several layered pages, but not much on each page. I'd be willing to 
bet that all the HTML on YoYo would compress to a few kbs. So instead of:

Waiting 30 seconds for each page to load and then 30 seconds for the images on 
that page, and then clicking a link and waiting 30 secounds for the page 
again, and then anouthor 30 secounds for the images.
Assuming only a 1kbs transfer rate:
You wait 32 secounds for the main page. You wait 30 secounds for the images on 
that page. You click a link. It apears instantly. The images load in 30 
secounds. You go back click anouthor link, it loads instantly and so do the 
images.

> > Well if you don't load those pages, you only get the images that are
> > shared between pages, Which you are bound to see, and the HTML. (very
> > small)
>
> And what happens to those of us who are interested only in textual rather
> than image content?

So what? With or without zipping this works the saim way. If your bowser 
requests the images you get them if not you don't.

> Additionally, images are not indexable. If there is somebody out there who
> is working on an automated indexing robot, then this robot would put more
> strain on the network than necessary if it starts retrieving ZIP files with
> images in them because those images would not be used by it.

Not if the images are never in the same ZIP as the HTML. Which is what I said 
at the start of this thread.

> > If you don't zip active links this is a non issue.
>
> I think this should be extended at least to all images, rather than just
> active links, if we DO end up with having archives. HTML pages only, and
> limit the size to much smaller than 1 MB.

Not a problem as long as html and images are in seperate zips. If you don't 
want an image zip, fine, but that is no reason to say they should be 
eliminated.

> Manual pre-caching would be using IFRAME tags to pre-load pages/files that
> the page links to, so that while you are reading the current page, the
> pages you are likely to go to are already pre-caching.
>
> Automated pre-caching would be using pre-caching software than
> automatically follows all links from the current page and puts them in your
> browser cache.
>
> The latter, although far less than ideal, would IMO still be better than
> implementing archives on the node level.

How is this better?!? It is garenteed to waste bandwidth. Idealy ZIPS will not 
use any more bandwidth, because you are only getting images together that you 
would have got anyway, and the HTML is compressed. and YES THEY SHOULD BE IN 
SEPERATE ARCHIVES.

> OK, for the first time there is an aspect of this that I might actually be
> able to swallow as vaguely sensible. But again, the same problems are
> there:
>
> 1) There would be no way to automatically decide what should be in an
> archive. 

Go back and read what I said earlier. I proposed an automated way to do 
exactly this.

> 2) Allowing it to be done manually would allow for "abuse" of the
> network by bandwidth wasting

That is why I said "THIS MUST BE AUTOMATED", and then spelled out exactly how 
it could be done. To reineterate I basicaly said: HTML and Images should be 
seperated. No archive should be greater that one meg. Insertion utilitys 
should save a file list so all files end up in the saim zip each time. No 
more than one zip for images. 

If you have ideas on how to improve this, say so. Shrinking the max size might 
be a good idea. Putting restictions like "you can't add any file to zip 
anything bigger than X or that has a negitive compression ratio." is a good 
idea.

> 3) Doing it manually would likely be beyond most people's abilities to do
> properly. It is a bit like shooting yourself in the foot with a rocket
> launcher.

Yes, there should be no way of doing it manualy, short of writing your own 
Freenet insertion utility.

> While this use (skins) would probably be OK, it would still be likely to
> create more problems than it solves, and good CSS is probably a better way
> to go than graphic heavy skins.

If you can use css and it is better then you should. Most webdesigners don't 
know css. It is very easy to rip an image theme off anouthor site. There is 
no harm in supporting both systems.

> And besides, latency with parallel downloads is not really an issue, as the
> latency for each download is suffered simultaneously, not in series.

That causes more strain on the network.
_______________________________________________
devl mailing list
[EMAIL PROTECTED]
http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl

Reply via email to