On Fri, Nov 15, 2002 at 07:23:38AM -0800, [EMAIL PROTECTED] wrote:
> >> Insert files.
> >> For each file:
> >>  if fetchable on all nodes, continue to next file
> >>  for each extra node, (in parallel?):
> >>   try to fetch at current HTL
> >>   if RNF, retry with same options
> >>   if DNF, and HTL < max HTL, increase HTL and retry
> >>  done
> >>  repeat
> >> done
> >>
> 
> i hate you too but in a kind fluffy pink bunny way
> 
> how bout going for speed instead,
> cache the crap and verify later
> if one cares too
Um...
> 
> paging for acutally large files over the 128 meg limit
> wud be really nice to not get buffer errors all the time
Sorry, I don't follow you. Insertions of 230M+ files work fine AFAIK
(that Enterprise site, for example).
> 
> 
> >> Eventually the files should become available... and the network
> >will
> >> forge new links to make this happen. The only cost is time and
> >load...
> >
> >so, I've started implenting some multi-homing into fishtools, but
> >how it
> >works right now is much simpler than this
> >
> >
> >loop:
> >     insert file on random node
> >     attempt to fetch from different random node
> >     if(fetched):
> >             done, next file
> >     else:
> >             jump loop
> >
> >I realise that's not exactly what you were after, but I'll probably
> >get to
> >adding the code to fetch from each node this weekend - the way the
> >code
> >works requires some retooling to pull that part off (baiscally,
> >i modified
> >openFcpPort() to open a random node from a list, instead of always
> >opening
> >the same node.  I'll need to add support to retrieveKey() to retrieve
> >from
> >a specified node.  but anyhow...)
> >

-- 
Matthew Toseland
[EMAIL PROTECTED]
[EMAIL PROTECTED]
Freenet/Coldstore open source hacker.
Employed full time by Freenet Project Inc. from 11/9/02 to 11/1/03
http://freenetproject.org/

Attachment: msg05445/pgp00000.pgp
Description: PGP signature

Reply via email to