On Wednesday 14 November 2001 15:22, you wrote:
> On Wed, Nov 14, 2001 at 03:40:00PM -0500, Gianni Johansson wrote:
> < >
>
> > It's important to keep in mind what you are stress testing.  They maximum
> > memory allocation of the encoder implementation is capped at 24M, which
> > it will hit as soon as you get to 16M of data (8M of check blocks for 50%
> > redundancy).   Beyond that, I do the encoding in multiple stripes, never
> > with more than 16M total.
>
> Is 50% really necessary? I don't know what effect the graphs have under
> this system, but it sounds like a lot (considering 10-20% is definitely
> enough using a real IDA).
This is probably too conservative.  The problem is that I don't really know 
what the per request failure rate is, so it don't have any basis for deciding
what is "good enough". 

Remember, the biggest file size I can handle without segmenting is 128MB.  I 
wanted really high reliability because I need be concatinate independant 
segments for things like divx movies (720Mb ~= 5+ segments).  

I have some calculations that I want to run by you to check, but I am going 
out the door right now.  Will send them later.

>
> > Once you get past the 16M data size, the only thing you are stressing is
> > Freenet's ability to handle a large number of inserts, which everyone
> > knows is pretty poor.
>
> Well, he did show that you were forgetting to use redirects when the
> splitfile index grows above 32 kB.
Got me there.   

--gj

>
> <>

-- 
Freesites
(0.3) freenet:MSK at SSK@enI8YFo3gj8UVh-Au0HpKMftf6QQAgE/homepage//
(0.4) freenet:SSK at npfV5XQijFkF6sXZvuO0o~kG4wEPAgM/homepage//

_______________________________________________
Devl mailing list
Devl at freenetproject.org
http://lists.freenetproject.org/mailman/listinfo/devl

Reply via email to