On Sat, Dec 25, 2004 at 09:36:17PM -0600, Paul Landers wrote: > When requesting a large split-file via fproxy, it usually eventually > fails with the following: > > Request failed gracefully: Next failed: Could only fetch 25 of 26 > blocks in segment 1 of 1: 14 failed, total available 39 > > However, if I manually re-request the URI enough times, it will > eventually download completely. I take all defaults when requesting: > > 0 Initial Hops to Live > 4 Number of Retries for a Block that Failed > 5 Increment HTL on retry by this amount > Force the Browser to Save the File > Don't Look for Blocks in Local Data Store > Download Segments in Random Order > 30 Number of Simultaneous Downloads > Run Anonymity Filter on Download Completion (recommended) > Make the Anonymity Filter Really Paranoid > 100 % of Missing Data Blocks to Insert (Healing) > 18 Hops-to-Live for the Healing Blocks > Write directly to disk rather than sending to browser > > Is there a combination of settings for the request (or another method) > to instruct fproxy to not give up until the file actually completes?
Increase the "Number of Retries for a Block that Failed". There is a maximum of 50; I made this configurable a while back IIRC but there is a bug that prevents the change from working, so you're stuck with 50 for now. -- Matthew J Toseland - [EMAIL PROTECTED] Freenet Project Official Codemonkey - http://freenetproject.org/ ICTHUS - Nothing is impossible. Our Boss says so.
signature.asc
Description: Digital signature
_______________________________________________ Support mailing list Support@freenetproject.org http://news.gmane.org/gmane.network.freenet.support Unsubscribe at http://dodo.freenetproject.org/cgi-bin/mailman/listinfo/support Or mailto:[EMAIL PROTECTED]