The Blue Meanie wrote:
> I have a limited number of large files shared that have low upload demand. 
> Short of adding a bunch of porn or music to my shared stuff, is there any way
> to test uploads "directly"?

You can request an upload with any HTTP-capable tool using
http://$host:$port/get/0/$filename as URL scheme. The first request gets
a 503 and you get queued, so you have to try twice.

> Will upload failures emit warnings to the calling
> shell (I'm running GTKG from a terminal for now to watch the warnings)?

Depends on the type of failure. If sendfile() fails hard the first time,
there should be a warning in the terminal. If you assume that sendfile()
is broken, check the checksum as well and try aborting/resuming a transfer
to make it a little more realistic since transfers of complete files are
rather exceptional. If you have some time, bandwidth and diskspace it
would be nice if you could try transfering something larger than 4 GiB
just to verify that there are no issues with large files on your setup.

> $ /usr/lib/gmsgfmt
> Usage: gmsgfmt [-D dir | --directory=dir] [-f | --use-fuzzy]
>                [-g] [-o outfile | --output-file=outfile]
>                [--strict] [-v | --verbose] files ...
> 
> and it won't report a version with any of the listed options.  A "strings" on
> the binary doesn't show ANYTHING resembling a version number.  I'm really not
> sure what to make of it.  Suggestions?

What does "gmsgfmt -V; echo $?" print? It might be possible to use that
as indicator to use more conservative options.

-- 
Christian

Attachment: pgp4kJR6Mx7LA.pgp
Description: PGP signature

Reply via email to