Hi! I wanted to test Tahoe a bit and installed one introducer and 10 storage nodes, all on one machine with different ports, this seemed to work well.
As the next step, I tried to upload a file (32472771 bytes) through the
web frontend, which resulted in two issues:
* Uploading the file didn't work. I got the message "SDMF is limited
to one segment, and 32472771 > 3500000" displayed in the browser,
checking the directory URI afterwards only resulted in an empty
directory.
Since I saw messages flying around discussing gigabyte sized
files, I guess this is only a stupidity on my side. Any hints how
to tackle this?
* When is `ps axflwww'ed the process' memory usage, I saw that the
python instance that ran the connection where I uploaded the 31MB
file grew beyond 300MB of VSZ. I better keep my brain away from
thinking about uploading a gigabyte sized file on a 32bit
system...
And during create-client and create-introducer, an empty directory is
required. It would be nice to ignore "lost+found" while looking up
directory contents...
MfG, JBG
--
Jan-Benedict Glaw [EMAIL PROTECTED] +49-172-7608481
Signature of: ...und wenn Du denkst, es geht nicht mehr,
the second : kommt irgendwo ein Lichtlein her.
signature.asc
Description: Digital signature
_______________________________________________ tahoe-dev mailing list [email protected] http://allmydata.org/cgi-bin/mailman/listinfo/tahoe-dev
