Hi
My application needs a specific functionality to store large files - about 500M 
to 1.5G. 
I've changed configuration for acceptance of such a big files, but after that I 
saw that I have other limitations.
1. Uploading is too slow but the performance is down when actually the file is 
transfered and it is placed in the cache (I'm watchin the network activity). 
The copying from cache to  a  node of repository is too slow. And I think my 
configuration of repository is not good.
2. Because of the timeout - I have an error message - something like 
"Transaction fail" and sometime the blob is missing some time not.

I'm using a nuxeo 5.1.6 with PostGres as a SQL server. I put the flag 
externalsBlobs to true, but I think this is not enough.
Configuration of Trinidad is:

bq. <!-- Trinidad filter config for file upload --> \\ 
<context-param> \\ <!-- Maximum memory per request (in bytes) --> 
\\ 
<param-name>org.apache.myfaces.trinidad.UPLOAD_MAX_MEMORY</param-name>
 \\ <!-- Use 512K --> \\ <param-value>512000</param-value> \\ 
</context-param> \\ <context-param> \\ <!-- Maximum disk space 
per request (in bytes) --> \\ 
<param-name>org.apache.myfaces.trinidad.UPLOAD_MAX_DISK_SPACE</param-name>
 \\ <!-- Use 1.5G --> \\ 
<param-value>1610612736</param-value> \\ </context-param>
Thanks in advance,
Stefan Dimov
--
Posted by "[email protected]" at Nuxeo Discussions 
<http://nuxeo.org/discussions>
View the complete thread: 
<http://www.nuxeo.org/discussions/thread.jspa?threadID=2339#6306>
_______________________________________________
ECM mailing list
[email protected]
http://lists.nuxeo.com/mailman/listinfo/ecm

Reply via email to