Limit still is 100MB. I am willing to test but I can't with the 100MB
restriction.

  -- とある白い猫  (To Aru Shiroi Neko)


On Mon, May 7, 2012 at 9:06 PM, Federico Leva (Nemo) <[email protected]>wrote:

> Erik Moeller, 07/05/2012 19:13:
>
>> On Mon, May 7, 2012 at 2:54 AM, Federico Leva (Nemo)  wrote:
>>
>>
>>  * especially when you add the file and start upload, it's very
>>> resource-intensive and made my browser stall for a short while: 400 MB of
>>> RAM used for several minutes, CPU usage hopping, a whole core waiting for
>>> disk (?).
>>>
>>
>> Yeah, UW was trying to extract JPEG metadata from video files. :-P
>> This should be fixed once 
>> https://gerrit.wikimedia.org/**r/#/c/6722/<https://gerrit.wikimedia.org/r/#/c/6722/>is
>> reviewed and deployed.
>>
>> The internal API error bug will probably be a bit harder to track
>> down; hazarding a guess it might be a timing issue with the copy
>> operation on very large files.
>>
>
> What about the automatic clearing of the stash? Wasn't that made more
> aggressive at some point?
>
>
>  Can you file a bug for the upload performance issue as well? That one
>> is new to me.
>>
>
> Done: 
> https://bugzilla.wikimedia.**org/show_bug.cgi?id=36599<https://bugzilla.wikimedia.org/show_bug.cgi?id=36599>
>
> Nemo
>
>
> ______________________________**_________________
> Commons-l mailing list
> [email protected]
> https://lists.wikimedia.org/**mailman/listinfo/commons-l<https://lists.wikimedia.org/mailman/listinfo/commons-l>
>
_______________________________________________
Commons-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/commons-l

Reply via email to