Hi!

I think will be good idea to try to get access to real hardware.

For example, Boston (http://www.boston.co.uk) produces Calxeda-based
servers and well as HP has experimental Calxeda and X-Gene based
cartridges for Moonshot servers (http://www.hp.com/moonshot).

Both provide remote access to own servers for trials.

Eugene.

On Mon, Jan 13, 2014 at 4:40 PM, Tim Starling <tstarl...@wikimedia.org> wrote:
> On 14/01/14 10:55, George Herbert wrote:
>> On Mon, Jan 13, 2014 at 3:33 PM, Tim Starling <tstarl...@wikimedia.org>wrote:
>>>
>>> In fact, it would slow down individual requests by a factor of 7,
>>> judging by the benchmarks of Calxeda and Xeon CPUs at
>>>
>>> http://www.eembc.org/coremark/index.php
>>>
>>> So instead of a 10s parse time, you would have 70s. Obviously that's
>>> not tolerable.
>>
>>
>> Question - is that 10s linear CPU core time for a parse, or 10s of average
>> response time given our workloads?
>
> Just an arbitrary number chosen to be within the range of CPU times
> for slower articles. On average, it is much faster than that.
>
> For actual data, you could look at:
>
> http://tstarling.com/stuff/featured-parse-boxplot.png
>
>> If it is the linear one-core parse processing time, how much of that is
>> dependencies on DB lookups and the like, externalities within the
>> infrastructure rather than the straight-line CPU time needed for the parse
>> itself?
>
> WikitextContent::getParserOutput() profiles at around 1.25s real and
> 1.17s CPU.
>
> -- Tim Starling
>
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to