Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-26 Thread Asher Feldman
On Thu, Mar 21, 2013 at 10:55 PM, Yuri Astrakhan yastrak...@wikimedia.orgwrote: API is fairly complex to meassure and performance target. If a bot requests 5000 pages in one call, together with all links categories, it might take a very long time (seconds if not tens of seconds). Comparing

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-26 Thread Yuri Astrakhan
Asher, I don't know the actual perf statistics just yet. With the API this has to be a balance - I would want more slower calls than tons of very fast calls - as that consumes much more bandwidth and resources (consider getting all items one item at a time - very quick, but very inefficient). On

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-26 Thread Asher Feldman
There are all good points, and we certainly do need better tooling for individual developers. There are a lot of things a developer can do on just a laptop in terms of profiling code, that if done consistently, could go a long way, even without it looking anything like production. Things like

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-26 Thread Ryan Lane
On Tue, Mar 26, 2013 at 3:58 PM, Asher Feldman afeld...@wikimedia.orgwrote: There are all good points, and we certainly do need better tooling for individual developers. There are a lot of things a developer can do on just a laptop in terms of profiling code, that if done consistently, could

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-26 Thread George Herbert
On Tue, Mar 26, 2013 at 8:15 PM, Ryan Lane rlan...@gmail.com wrote: On Tue, Mar 26, 2013 at 3:58 PM, Asher Feldman afeld...@wikimedia.orgwrote: There are all good points, and we certainly do need better tooling for individual developers. There are a lot of things a developer can do on just a

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread Arthur Richards
Right now, I think many of us profile locally or in VMs, which can be useful for relative metrics or quickly identifying bottlenecks, but doesn't really get us the kind of information you're talking about from any sort of real-world setting, or in any way that would be consistent from engineer to

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread MZMcBride
Asher Feldman wrote: I'd like to push for a codified set of minimum performance standards that new mediawiki features must meet before they can be deployed to larger wikimedia sites such as English Wikipedia, or be considered complete. These would look like (numbers pulled out of a hat, not

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread Steven Walling
On Thu, Mar 21, 2013 at 6:40 PM, Asher Feldman afeld...@wikimedia.orgwrote: Right now, varying amounts of effort are made to highlight potential performance bottlenecks in code review, and engineers are encouraged to profile and optimize their own code. But beyond is the site still up for

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread Brian Wolff
been a way to reliably measure performance... The measure part is important. As it stands I have no way of measuring code in action (sure i can set up profiling locally, and actually have but its not the same [otoh i barely ever look at the local profiling i did set up...). People throw around

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread Matthew Walker
People throw around words like graphite, but unless im mistaken us non staff folks do not have access to whatever that may be. Graphite refers to the cluster performance logger available at: http://graphite.wikimedia.org/ Anyone with a labs account can view it -- which as a commiter you do

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-22 Thread Brian Wolff
On 2013-03-22 6:46 PM, Matthew Walker mwal...@wikimedia.org wrote: People throw around words like graphite, but unless im mistaken us non staff folks do not have access to whatever that may be. Graphite refers to the cluster performance logger available at: http://graphite.wikimedia.org/

[Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-21 Thread Asher Feldman
I'd like to push for a codified set of minimum performance standards that new mediawiki features must meet before they can be deployed to larger wikimedia sites such as English Wikipedia, or be considered complete. These would look like (numbers pulled out of a hat, not actual suggestions): -

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-21 Thread Matthew Walker
Asher, Do we know what our numbers are now? That's probably a pretty good baseline to start with as a discussion. p99 banner request latency of 80ms Fundraising banners? From start of page load; or is this specifically how fast our API requests run? On the topic of APIs; we should set similar

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-21 Thread Peter Gehres
From where would you propose measuring these data points? Obviously network latency will have a great impact on some of the metrics and a consistent location would help to define the pass/fail of each test. I do think another benchmark Ops features would be a set of latency-to-datacenter values,

Re: [Wikitech-l] [RFC] performance standards for new mediawiki features

2013-03-21 Thread Yuri Astrakhan
API is fairly complex to meassure and performance target. If a bot requests 5000 pages in one call, together with all links categories, it might take a very long time (seconds if not tens of seconds). Comparing that to another api request that gets an HTML section of a page, which takes a