RolandUnger created this task.
RolandUnger added projects: Wikidata, LuaSandbox, hardware-requests, MediaWiki-extensions-WikibaseClient.
Herald added a project: Operations.


If you are looking for the profiling data at the article "Halle (Saale)" at the German Wikivoyage ( you will learn that the total computing time is between 8 an 10 seconds. It is much time because the maximum Lua time is restricted to 10 s. I think it is too much. A few weeks ago it took only about 4 seconds. I do not know what happened but we made only minor changes to our scripts. Unfortunately we have no protocol to analyze the time consumption. Of course the article of Halle (Saale) is a test case to learn what would be happen if we are using Wikidata data on a grand scale.

Most of the computing time is necessary for the vCard template. On average it takes about 15 ms if no Wikidata call is made and about 65 ms with Wikidata calls.

There are 170 expensive Lua calls:

  • 50'Media:' .. image).exists calls, about 10 ms each
  • 120 mw.wikibase.getEntity( id ) calls, about 50 ms each

Of course this is a huge number of getEntity() calls. But this can be happen also on Wikipedias for instance for fetching reference data.

As I said I do not know why the computing time is as much. So I think of several reasons:

  • Wikivoyage is hosted on a low-performance server. In the last days there were often long-term loading times for articles, for edit mode and watch lists. Maybe there are too many accesses to Wikivoyage. The Alexa page rank is now about 16.000 similar to Wikidata (about 14.000).
  • Wikidata is hosted on a low-performance server.
  • We are trying to prevent'Media:' .. image) calls. The 50 image checks mentioned are for invisible images because they are shown only on the map. We need this for maintenance. We proved already several ways to overcome this (T179636, T189406). But for now we have no other opportunity. But finally we will get a reduction of time consumption of about half a second.
  • If you use a template without a module invocation it takes about 5 ms, with a module invocation about 15 ms and more. Maybe there is an opportunity for optimization. In case of many template/module calls parallel computing could be a mean reducing the time for waiting (of course not the computing time itself). It could be helpful to store Lua data for the whole parsing process which will not be deleted after a module call.
  • Of course the main optimization has to undertaken at Wikidata itself. If we cannot reduce the Wikidata access time Wikidata will be become useless. We have only one mw.wikibase.getEntity( id ) call per template. Several tables like help us to prevent additional Wikidata calls. We already checked all opportunities to reduce the Wikidata access time and made a proposal to reduce the computing time for mw.wikibase.getEntity( id ) calls (T179638).

I hope you can help us for the reduction of computing time.



To: RolandUnger
Cc: thiemowmde, Lydia_Pintscher, MaxSem, RolandUnger, Davinaclare77, Qtn1293, Lahi, Gq86, GoranSMilovanovic, Th3d3v1ls, Hfbn0, QZanden, LawExplorer, Zppix, Wikidata-bugs, aude, Anomie, faidon, Mbch331, Jay8g, fgiunchedi, Legoktm
Wikidata-bugs mailing list

Reply via email to