Legoktm has uploaded a new change for review. ( 
https://gerrit.wikimedia.org/r/375063 )

Change subject: Use a prefetch generator to make getting the extension list 
faster
......................................................................

Use a prefetch generator to make getting the extension list faster

Getting the full extension list is slow because it requires making about
3 network requests per extension. Since we pause starting new jobs after
we hit the limit of concurrent ones, we often spend a lot of time
waiting on making requests just to build the extension list.

This adds a dependency of "prefetch_generator" being installed on the
host system.

Change-Id: I609b630832d5552a7aa63939037ec2c73691af2b
---
A requirements.txt
M run.py
2 files changed, 15 insertions(+), 11 deletions(-)


  git pull ssh://gerrit.wikimedia.org:29418/labs/libraryupgrader 
refs/changes/63/375063/1

diff --git a/requirements.txt b/requirements.txt
new file mode 100644
index 0000000..1952b2e
--- /dev/null
+++ b/requirements.txt
@@ -0,0 +1 @@
+prefetch_generator
diff --git a/run.py b/run.py
index 31a4d7c..4a2666d 100755
--- a/run.py
+++ b/run.py
@@ -20,6 +20,7 @@
 from collections import defaultdict
 import json
 import os
+import prefetch_generator
 import requests
 import sys
 
@@ -39,9 +40,15 @@
 s = requests.session()
 
 
-def get_extension_list():
+@prefetch_generator.background()
+def get_extension_list(library):
     r = 
s.get('https://www.mediawiki.org/w/api.php?action=query&list=extdistrepos&formatversion=2&format=json')
-    yield from r.json()['query']['extdistrepos']['extensions']
+    for ext in r.json()['query']['extdistrepos']['extensions']:
+        phab = get_phab_file('mediawiki/extensions/' + ext, 'composer.json')
+        if phab:
+            version = phab.get('require-dev', {}).get(library)
+            if version:
+                yield {'ext': ext, 'version': version}
 
 
 def get_phab_file(gerrit_name, path):
@@ -94,15 +101,11 @@
     data = defaultdict(dict)
     for version in VERSIONS:
         cleanup = set()
-        for ext in get_extension_list():
-            cs = has_codesniffer(ext)
-            if cs:
-                # Save PHPCS version
-                data[ext]['PHPCS'] = cs
-                run(ext, version=version, mode='test')
-                cleanup.add(ext)
-            else:
-                print('Skipping ' + ext)
+        for info in get_extension_list('mediawiki/mediawiki-codesniffer'):
+            # Save PHPCS version
+            data[info['ext']]['PHPCS'] = info['version']
+            run(info['ext'], version=version, mode='test')
+            cleanup.add(info['ext'])
             # If more than max containers running, pause
             docker.wait_for_containers(count=CONCURRENT)
         # Wait for all containers to finish...

-- 
To view, visit https://gerrit.wikimedia.org/r/375063
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings

Gerrit-MessageType: newchange
Gerrit-Change-Id: I609b630832d5552a7aa63939037ec2c73691af2b
Gerrit-PatchSet: 1
Gerrit-Project: labs/libraryupgrader
Gerrit-Branch: master
Gerrit-Owner: Legoktm <lego...@member.fsf.org>

_______________________________________________
MediaWiki-commits mailing list
MediaWiki-commits@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-commits

Reply via email to