Well, I've tried. Fixed some problems, but one check fail on something
weird.

*11:30:51* [chrome 120.0.6099.224 linux #0-0] Running: chrome
(v120.0.6099.224) on linux*11:30:51* [chrome 120.0.6099.224 linux
#0-0] Session ID: 61d1baf9420e4a5556fc9608d3ca3da2*11:30:51* [chrome
120.0.6099.224 linux #0-0]*11:30:51* [chrome 120.0.6099.224 linux
#0-0] » tests/selenium/specs/SpecialGlobalWatchlist.js*11:30:51*
[chrome 120.0.6099.224 linux #0-0] Special:GlobalWatchlist*11:30:51*
[chrome 120.0.6099.224 linux #0-0]    ✖ works with normal
display*11:30:51* [chrome 120.0.6099.224 linux #0-0]*11:30:51* [chrome
120.0.6099.224 linux #0-0] 1 failing (18.4s)*11:30:51* [chrome
120.0.6099.224 linux #0-0]*11:30:51* [chrome 120.0.6099.224 linux
#0-0] 1) Special:GlobalWatchlist works with normal display*11:30:51*
[chrome 120.0.6099.224 linux #0-0] element
(".ext-globalwatchlist-site") still not existing after
10000ms*11:30:51* [chrome 120.0.6099.224 linux #0-0] Error: element
(".ext-globalwatchlist-site") still not existing after
10000ms*11:30:51* [chrome 120.0.6099.224 linux #0-0]     at async
Context.<anonymous>
(file:///workspace/src/extensions/GlobalWatchlist/tests/selenium/specs/SpecialGlobalWatchlist.js:44:3)

(
https://integration.wikimedia.org/ci/job/quibble-vendor-mysql-php83-selenium/5343/console
)
I do not know what to do with it. Can the code you suggested take so much
time, or maybe it's something different? Maybe some part of the code can't
get an answer because it isn't "a real wiki"? Thank you.
Igal


<http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
Virus-free.www.avg.com
<http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
<#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>

‫בתאריך יום ו׳, 19 בדצמ׳ 2025 ב-9:47 מאת יגאל חיטרון <‪
[email protected]‬‏>:‬

> Thank you both, I'll try to debug it and return here with results.
> Igal
>
>
>
> <http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
> Virus-free.www.avg.com
> <http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
> <#m_4660861757529901315_DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
>
> ‫בתאריך יום ו׳, 19 בדצמ׳ 2025 ב-5:15 מאת ‪Brian Wolff‬‏ <‪
> [email protected]‬‏>:‬
>
>> > This is a very obscure corner of MediaWiki, it took me a while to get
>> this working. I'm not even sure what is the division of responsibility
>> between WikiMap and SiteLookup.
>>
>> I think WikiMap was first. Site was added later for wikidata without much
>> consideration for what came before. I think the difference was that Site
>> was more tied to the Site DB table and WikiMap is more based around
>> $wgConf. In any case, i suspect the two classes should be merged in theory
>> as both do basically the same thing.
>>
>> --
>> Brian
>>
>>
>> On Thursday, 18 December 2025, Bartosz Dziewoński <[email protected]>
>> wrote:
>>
>>> On 2025-12-19 00:08, Brian Wolff wrote:
>>>
>>>> Why cant you use the MediaWikiSite class to get a site reference, call
>>>> ->getLanguageCode() then make a language object and use the language object
>>>> to get the direction?
>>>>
>>>
>>> Indeed, this is the best advice for this case. Just to expand on it,
>>> since there are several non-obvious steps:
>>>
>>>     use MediaWiki\WikiMap\WikiMap;
>>>     $services = MediaWiki\MediaWikiServices::getInstance();
>>>
>>>     $wikiId = WikiMap::getWikiFromUrl( 'https://en.wikipedia.org/' );
>>>     // $wikiId is a string like 'enwiki' or false if the URL isn't known
>>>     $site = $services->getSiteLookup()->getSite( $wikiId );
>>>     $langCode = $site->getLanguageCode();
>>>     // $langCode is a string like 'en' or null if it's not known
>>>     $lang = $services->getLanguageFactory()->getLanguage( $langCode );
>>>     $dir = $lang->getDir();
>>>     // $dir is a string: 'ltr' or 'rtl'
>>>
>>> This is a very obscure corner of MediaWiki, it took me a while to get
>>> this working. I'm not even sure what is the division of responsibility
>>> between WikiMap and SiteLookup.
>>>
>>> Each wiki in a wiki farm knows a little bit about every other wiki, but
>>> only a little bit. You can use these classes to look up the language code
>>> or format links, but not much more than that. It doesn't know all of the
>>> configuration, so if you ever need e.g. the list of namespaces on the other
>>> wiki, you'll need to fall back to doing API requests to the 'siteinfo' API
>>> or make up some configuration of your own.
>>>
>>> --
>>> Bartosz Dziewoński
>>> _______________________________________________
>>> Wikitech-l mailing list -- [email protected]
>>> To unsubscribe send an email to [email protected]
>>>
>>> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
>>
>> _______________________________________________
>> Wikitech-l mailing list -- [email protected]
>> To unsubscribe send an email to [email protected]
>>
>> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
>
>
_______________________________________________
Wikitech-l mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

Reply via email to