[Wikitech-l] Templatestyles in refs

2024-05-07 Thread Strainu
Hi folks,

I'm trying to group 2 named references (which are identical as far as
humans are concerned) together. I'm doing it on rowp, but the code is very
close to enwp. One reference is generated from a template via a module
calling Module:Citation/CS1, the other is generated by the same code, but
the calling template is substituted.

The Lua code generating the reference tag is:

frame:extensionTag("ref", refText, { name = citationHash })

(refText is returned by Module:Citation/CS1)

I isolated the problem to the  tag added by
Modul:Citation/CS1 - removing it from inside to outside the 
part solves the issue. More precisely, the parser seems to generate
different strip markers for each invocation, even if the 
content is identical. This problem seems related to the one outlined in
[1].

I have 2 questions related to this:

1. How should I change the Lua code to allow for templatestyles? According
to the Lua reference manual [2], extensionTag is equivalent to a call to
frame:callParserFunction()
<https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#frame:callParserFunction>
with function name '#tag', which suggests to me this should be the correct
invocation wrt to [1].

2. Will a ref with a call to {{citation}} (which is simply a pass-through
to the module) with the exact same parameters as in the module also work?
The use-case is to generate a ref containing the template on substitution
instead of the ton of metadata generated by the module.

Thanks,
   Strainu

[1]
https://www.mediawiki.org/wiki/Help:Cite#Substitution_and_embedded_parser_functions
[2]
https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#frame:extensionTag
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] "Known languages" or similar?

2024-01-07 Thread Strainu
Hi folks,

I'm trying to add a "translate" link to [[:ro:Template:Ill-wd]] (which
indicates a subject by it's wikidata id) and I need to determine the
original language. Is there a way to determine if the current user
prefers/knows some languages except the wiki's own language? I know I can
use the interface language, but for the vast majority of users that's
identical to the content language.

I vaguely remember that Content Translation asked be at some point about
what languages it should use to provide suggestions, but I can't find that
setting now. Does it still exist and is it available somehow from outside
CX?

Are there any other data sources I can use (and which don't ruin caching,
either)?

Thanks,
   Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: ORES To Lift Wing Migration

2023-09-23 Thread Strainu
Hi folks,

So glad to see the old and new ML teams have an open discussion about this
subject.

I understand that the team might prefer to have several tickets for
different issues, but the discussion about the general approach to the
different models is of interest to many people and is more easily digested
on email. I would suggest to continue discussing the merits of the current
strategy (and not necessarily of a model or another) on email.

* One model per wiki or overall
This is a tough one. :) As a user, I remember how hard it was for Romanian
speakers to complete the training data for damaging/goodfaith and would
prefer to not have to do it again.

However, I'm also worried that some specificities of larger wikis would
creep in the output, leading to reverts that would normally not happen on
my wiki. For instance, smaller settlements are not accepted on enwp, while
they are accepted on rowp. I don't know how to test it myself, and I
haven't seen anything about it in the research.

Another problem I have is I'm not sure how the revert-risk score should be
matched against custom damaging/goodfaith thresholds. Ate there some
guidelines on this except "test"?

* Multiple criteria VS a single score
I think the discussion has been very much about reverts, but as Sj said,
each of these scores are a slightly different facet. Is there data
available on the prevalence of other use-cases or is everyone just writing
revert bots?

On the long run, I believe an unique model good enough can be developed for
revert bots. However, it would be great if there were some clear quality
criteria that the community can verify and the old models are maintained
for a wiki until we are sure the new model passes that criteria on that
wiki.

A change in hosting should not be the guiding force in any team's roadmap,
but the needs of its users.

Have a good weekend,
 Strainu




Pe sâmbătă, 23 septembrie 2023, Luca Toscano  a
scris:
>
>
> On Fri, Sep 22, 2023 at 11:34 PM Aaron Halfaker 
wrote:
>>
>> All fine points.  As you can see, I've filed some phab tasks where I saw
a clear opportunity to do so.
>
> Thanks a lot! We are going to review them next week and decide the next
steps, but we'd like to proceed anyway to migrate ores to ores-legacy on
Monday (this will allow us to free some old nodes that need to be decommed
etc..). Adding features later on to the models on Lift Wing should be
doable, and our goal is to transition away from ores-legacy in a few months
(to avoid maintaining too many systems). The timeline is not yet set in
stone, we'll update this mailing list when the time comes (and we'll follow
up with the remaining users of ores-legacy as well). To summarize: we start
with Ores -> Ores Legacy on Monday, and we'll do Ores Legacy -> Lift Wing
in a second step.
>>
>> >  as mentioned before all the models that currently run on ORES are
available in both ores-legacy and Lift Wing.
>>
>> I thought I read that damaging and goodfaith models are going to be
replaced.  Should I instead read that they are likely to remain available
for the foreseeable future?   When I asked about a community discussion
about the transition from damaging/goodfaith to revertrisk, I was imagining
that many people who use those predictions might have an opinion about them
going away.  E.g. people who use the relevant filters in RecentChanges.
Maybe I missed the discussions about that.
>
> This is a good point, I'll clarify the documentation on Wikitech. Until
models are used we'll not remove them from Lift Wing, but we'll propose to
use Revert Risk where it is suited since it is a model family on which we
decided to invest time and efforts. Basic maintenance will be performed on
the goodfaith/damaging/articlequality/etc.. models on Lift Wing, but we
don't have (at the moment) any bandwidth to guarantee retraining or more
complex workflows on them. This is why we used the term "deprecated" on
Wikitech, but we need to specify what we mean to avoid confusion. Thanks
for the feedback :)
>
>>
>> I haven't seen a mention of the article quality or article topic models
in the docs.  Are those also going to remain available?  I have some user
scripts that use these models and are relatively widely used.  I didn't
notice anyone reaching out. ... So I checked and setting a User-Agent on my
user scripts doesn't actually change the User-Agent.  I've read that you
need to set "Api-User-Agent" instead, but that causes a CORS error when
querying ORES.  I'll file a bug.
>
> Will update the docs as well, as mentioned above we'll keep the current
ORES models available on Lift Wing. Eventually new models will be proposed
by Research and other teams (like Revert Risk), and at that point we (as ML
team) will decide what recommendation to give. Nothing will be removed from
Lift Wing if there are active users on it, but we'll certainly try to
reduce t

[Wikitech-l] Re: ORES To Lift Wing Migration

2023-08-04 Thread Strainu
Hi Chris & ML team,

Good to see LiftWing is finally becoming a reality. There are a few things
in the documentation that I would like to clarify.

1. In [1], the bot owner is encouraged to move to the revertrisk score.
However, in [2], it's explicitly mentioned that the model should not be
used for "Auto-removing edits that a user makes without another editor in
the loop". So, should bot owners currently reverting based on goodfaith and
damaging scores explore the new models? If so, do you have any suggestions
on how to automatically match thresholds between the old and new models?
2. I could not find any reference regarding the ores scores exposed through
other APIs (specifically the RC API [3]). Will those be available going
forward? Under which names?
3. Will it still be possible to (re-)train existing and new model for a
specific wiki? How and when?

Thanks,
  Strainu

[1]
https://wikitech.wikimedia.org/wiki/ORES#Example:_migrating_a_Bot_from_ORES_to_Lift_Wing
[2]
https://meta.wikimedia.org/wiki/Machine_learning_models/Proposed/Language-agnostic_revert_risk#Users_and_uses
[3]
https://ro.wikipedia.org/w/api.php?action=query=json=recentchanges=0%7C4%7C6%7C8%7C10;
*rcprop=*title%7Ctimestamp%7Cids%7C*oresscores*
%7Ctags%7Cpatrolled=unpatrolled=50=edit%7Cnew%7Ccategorize

În joi, 3 aug. 2023 la 17:16, Chris Albon  a scris:

> Hi everybody,
>
> TL;DR We would like users of ORES models to migrate to our new open source
> ML infrastructure, Lift Wing, within the next five months. We are available
> to help you do that, from advice to making code commits. It is important to
> note: All ML models currently accessible on ORES are also currently
> accessible on Lift Wing.
>
> As part of the Machine Learning Modernization Project (
> https://www.mediawiki.org/wiki/Machine_Learning/Modernization), the
> Machine Learning team has deployed a Wikimedia’s new machine learning
> inference infrastructure, called Lift Wing (
> https://wikitech.wikimedia.org/wiki/Machine_Learning/LiftWing). Lift Wing
> brings a lot of new features such as support for GPU-based models, open
> source LLM hosting, auto-scaling, stability, and ability to host a larger
> number of models.
>
> With the creation of Lift Wing, the team is turning its attention to
> deprecating the current machine learning infrastructure, ORES. ORES served
> us really well over the years, it was a successful project but it came
> before radical changes in technology like Docker, Kubernetes and more
> recently MLOps. The servers that run ORES are at the end of their planned
> lifespan and so to save cost we are going to shut them down in early 2024.
>
> We have outlined a deprecation path on Wikitech (
> https://wikitech.wikimedia.org/wiki/ORES), please read the page if you
> are a maintainer of a tool or code that uses the ORES endpoint
> https://ores.wikimedia.org/). If you have any doubt or if you need
> assistance in migrating to Lift Wing, feel free to contact the ML team via:
>
> - Email: m...@wikimedia.org
> - Phabricator: #Machine-Learning-Team tag
> - IRC (Libera): #wikimedia-ml
>
> The Machine Learning team is available to help projects migrate, from
> offering advice to making code commits. We want to make this as easy as
> possible for folks.
>
> High Level timeline:
>
> **By September 30th 2023: *Infrastructure powering the ORES API endpoint
> will be migrated from ORES to Lift Wing. For users, the API endpoint will
> remain the same, and most users won’t notice any change. Rather just the
> backend services powering the endpoint will change.
>
> Details: We'd like to add a DNS CNAME that points ores.wikimedia.org to
> ores-legacy.wikimedia.org, a new endpoint that offers a almost complete
> replacement of the ORES API calling Lift Wing behind the scenes. In an
> ideal world we'd migrate all tools to Lift Wing before decommissioning the
> infrastructure behind ores.wikimedia.org, but it turned out to be really
> challenging so to avoid disrupting users we chose to implement a transition
> layer/API.
>
> To summarize, if you don't have time to migrate before September to Lift
> Wing, your code/tool should work just fine on ores-legacy.wikimedia.org
> and you'll not have to change a line in your code thanks to the DNS CNAME.
> The ores-legacy endpoint is not a 100% replacement for ores, we removed
> some very old and not used features, so we highly recommend at least test
> the new endpoint for your use case to avoid surprises when we'll make the
> switch. In case you find anything weird, please report it to us using the
> aforementioned channels.
>
> **September to January: *We will be reaching out to every user of ORES we
> can identify and working with them to make the migration process as easy as
> possible.
>
> **By January 2024

[Wikitech-l] Maps as article image?

2023-05-26 Thread Strainu
Hey folks,

The maps and aticle image have been good additions to the multimedia
capabilities in Wikipedia in the last decade, widely used on my home wiki.
For now, in Wikipedia the maps are also rendered as images and become
interactive only when clicked on. This makes them potential candidates for
being displayed as article images in article without a photo.

I would like to find out how complicated it would technically be to make
the article image extension also capable of using map images? I know this
is probably nowhere on the roadmap, I'm only interested in the technical
part of the idea.

Thanks,
   Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: VisualEditor inserting ``

2023-03-13 Thread Strainu
Hi Robert,

While waiting for VE devs to respond, there are a few things you could do
to narrow-down the issue:
1. Check what the diff window shows (click on Publish changes, then on the
popup "Review your changes" in the lower-left corner).
2. Check if there was actually one or more newline(s) (\n) inserted in
wikitext.In this case, the parser probably tries to simplify the wikitext
unless a hooman decided otherwise (this seems in line with the
"alienInline" you see).

Regards,
  Strainu


În lun., 13 mar. 2023 la 09:37, Robert Vogel via Wikitech-l <
wikitech-l@lists.wikimedia.org> a scris:

> Hi everyone!
> Inspired by
> https://www.mediawiki.org/wiki/VisualEditor/Gadgets#Implementing_a_custom_command
> I was trying to add a `` into the VE using this command:
>
> ```
> ve.init.target.getSurface().getModel().getFragment().insertContent( [ {
> type: 'break' }, { type: '/break' } ] );
> ```
>
> While it actually inserted the line break in visual edit mode, there was
> not `` in the wikitext after saving the page or switching to wikitext
> mode within the edit session.
> I also tried to implement the whole "command/tool" in an extension, but
> the behavior was the same. The odd thing is that a `` inserted  in
> wikitext mode survives the round trip. The linear data model shows it as
> "alienInline" node then.
>
> Any ideas why the official example didn't work for me?
>
> Greetings,
> Robert
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Filtered lists with checkboxes

2022-10-26 Thread Strainu
Pe marți, 25 octombrie 2022, Bináris  a scris:
>
>
> Strainu  ezt írta (időpont: 2022. okt. 25., K,
19:20):
>>
>> If you're ok with editing a list of titles, petscan [1] is all you need.
>
> Thank you!
> Unfortunately, this does not run in native home wiki, so it lacks the
advantage of seeing the article preview when I push my mouse over the
title, but anyway, it is useful!

Not sure about page pop-ups, but Wikidata label and description added to
the output seems like an easy improvement. Patches are welcome:
https://github.com/magnusmanske/petscan_rs/pulls?q=is%3Apr+is%3Aclosed

:)
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Filtered lists with checkboxes

2022-10-25 Thread Strainu
Pe marți, 25 octombrie 2022, Bináris  a scris:
> Thank you! " editing the wikipage and deleting the unwanted titles" --
yes, this is just what I want, The list may be on a temporary page, and
deleting the unwanted is even better then first marking it. Then I can use
this page as a source for the bot.
> Is there a list of these tools? I am not familiar with Toolforge, how to
find a tool by purpose.

If you're ok with editing a list of titles, petscan [1] is all you need.

If you never used it, here is how I would do it:

1. Generate the initial list of articles: on the "Categories" tab, select
your wiki and your category. On the output tab, select Format plain text.
This will get you the list.

2. Prepare the workspace: After you refresh (in order to reset all fields),
go to the "Other sources" tab and input your list of files in the "Manual
list" field, then just below add the wiki and click on "Do it!". A list of
results will appear, along with a number called psid, which uniquely
identifies the list. For example: "PSID is 23121189". Copy the link and
send them to your users.

3. Each user can then edit the article list, click on "Do it!" and give you
their psid.

4. (Optional) If you want to do more complex operations on the results, set
the output to PagePile and ask the users for the PagePile ID (or the url
they are redirected to when clicking Do it). Then use
https://pagepile.toolforge.org/?menu=filter to combine them.

Now, pywikibot does have some support for petscan, but I believe it does
not include psid. However, it's trivial to scrape the output of the text
output.

Good luck!

Strainu

[1] https://petscan.wmflabs.org/


>
> Strainu  ezt írta (időpont: 2022. okt. 25., K, 1:10):
>>
>> There are several tools working with PagePiles that can achieve the same
result, but they are all basically equivalent to editing the wikipage and
deleting the unwanted titles, which doesn't seem to be what you want.
>
>
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Filtered lists with checkboxes

2022-10-24 Thread Strainu
Pe luni, 24 octombrie 2022, Bináris  a scris:
> Hi,
[...]
> By that time, do you know about such service on the tolserver? Or can I
do it myself somehow with Lua?

Hi Binaris,

For Commons files, there is https://pagepile-visual-filter.toolforge.org/
It shouldn't be too complicated to extend it to any list, but it's not
there yet. Maybe ask the maintainer for an extended version?

There are several tools working with PagePiles that can achieve the same
result, but they are all basically equivalent to editing the wikipage and
deleting the unwanted titles, which doesn't seem to be what you want.

HTH,
 Strainu


>
> --
> Bináris
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: TDF is looking for community representatives

2022-10-14 Thread Strainu
Erica,

There are a lot of emails on this list and Wikimedia-l starting with "you
can find translations of this announcement on meta". I think this is a very
effective way to indicate translations where needed, while keeping the
announcement in a single place. Sending us to a link feels like the catchy
press titles: "you won't believe what's happening! Click here to find out!"

Second, as much as we want to be multilingual, participating in the
technical community without some command of English is basically
impossible. In that respect, the technical audience is not the same as a
general Wikimedia audience.

My 2c,
 Strainu

Pe luni, 10 octombrie 2022, Erica Litrenta  a
scris:
> (Sorry to "hijack" the thread, I am not personally involved in TDF but
since I was the original "messenger", I'm interested in learning more about
Daniel's POV.
> The original email linked to
https://www.mediawiki.org/wiki/Technical_decision_making/Community_representation
.
> While I'm well aware that info a click away is not optimal, I'm
definitely more against walls of text that may be hard to understand for
non-native readers.
> That page was marked for translation instead, and among other things, it
offered exactly the process you are describing, and the second email asked
specifically for recommendations.
> We had /also/ asked for recs to a few dozens colleagues, and none of the
people pinged gave their availability.
> Interested to hear what could have been done differently.)
> On Thu, Oct 6, 2022 at 1:06 PM Daniel Kinzler 
wrote:
>>
>> Am 06.10.2022 um 08:52 schrieb Linh Nguyen:
>>
>> Kunal,
>> I hear you but we only have 3 people who actually put the effort into
applying for the position.  We are appointing people who are at least
trying to help.  If you want to help in the process please feel free to put
your name on the list.
>>
>> The original mail doesn't really make it clear what impact one might
have by joining, or what would be expected of a member. Asking people to
click a link for details loses most of the audience already.
>>
>> One thing that has worked pretty well in the past when we were looking
for people to join TechCom was to ask for nominations, rather than
volunteers. We'd then reach out to the people who were nominated, and asked
them if they were interested.  Self-nominations were of course also fine.
>>
>> Another thing that might work is to directly approach active volunteer
contributors to production code. There really aren't so many really active
ones. Ten, maybe.
>>
>> --
>> Daniel Kinzler
>> Principal Software Engineer, Platform Engineering
>> Wikimedia Foundation
>>
>> ___
>> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
>> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
>>
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
>
> --
>
> 
>
> <
https://lh4.googleusercontent.com/t1GetqH3N05ZDv75_-Q6W0YEm4ofn22ZQVNUIoPTIa-ruOTtteTbCweEL9so7ibpyWciFTgOyeDjTRDNr7bhQtxRjFucqJcb7cFnXUqpcqkBsTGqxZRdpmCCzx5xnCYOks-0sAej
>
>
> Erica Litrenta (she/her)
>
> Senior Manager, Community Relations Specialists (Product)
>
> Wikimedia Foundation
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] A random thank you to the Wikimedia tech community

2022-08-30 Thread Strainu
Hi all,

With the risk of being off-topic, I want to express my gratitude to
all the members of the Wikimedia tech community for being such a
supportive and helpful group! Not only here, but on all communication
channels.

It's been years since one of my questions (most of which could be
classified as obscure) has gone unanswered. Also, I recently had to go
through all my emails since June and I noticed that except for a few
announcements and obvious spam, all other other threads had at least
one answer. For me, this is the sign of a great community to be in.

Thanks again and keep up the good work!
   Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


[Wikitech-l] Re: 3D models in Commons

2022-07-03 Thread Strainu
Pe duminică, 3 iulie 2022, Derk-Jan Hartman 
a scris:
> You mean 3d models besides the type we already support ?
> https://www.mediawiki.org/wiki/Extension:3D

Yes, specifically the formats that support textures. There is a ticket list
in phab: https://phabricator.wikimedia.org/maniphest/query/ZgJlL4Jm.OCj/#R

Strainu
>
> On 3 Jul 2022, at 11:49, Strainu  wrote:
> Hey folks,
>
> I know it's a bit early in the fiscal year and that's probably why I
can't find anything on wiki, but I understood that the new yearly plan puts
a lot of emphasis on the multimedia features. Does that include making 3D
models usable in our wikis? If there are planned projects related to that
in this fiscal year, would it be possible to get a link to the project page?
>
> I'm asking because there are a lot of cool, freely licensed models out
there just waiting to be imported...
>
> Thank you,
>  Strainu ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
>
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] 3D models in Commons

2022-07-03 Thread Strainu
Hey folks,

I know it's a bit early in the fiscal year and that's probably why I can't
find anything on wiki, but I understood that the new yearly plan puts a lot
of emphasis on the multimedia features. Does that include making 3D models
usable in our wikis? If there are planned projects related to that in this
fiscal year, would it be possible to get a link to the project page?

I'm asking because there are a lot of cool, freely licensed models out
there just waiting to be imported...

Thank you,
 Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: namespace names vs interlanguage links

2022-06-05 Thread Strainu
Amir,

In Romanian, this kind of possible, but highly unlikely problem is called
"drob de sare" (salt stone). I'll let you use your language skills to find
out why :)

Let the community be and find their own ways to deal with the problem, if
it ever becomes a real one. If one really wants to link to the sanskrit
Wikipedia, it can do so using [[:w:sa:...]]. Bugs in Pwb can be solved if
you log them - but does Tyap has any robots today?

No need to have users in a new language write in English just because some
problems might occur in certain very particular scenarios.

My 2c,
 Andrei

Pe sâmbătă, 4 iunie 2022, Amir E. Aharoni  a
scris:
> Hi,
> I've recently discovered that namespace names may have an ambiguity with
interlanguage links: If a namespace name is the same as a language code,
using it in wikitext poses all kinds of challenges.
> Actual example: In the Tyap language (code kcg), the Wikipedia in which
was created a few days ago, the Category namespace is called "Sa:", which
is also the language code and, hence, the interlanguage link code for
Sanskrit.
> So, "Sa" is usable in wikitext, but has all kinds of little issues. For
example, old-style non-Wikidata interlanguage links to Sanskrit from the
Tyap Wikipedia are probably impossible. They are not very likely to be
inserted into articles, but still, it's somewhat conceivable. I also
noticed that it confuses Pywikibot in some ways. And I can imagine other
subtle bugs that it will cause.
> I've asked Tyap speakers whether it's possible to change the word for
"Category" to something else. No—they want to use "Sa". It's legitimate not
to want to change the word for a technical reason.
> So what can be done?
> The editors there told me that it's OK for them to use "[[Category:" in
wikitext, but they would like to see "Sa:" in the title of category pages.
I'm not sure that it's possible: as far as I know, the namespace name
definition in MessagesKcg.php will be used for both things, and if Visual
editor is used to add categories, it will add "[[Sa:". Bots or gadgets can
be used to replace it to "Category", but is looks like an ugly hack.
> Does anyone have better ideas for a robust, comprehensive solution?
>
> --
> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> http://aharoni.wordpress.com
> ‪“We're living in pieces,
> I want to live in peace.” – T. Moore‬
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Different cache invalidation rules for similar users?

2022-04-06 Thread Strainu
În mie., 6 apr. 2022 la 22:55, Krinkle  a scris:
>
> On Mon, 4 Apr 2022, at 10:12, Strainu wrote:
>
> Thank you for your responses folks. The script is a gaget [1], loaded
> and unloaded through the preferences.
>
> Regards,
>Strainu
>
> [1] https://ro.wikipedia.org/wiki/MediaWiki:Gadget-wikidata-description.js
>
>
> This page has a history of two revisions, both 25 Mar, about 10 minutes apart.
>
> Is the reported issue that its last edit [1] was seemingly not applied for 
> some editors? E.g. they kept getting the previous version with the 
> getElementByID error?

No, the issue is that the users would disable the gadget and it would
still be enabled after 12h.

Strainu

>
> -- Krinkle
>
> [1] 
> https://ro.wikipedia.org/w/index.php?title=MediaWiki%3AGadget-wikidata-description.js=revision=14854456=14854443
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Different cache invalidation rules for similar users?

2022-04-04 Thread Strainu
Thank you for your responses folks. The script is a gaget [1], loaded
and unloaded through the preferences.

Regards,
   Strainu

[1] https://ro.wikipedia.org/wiki/MediaWiki:Gadget-wikidata-description.js

În lun., 4 apr. 2022 la 04:20, Krinkle  a scris:
>
> On Sun, 3 Apr 2022, at 17:57, Strainu wrote:
>
> Hi,
>
> I've recently seen some complaints from 2 users located in the same country 
> that it takes about half a day for the Javascript changes to propagate. Users 
> from different countries but similar user rights don't seem to have this 
> problem.
>
> Is it possible to have different cache invalidation rules for different 
> countries? If not, what else could cause this behavior?
>
>
> It depends on what kind of changes and to what piece of JavaScript code.
>
> My guess would be that this is a change not to deployed software or gadgets 
> or site scripts, but a user script. And that the user script is loaded by URL 
> via importScriptURI or mw.loader.load. And that the URL is non-standard (e.g. 
> not exactly /w/index.php?title=..=raw=text/javascript, but with 
> other parameters or different order or different encoding). This means that 
> it is not purged on edits.
>
> In that case, it will stay cached. It might then be that someone near one 
> data center is lucky that the URL is not used there before and sees no cache. 
> Or that near another data center the URL is not popular enough to stay in the 
> CDN and thus falls out before the 7 day expiry despite no observed edit or 
> purge.
>
> To know for sure, I would need to see the specific script edit and how the 
> script is loaded.
>
> -- Krinkle
>
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Different cache invalidation rules for similar users?

2022-04-03 Thread Strainu
Hi,

I've recently seen some complaints from 2 users located in the same country
that it takes about half a day for the Javascript changes to propagate.
Users from different countries but similar user rights don't seem to have
this problem.

Is it possible to have different cache invalidation rules for different
countries? If not, what else could cause this behavior?

Thanks,
  Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Is there still a maximum page size in effect?

2022-02-05 Thread Strainu
În sâm., 5 feb. 2022 la 19:53, Andre Klapper  a scris:
>
> On Sat, 2022-02-05 at 18:43 +0200, Strainu wrote:
> > I am aware of the various limits in the NewPP report. I'm trying to
> > determine if we currently we have some max page size (before or after
> > processing).
> >
> > The documentation on mw.org and en.wp is a bit confusing on the
> > subject and personal experimentation shows that substituting
> > templates allows be to go past the 2MiB page size.
>
> What does "personal experimentation" mean exactly? There might be
> exceptions like https://phabricator.wikimedia.org/T188852 but generally
> speaking, as neither
> https://noc.wikimedia.org/conf/InitialiseSettings.php.txt nor
> https://noc.wikimedia.org/conf/CommonSettings.php.txt seem to change
> the MediaWiki software default setting defined in
> https://phabricator.wikimedia.org/source/mediawiki/browse/master/includes/DefaultSettings.php$2673
> I'd assume that we're at 2MiB.

Hey Andre,

Thanks for taking the time to respond to my curiosity during the weekend.

Here is what I experimented with:
* for post-parser size (which would have been my first guess given
that for templates we count the *Post‐expand include size*) I just
measured the size of the .mw-parser-output div. For this, I took the
output of [1] (which is a mix of included and substituded templates
and is displayed just fine) and it was well over 8MiB.
* for pre-parser size, I took the same page [1], added some more
templates and started saving while substituting them (e.g. I have 1.5
MiB of text and a few thousand templates which I sustitute in one go).
One version which goes over the 2MiB limit is [2].

Now, because of the way those templates are written (pretty verbose
and with a lot of whitespace) the difference between the template and
the result of the substitution is small, so the page size is just over
the limit, but one can imagine a template expanding near 2MiB the
limit a couple of times could take a page to ~4MiB without much
effort.

I understand from your message that these are bugs and the limit is
still enforced. However, I still don't understand the logic in having
a limit for wikitext in pages, but a limit for post-expand (if I
understand correctly, that is after they go through the parser) in
templates. Why not have a single limit set to something like 8-12-16
MiB and counted after all the processing is done?

Thanks again,
   Strainu

[1] 
https://ro.wikipedia.org/w/index.php?title=Bunuri_mobile_din_domeniul_%C8%99tiin%C8%9Bele_naturii_clasate_%C3%AEn_patrimoniul_cultural_na%C8%9Bional_al_Rom%C3%A2niei_aflate_%C3%AEn_municipiul_Bucure%C8%99ti_(tezaur)=14712585
[2] 
https://ro.wikipedia.org/w/index.php?title=Utilizator:Strainu/2=14784282


>
> See also https://phabricator.wikimedia.org/T189108 about requesting an
> increase, and https://phabricator.wikimedia.org/T181907#3835654
> for some more background.
>
> Cheers,
> andre
> --
> Andre Klapper (he/him) | Bugwrangler / Developer Advocate
> https://blogs.gnome.org/aklapper/
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Is there still a maximum page size in effect?

2022-02-05 Thread Strainu
Hi,

I am aware of the various limits in the NewPP report. I'm trying to
determine if we currently we have some max page size (before or after
processing).

The documentation on mw.org and en.wp is a bit confusing on the subject and
personal experimentation shows that substituting templates allows be to go
past the 2MiB page size.

If we don't have such a limit, I'm not exactly sure why do we need the
Post‐expand include size limit? Why is output generated by transclusion
harder on the parser than output directly in the page?

Thanks,
 Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: [Wikimedia-l] Re: Re: Uplifting the multimedia stack (was: Community Wishlist Survery)

2022-01-12 Thread Strainu
În mar., 11 ian. 2022 la 08:01, Kunal Mehta  a scris:
>
> So I think the status quo can be changed by just about anyone who is
> motivated to do so, not by trying to convince the WMF to change its
> prioritization, but just by doing the work. We should be empowering
> those people rather than continuing to further entrench a WMF technical
> monopoly.
>

Counterexample:
https://lists.wikimedia.org/hyperkitty/list/wikitech-l@lists.wikimedia.org/message/G2QTRJFAUKLE45SFTFUHOOTOBR6G3DP3/
(this was the situation that I quoted in my first email on this thread
as the WMF refusing to even do reviews).

Maybe it's just the multimedia part that it's in this desperate
situation, but I can totally see volunteer developers getting
discouraged quickly if their patches are outright ignored.

Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: [Wikimedia-l] Uplifting the multimedia stack (was: Community Wishlist Survery)

2021-12-30 Thread Strainu
> So where is the best current place to discuss scaling Commons, and all
that entails?

My impression is that we don't have one. All we hear is "it needs to be
planned", but there is no transparency on what that planning involves or
when it actually happens.

> I'd be surprised if the bottleneck were people or budget

The main problem I see is that we end up in this kind of situation. Scaling
and bug fixing critical features should be part of the annual budget. Each
line of code deployed to production wikis should have an owner and
associated maintenance budget each year. Without this, the team will not
even commit reviews - see the thread on wikitech a few months back where a
volunteer programmer willing to work on Upload Wizard was basically told
"We will not review your code. Go fork."

> Some examples from recent discussions

Also improvements to the Upload Wizard. There are quite a few open items in
Phab on this.

I really hope you will have better luck than others with bringing this
issue up in the priority list for next year - multimedia support is growing
more outdated by the minute.

Strainu

Pe joi, 30 decembrie 2021, Samuel Klein  a scris:
> Separate thread.  I'm not sure which list is appropriate.
> ... but not all the way to sentience.
>
> The annual community wishlist survey (implemented by a small team,
possibly in isolation?) may not be the mechanism for prioritizing large
changes, but the latter also deserves a community-curated priority queue.
To complement the staff-maintained priorities in phab ~
> For core challenges (like Commons stability and capacity), I'd be
surprised if the bottleneck were people or budget.  We do need a shared
understanding of what issues are most important and most urgent, and how to
solve them. For instance, a way to turn Amir's recent email about the
problem (and related phab tickets) into a family of persistent,
implementable specs and proposals and their articulated obstacles.
> An issue tracker like phab is good for tracking the progress and
dependencies of agreed-upon tasks, but weak for discussing what is
important, what we know about it, how to address it. And weak for
discussing ecosystem-design issues that are important and need persistent
updating but don't have a simple checklist of steps.
> So where is the best current place to discuss scaling Commons, and all
that entails?  Some examples from recent discussions (most from the wm-l
thread below):
> - Uploads: Support for large file uploads / Keeping bulk upload tools
online
> - Video: Debugging + rolling out the videojs player
> - Formats: Adding support for CML and dozens of other common high-demand
file formats
> - Thumbs: Updating thumbor and librsvg
> - Search: WCQS still down, noauth option wanted for tools
> - General: Finish implementing redesign of the image table
>
> SJ
> On Wed, Dec 29, 2021 at 6:26 AM Amir Sarabadani 
wrote:
>>
>> I'm not debating your note. It is very valid that we lack proper support
for multimedia stack. I myself wrote a detailed rant on how broken it is
[1] but three notes:
>>  - Fixing something like this takes time, you need to assign the budget
for it (which means it has to be done during the annual planning) and if
gets approved, you need to start it with the fiscal year (meaning July
2022) and then hire (meaning, write JD, do recruitment, interview lots of
people, get them hired) which can take from several months to years. Once
they are hired, you need to onboard them and let them learn about our
technical infrastructure which takes at least two good months. Software
engineering is not magic, it takes time, blood and sweat. [2]
>>  - Making another team focus on multimedia requires changes in planning,
budget, OKR, etc. etc. Are we sure moving the focus of teams is a good
idea? Most teams are already focusing on vital parts of wikimedia and
changing the focus will turn this into a whack-a-mole game where we fix
multimedia but now we have critical issues in security or performance.
>>  - Voting Wishlist survey is a good band-aid in the meantime. To at
least address the worst parts for now.
>>
>> I don't understand your point tbh, either you think it's a good idea to
make requests for improvements in multimedia in the wishlist survey or you
think it's not. If you think it's not, then it's offtopic to this thread.
>> [1]
https://lists.wikimedia.org/hyperkitty/list/wikimedi...@lists.wikimedia.org/message/WMPZHMXSLQJ6GONAVTFLDFFMPNJDVORS/
>> [2] There is a classic book in this topic called "The Mythical Man-month"
>>
>> On Wed, Dec 29, 2021 at 11:41 AM Gnangarra  wrote:
>>>
>>> we have to vote for regular maintenance and support for
essential functions like uploading files which is the core mission of
Wikimedia Commons
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Limit to the number of images in a page?

2021-11-29 Thread Strainu
Hi,

I have some wikipages with a large number of images (1000+). Those
pages never load completely, as upload.wikimedia.org starts returning
429 Too many requests after a while.

This limit does not seem to be documented on mediawiki.org, so I would
like to know what it the exact value and if there is a way to work
around it (except for splitting the pages).

Thanks,
   Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


[Wikitech-l] Re: How do I make VE *not* save/recover my changes?

2021-10-27 Thread Strainu
Thanks David. I just tried this and it only seems to work if I go back to
the article via the tabs. Any other tab or link doesn't work. I've logged
https://phabricator.wikimedia.org/T294463 so the team can check if this is
intended or not.

Have a good day,
Strainu

Pe miercuri, 27 octombrie 2021, David Lynch  a scris:
> Leave the editing mode "cleanly" -- saving, navigating back to the
article via the tabs, following a sidebar link to another page, hitting
escape, whatever. If you stop editing in a way that we can tell is
intentional, you'll be asked if you want to discard your changes, and if
you say that you do then it'll all get cleaned up. (We can't tell the
difference between "I deliberately closed this tab because I want to get
rid of this" and "I accidentally closed the wrong tab and I'll be very
upset if my changes are lost", unfortunately...)
> If you want it to never autosave your changes, you could disable local
session storage for the wikis you use at the browser level. But that might
have side-effects outside of VE.
> We don't have any preferences that'd control the autosave, and I don't
think that we have any current plans to implement something like that.
> ~David
> On Wed, Oct 27, 2021 at 11:14 AM Strainu  wrote:
>>
>> Hi,
>>
>> I've been searching quite a bit on MediaWiki.org but I can't find how
>> to tell the VisualEditor to stop saving and (especially) recovering
>> changes that I haven't explicitly saved. Is there a method?
>>
>> Thanks,
>> Strainu
>> ___
>> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
>> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
>>
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] How do I make VE *not* save/recover my changes?

2021-10-27 Thread Strainu
Hi,

I've been searching quite a bit on MediaWiki.org but I can't find how
to tell the VisualEditor to stop saving and (especially) recovering
changes that I haven't explicitly saved. Is there a method?

Thanks,
Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


Re: [Wikitech-l] Maps Modernization plan - FYI

2021-03-22 Thread Strainu
Hi Erica,

Thanks for the announcement, I'm glad to see some love given to Maps. Could
you explain how this initiative interacts with the planned improvements
from WMDE [3]?

Thank you,
   Strainu

[3] https://meta.wikimedia.org/wiki/WMDE_Technical_Wishes/Geoinformation


În lun., 22 mar. 2021 la 19:11, Erica Litrenta  a
scris:

> Greetings,
>
> This is a follow up from our last email some months ago (and a crosspost).
> You may already have seen today's announcement from Legal about the
> upcoming changes to the Maps Terms of Use. Here is an extra heads-up that 
> Wikimedia
> Maps are transitioning towards a more modern architecture. The first phase
> of this transition will be replacing Tilerator [0] with Tegola [1] as our
> vector tile server. This is a change in the Maps infrastructure, so there
> should be little to no impact to the end users’ experience.
>
> It is important that we are able to provide software that is sustainable
> to support, before we can guarantee a reliable user experience. Wikimedia
> Maps aim to provide Wikimedia users a consistent experience contributing to
> and learning about geoinformation. To achieve this goal, we will empower
> those engineers maintaining the Wikimedia Maps infrastructure to do so with
> ease and low effort.
>
> If you want to learn more, please head to mediawiki.org [2], where you
> will also find a Questions & Answers section.
>
> Thanks, and take care,
>
> Erica Litrenta (on behalf of the Product Infrastructure team)
>
> [0] https://wikitech.wikimedia.org/wiki/Maps/Tilerator
>
> [1] https://tegola.io/
> [2] https://www.mediawiki.org/wiki/Wikimedia_Maps/2021_modernization_plan
>
> --
>
> --
>
>
> Erica Litrenta (she/her)
>
> Manager, Community Relations Specialists
>
> Wikimedia Foundation <https://meta.wikimedia.org/wiki/User:Elitre_(WMF)>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The future of UploadWizard

2021-02-04 Thread Strainu
În joi, 4 feb. 2021 la 21:31, Ostrzyciel Nożyczek
 a scris:
> The things that I have on mind are:
>
> Rework config handling to make it more consistent (now only campaign configs 
> are parsed, the main config is not) and robust (unit testing included!).
> Simplify the task of including more licenses in UW (message loading based on 
> config), add more built-in icons to make that even simpler for site admins.
> Change the tutorial from an image to wikitext, which should be much easier to 
> edit.
> Restructure documentation to be third-party centric, maybe make a brief 
> configuration guide (configuring UW now requires one to carefully study a 
> not-so-friendly PHP file).
> Add a few quick hacks to make the UI responsive, at least to some degree 
> (that is very much possible with just CSS). The solution can be polished 
> later.
> Remove Wikibase-related code and other Wikimedia-specific stuff that will 
> just make testing harder.
> Improve configurability for all fields in the wizard, ensure sensible default 
> settings.
> Add an option to use single-language fields. Multi-language fields are 
> unnecessary on most wikis.
> Look into how different stages of UW could be streamlined / improved to make 
> the upload process faster, especially on wikis requiring less detailed 
> information.
> Make all kinds of file description syntax configurable.
> (Maybe) prepare and package a few ready-to-use configuration sets, together 
> with the templates necessary to make it work. That would really simplify the 
> process of bringing UW to a wiki.

Just a quick note to say that out of the 11 items you list above, 8
would also improve the Wikimedia experience :)

Strainu

>
> ...and more! This may be a bit ambitious, but I think it's doable with just a 
> few people interested in the project and some spare time. I am certainly on 
> board. :P
>
>
> --
> Ostrzyciel (he/him)
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The future of UploadWizard

2021-02-04 Thread Strainu
În joi, 4 feb. 2021 la 16:54, Bartosz Dziewoński  a scris:
>
> On 2021-02-03 23:33, Strainu wrote:
> > One thing that puzzles me in that ticket is this phrase from Mark
> > Traceur: "It might be better to look at something (slightly) more
> > modern, like the upload dialog in core". Does anyone know what that
> > dialog is? AFAIK the uploader in core (Special:Upload) hasn't changed in
> > decades, except maybe for the look of the buttons. Its usability is
> > rubbish compared to UW. Wikis used to (no, actually they still do)
> > customize it using the uselang param,which messes with the user's
> > settings. I can't really understand how that would be better...
>
> The upload dialog is this: https://www.mediawiki.org/wiki/Upload_dialog
>
> It's accessible from both the visual and wikitext editors (unless you
> disabled the toolbar), though their dialogs to insert image thumbnails.

Thanks to all who enlightened me. :)

We're basically talking cross-wiki uploader here in the Wikimedia
world (although I'm sure it can be used for other things). I agree
with Ostrzyciel's assessment that it lets anyone upload anything -
that's what prompted the request to disable cross-wiki uploads in the
first place. The UW, in collaboration with campaigns, remains the most
powerful web uploader the Wikimedia community currently has.

Strainu
>
> --
> Bartosz Dziewoński
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The future of UploadWizard

2021-02-03 Thread Strainu
As the deafening silence of this thread probably shows, a discussion is not
really possible. The WMF has had 0 interest in making uploads easier in the
last few years.

To be fair, faced with furios opposition from the Commons community for
even basic improvements such as allowing imports from other sites except
Flickr and requests to stop cross-wiki uploads, this decision does not seem
out of place.

As one of the few people that has enabled UW in another Wikimedia wiki, I
would like to encourage you to follow on your plan to improve the wizard as
much as possible. Plans at the WMF change often and not necessarily for the
better. A responsive design would be awsome news for wikis that need to
guide their users through the mess that is freedom of panorama.

One thing that puzzles me in that ticket is this phrase from Mark Traceur: "It
might be better to look at something (slightly) more modern, like the
upload dialog in core". Does anyone know what that dialog is? AFAIK the
uploader in core (Special:Upload) hasn't changed in decades, except maybe
for the look of the buttons. Its usability is rubbish compared to UW. Wikis
used to (no, actually they still do) customize it using the uselang
param,which messes with the user's settings. I can't really understand how
that would be better...

Andrei

Pe duminică, 31 ianuarie 2021, Ostrzyciel Nożyczek <
ostrzycielnozyc...@gmail.com> a scris:

> Hi,
>
> I would like to uhhh... start the discussion? ask for opinions? about the
> future of UploadWizard.
>
> It is a rather special extension, that was from the start made mostly for
> Commons' very specific needs and getting it to work anywhere else presents
> some challenges (some of which I attempt to tackle here
> ). Interestingly, it still is
> used by many third-party wikis
>  and although some
> of them don't need its full set of capabilities related to describing
> licenses, authors and sources, there are wikis that do need that. The wiki
> I maintain, Nonsensopedia, has a Commons-like file description system based
> on Semantic MediaWiki (see example here
> ) and
> UploadWizard has been a *blessing* for us, greatly simplifying the task
> of file moderation.
>
> Opinion time: Wikis should be *encouraged* to properly describe the
> authorship of files that they use, to meet the licensing requirements. IMO
> Wikimedia Foundation as the maintainer of MediaWiki and a foundation
> dedicated to dissemination of free culture should provide a usable tool
> for properly describing free multimedia. UploadWizard could be just that.
>
> At the same time, the extension has been basically unmaintained
>  since the Multimedia
> team was dissolved and I've been rather surprised to discover that patches
> improving third-party support were met with uhm... very limited
> enthusiasm?  There are
> a few obvious features lacking like mobile support (seriously, try opening
> https://commons.wikimedia.org/wiki/Special:UploadWizard on a narrow
> screen device, it's been like this since.. always) and configurability (you
> have to jump through some serious hoops
>  to just add a
> license; customizing the tutorial is similarly hard).
>
> I've been thinking of what to do with the above and I really wouldn't want
> to embark on something that will be rendered redundant or obsolete in a
> year, so my question is: are there any plans for UploadWizard? What makes
> me suspect that things may change is primarily Structured Data on Wikimedia
> Commons, which in the future will maybe (?) supersede the description
> system around the {{Information}} template. Are there any rough roadmaps or
> outlines of anything resembling a plan for that? If Commons was to
> implement full, structured file descriptions in the upload tool, that code
> would be probably hardly usable outside Commons, given that Wikibase is not
> something easy to install or maintain, it is also awfully overkill for the
> vast majority of applications. In such a situation, would it make sense to
> consider completely separating the "Wikimedia Commons Shiny Upload Tool"
> from a more general extension that would be usable for third
> parties, stripped of any Commons-specific code? A lot of things could be
> much simplified if the extension was to target just the needs of third
> parties and not Commons.
>
> I ask about this because I really don't see any sort of interest of the
> extension's *de facto* owner (and that is WMF) in developing it, there
> are also no public plans for it, as far as I know. Yes, I can make a fork
> anytime, but first I'd prefer to know if I'm not missing something. Well,
> actually, I already did make a fork of UW
> 

Re: [Wikitech-l] The watchlist queue does not work recently

2020-12-29 Thread Strainu
În mar., 29 dec. 2020 la 23:46, Andre Klapper  a scris:
>
> On Tue, 2020-12-29 at 18:37 +, Tom Doles via Wikitech-l wrote:
> Yeah, I also had to deal with such unhelpful responses before.
>
> I'm sorry to hear that. The process of gathering sufficient info in
> tickets can unfortunately sometimes be confusing, surprising, or
> frustrating, given the many technical Wikimedia areas and complexity.
>
> > Some advice to avoid this:
>
> Customer service rule no. 1: listen to the customer. It's not helpful
> to argue and I'd assume that's not what Andre's employer expects from
> him.
>
>
> There might be a misunderstanding:

Not a misunderstanding, more like a difference in the chosen meaning
of the term. I think what Tom was suggesting is that the bugwrangler,
just like anyone doing any commercial activity in this world, has
customers to which they provide a service - in this case, the services
described at [[:mw:Bugwrangler]] [0].and specifically the following
line: "Work with members of the community who report bugs to clarify
any ambiguity in the bug descriptions and get all the information
required to reproduce the bugs", For the purposes of this thread, your
customers are the members of the community which take from their time
to report bugs.

The "rules" described in Tom's email are good practices that can be
encountered, under different forms, in many companies' core values.
[1]  While WMF does not put them this high, I would not dismiss them
as "not my job". My suggestion would be to ask for more constructive
feedback instead:
* Why is the "How to report a bug" page not helpful?
* Where do you need more info?
* How can the bugwrangler help more while keeping in mind he needs to
scale his methods to hundreds or thousands of bug reporters every
month?

[0] As a sidenote, I personally find that page to be comprehensive and
providing an appropriate level of detail
[1] https://builtin.com/company-culture/company-core-values-examples

>
> Phabricator is an issue tracker where people interact in their many
> different roles (readers, editors, developers, managers, translators,
> document writers, designers, etc etc etc). Anyone can report Wikimedia
> related technical issues there, by following
> https://www.mediawiki.org/wiki/How_to_report_a_bug
>
> If you are looking for a 'customer service' support venue, then you may
> want to check https://www.mediawiki.org/wiki/Project:Support_desk for
> MediaWiki, or https://meta.wikimedia.org/wiki/Tech for tech issues on
> Wikimedia wikis, or contact the OTRS mail queues.
>
>
> Regarding [part of] my work, https://www.mediawiki.org/wiki/Bugwrangler
> tries to outline some duties. (As I was explicitly mentioned.)
>
> Hope that helps a bit. :)

Same here :)

Strainu

>
> Cheers,
> andre
>
> --
> Andre Klapper (he/him) | Bugwrangler / Developer Advocate
> https://blogs.gnome.org/aklapper/
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The watchlist queue does not work recently

2020-12-29 Thread Strainu
Pe marți, 29 decembrie 2020, Andre Klapper  a scris:

> On Tue, 2020-12-29 at 20:14 +0200, יגאל חיטרון wrote:
> > Sure. As you said, this very link requires full reproduction steps.
>
> No. It says "Full details of the issue, giving as much detail as possible."


That's just... Confusing. I can totally understand why someone would feel
discouraged from logging an issue. How about "Full details of the issue,
giving all the information you currently have. If that is insufficient you
will be asked for additional information along with guidance on how to
obtain it."?

An example of what "minimized steps" means might also be a good idea.

Strainu


> andre
> --
> Andre Klapper (he/him) | Bugwrangler / Developer Advocate
> https://blogs.gnome.org/aklapper/
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What is JSON (in JavaScript code)?

2020-10-30 Thread Strainu
Thanks Dan and Roy, apparently our TW was overriding JSON for some
reason. Fixed it.

Strainu

În vin., 30 oct. 2020 la 17:20, Roy Smith  a scris:
>
> JSON is Java Script Object Notation.  It's a way of encoding structured data 
> as text strings which originated (as in name implies) in javascript, but is 
> now widely used as a data exchange format, with support in nearly every 
> programming language.  https://www.w3schools.com/js/js_json.asp
>
> But, in the context you're using it, it's a library of JSON parsing and 
> encoding functions built into the javascript implementation on most browsers. 
>  https://www.w3schools.com/Js/js_json_parse.asp
>
> If you've opened your browser's console, you should be able to type JSON at 
> it and get back something like:
>
> JSON
> JSON {Symbol(Symbol.toStringTag): "JSON", parse: ƒ, stringify: ƒ}parse: ƒ 
> parse()arguments: (...)caller: (...)length: 2name: "parse"__proto__: ƒ 
> ()[[Scopes]]: Scopes[0]stringify: ƒ stringify()Symbol(Symbol.toStringTag): 
> "JSON"__proto__: Object
>
>
> If you get something like "JSON is not defined", you're probably running an 
> ancient browser.
>
>
>
>
> On Oct 30, 2020, at 11:05 AM, Strainu  wrote:
>
> Hi,
>
> I'm looking at solving the following console warning on ro.wp:
> "JQMIGRATE: jQuery.parseJSON is deprecated; use JSON.parse" which
> appears due to outdated Twinkle code. Just making the replacement does
> not work, since JSON is not defined. As a matter of fact, I cannot
> find it anywhere else in the code loading on a normal Romanian
> Wikipedia page.
>
> Alas, the generic name of that object makes searching on mw.org or
> Google rather useless. I can see some similar changes in Phabricator,
> but they seem to work.
>
> So, what is JSON and how can I use it in my code?
>
> Thanks,
>   Strainu
>
> P.S. Please don't suggest updating Twinkle...
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] What is JSON (in JavaScript code)?

2020-10-30 Thread Strainu
Hi,

I'm looking at solving the following console warning on ro.wp:
"JQMIGRATE: jQuery.parseJSON is deprecated; use JSON.parse" which
appears due to outdated Twinkle code. Just making the replacement does
not work, since JSON is not defined. As a matter of fact, I cannot
find it anywhere else in the code loading on a normal Romanian
Wikipedia page.

Alas, the generic name of that object makes searching on mw.org or
Google rather useless. I can see some similar changes in Phabricator,
but they seem to work.

So, what is JSON and how can I use it in my code?

Thanks,
   Strainu

P.S. Please don't suggest updating Twinkle...

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Wikimedia-l] Wikimedia Chat

2020-08-30 Thread Strainu
În dum., 30 aug. 2020 la 03:00, Amir Sarabadani  a scris:
>
> Hello,
> Due to the current situation, there are more and more collaborations
> happening online instead. and now you can see Wikimedia-related discussion
> groups in Slack, Discord, Telegram, Facebook, and many more. Besides being
> scattered and inaccessible to people who don't have accounts in those
> platforms (for privacy reasons for example), these platforms use
> proprietary and closed-source software, are outside Wikimedia
> infrastructure and some harvest our personal data for profit.

Hey Amir,

Please take this email as positive feedback, even if it might not
sound like it :)

As much as I value software freedom and my personal data, I've learned
during the years that the best conversations happen where people
converge naturally, not where one wants them to be. What you describe
below is an awesome list of features ... that already exist elsewhere.
I could give you an equally long list of things that are missing, but
individually, none of them matter. What matters is which platform most
people choose, based on which of the features are important for them.
And that platform might be different for different projects.

What that means is that Wikimedia Chat will be just another name in
that long list of apps that people choose to use or not. It's fine if
you want to maintain it, it's great if it will gain traction, but
don't be too upset if it will have the same usage as Wikimedia Spaces.

Strainu
>
> IRC on freenode is a good alternative but it lacks basic functionalities of
> a modern chat platform. So we created Wikimedia Chat, a mattermost instance
> in Wikimedia Cloud. Compared to IRC, you have:
> * Ability to scrollback and read messages when you were offline
> * Push notification and email notification
> * You don't need to get a cloak to hide your IP from others
> * Proper support for sharing media
> * Two factor authentication
> * A proper mobile app support
> * Ability to add custom emojis (yes, it's extremely important)
> * Profile pictures
> * Ability to ping everyone with @here
> * much much more.
>
> You can use Wikimedia Chat by going to https://chat.wmcloud.org, anyone can
> make an account. This is part of Wikimedia Social suite [1], the oher
> similar project is "Wikimedia Meet". [2]
>
> Some notes:
> * This is done in my volunteer capacity and has been maintained by a group
> of volunteers. If you're willing to join the team (either technical or
> enforcing CoC, kicking out spammers, other daily work), drop me a message.
> * Privacy policy of Wikimedia Cloud applies: https://w.wiki/aQW
> * As a result, all messages older than 90 days get automatically deleted.
> * As a Wikimedia Cloud project, all of discussions, private and public are
> covered by Code of conduct in technical spaces:  https://w.wiki/AK$
>
> Hope that would be useful for you, if you encounter any technical issues,
> file a bug in the phabricator.
>
> [1] https://meta.wikimedia.org/wiki/Wikimedia_Social_Suite
> [2] https://meta.wikimedia.org/wiki/Wikimedia_Meet
>
> Best
> --
> Amir (he/him)
> ___
> Wikimedia-l mailing list, guidelines at: 
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
> https://meta.wikimedia.org/wiki/Wikimedia-l
> New messages to: wikimedi...@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 
> <mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Change of translation for "attribution" in CC licenses

2020-07-20 Thread Strainu
Hi folks,

Sorry for cross-posting, not sure which list is the best venue for my problem.

I have an issue with regards to the translation of the word
"attribution" in "Creative Commons Attribution-Share-Alike". For
reasons (explained in [1]) which are not interesting for Wikimedia,
the CC-sanctioned Romanian translation has changed from "distribuire"
to "partajare" in the translation for version 4.0 *only*.

This becomes a problem for multilingual wikis (mw, m, c), which use
meta-templates and MediaWiki messages to translate the {{cc-by-sa-*}}
templates. What would be the easiest way to solve the problem without
affecting other languages?

Thanks,
   Strainu


[1] (in Romanian)
https://www.cyberculture.ro/2020/07/20/licente-creative-commons-versiunea-4-romana/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Video on Wikipedia

2020-05-23 Thread Strainu
Pe sâmbătă, 23 mai 2020, Andre Klapper  a scris:

> On Sat, 2020-05-23 at 22:29 +0300, Strainu wrote:
> > Is there a page where I can find a matrix or some other executive
> > summary of what players we're using on what browsers?
>
> https://www.mediawiki.org/wiki/Extension:TimedMediaHandler#Client_support
> in theory. In practice that seems to be about the Kaltura player but
> not VideoJS? Ah, that page does not even provide a link to
> https://www.mediawiki.org/wiki/Extension:TimedMediaHandler/VideoJS_Player
> Keeping documentation up-to-date: Not always the main focus it seems.
>
> Probably also depends on whether you are logged in and if you have
> enabled "New video player" under "Preferences > Beta features".


Yup, that was causing videojs to appear. Thanks!


> Probably also depends on whether your web browser (version) supports
> HTML5: https://phabricator.wikimedia.org/T100106
>
> Probably also depends on which exact website you are on:
> https://phabricator.wikimedia.org/T248418
>
> > I've tried to push for more video to be used in Infoboxes just to see
> > examples of why that is a terrible idea [1].
> > [1] https://ro.wikipedia.org/wiki/Fi%C8%99ier:IMG_20200523_183135.jpg
>
> I don't know what "is a terrible idea" in the image. Please be explicit
> in what you expect and what happens instead so we don't have to guess.


The movie is left-aligned with a huge padding on the right which makes the
Infobox too wide.

As I said, I suspect that the issue is the Infobox CSS, not the video
player, but unfortunately on my machines it still looks good.

Strainu


> Cheers,
> andre
> --
> Andre Klapper (he/him) | Bugwrangler / Developer Advocate
> https://blogs.gnome.org/aklapper/
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Video on Wikipedia

2020-05-23 Thread Strainu
Hi folks,

Is there a page where I can find a matrix or some other executive summary
of what players we're using on what browsers?

Also, how can one force the use of a certain player? I'm asking because
I've tried to push for more video to be used in Infoboxes just to see
examples of why that is a terrible idea [1]. Now, I strongly suspect that
the issue is the Infobox code rather than the player, but I'm not sure how
to debug it, since I see the videojs on all my browsers.

Strainu

[1] https://ro.wikipedia.org/wiki/Fi%C8%99ier:IMG_20200523_183135.jpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Maps on wikidata

2020-05-10 Thread Strainu
Hi,

Thanks for the responses.

Pe sâmbătă, 9 mai 2020, Marius Hoch  a scris:

> Hi Strainu,
>
> as Michael already pointed out, the (currently hard-coded) zoom value can
> be found in CachingKartographerEmbeddingHandler::getWikiText.
>
> I guess we could try to derive the zoom from the precision of the
> coordinate (GlobeCoordinateValue::getPrecision), but other than that, I
> don't have a (nice and easy to implement) idea for improving this.


Other things that I have seen done in modules throughout the ecosystem are
using the bbox limits (P1333) and the surface. I am also considering
using the geo data from Commons, although that's in pretty bad shape (pun
intended). The precision is indeed another solution, although I would leave
it as the last option, since it depends more often on the data source than
on the subject itself. I hope some of those can be reused in php as well.

Any suggestions on how to prioritize those and/or other data sources in Lua
are welcome. I asked about the php code as I thought you might have some
"instance of"-based algorithm I could replicate.

Straini


> Cheers
> Marius
>
> On 5/9/20 12:42 AM, Michael Holloway wrote:
>
>> Hi Strainu,
>>
>> It's probably best if a Wikibase dev confirms, but I think this is what
>> you're looking for:
>> https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/ext
>> ensions/Wikibase/+/master/lib/includes/Formatters/CachingKar
>> tographerEmbeddingHandler.php#198
>>
>> -mdh
>>
>> On Fri, May 8, 2020 at 12:44 PM Strainu  wrote:
>>
>> Hey folks,
>>>
>>> Can someone point me to the code that decides which zoom to use for
>>> the maps that are displayed in the items with coordinates?
>>>
>>> Thanks,
>>> Strainu
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Maps on wikidata

2020-05-08 Thread Strainu
Hey folks,

Can someone point me to the code that decides which zoom to use for
the maps that are displayed in the items with coordinates?

Thanks,
   Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SQL for reverted and changes

2019-09-09 Thread Strainu
În dum., 8 sept. 2019 la 13:03, Federico Leva (Nemo)
 a scris:
>
> This is not a trivial query to perform. There are various strategies and
> possible data sources:
> https://meta.wikimedia.org/wiki/Research:Revert

Thanks Frederico, that python code is very close to what (I believe) I need.

Strainu

>
> If you only care about identity reverts, working with rev_sha1 might be
> enough for you.
>
> Federico
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] SQL for reverted and changes

2019-09-07 Thread Strainu
Hi,

Can someone help me with an SQL snippet for reverted changes? I know some
tools have this metric but I am not able to isolate the relevant query in
their code.

Thanks,
  Strainu
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introducing WikiContrib 1.0!

2019-09-07 Thread Strainu
Pe sâmbătă, 7 septembrie 2019, יגאל חיטרון  a scris:

> Hi. So, it's not just me. I tried multiple times, on desktop and tablet,
> and it never succeeded.
> Igal


After carefully reading the docs I finally figured it out: you need to fill
*all* the fields. This was not obvious to me, especially since I don't use
my full name anywhere in the wikiverse.

The tool now works and I concur it is pretty cool, on par with what other
platforms offer. GG!

Strainu

>
>
> בתאריך יום ו׳, 6 בספט׳ 2019, 23:31, מאת Andre Klapper ‏<
> aklap...@wikimedia.org>:
>
> > On Fri, 2019-09-06 at 21:38 +0300, Strainu wrote:
> > > Not sure if it isn't just my internet connection, but I can't seem to
> > make
> > > the service work on my phone (Android 9 with Opera). It always seems to
> > get
> > > stuck after I hit search.
> > >
> > > Is it supposed to work on mobile?
> >
> > Maybe related to current network issues that Wikimedia is facing.
> >
> > WikiContrib works on my mobile phone. I get results after clicking the
> > magnifier icon (but I'm neither using Opera nor Android).
> >
> > andre
> > --
> > Andre Klapper (he/him) | Bugwrangler / Developer Advocate
> > https://blogs.gnome.org/aklapper/
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introducing WikiContrib 1.0!

2019-09-06 Thread Strainu
Hey Rammanoj,

Not sure if it isn't just my internet connection, but I can't seem to make
the service work on my phone (Android 9 with Opera). It always seems to get
stuck after I hit search.

Is it supposed to work on mobile?

Thanks,
  Strainu

Pe joi, 5 septembrie 2019, Rammanoj Potla  a
scris:

> Hello folks,
>
>
> I am happy to introduce you to the first version of WikiContrib
> <https://tools.wmflabs.org/wikicontrib/>. WikiContrib is a developer
> metrics tool which can be used to view a developer’s contributions on
> Phabricator and Gerrit. This tool was initially designed keeping a
> Wikimedia Hackathon scholarship committee in mind and with the hope that
> the tool will make it easier for them to decide on a candidate’s
> application. All community members can also use the tool to learn more
> about the contributions of fellow Wikimedians or discover their own!
>
>
> I developed the WikiContrib tool as part of my Google Summer of Code
> project with guidance and support from my mentors Suchakra Sharma
> <https://phabricator.wikimedia.org/p/Tuxology/> and Srishti Sethi
> <https://phabricator.wikimedia.org/p/srishakatux/>.
>
>
> Here are some relevant links:
>
>-
>
>Tool is hosted on Toolforge https://tools.wmflabs.org/wikicontrib/
>-
>
>Source code is available on GitHub
>https://github.com/wikimedia/WikiContrib/
>-
>
>Link to my Phabricator proposal https://phabricator.wikimedia.
> org/T220254
>-
>
>Learn how to use the tool
>https://wikicontrib.readthedocs.io/en/latest/Usage.html
>
>
> Try the tool, and if you encounter any bugs or have any feature requests,
> please file them in the GitHub repository! For anything else, you can
> comment on the Phabricator proposal.
>
>
> Looking forward to your response!
>
>
> Thanks,
> Rammanoj potla
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] extension1 (x1) db primary master failover (read-only required) 3rd July 06:00 AM UTC

2019-07-03 Thread Strainu
Manuel,

Thank you for the notifications. May I suggest that in the future you
also include a list of services/wikis affected? I personally have no
idea what "extension1" means.

Thank you,
   Strainu

În mie., 3 iul. 2019 la 09:10, Manuel Arostegui
 a scris:
>
> This was done.
> Read only start: 06:00:36 UTC
> Read only stop: 06:01:56 UTC
> Total read only time: 01:20 min
>
> On Wed, Jul 3, 2019 at 7:00 AM Manuel Arostegui 
> wrote:
>
> >
> > On Mon, Jul 1, 2019 at 3:48 PM Manuel Arostegui 
> > wrote:
> >
> >> Hello,
> >>
> >> We need to switchover x1 primary master from db1069 to db1120
> >> https://phabricator.wikimedia.org/T226358
> >>
> >> db1069 is a very old host that has hardware issues, it is out of warranty
> >> and needs to be decommissioned.
> >> Given that x1 cannot be set on read-only on a MW level, we will need to
> >> go read-only at a MySQL level.
> >>
> >> We are going to do this on Wednesday 3rd July at 06:00AM UTC. We expect
> >> around 1 minute of read-only time if everything goes as expected.
> >>
> >> Impact: Writes will be blocked. Reads will remain unaffected.
> >>
> >> Communication will happen on #wikimedia-operations
> >> If you are around at that time and want to help with the monitoring,
> >> please join us!
> >>
> >> Thanks
> >> Manuel.
> >>
> >
> > Hello,
> >
> > This will start in 1 hour.
> >
> > Manuel.
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Question to WMF: Backlog on bugs

2019-03-18 Thread Strainu
În dum., 17 mar. 2019 la 23:22, Gergo Tisza  a scris:

> On Sat, Mar 16, 2019 at 8:23 AM Strainu  wrote:
>
> > A large backlog by itself is not alarming. A growing one for
> > components deployed to WMF sites is. It indicates insufficient
> > attention is given to ongoing maintenance of projects after they are
> > no longer "actively developed", which in turn creates resentment with
> > the reporters.
> >
>
> It really doesn't. The backlog is the contact surface between stuff that
> exists and stuff that doesn't; all the things we don't have but which seem
> realistically within reach. As functionality expands, that surface expands
> too. It is a normal process.

Except functionality doesn't expand for not actively developed
products, but the backlog does.

> (We do have projects which are basically unmaintained. Those are not
> typically the ones producing lots of new tasks though, since most relevant
> issues have been filed already. And realistically the choice is between
> having poorly maintained components and having far less components. Would
> undeploying UploadWizard, for example, reduce resentment? I don't think so.)

It's all relative: if UW would be undeployed in favor of a different
component that would cover some of the stuff lacking from UW, than I
don't think we'd see much resentment. I would personally love to see
regular code stewardship reviews for every deployed components which
haven't had one in 2-3 years. After a couple of such iterations, I'm
pretty sure we'd have a non-negligible number of extensions
undeployed. Would that lead to resentment? Sure, but I don't think the
level would be comparable. The main problem I see is there is no good
way to decide how important something is beyond usage metrics.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Question to WMF: Backlog on bugs

2019-03-16 Thread Strainu
În sâm., 16 mar. 2019 la 15:55, David Barratt  a scris:
>
> Perhaps a better example would be the Drupal community who has a total of
> ~1,071,600 issues and ~282,350 of those are open
> https://www.drupal.org/project/issues and they have several organizations
> https://www.drupal.org/organizations working on the software.

Maybe, maybe not - I'm not familiar with Drupal development, but
precisely because of the fragmented contributions, chances are some
bugs fall between teams. As discussed previously in the thread, MW
development is much more centralized, so better coordination is to be
expected.

That being said, their org stats are pretty awsome, is there any way
to obtain similar stats from Phabricator/Gerrit (at least by email
domain if nothing else)?

>
> I do not understand how a large backlog is a problem. It is not an
> indication of anything.

A large backlog by itself is not alarming. A growing one for
components deployed to WMF sites is. It indicates insufficient
attention is given to ongoing maintenance of projects after they are
no longer "actively developed", which in turn creates resentment with
the reporters.

I've checked the burnup graphs Andre referred to for some of the
extensions with high editor visibility (UW, VE, CX) and they all have
a similar pattern - huge increase in the first ~12 months after being
widely deployed, then a much reduced, but visible, growing rate, with
some sharp decreases which correspond to a peak in activity (new team
culling the backlog? volunteer developer solving a few bugs?). I tried
to compare that with the overall pattern, but Phabricator timed out -
if somebody could obtain and publish the overall burnup rate data
somewhere, that would be great.

I guess the question is what's an acceptable backlog growing rate (key
secondary question: for who?) and if it is different between projects.
I don't know how to respond to that.

> On Fri, Mar 15, 2019 at 12:25 PM Strainu  wrote:
>
> > În joi, 14 mar. 2019 la 22:23, Gergő Tisza  a scris:
> > > About backlogs in general, Chromium is probably the biggest
> > > open-source Google repo; that has currently 940K tickets, 60K of which
> > are
> > > open, and another 50K have been auto-archived after a year of inactivity.
> > > (As others have pointed out, having a huge backlog and ruthlessly closing
> > > tasks that do not get on the roadmap are the only two realistic options,
> > > and the latter does have its advantages, no one here seems to be in favor
> > > of it.) We have 220K tasks in total, 40K of which are open, so that's in
> > > the same ballpark
> >
> > That's an overstatement: 18% (not counting bugs closed as declined) is
> > almost double to 11%. If you're going this route, we're doing much
> > worse than Chromium.
> >
> > >
> > > On Wed, Mar 13, 2019 at 3:02 PM Strainu  wrote:
> > >
> > > > The main problem I see with the community wishlist is that it's a
> > > > process beside the normal process, not part of it. The dedicated team
> > > > takes 10 bugs and other developers another ~10. I think we would be
> > > > much better off if each team at the WMF would also take the top ranked
> > > > bug on their turf and solve it and bump the priority of all the other
> > > > bugs by one (e.g. low->medium). One bug per year per team means at
> > > > least 18 bugs (at least if [1] is up to date) or something similar.
> > > >
> > >
> > > Community Tech is seven people and they do ten wishlist requests a year.
> > > (Granted, they do other things too, but the wishlist is their main
> > focus.)
> > > So you are proposing to reallocate on average 1-2 months per year for
> > every
> > > team to work on wishlist wishes. That's about two million dollars of
> > donor
> > > money. How confident are you that the wishlist is actually a good way of
> > > estimating the impact of tasks, outside of the narrow field where editors
> > > have personal experience (ie. editing tools)?
> >
> > I'm 99.9% sure the wishlist is relevant in at least half the
> > categories (Admins, Bots, Editing, Notifications,
> > Programs, Watchlists, Wikidata, Wikisource, Wiktionary) and
> > very likely (80%) also for Anti-harassment, Categories and Maps.
> >
> > I'm not sure how you arrived at the $2M figure (even 36 months of dev
> > time - 18 teams, 2 man-months/team - only add up to ~$400K, unless
> > Glasdoor is waaay off on the salaries there [2]), but presumably going
> > down on the list will also surface bugs and not only features, which
> > will take less time to solve. Investing an additional 1% of the
&

Re: [Wikitech-l] Question to WMF: Backlog on bugs

2019-03-15 Thread Strainu
În joi, 14 mar. 2019 la 22:23, Gergő Tisza  a scris:
> About backlogs in general, Chromium is probably the biggest
> open-source Google repo; that has currently 940K tickets, 60K of which are
> open, and another 50K have been auto-archived after a year of inactivity.
> (As others have pointed out, having a huge backlog and ruthlessly closing
> tasks that do not get on the roadmap are the only two realistic options,
> and the latter does have its advantages, no one here seems to be in favor
> of it.) We have 220K tasks in total, 40K of which are open, so that's in
> the same ballpark

That's an overstatement: 18% (not counting bugs closed as declined) is
almost double to 11%. If you're going this route, we're doing much
worse than Chromium.

>
> On Wed, Mar 13, 2019 at 3:02 PM Strainu  wrote:
>
> > The main problem I see with the community wishlist is that it's a
> > process beside the normal process, not part of it. The dedicated team
> > takes 10 bugs and other developers another ~10. I think we would be
> > much better off if each team at the WMF would also take the top ranked
> > bug on their turf and solve it and bump the priority of all the other
> > bugs by one (e.g. low->medium). One bug per year per team means at
> > least 18 bugs (at least if [1] is up to date) or something similar.
> >
>
> Community Tech is seven people and they do ten wishlist requests a year.
> (Granted, they do other things too, but the wishlist is their main focus.)
> So you are proposing to reallocate on average 1-2 months per year for every
> team to work on wishlist wishes. That's about two million dollars of donor
> money. How confident are you that the wishlist is actually a good way of
> estimating the impact of tasks, outside of the narrow field where editors
> have personal experience (ie. editing tools)?

I'm 99.9% sure the wishlist is relevant in at least half the
categories (Admins, Bots, Editing, Notifications,
Programs, Watchlists, Wikidata, Wikisource, Wiktionary) and
very likely (80%) also for Anti-harassment, Categories and Maps.

I'm not sure how you arrived at the $2M figure (even 36 months of dev
time - 18 teams, 2 man-months/team - only add up to ~$400K, unless
Glasdoor is waaay off on the salaries there [2]), but presumably going
down on the list will also surface bugs and not only features, which
will take less time to solve. Investing an additional 1% of the
revenue into this seems reasonable to me.

[2] https://www.glassdoor.com/Salary/Wikimedia-Foundation-Salaries-E38331.htm

>
> UploadWizard is not in active development currently.

I did not claim (or asked) that it was. What I said is that it is an
important part of the infrastructure and that it should be maintained
properly. I also said I will try to come up with a more detailed
critique later on and see if it has any result.

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Question to WMF: Backlog on bugs

2019-03-14 Thread Strainu
În joi, 14 mar. 2019 la 01:02, Amir Sarabadani  a scris:
>
> On Wed, Mar 13, 2019 at 11:02 PM Strainu  wrote:
>
> > - ContentTranslation v1 (obsolete now, has been unmaintained for 2
> > years while in production)
> > - UploadWizard (2 with high priority, 40 with normal, a few dozens
> > low, hundreds more untriaged): this is the project that got us out of
> > the "overloading the lang parameter for customizing the uploader" era,
> > the project that is used by millions of people every year, including
> > during every photo contest
> >
> There's something called code stewardship [0] and there is a process called
> code stewardship review for projects that are under-, un- or unclear
> maintained [1] which basically a piece of code either gets undeployed from
> WMF infra or we find maintainer(s) to fix the bugs. You can find the list
> of current and past reviews in [2].
>
> If you think a project doesn't have enough maintainer, you can start the
> review process. If there's an active maintainer [3] but they are not fixing
> bugs, most importantly critical bugs, you can raise the issue probably here
> but with **concrete examples.**

I'll rant about UW in a separate thread, right now I just want to
mention that [3] presents 3 possible maintainers for it, **none of
which did any work on UW in the last 6 months** (and presumably much
longer) according to Phab timelines. I know documentation is hard, but
this feels a lot like a wild goose chase.

Strainu

>
> [0]: https://www.mediawiki.org/wiki/Code_Stewardship
> [1]: https://www.mediawiki.org/wiki/Code_stewardship_reviews
> [2]: https://phabricator.wikimedia.org/project/board/3144/query/all/
> [3]: https://www.mediawiki.org/wiki/Developers/Maintainers
> --
> Amir
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Question to WMF: Backlog on bugs

2019-03-13 Thread Strainu
u think they should be spending. Even if management doesn't
> agree with your proposal, it would at least be specific enough to debate.

I have no way of finding that, do I? If there is one, I would be very
curious to learn of it.

În lun., 11 mar. 2019 la 16:58, Bartosz Dziewoński
 a scris:
> In my experience WMF teams usually have a way to distinguish "bugs we're
> going to work on soon" and "bugs we're not planning to work on, but we'd
> accept patches". This is usually public in Phabricator, but not really
> documented.

Yes, and also not uniform between teams and prone to change on each
team reorg. Having a list would would be great, but I'm not sure how
maintainable it is. Would you like to start one? ;)

În mar., 12 mar. 2019 la 01:29, John Erling Blad  a scris:
>
> It seems like some projects simply put everything coming from external
> sources into deep freezer or add "need volunteer". If they respond at
> all. In some cases it could be that the projects are defunc.

No deployed project should be considered defunct. But I agree with
others - naming them would help. Let me start:
- ContentTranslation v1 (obsolete now, has been unmaintained for 2
years while in production)
- UploadWizard (2 with high priority, 40 with normal, a few dozens
low, hundreds more untriaged): this is the project that got us out of
the "overloading the lang parameter for customizing the uploader" era,
the project that is used by millions of people every year, including
during every photo contest

I encourage you all to add more examples.

Thanks to all who chimed in on the subject.
Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Question to WMF: Backlog on bugs

2019-03-09 Thread Strainu
Dan,

Thank you for your response. I appreciate far more someone disagreeing with
me than someone ignoring me :)

Let me start with a simple question, to put the references to wmf into
context. You keep talking below about volunteer developers and how they can
take over any project. While that's true, how many fully-volunteer teams
are there?  How does that number compare to the number of wmf teams? Am I
right to assume the ratio is hugely in favor of wmf teams?  Note: teams,
not developers, since decisions on project management are usually done at
team level.

Pe sâmbătă, 9 martie 2019, Dan Garry (Deskana)  a scris:

> On Sat, 9 Mar 2019 at 11:26, Strainu  wrote:
>
> > How many successful commercial projects leave customer issues unresolved
> > for years because they're working on something else now?
> >
>
> Almost all of them, they just keep it secret. Companies pay millions of
> dollars each year for support packages, even after having paid for software
> in the first place, specifically because otherwise their support issues may
> not be answered in a timely fashion, or even answered at all. I don't think
> comparing us to commercial products makes much sense in this context.


In my experience in b2b contracts they don't keep it a secret, they usually
have SLAs they respect, but ok, let's leave it at that.


> > There were a
> > number of proposals on how to track such issues so that reporters have a
> > clear image of the status of the bugs. Have any of them been tried by at
> > least one of the teams at wmf? If so, is there a way to share the results
> > with other teams? If not, how can we convince the wmf to give them a
> > chance?
> >
>
> I don't agree with shifting responsibility onto the Wikimedia Foundation.


Responsibility for what? Developing and hosting  MediaWiki? Helping
communities concentrate on creating and attracting content without having
to work around bugs? I'm sorry, but that's precisely one of the
responsibilities of the wmf and this is what's discussed here.



> There's an anti-pattern here: we all have a big mailing list discussion,
> agree there's a problem, agree that the Foundation should solve the
> problem, then ask again in a year what they did even though they didn't
> actually say they'd do anything about it. That's not a healthy dynamic.


This is one thing that we agree on: nobody committed on anything. Ever.
That's why I asked above: what does it take to have someone (anyone) at the
WMF act upon these discussions?

My role in the Wikimedia tech community is tech ambassador above all else,
so I'm caught in the middle here: I have to explain new features and
technical decisions to people who don't care about php, js or server
performance , but I also feel obligated to relay their requirements, as I
see them, to the development team. This second process does not happen as
smoothly as it should.

It's not healthy to ignore discussion after discussion and claim it's a
community issue. It's not. It's a governance issue and it's growing every
day.




>
> The technical space is owned by all of us, so if we, as a technical
> community, decide this is important to us, then we can look at the problem
> and try to tackle it, and then figure out how the Wikimedia Foundation
> could catalyse that.


The projects belong to the community at large, not just the technical
subcommunity. They are the ones affected by the  bugs and also they are the
ones that need our support. So why should they be ignored in taking this
decision?

My proposal is to begin the discussion here: how can we better relay issues
that are more important to communities than new features? How can we have a
"community whishlist for bugs"?

Cheers and a great weekend to everyone,
  Strainu



> Dan
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Question to WMF: Backlog on bugs

2019-03-09 Thread Strainu
Pe vineri, 8 martie 2019, bawolff  a scris:

> "tracked" does not mean someone is planning to work on it. This could be
> for a lot of reasons, maybe the bug is unclear, maybe its not obvious what
> a good way to fix is, maybe nobody cares (This sounds harsh, but the simple
> truth is, different things have different people caring about them, and
> some parts just don't have anyone).
>
> This is not really a paid vs unpaid thing. Volunteer projects have a big
> backlog of bugs. Commercial projects also have a backlog or things they
> just don't intend to fix (although usually big commercial projects keep the
> list of bug reports secret).


How many successful commercial projects leave customer issues unresolved
for years because they're working on something else now? I can name a few
which used to do that because they were monopolies, but even those improved
eventually, pressured  by the market. There are companies that require
weekly reports of progress on customer issues, others that don't release
until all bugs are closed one way or another etc.

The discussion at https://lists.gt.net/wiki/wikitech/889489 is relevant, I
believe. The request there was to not decline low-priority issues that
might be resolved by volunteers and this clearly increases the number of
open bugs (as I said, there are good reasons for that :) ). There were a
number of proposals on how to track such issues so that reporters have a
clear image of the status of the bugs. Have any of them been tried by at
least one of the teams at wmf? If so, is there a way to share the results
with other teams? If not, how can we convince the wmf to give them a
chance?

Strainu


> I really think its no different from Wikipedia.
> https://en.wikipedia.org/wiki/Wikipedia:Backlog isn't getting any smaller.
> That's just the natural way of things. Its a bit easier to yell {{sofixit}}
> on wiki than it is to yell it about technical tasks, as technical stuff by
> their very nature require specialized knowledge (Although i would argue
> that lots of tasks on wiki also require specialized knowledge). At the end
> of the day, to get a task fixed, someone who knows how to do it (Or is
> willing to learn how to do it) needs to be interested in doing it.
>
> --
> Brian
>
> On Fri, Mar 8, 2019 at 12:31 PM John Erling Blad  wrote:
>
> > The backlog for bugs are pretty large (that is an understatement),
> > even for bugs with know fixes and available patches. Is there any real
> > plan to start fixing them? Shall I keep telling the community the bugs
> > are "tracked"?
> >
> > /jeblad
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Question to WMF: Backlog on bugs

2019-03-08 Thread Strainu
Pe vineri, 8 martie 2019, Amir Sarabadani  a scris:

> Hey,
> I'm not WMF so I'm not the best one to answer the question but I think your
> statement is overgeneralizing. Some teams have more resource constraints
> than the other ones and treating all of WMF as a big monolith doesn't seem
> to be a good approach. I think you should be more precise and give a more
> clear statement on what do you think is wrong.


Several things:
* the bug backlog has been steadily increasing in all phabricator reports I
have seen (I don't read them all, so some decreases might have occurred
occasionally, but the trend is there)
* feature development is prioritized over bug fixes (read: once a feature
goes into maintenance, good luck getting a fix without bribing someone)
* after Andre stopped being bug wrangler, nobody else took the job of
clarifying user requests, closing obvious duplicates etc.

I'm sure there are legitimate reasons for these problems, the question is
what can be done to improve the situation?


> Two other things to note:
> 1- As a developer who loves to fix bugs, the reason I can't sometimes fix a
> bug is that it's not clearly defined, and/or there's no proper instruction
> to reproduce. Don't always blame the other party.


This is bound to happen when 99% of your users are non-technical.  It would
be great if you (globally) could take some time to ask for clarifications
when you feel they are required.


>

2- Everything is open-source and as non-profit, there's always resource
> constraint. If it's really important to you, feel free to make a patch and
> the team would be always more than happy to review.


No, not always. There are over 4600 open reviews, some 5 years old. There
are some reasons why one would want to keep a review opened, but very few
IMHO. You also need to consider the fact that open reviews might discourage
people from proposing new fixes. And again, 99% of MW  users are
non-technical.

All the best,
  Strainu


>
> Best
>
> On Fri, Mar 8, 2019 at 1:31 PM John Erling Blad  wrote:
>
> > The backlog for bugs are pretty large (that is an understatement),
> > even for bugs with know fixes and available patches. Is there any real
> > plan to start fixing them? Shall I keep telling the community the bugs
> > are "tracked"?
> >
> > /jeblad
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
> --
> Amir
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Help with mw.org content!

2018-12-29 Thread Strainu
În sâm., 29 dec. 2018 la 23:33, Amir Sarabadani  a scris:
>
> I blocked the user for 31 hours, tell me if the user continued. Is it okay
> if I nuke all of the translations?

Thanks! Yes, It's OK to nuke.

>
> Best
>
> On Sat, Dec 29, 2018 at 10:31 PM Strainu  wrote:
>
> > Hi,
> >
> > A certain user, blocked on Romanian Wikipedia for his original
> > research, is using mediawiki.org to push some vere "creative"
> > translations [1]. Since the pages are using the Translate extension,
> > i'm not sure how to revert them. I am also not sure where to ask for
> > help on wiki, since the support desk  [2] seems dedicated to mediawiki
> > questions.
> >
> > To be precise, I need a revert to version [3].
> >
> > Thanks,
> >   Strainu
> >
> >
> > [1] https://www.mediawiki.org/wiki/Special:Contributions/BAICAN_XXX
> > [2] https://www.mediawiki.org/wiki/Project:Support_desk
> > [3]
> > https://www.mediawiki.org/w/index.php?title=Help:Navigation/ro=2979409
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
> --
> Amir
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Help with mw.org content!

2018-12-29 Thread Strainu
Hi,

A certain user, blocked on Romanian Wikipedia for his original
research, is using mediawiki.org to push some vere "creative"
translations [1]. Since the pages are using the Translate extension,
i'm not sure how to revert them. I am also not sure where to ask for
help on wiki, since the support desk  [2] seems dedicated to mediawiki
questions.

To be precise, I need a revert to version [3].

Thanks,
  Strainu


[1] https://www.mediawiki.org/wiki/Special:Contributions/BAICAN_XXX
[2] https://www.mediawiki.org/wiki/Project:Support_desk
[3] https://www.mediawiki.org/w/index.php?title=Help:Navigation/ro=2979409

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Change coming to how certain templates will appear on the mobile web

2018-11-13 Thread Strainu
Chris, what does a few weeks mean exactly? We will need some time to fix
the templates and having a precise timeline helps.

Also, please make sure the change announcement makes it in the tech news
newsletter.

Thank you,
 Strainu

Pe marți, 13 noiembrie 2018, Chris Koerner  a scris:

> Hello,
> In a few weeks the Readers web team will be changing how some
> templates look on the mobile web site. We will make these templates
> more noticeable when viewing the article. We ask for your help in
> updating any templates that don't look correct.
>
> What kind of templates? Specifically templates that notify readers and
> contributors about issues with the content of an article – the text
> and information in the article. Examples like Template:Unreferenced or
> Template:More citations needed. [0] [1] Right now these notifications
> are hidden behind a link under the title of an article. We will format
> templates like these (mostly those that use Template:Ambox or message
> box templates in general) to show a short summary under the page
> title. You can tap on the "Learn more" link to get more information.
> [2]
>
> For template editors we have some recommendations on how to make
> templates that are mobile-friendly and also further documentation on
> our work so far. [3] [4]
>
> If you have questions about formatting templates for mobile, please
> leave a note on the project talk page or file a task in Phabricator
> and we will help you. [5] [6]
>
> 
>
> [0] https://www.wikidata.org/wiki/Q5962027
> [1] https://www.wikidata.org/wiki/Q5619503
> [2] https://meta.wikimedia.org/wiki/File:Page_issues_-_
> mobile_banner_example.jpg
> [3]  https://www.mediawiki.org/wiki/Recommendations_for_
> mobile_friendly_articles_on_Wikimedia_wikis#Making_page_
> issues_(ambox_templates)_mobile_friendly
> [4] https://www.mediawiki.org/wiki/Reading/Web/Projects/Mobile_Page_Issues
> [5] https://www.mediawiki.org/wiki/Talk:Reading/Web/
> Projects/Mobile_Page_Issues
> [6] https://phabricator.wikimedia.org/maniphest/task/edit/form/
> 1/?projects=Readers-Web-Backlog
>
> Yours,
> Chris Koerner
> Community Relations Specialist
> Wikimedia Foundation
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Article geotags missing in geo_tags table for some WP languages?

2018-11-08 Thread Strainu
Pe joi, 8 noiembrie 2018, Martin Dittus  a scris:

> Thank you Strainu and Bartosz, this was very useful.
>
> As far as I can tell, idwiki editors simply don't use the GeoData
> geotagging conventions. Instead, people specify article location as
> infobox latd/longd properties, which on enwiki (and likely others) is
> being deprecated in favour of GeoData tags. While this older method
> allows a map to be displayed on the page, the coordinates are not
> imported by the GeoData extension, and as a result none of these
> geotagged pages show up in API geo lookups, or in the data dumps.


I think you're confusing the coord template with the #coordinates: parser
function - unintuitively, they're both in brackets. The first is a generic
way of  displaying some coordinates. It can be built by code from
individual parameters, like on id.wp, or it can be passed already built,
like on en.wp. What it does inside varies, and some wikis, in addition to
displaying the coordinates, also call the parser function.

The parser function, on the other hand, simply saves the data in the
database, without displaying anything. It is almost always called from
templates or modules and very rarely directly from pages. Adding it to
Template:coord (or Module:Coordinates, for most wikis nowadays) makes the
coordinates magically appear in the database in some time. For instance,
these are the changes I made to use the parser function on ro.wp:
https://ro.wikipedia.org/w/index.php?title=Modul%3ACoordonate=revision=8110189=8099705

Strainu

>
> See also:
> https://en.wikipedia.org/wiki/Wikipedia:Coordinates_in_infoboxes
>
> I'm now pondering if there's a quick way to asses for which wikis this
> is the case... I may report back if I find a simple approach, beyond
> simply manually checking every wiki.
>
> Thanks again!
>
> m.
> On Wed, Nov 7, 2018 at 10:31 PM Bartosz Dziewoński 
> wrote:
> >
> > The coordinates template/module needs to use the {{#coordinates:…}}
> > parser function for the page to be geotagged.
> >
> > --
> > Bartosz Dziewoński
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Article geotags missing in geo_tags table for some WP languages?

2018-11-07 Thread Strainu
The extension responsible for adding the data is
https://www.mediawiki.org/wiki/Extension:GeoData

The template ({{coord}}) and module (Module:Coord) on id.wp seem taken as
is from en.wp, and claims to be adding the data, although I can't find the
specific code on my phone (which means absolutely nothing).

I would check if the template or module call the parser function first. You
can also check if this is a dump problem by asking for the data through the
api.

HTH
Strainu

Pe miercuri, 7 noiembrie 2018, Martin Dittus  a scris:

> (I sent this to xmldatadumps-l yesterday but just realised that this
> might be a more suitable place.)
>
> Hallo,
>
> I'm looking at the data dumps for all Wikipedia languages and noticed
> that for some larger wikis, the geo_tags.sql.gz dump file does not
> include any geotags found in articles. Is it possible to determine why
> this is, and for which languages this is the case?
>
> For example, the geotags dump file for Indonesian (a wiki with
> >400,000 articles) is only 7kb large, and all geotags in it are from
> user pages, file uploads, or file templates, but not from articles:
> https://dumps.wikimedia.org/idwiki/20181020/
>
> Yet it doesn't take much effort to find pages that are geotagged, such
> as this one (see the infobox): https://id.wikipedia.org/wiki/London
>
> I realise that there are a number of alternative geotagging
> conventions. Does idwiki possibly use a geotagging scheme that is not
> supported by some part of this data ingestion/export process? Which
> other wikis/languages may fall in this category?
>
> I tried to find the script(s) that populate the geo_tags table from
> page content but so far had no luck, as I'm not sufficiently familiar
> with WP's software architecture; if someone can point me in the right
> direction I'd be happy to investigate myself.
>
> Many thanks!
>
> m.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] problematic use of "Declined" in Phabricator

2018-10-03 Thread Strainu
În mie., 3 oct. 2018 la 19:08, Mathieu Lovato Stumpf Guntz
 a scris:
> On the other hand, I discovered in the process that for some other people in 
> the community phabricator is perceived as an hostile place, out of what they 
> feel as part of "their" community. Actually, to the point that starting a 
> proposal on phabricator might be  interpreted as an attempt to enforce ideas 
> without and against the consent of the community, rather than a call to give 
> feedback and make evolve ideas together, and thus despite an immediate 
> communication on the ticket creation.

That's a totally orthogonal problem that existed from the bugzilla
days. Some people consider bug tracking as a "dev" activity that they
don't know anything about, others have difficulties communicating in
English and finally some just consider the WMF evil and want nothing
to do with it. Using Phabricator to track bugs and features in a
certain way (or at all) doesn't seem to have a lot to do with this
(unless there is some proof to the contrary that I'm not aware of).

The problem Amir brought up mainly affects people that already use Phabricator.

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] problematic use of "Declined" in Phabricator

2018-10-02 Thread Strainu
Pe marți, 2 octombrie 2018, Amir E. Aharoni 
a scris:

> Hi,
>
> I sometimes see WMF developers and product managers marking tasks as
> "Declined" with comments such as these:
> * "No resources for it in (team name)"
> * "We won't have the resources to work on this anytime soon."
> * "I do not plan to work on this any time soon."
>
> Can we perhaps agree that the "Declined" status shouldn't be used like
> this?
>
> "Declined" should be valid when:
> * The component is no longer maintained (this is often done as
> mass-declining).
> * A product manager, a developer, or any other sensible stakeholder thinks
> that doing the task as proposed is a bad idea. There are also variants of
> this:
> * The person who filed the tasks misunderstood what the software component
> is supposed to do and had wrong expectations.
> * The person who filed the tasks identified a real problem, but another
> task proposes a better solution.
>
> It's quite possible that some people will disagree with the decision to
> mark a particular task as "Declined", but the reasons above are legitimate
> explanations.
>
> However, if the task suggests a valid idea, but the reason for declining is
> that a team or a person doesn't plan to work on it because of lack of
> resources or different near-term priorities, it's quite problematic to mark
> it as Declined.
>
> It's possible to reopen tasks, of course, but nevertheless "Declined" gives
> a somewhat permanent feeling, and may cause good ideas to get lost.
>
> So can we perhaps decide that such tasks should just remain Open? Maybe
> with a Lowest priority, maybe in something like a "Freezer" or "Long term"
> or "Volunteer needed" column on a project workboard, but nevertheless Open?


Yes, please!

I usually reopen such tasks, but it is a little weird.

Strainu

>
> --
> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> http://aharoni.wordpress.com
> ‪“We're living in pieces,
> I want to live in peace.” – T. Moore‬
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Roadmap for CX?

2018-09-06 Thread Strainu
Hello,

More than a year has passed since the email below and subjectively,
editors are complaining just as much about not being able to save
changes and other nuisances. Now, I know that wikipedians are not shy
about expressing their discontent, but I also cannot overstate the
impact CX can have for small and medium-sized communities.

Since the pages mentioned in Amir's email have not seen much action, I
would like to ask for another update from the engineering team. Is CX
still developed? Are small bug reports handled yet or are you still
waiting for some big feature?

Thank you,
   Strainu


2017-05-02 11:42 GMT+03:00 Amir E. Aharoni :
> 2017-04-27 8:55 GMT+03:00 Strainu :
>
>> Following the recent outage, we've had a new series of complaints
>> about the lack of improvements in CX, especially related to
>> server-side activities like saving/publishing pages.
>>
>> Now, I know the team is involved in a long-term effort to merge the
>> editor with the VE, but is there an end in sight for that effort? Can
>> I tell people who ask "look, 6 more months then we'll have a much
>> better translation tool"?
>>
>> Is there a publicly available roadmap for this project and more
>> generally, for CX?
>>
>>
> Hi,
>
> Thanks again for bringing this up.
>
> Currently the Language team is indeed working on transitioning the editing
> component to VE. At the moment we are completing the rewrite of the
> frontend internals using OOjs UI and so using VE's special handling of edge
> cases. This is more than a refactoring—this will also improve the stability
> of several features such as saving and loading, paragraph alignment, and
> table handling.
>
> We hope to complete the transition of the translation editing interface to
> VE in July–September 2017. This will not only change the interface itself,
> but will also bring in some of the most often requested CX features, such
> as the ability to add new categories, templates, and references using VE's
> existing tools rather than just adapt them, and to edit the translation
> using wiki syntax.
>
> The next part to develop would be another round of improvement of template
> support. The previous iteration was done in the latter half of 2016, and
> allowed adapting a much wider array of templates, including infoboxes.
> However, one important kind of template that is not yet supported well
> enough is ones inside references (a.k.a. citations or footnotes), and this
> will be the focus of the next iteration. We also plan to improve CX’s
> template editor itself by allowing machine translation of template
> parameter values, and by fixing several outstanding bugs in it.
>
> After finishing these two major projects, in early 2018 we expect to work
> on fixing various remaining bugs, after which we plan to start declaring
> Content Translation as non-beta in some languages. We are figuring out
> which bugs exactly will these be; the current list is at
> https://phabricator.wikimedia.org/project/view/2030/ , but it will likely
> change somewhat before we get there. (Suggestions about what should go
> there are welcome at any time.)
>
> Finally, two further future directions that we are thinking about
> longer-term are:
> 1. Translation List: Shared and personal lists of articles awaiting
> translation ( https://phabricator.wikimedia.org/T96147 ). We already have
> designs for it, but the implementation will have to wait until we fix the
> more urgent issues above.
> 2. Better support on mobile devices. This is complicated, but much-needed.
> Some early thoughts about this can be found at
> https://www.mediawiki.org/wiki/Content_translation/Product_Definition/Mobile_exploration
> , but there will need to be much more design and development around this.
>
> You can see a more formal document about this here, although the content is
> largely the same:
> https://www.mediawiki.org/wiki/Content_translation/Roadmap/2017%E2%80%932018
>
> The Language team already had this more or less figured out a couple of
> months ago, but the publishing was delayed because of the higher-level
> planning process (
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2017-2018/Draft/Programs/Product
> ).
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikimedia-l] C-team Statement on the Code of Conduct

2018-08-15 Thread Strainu
Hello,

I also find this email disappointing. The CoC itself hasn't been seriously
questioned as a whole, only specific parts of the interpretation given by
the committee, which have led to the changes proposed on mw.org

I appreciate the encouragements to the committee members, but objectively,
your mail sounds very much like a tactic to discourage open discussion
about the shortcomings of the Code.

Strainu

Pe marți, 14 august 2018, Victoria Coleman  a scris:

> Hello everyone,
>
> The executive leadership team, on behalf of the Foundation, would like to
> issue a statement of unequivocal support for the Code of Conduct[1] and the
> community-led Code of Conduct Committee. We believe that the development
> and implementation of the Code are vital in ensuring the healthy
> functioning of our technical communities and spaces. The Code of Conduct
> was created to address obstacles and occasionally very problematic personal
> communications that limit participation and cause real harm to community
> members and staff. In engaging in this work we are setting the tone for the
> ways we collaborate in tech. We are saying that treating others badly is
> not welcome in our communities. And we are joining an important movement in
> the tech industry to address these problems in a way that supports
> self-governance consistent with our values.
>
> This initiative is critical in continuing the amazing work of our projects
> and ensuring that they continue to flourish in delivering on the critical
> vision of being the essential infrastructure of free knowledge now and
> forever.
>
> Toby, Maggie, Eileen, Heather, Lisa, Katherine, Jaime, Joady, and Victoria
>
>
> https://www.mediawiki.org/wiki/Code_of_Conduct <https://www.mediawiki.org/
> wiki/Code_of_Conduct>
>
>
>
>
> ___
> Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/
> wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/
> wiki/Wikimedia-l
> New messages to: wikimedi...@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> <mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] My Phabricator account has been disabled

2018-08-09 Thread Strainu
2018-08-09 15:12 GMT+03:00 Lucie Kaffee :
> I understand the wish for a more transparent process. (What a good thing
> there is the possibility to suggest amendments to the CoC!)
> But I would like you to consider the following: Someone, who was warned, or
> even blocked, might change their behavior. Should we still keep a public
> list of all people that ever had contact with the CoC committee? It seems
> to me that this could easily be used as a shaming and blaming list. If the
> block is over and the person wants to change their behavior, it might be
> hard for them to start with a clean sheet if we keep a backlog public of
> everyone. I'd see it not only as a privacy issue for the people reporting,
> but also the reported.

So basically your're saying that the wiki way of doing things, were
blocks and bans are public and often contain the offending diff, is
bad and should not be followed. Is the CoC committee really the venue
where such a decision should be made? Shouldn't the wiki way be the
default *unless* the community decided otherwise?

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] My Phabricator account has been disabled

2018-08-09 Thread Strainu
2018-08-08 23:29 GMT+03:00 Amir Ladsgroup :
> Taking my coc hat off, I'm not representing the committee at all.
> Several things have been misunderstood imo. I want to address them.
> 1) The use of profanity is not prohibited by the COC, using them against
> others or for unconstructive reasons is. If you see the whole discussion,
> you could clearly see the comment is not made to move discussion forward.
> These are clear case of disruptive actions.
> 1.1) the response to these violations depends on the user, very similar to
> what Wikipedia does. If it was the first case reported about Mz, they
> wouldn't get this ban.
> 2) the duration of block which is for one week was determined and
> communicated in the email. You can check the email as it's public now.

Unless you're talking about another mail than the one published by
MZMcBride, you did not mention the duration. I'm assuming this was an
omission from your part (AGF) but you should consider having email
templates or some other mean of avoiding such mistakes in the future.

> 3) not being able to discuss cases clearly also bothers me too as I can't
> clarify points. But these secrecy is there for a reason. We have cases of
> sexual harassment in Wikimedia events, do you want us to communicate those
> too? And if not, where and who supposed to draw the line between public and
> non-public cases? I'm very much for more transparency but if we don't iron
> things out before implementing them, it will end up as a disaster.

There is a clear line that can be established: public comment
(wiki/phabricator/etc) => public case. Also, you don't have to go into
details, just mentioning that someone was banned from the Wikimedia
events for sexual harassments seems enough to me.

Reversely, if you don't publish this data, how are other event
organizers going to enforce the ban? When Austria organized the
Wikimedia hackathon, we had several pre-hackathons organized in
several CEE countries. If these would happen today, they would be
bound by the CoC, but the organizers would have no way to determine if
a user should be banned or not.

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] My Phabricator account has been disabled

2018-08-08 Thread Strainu
2018-08-08 17:44 GMT+03:00 Dan Garry :
> On 8 August 2018 at 14:29, Alex Monk  wrote:
>
>> Are you trying to ban people discussing CoC committee decisions publicly?
>> Not that it even looks like he wrote grievances.
>
>
> Hardly. I have absolutely nothing to do with the administration of this
> list, nor the authority to set what is discussed on this list, nor any
> involvement in the Code of Conduct, all of which you are well aware.
>

Then what was the purpose of your original email then, if you don't
mind me asking? How was that a positive contribution to the
discussion?

Thanks,
   Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] My Phabricator account has been disabled

2018-08-08 Thread Strainu
2018-08-08 18:53 GMT+03:00 Bináris :
> This happens when American culture and behavioral standard is extended to
> an international community.

FWIW, the CoC itself is quite neutral and contains (at least in my
view) no American specificities, only general principles that most
developers can identify with. Also, I would note that the majority of
the current committee members are *not* US-based (from what I can
tell) and that there is a good gender balance, so it's hard to argue
it could get more diverse than that. That, together with the history
of MZMcBride should make us give credit to the committee (and question
some of our own stereotypes ;))

Nevertheless, this case has shown a few issues with the way the CoC is
implemented. I strongly believe secrecy and open source don't go well
together and that the committee's decisions should be opened to
scrutiny by the community. That implies that (at the very least) bans
should be publicly logged, together with the duration of the ban, the
intervention in question (if still public) and the part of the CoC
that was breached. Ideally, the justification should also be public,
but I realize that might not always be possible or desirable.

Another question is how will such discussions be included in the CoC
or the committee's process?  I don't think a blacklist of forbidden
words would be a constructive or realistic solution, but such email
threads should not remain without follow-up, or we risk repeating the
same mistakes in the future.

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Old categorization data

2018-07-16 Thread Strainu
Hi,

What I would do is is  use the categorylinks table [1] to extract a
timestamp that can then be used to search for a particular revision.
Filtering by user ID is then trivial.

Strainu

[1]
https://m.mediawiki.org/wiki/Special:MyLanguage/Manual:categorylinks_table


Pe vineri, 13 iulie 2018, יגאל חיטרון  a scris:

> Hi. Is there a way to get caterization edits data from more than month ago?
> I found the last month in recentchanges SQL table, but that's all. I need
> to find pages that were added by specific user to specific category,
> doesn't matter when. Thank you.
> Igal (User:IKhitron)
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] An update on map internationalization

2018-04-23 Thread Strainu
Hi Joe,

That's great news! I especially like the promise of replication within
minutes from OSM, it sounds wonderful. I see the feature is to be
deployed to mapframe-using wikis first. Do you have a timeline for
maplink-only wikis? If a wiki decides to enable mapframe after the
deployment, will it have to ask explicitly for this feature to be
enabled or will it be done automatically?

Regarding resources, not many free tile servers that I'm aware of can
provide this multilanguage capability. Have you considered the
possibility that some 3rd party might want a "free ride" and use
Wikimedia's tile servers, creating an increase in usage? Are there
usage caps or other similar methods in place to prevent abuse?

Also, regarding fallback languages, could you clarify what "languages
that use the same alphabet" mean? For instance, Romanian does not have
a fallback language; will the labels be displayed in English before,
say, Chinese?

Thanks,
  Strainu

2018-04-21 2:00 GMT+03:00 Joe Matazzoni <jmatazz...@wikimedia.org>:
> This is to let you know that Collaboration Team is planning to release map 
> internationalization next week for testing on testwiki [1]. When it’s ready, 
> we’ll post a note to confirm.
>
> Meanwhile, you might like to check out the detailed post I added last night 
> to the Map Improvements 2018 project board: Special Update on Map 
> Internationalization[2]. It includes a lot of information on the feature's 
> status, how the it will work, how we imagine it might be useful, what the 
> known limitations are, etc.  I’m looking forward to getting your input on 
> this challenging but important feature (the best place to leave your ideas 
> and questions is on the project talk page [3]).
>
> Yours,
> Joe
>
> [1] https://phabricator.wikimedia.org/T112948 
> <https://phabricator.wikimedia.org/T112948>
> [2] 
> https://www.mediawiki.org/wiki/Map_improvements_2018#April_18,_2018,_Special_Update_on_Map_Internationalization
>  
> <https://www.mediawiki.org/wiki/Map_improvements_2018#April_18,_2018,_Special_Update_on_Map_Internationalization>
> [3] https://www.mediawiki.org/wiki/Talk:Map_improvements_2018 
> <https://www.mediawiki.org/wiki/Talk:Map_improvements_2018>
> _
>
> Joe Matazzoni
> Product Manager, Collaboration
> Wikimedia Foundation, San Francisco
>
> "Imagine a world in which every single human being can freely share in the 
> sum of all knowledge."
>
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mobile MonoBook

2018-04-04 Thread Strainu
2018-04-04 23:41 GMT+03:00 Brad Jorsch (Anomie) <bjor...@wikimedia.org>:
> On Wed, Apr 4, 2018 at 3:18 PM, Strainu <strain...@gmail.com> wrote:
>
>> Skipping navboxes is a decision that was taken by the WMF team
>> responsible for the mobile site (whatever it was called at the time)
>> and can be solved cleanly only at the skin level, but I don't expect
>> this to happen as long as it will break the mobile site.
>>
>> Your proposal would be the ideal argument for reversing the current
>> "solution", yes, but realistically, it's not going to happen
>> throughout the hundreds of wikis that implemented navboxes.
>>
>
> Here we're talking about Isarra's very interesting project to make the
> Monobook skin responsive, not whatever decisions WMF's mobile teams may
> have made in the past for their mobile-only skin. She is not obligated to
> follow their lead just because they led, and I'd recommend she doesn't in
> this case.

Maybe I wasn't clear, but that's exactly what I meant as well: I'm
very curious to see how *Mobile MonoBook* handles that particular
issue. I most certainly hope Isarra *doesn't* follow Minerva's
(mis)behavior. If the solution is acceptable, perhaps it can then be
adapted to Minerva. However, I haven't found any navboxes in the demo
wiki.

Strainu

>
> Note that's my personal recommendation and not any sort of official WMF
> position. I likely won't even be involved in the code review for her patch.
>
>
> --
> Brad Jorsch (Anomie)
> Senior Software Engineer
> Wikimedia Foundation
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mobile MonoBook

2018-04-04 Thread Strainu
2018-04-04 18:40 GMT+03:00 Brad Jorsch (Anomie) <bjor...@wikimedia.org>:
> On Tue, Apr 3, 2018 at 6:52 PM, Strainu <strain...@gmail.com> wrote:
>
>> but what I would particularly like to see is how it
>> handles navboxes. Traditionally, they have been hidden on the
>> Wikipedia mobile site, prompting people to do all kinds of sick
>> workarounds that kind of work, but not really. If anyone can come up
>> with a decent solution to that it's probably you :)
>>
>
> The solution is probably for the on-wiki editors to make navboxes
> responsive (e.g. using TemplateStyles[1]), rather than expecting the skin
> to deal with it.
>
> [1]: https://www.mediawiki.org/wiki/Extension:TemplateStyles

Skipping navboxes is a decision that was taken by the WMF team
responsible for the mobile site (whatever it was called at the time)
and can be solved cleanly only at the skin level, but I don't expect
this to happen as long as it will break the mobile site.

Your proposal would be the ideal argument for reversing the current
"solution", yes, but realistically, it's not going to happen
throughout the hundreds of wikis that implemented navboxes.

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mobile MonoBook

2018-04-03 Thread Strainu
2018-04-02 6:13 GMT+03:00 Isarra Yos <zhoris...@gmail.com>:
> I have made a patch for a responsive version of MonoBook with mobile
> support. See: https://gerrit.wikimedia.org/r/c/421199/ or if you just want a
> live demo, https://wiki.zaori.org/wiki/Page_title?useskin=monobook - load it
> on a phone or make your browser window narrow or something and you can see
> what it looks like. This is a prototype nojs version; I intend to make an
> even sillier js layout with popovers and stuff as a followup patch.
>
> A potential issue has already been raised with the icons: I don't really
> know how to make text strings actually, well, reliably fit on mobile
> devices, but a lack of support for no-image usage could also be a real
> problem in MonoBook. Feedback on that or whatever, as well as other testing
> and reviews, would be greatly appreciated.

Well, it looks a bit dated, but in cute and geekish way. I love the
way you took the menu out of the way without hiding it under a
hamburger button, but what I would particularly like to see is how it
handles navboxes. Traditionally, they have been hidden on the
Wikipedia mobile site, prompting people to do all kinds of sick
workarounds that kind of work, but not really. If anyone can come up
with a decent solution to that it's probably you :)

Strainu

>
> There is also a rather more immediate problem at present. As you can see on
> the patch, jenkins has -1ed it for style problems. Unfortunately I have no
> idea what the style problems are because the output of that test is totally
> useless (https://phabricator.wikimedia.org/T190072) - can anyone tell me
> what the problem(s) are so I can fix them? And/or just resolve T190072?
> Please? It's starting to get a bit annoying, frankly, as it's been coming up
> across several of these patches.
>
> -I
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Any way to find the used template from the template itself?

2018-03-10 Thread Strainu
Thanks Gergo

2018-03-09 9:51 GMT+02:00 Gergo Tisza <gti...@wikimedia.org>:
> On Thu, Mar 1, 2018 at 2:52 PM, Strainu <strain...@gmail.com> wrote:
>
>> Say we have an article which includes {{Infobox A}}, which redirects
>> to {{Infobox B}}, which in turn transcludes {{Infobox}}, which is
>> implemented using [[Module:Infobox]]. Is there a way to know from the
>> module (or the {{Infobox}} template) which infobox was actually used
>> in the article?
>>
>
> Not unless you pass it along as a parameter.
> mw.getCurrentFrame().getParent() only goes one level up (so you can tell
> Module:Infobox was called from {{Infobox}} but not beyond that).
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Any way to find the used template from the template itself?

2018-03-08 Thread Strainu
2018-03-02 0:52 GMT+02:00 Strainu <strain...@gmail.com>:
> Say we have an article which includes {{Infobox A}}, which redirects
> to {{Infobox B}}, which in turn transcludes {{Infobox}}, which is
> implemented using [[Module:Infobox]]. Is there a way to know from the
> module (or the {{Infobox}} template) which infobox was actually used
> in the article?


No ideas? :(

Thanks,
   Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Any way to find the used template from the template itself?

2018-03-01 Thread Strainu
Say we have an article which includes {{Infobox A}}, which redirects
to {{Infobox B}}, which in turn transcludes {{Infobox}}, which is
implemented using [[Module:Infobox]]. Is there a way to know from the
module (or the {{Infobox}} template) which infobox was actually used
in the article?

Thanks,
  Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thumbor @ Wikimedia

2017-12-12 Thread Strainu
Hi Gilles,

Thank you for the very interesting read and generally for a job well
done, as most of the users will never know that critical part of the
infrastructure changed. :)

Where can we find more information about the thumb caching policy
change that you mention in one of the posts?

Also, based on the bugs that you found in production and the solutions
you implemented in order to prevent them, I was wondering if it would
be feasible to reuse those in production: for instance, when the
package is upgraded, try to regenerate all the thumbnails failed with
the previous version; also, if a thumbnail that worked stops working,
automatically notify the maintainers to look into the regression.


Strainu

2017-12-12 12:10 GMT+02:00 Gilles Dubuc <gil...@wikimedia.org>:
> I wrote a blog series that just got picked up on the main Wikimedia blog
> about my work migrating Wikimedia wikis to Thumbor for media thumbnailing:
>
> Part 1: https://blog.wikimedia.org/2017/12/09/thumbor-journey-rationale/
> Part 2: https://blog.wikimedia.org/2017/12/09/thumbor-journey-
> thumbnailing-architecture/
> Part 3: https://blog.wikimedia.org/2017/12/09/thumbor-journey-
> development-deployment-strategy/
>
> There's also a set of wiki pages with all the gory technical details of the
> setup: https://wikitech.wikimedia.org/wiki/Thumbor
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Start from random image in slideshow gallery

2017-11-03 Thread Strainu
Hi there,

Is there a way to start displaying a slideshow gallery from a random
image without editing the site JavaScript, except for building a
template that randomizes the content of the  tag?

If not, is there a phab item for this feature?

Thanks,
Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal regarding the handling of imported usernames

2017-11-02 Thread Strainu
 from being used for an actual
>local account.
>5. Some less consistent combination of the "all the existing rows" and
>"when a new user is created" options from #2–4.
>
> Of these options, this proposal seems like the best one.
>
> [1]: https://phabricator.wikimedia.org/T9240
> [2]: https://phabricator.wikimedia.org/T179246
> [3]: https://gerrit.wikimedia.org/r/#/c/386625/
> [4]: https://phabricator.wikimedia.org/T111605
> [5]: ">" was chosen rather than the more typical ":" because the former is
> already invalid in all usernames (and page titles). While a colon is *now*
> disallowed in new usernames, existing names created before that restriction
> was added can continue to be used (and there are over 12000 such usernames
> in WMF's SUL) and we decided it'd be better not to suddenly break them.
> [6]: https://phabricator.wikimedia.org/T167246

Strainu

>
> --
> Brad Jorsch (Anomie)
> Senior Software Engineer
> Wikimedia Foundation
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Last Call: PostgreSQL schema change for consistency with MySQL

2017-08-17 Thread Strainu
2017-08-17 16:50 GMT+03:00 Brian Wolff <bawo...@gmail.com>:
> Does it actually make sense to notify third parties using postgres about
> the change? Its not like we are dropping support for it, and im doubtful
> that random user using postgres with mediawiki has opinions on
> implementation details such as what specific features are used in the
> schema or would even understand what all this is about.

Will the upgrade be seamless regardless of how they upgrade their
version? Will they be able to recover their website after disaster
from a backup with the old layout? If the answer is anything else than
a definite yes (and I can imagine at least a scenario where MW is
upgraded without Postgre), then yes, it makes sense to try and reduce
the surprise effect. Even if they don't have an opinion about it,
knowing of the change will help them upgrade.

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikiscan statistics tool for Wikimedia projects

2017-08-14 Thread Strainu
Hi Vira,

I'm not 100% sure, but if I understand correctly, these are basically the
same stats from the official reports, only at day level:

- users/day: how many authenticated users were active (aka made
edits), on average, in a day of that year
- ip/day: how many anonymous users were active on average, in a day of
that year.
- user edits/day: how many edits were made by authenticated users in a
day, on average
... and so on.

HTH,
  Strainu

2017-08-14 14:15 GMT+03:00 Vira Motorko <vira.moto...@gmail.com>:
> Hi,
>
> I have a question here. Looking at Stegosaurus-looking graphs on
> http://uk.wikiscan.org I have a hard time understanding what they show.
> Users/IPs/bots who got registered? who edit? something else?
>
> I'm sorry for being dumb today.
>
> *--*
> *Vira Motorko // Віра Моторко*
> Wikimedia Ukraine <https://ua.wikimedia.org/> nonprofit organisation // ГО
> «Вікімедіа Україна»
> mobile: +380667740499 | facebook: vira.motorko
> <https://www.facebook.com/vira.motorko> | wikipedia: Ата
> <https://meta.wikimedia.org/wiki/User:Ата>
>
> If this email is about your daily job and it reaches you outside of the
> working hours, please, feel free to answer when it's appropriate! // Якщо
> це робочий лист і Ви отримали його не в робочий час, будь ласка,
> відповідайте, коли вважаєте за потрібне!
> Якщо маєте електронну скриньку на зразок _@_.ru, задумайтесь, будь ласка,
> над її зміною. Дякую!
>
> 2017-07-30 22:17 GMT+03:00 Pine W <wiki.p...@gmail.com>:
>
>> Wikiscan is an interesting tool for statistics fans. I suggest briefly
>> reading this IEG page
>> <https://meta.wikimedia.org/wiki/Grants:IEG/Wikiscan_multi-wiki>, then
>> playing with the tool on https://wikiscan.org/
>>
>> Pine
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML5 sections update

2017-08-03 Thread Strainu
Thanks for letting us know Max! Just a couple of questions from me at
this point:

1. Will this be included in the tech news?
2. How will the behavior of anchorencode change? Should it be removed
from templates after the new IDs become default?

Strainu

2017-08-03 3:46 GMT+03:00 Max Semenik <maxsem.w...@gmail.com>:
> Hey, yesterday the patch implementing human-readable section IDs [0] was
> merged (thanks, Tim!). The new feature has already been enabled on beta
> cluster and you can try it yourselves, e.g. on [1] - some pages might still
> have old HTML cached though and require a null edit to update.
>
> What's next? We can probably flip it on testwiki after Wikimania, but
> further deployments depend on Reading Web, Apps and Parsoid teams. We,
> however, can already encourage editors to check it out in staging.
>
> Unanswered question: do we really need to percent-encode the IDs? There is
> extensive discussion of that in the aforementioned task, concluding that
> percent-encoding is probably more "correct". However, not escaping it gives
> much better browser compatibility (close to 100%). We can change this at
> any time because no links will be broken due to the way browsers handle
> percent-encoded fragments.
>
> What's the impact for end users? When the commit above goes live, there
> will be no user-visible changes, and almost no HTML changes at all (only
> some interface IDs/classes generated from MediaWiki messages might slightly
> change, but no links will be broken). When we migrate we will initially
> enable new IDs as a fallback. After 1 month, the new IDs will become
> default while old ones will be used as a fallback. This way, old links will
> continue to work, and we have no plans to disable the fallbacks in the
> foreseeable future.
>
> What's the impact for developers? Sanitizer::escapeId() has been
> deprecated, all code should migrate
> to escapeIdForAttribute(), escapeIdForLink()
> or escapeIdForExternalInterwiki(). Warning: unlike escapeId(), these
> functions' output is not guaranteed to be HTML-safe so it must be escaped.
> Our security guidelines say that everything should be HTML-safe anyway, so
> even escapeId() should be properly escaped. The same deprecation happened
> in JavaScript, mw.util.escapeId() is also deprecated.
>
> 
> [0] https://phabricator.wikimedia.org/T152540
> [1]
> https://ru.wikipedia.beta.wmflabs.org/wiki/%D0%92%D0%B8%D0%BA%D0%B8%D0%BF%D0%B5%D0%B4%D0%B8%D1%8F
>
> --
> Best regards,
> Max Semenik ([[User:MaxSem]])
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New scoring models deployed

2017-07-21 Thread Strainu
2017-07-19 3:56 GMT+03:00 Adam Wight <awi...@wikimedia.org>:
>- Romanian Wikipedia: new models for damaging and goodfaith.

According to https://ores.wikimedia.org/v3/scores/ , ro.wiki also has
a model for reverted. Could you confirm that?

Thanks,
   Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Drop in mainpage pageviews?

2017-07-16 Thread Strainu
2017-07-15 14:34 GMT+03:00 יגאל חיטרון <khit...@gmail.com>:
> Hello, Strainu.
> 1. Try in place of "all-access" use different platforms. You'll see, as I
> expected reading your letter, that the effect you recognized appears in
> "mobile-web" mostly.
> 2. I do not remember the exact date, but a couple of months ago the way for
> pageviews counting was changed, using cookies, affecting the mobile web
> views. This can be the cause.
> Igal (User IKhitron)

Thank you Igal, that must be the reason, I'll look for the change
announcement just to understand what it means exactly.

Strainu

>
> On Jul 15, 2017 13:47, "Strainu" <strain...@gmail.com> wrote:
>
>> Hi,
>>
>> Starting from an unrelated discussion on meta, I noticed a significant
>> drop in main page views for several wikis starting from April this
>> year. Is there anything we (or Google) did at that time to justify
>> this drop?
>>
>> ro.wiki: http://tools.wmflabs.org/pageviews/?project=ro.
>> wikipedia.org=all-access=user=2016-
>> 01=2017-06=Pagina_principal%C4%83
>>
>> hu.wiki: http://tools.wmflabs.org/pageviews/?project=hu.
>> wikipedia.org=all-access=user=2016-
>> 01=2017-06=Kezd%C5%91lap
>>
>> fr.wiki: http://tools.wmflabs.org/pageviews/?project=fr.
>> wikipedia.org=all-access=user=2016-
>> 01=2017-06=Wikip%C3%A9dia:Accueil_principal
>>
>> de.wiki: http://tools.wmflabs.org/pageviews/?project=de.
>> wikipedia.org=all-access=user=2016-
>> 01=2017-06=Wikipedia:Hauptseite
>>
>> en.wiki: http://tools.wmflabs.org/pageviews/?project=en.
>> wikipedia.org=all-access=user=2016-
>> 01=2017-06=Main_Page
>> (slightly different pattern)
>>
>> The same cannot be said for other projects, for instance uk.wiki:
>> http://tools.wmflabs.org/pageviews/?project=uk.wikipedia.org=all-
>> access=user=2016-01=2017-06=%D0%93%
>> D0%BE%D0%BB%D0%BE%D0%B2%D0%BD%D0%B0_%D1%81%D1%82%D0%BE%D1%
>> 80%D1%96%D0%BD%D0%BA%D0%B0
>>
>> Thanks,
>>Strainu
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Drop in mainpage pageviews?

2017-07-15 Thread Strainu
Hi,

Starting from an unrelated discussion on meta, I noticed a significant
drop in main page views for several wikis starting from April this
year. Is there anything we (or Google) did at that time to justify
this drop?

ro.wiki: 
http://tools.wmflabs.org/pageviews/?project=ro.wikipedia.org=all-access=user=2016-01=2017-06=Pagina_principal%C4%83

hu.wiki: 
http://tools.wmflabs.org/pageviews/?project=hu.wikipedia.org=all-access=user=2016-01=2017-06=Kezd%C5%91lap

fr.wiki: 
http://tools.wmflabs.org/pageviews/?project=fr.wikipedia.org=all-access=user=2016-01=2017-06=Wikip%C3%A9dia:Accueil_principal

de.wiki: 
http://tools.wmflabs.org/pageviews/?project=de.wikipedia.org=all-access=user=2016-01=2017-06=Wikipedia:Hauptseite

en.wiki: 
http://tools.wmflabs.org/pageviews/?project=en.wikipedia.org=all-access=user=2016-01=2017-06=Main_Page
(slightly different pattern)

The same cannot be said for other projects, for instance uk.wiki:
http://tools.wmflabs.org/pageviews/?project=uk.wikipedia.org=all-access=user=2016-01=2017-06=%D0%93%D0%BE%D0%BB%D0%BE%D0%B2%D0%BD%D0%B0_%D1%81%D1%82%D0%BE%D1%80%D1%96%D0%BD%D0%BA%D0%B0

Thanks,
   Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Roadmap for CX?

2017-05-02 Thread Strainu
2017-05-02 11:42 GMT+03:00 Amir E. Aharoni <amir.ahar...@mail.huji.ac.il>:
> 2017-04-27 8:55 GMT+03:00 Strainu <strain...@gmail.com>:
>
>> Following the recent outage, we've had a new series of complaints
>> about the lack of improvements in CX, especially related to
>> server-side activities like saving/publishing pages.
>>
>> Now, I know the team is involved in a long-term effort to merge the
>> editor with the VE, but is there an end in sight for that effort? Can
>> I tell people who ask "look, 6 more months then we'll have a much
>> better translation tool"?
>>
>> Is there a publicly available roadmap for this project and more
>> generally, for CX?
>>
>>
> Hi,
>
> Thanks again for bringing this up.
>
> Currently the Language team is indeed working on transitioning the editing
> component to VE. At the moment we are completing the rewrite of the
> frontend internals using OOjs UI and so using VE's special handling of edge
> cases. This is more than a refactoring—this will also improve the stability
> of several features such as saving and loading, paragraph alignment, and
> table handling.
>
> We hope to complete the transition of the translation editing interface to
> VE in July–September 2017. This will not only change the interface itself,
> but will also bring in some of the most often requested CX features, such
> as the ability to add new categories, templates, and references using VE's
> existing tools rather than just adapt them, and to edit the translation
> using wiki syntax.
>
> The next part to develop would be another round of improvement of template
> support. The previous iteration was done in the latter half of 2016, and
> allowed adapting a much wider array of templates, including infoboxes.
> However, one important kind of template that is not yet supported well
> enough is ones inside references (a.k.a. citations or footnotes), and this
> will be the focus of the next iteration. We also plan to improve CX’s
> template editor itself by allowing machine translation of template
> parameter values, and by fixing several outstanding bugs in it.
>
> After finishing these two major projects, in early 2018 we expect to work
> on fixing various remaining bugs, after which we plan to start declaring
> Content Translation as non-beta in some languages. We are figuring out
> which bugs exactly will these be; the current list is at
> https://phabricator.wikimedia.org/project/view/2030/ , but it will likely
> change somewhat before we get there. (Suggestions about what should go
> there are welcome at any time.)



Thanks Amir!

Will these features be available all at once, or will they be deployed
gradually?

Thanks,
   Strainu

>
> Finally, two further future directions that we are thinking about
> longer-term are:
> 1. Translation List: Shared and personal lists of articles awaiting
> translation ( https://phabricator.wikimedia.org/T96147 ). We already have
> designs for it, but the implementation will have to wait until we fix the
> more urgent issues above.
> 2. Better support on mobile devices. This is complicated, but much-needed.
> Some early thoughts about this can be found at
> https://www.mediawiki.org/wiki/Content_translation/Product_Definition/Mobile_exploration
> , but there will need to be much more design and development around this.
>
> You can see a more formal document about this here, although the content is
> largely the same:
> https://www.mediawiki.org/wiki/Content_translation/Roadmap/2017%E2%80%932018
>
> The Language team already had this more or less figured out a couple of
> months ago, but the publishing was delayed because of the higher-level
> planning process (
> https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2017-2018/Draft/Programs/Product
> ).
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Roadmap for CX?

2017-04-26 Thread Strainu
Following the recent outage, we've had a new series of complaints
about the lack of improvements in CX, especially related to
server-side activities like saving/publishing pages.

Now, I know the team is involved in a long-term effort to merge the
editor with the VE, but is there an end in sight for that effort? Can
I tell people who ask "look, 6 more months then we'll have a much
better translation tool"?

Is there a publicly available roadmap for this project and more
generally, for CX?

Thanks,
   Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] A few Reading Web announcements: new header, page previews rollout update, lead section restricted PageImages

2017-04-06 Thread Strainu
Hi Olga,

2017-04-05 15:07 GMT+03:00 Olga Vasileva <ovasil...@wikimedia.org>:
> - PageImages restricted to the lead section [5]

This sounds like a setback for tools and robots that use this feature
to easily retrieve images associated with articles (the hard way being
parsing the source), although based on the examples from the bug I
agree that many of them where not really relevant. What are the next
steps to further improve PageImages? Any plans to implement T95026
and/or T91683 in the near future?

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SHA-1 hash officially broken

2017-02-25 Thread Strainu
2017-02-24 21:29 GMT+02:00 Florian Schmidt :
> About the git part: There's an interesting statement from Linus [1] with 
> which I can totally agree (and additionally this one[2]). So probably even 
> this is not a "high priority, do nothing else" task :P

Just FYI, this was confirmed in [3]

>
> [1] http://marc.info/?l=git=148787047422954
> [2] 
> https://public-inbox.org/git/pine.lnx.4.58.0504291221250.18...@ppc970.osdl.org/
[3] https://plus.google.com/+LinusTorvalds/posts/7tp2gYWQugL

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Multilingual maps: how to pick each feature language?

2017-02-10 Thread Strainu
2017-02-10 3:31 GMT+02:00 Yuri Astrakhan <yuriastrak...@gmail.com>:
> TLDR: if browsing a map for French wiki, and a city only has a Russian and
> Chinese name, which one should be shown? Should the city name have
> different rules from a store or a street name? ...

Where exactly are those names stored? OSM, Commons, the local Wiki? If
in OSM, you should always have a fallback showing key:name, whatever
it is, instead of name:fr.

Strainu
>
> I have been hacking to add unlimited multilingual support to Wikipedia
> maps, and have language fallback question:  given a list of arbitrary
> languages for each map feature, what is the best choice for a given
> language?
>
> I know Mediawiki has language fallbacks, but they are very simple (e.g. for
> "ru", if "ru" is not there, try "en").
>
> Some things to consider:
> * Unlike Wikipedia, where readers go for "meaning", in maps we mostly need
> "readability".
> * Alphabets: Latin alphabet is probably the most universally understood,
> followed by...? Per target language?
> * Politics: places like Crimea tend to have both Russian and Ukrainian
> names defined, but if drawing map in Ukrainian, and some feature has
> Russian and English names, but not Ukrainian, should it be shown with the
> Russian or Ukrainian name?
> * When viewing a map of China in English, should Chinese (local) name be
> shown together with the English name? Should it be shown for all types of
> features (city name, street name, name of the church, ...?)
>
> Thanks!
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Changes in colors of user interface

2016-12-12 Thread Strainu
2016-12-12 10:21 GMT+02:00 Quim Gil <q...@wikimedia.org>:
> Hi, let me check this incident under the light of the Technical
> Collaboration Guideline
> <https://www.mediawiki.org/wiki/Technical_Collaboration_Guideline> (draft
> under review, feedback welcome in the related discussion pages).
>
> https://www.mediawiki.org/wiki/Technical_Collaboration_Guideline/Milestone_communication
> defines when and where are communications expected.

Thank you for the pragmatic approach Quim. I launched 2 discussions
there, referring to changes that require action from the communities
[1] and changes affecting large number of pages [2]. Hopefully we can
find a middle ground on at least some of the subjects.

Strainu

[1] https://www.mediawiki.org/wiki/Topic:Th1vs3h97d96ajaf
[2] https://www.mediawiki.org/wiki/Topic:Th1wc4pu1qplo4k8

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Changes in colors of user interface

2016-12-11 Thread Strainu
2016-12-11 12:13 GMT+02:00 Amir Ladsgroup <ladsgr...@gmail.com>:
> On Sat, Dec 10, 2016 at 3:19 PM Strainu <strain...@gmail.com> wrote:
>
> For 3 reasons:
> 1. While MW is open source, what gets deployed on the WMF servers is
> the legal and moral responsability of the Foundation.
>
> They have responsibility but it's limited.
>
> 2. The WMF has an 8-person "Community Liaisons" team that is dedicated
> to "inform the communities during the whole process of development of
> said software, and facilitate its adoption." [1] For me, that means
> that they should be the ones that make sure that changes that impact
> million of pages don't get left out, even if the developer forgets to
> notify anyone.
>
> It's not correct. community liaisons can't check every patch to see if it's
> going to impact users (and most patches have effect on users, even
> indirect). They built a protocol and told developers to inform them when
> they think the change is going to impact users *significantly*.

Then perhaps we need to define "significantly" to include changes that
affect every single article on every single wiki we have. If that's
not significant, I don't know what is. People claimed in this thread
that your change was "almost invisible". But if just 1:1,000,000
pageviews causes someone to notice the change, in just a month, your
change will have affected  over 15,000 people. That's singnificant,
IMHO.

Where is this protocol you talk about laid out?

>
> 3. The average wikipedian does not seem to make the difference between
> volunteer developers and employees of the WMF (this is a personal
> opinion and I might be wrong).
>
> It's horrifyingly wrong. Lots of stuff is being done by either volunteers
> or staff in their volunteer capacity. You need to correct this view. not to
> blame WMF, right?

Wrong. It's just not realistic to expect 70.000+ editors distributed
in hundreds or thousands of wikis to understand free software
development, deployments and versioning. For them, it's "the
developers" who "break the site" and that's all they want (and need)
to know. The details need to be abstracted by someone, and the WMF is
the best positioned entity to do that. I think this is already
addressed, though, since that's the role of the community liaisons.
It's just a question of how much can those people do - I ask more from
them, others want less.

Since we got to this point, I would also like to address Tyler's email
from yesterday, because it shows a similar lack of understanding of
how non-technical communities use MW. He talks about red-tape
slowing-down development, but forgets (or does not realise) that
constant tweaking to the website is in itself red-tape for smaller
communities, as it takes away valuable resources from other tasks:
most Wikipedias simply do not have a web developer with admin rights.
Also, throwing the responsability of maintaining Common.css up to date
solely on the communities without providing a minimum of heads-up and
guidance ("I don't even think this change needs to be announced")
might look like a smart thing to do, but in the long run it turns
pitilessly against the developers, as it results in even more
technical debt that they need to address when developing new features.
I think the team working on global gadgets might have some interesting
stories on this process :)


>
> Best
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Changes in colors of user interface

2016-12-10 Thread Strainu
2016-12-10 12:48 GMT+02:00 Amir Ladsgroup <ladsgr...@gmail.com>:
> I just want you to stop there
>
> On Sat, Dec 10, 2016 at 2:07 PM Strainu <strain...@gmail.com> wrote:
>
>> That's one way to put it. I would rather say that we reacted to yet
>> another slip-up in communication from the Foundation.  Why is it so
>> hard for you guys to push the information to wikis?
>>
>> Why is this related to WMF?

For 3 reasons:
1. While MW is open source, what gets deployed on the WMF servers is
the legal and moral responsability of the Foundation.
2. The WMF has an 8-person "Community Liaisons" team that is dedicated
to "inform the communities during the whole process of development of
said software, and facilitate its adoption." [1] For me, that means
that they should be the ones that make sure that changes that impact
million of pages don't get left out, even if the developer forgets to
notify anyone.
3. The average wikipedian does not seem to make the difference between
volunteer developers and employees of the WMF (this is a personal
opinion and I might be wrong).

> Do you really want to compare this to something like
> MediaViewer rollout?

MediaViewer, VisualEditor and many others, yes. But not in the sense
that this was as bad as those, rather that the WMF missed another good
opportunity to establish trust and prepare for the next big feature.

Small, almost invisible changes are the best time to practice and
experiment with notifications and to gauge the community response: how
many communities actually made changes to Common.css? How many needed
to make changes? For the ones that did not make the changes, was it
because they did not have the knowledge or because they missed the
memo? Etc, etc, etc... This way, we (the "tech abassadors"), you
(developers) and them (community liaisons) can all be better prepared
for the next big deployement.

Strainu

[1] https://meta.wikimedia.org/wiki/Community_Liaisons

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Changes in colors of user interface

2016-12-10 Thread Strainu
2016-12-10 11:07 GMT+02:00 Gergo Tisza <gti...@wikimedia.org>:
> For reference, these are the changes being discussed:
> https://phabricator.wikimedia.org/F5022813
>
>
> 1) Significantly larger changes than this are happening all the time (the
> OOUI-ification of old forms, for example), without anyone noticing,

Agreed. However, these changes do not apply to millions of articles.
It's a question of scale and visibility.

> it's pretty clear people are reacting to the announcement here and not the
> actual change.

That's one way to put it. I would rather say that we reacted to yet
another slip-up in communication from the Foundation.  Why is it so
hard for you guys to push the information to wikis?


>
> There is nothing wrong with not paying attention to something well outside
> your work area, and people should not be excluded from a discussion topic
> just because they are new (or casual) to it, but please consider how it
> creates an unhealthy community dynamic when people are criticized for
> announcing changes which would otherwise go unnoticed.

I don't think an RFC was needed, but:
1. an announcement on this list with a phabricator number would have been nice
2. an announcement *on wiki, before the deployment* was mandatory.

The rest of your mail does not seem related to this particular change,
so I'll respond separately.


2016-12-10 1:43 GMT+02:00 Andre Klapper <aklap...@wikimedia.org>:
> Regarding a heads-up, would it have helped to have these changes listed
> in https://meta.wikimedia.org/wiki/Tech/News/2016/49 ?

Yes,  with a follow-up in /50. Do note that the text in /50 is
insufficient, you
might want to add something like what Amir said: "in order to keep
consistency between all elements of a wiki page, change such usages in your
Mediawiki:Common.css (for example for infoboxes)." That's because not all
TechNews readers can deduce action items by themselves.

HTH,
Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Changes in colors of user interface

2016-12-09 Thread Strainu
2016-12-09 10:53 GMT+02:00 Amir Ladsgroup <ladsgr...@gmail.com>:
> But these changes are too small to notice and even smaller to dislike.

Amir, I believe you've been a wikimedian for too long to really
believe that. No change is to small to be disliked by one or more
people!

>> It seems to me that Wikimedians should be given
>> plenty of notice that color changes like these are proposed, and should be
>> given ample opportunity to comment on them before they are rolled out, but
>> this is the first that I recall hearing of these changes. I would go so far
>> as to say that there should be an RfC before making changes like this to
>> community wikis.

Totally agree. It's not a question of how small/large such a change
is, it's a question of collaboration and mutual respect. Which was
lacking, yet again :(

Some communities will miss these changes completely just because they
had already overridden the old values in Common.css, most likely for
historical reasons. A heads-up would have allowed them to decide on
whether to go with the upstream changes or stick with the current
layout.

I call again on the staff and volunteers that work on the frontend to
be announce changes well ahead of time and as widely as possible.

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Google Code-in 2016 just started and we need your help!

2016-12-03 Thread Strainu
2016-12-03 13:25 GMT+02:00 John Mark Vandenberg <jay...@gmail.com>:
> Is there any warning when the 36 hour limit is approaching?
>
> Are Wikimedia org admins watching this limit somehow?
> Is there some process in place?
> e.g. 24 hr "this is getting worrying" status, where we find another
> mentor / code reviewer?

John,

There was an email sent to mentors which says:

"Mentors do have 24h of time to review and reply to questions until
Google will send a reminder email, and a maximum of 36h in total.
However, if you plan to be off for a weekend or take holidays, please
either do tell us (admins could also add more co-mentors to a task if
you have somebody in mind), or when creating tasks simply put some
"[DONT PUBLISH BEFORE 20161231]" prefix or such into the task summary."

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Automatic gerrit authentication and retrieval of reviews

2016-10-30 Thread Strainu
Hi,

After extensive discussions with my fellow organizers, we decided to
go with an oauth consumer on wikitech, and I submitted a new consumer.
It would be great if Bryan or another Oauth admin could take a look at
it (we're waaay overdue with that already).

Thank you,
   Strainu

2016-10-22 17:06 GMT+03:00 Strainu <strain...@gmail.com>:
> Thank you all for your suggestions. I'll discuss with the people who
> implemented the original application and decide on the best approach
> given the limited resources and time.
>
> Strainu
>
> 2016-10-22 3:23 GMT+03:00 Gergo Tisza <gti...@wikimedia.org>:
>> On Fri, Oct 21, 2016 at 3:38 PM, Strainu <strain...@gmail.com> wrote:
>>
>>> I'm simply trying to make it easy for the users. In the current
>>> version of the tool, they login with the github account and the rest
>>> happens "magically": the tool retrieves their pull requests and scores
>>> them according to a predefined set of criteria - no need for user
>>> input of any kind. I just want the same workflow for patches submitted
>>> to gerrit and I needed a way to authenticate the users and match the
>>> information I have from the OAuth endpoint with reviews from gerrit.
>>
>>
>> In that case, I would require them to prove account ownership by sending an
>> email to the gerrit email address with a verification link.
>>
>> Or you could require that that email address is present in Github (it does
>> not have to be the primary address, and this is a good practice anyway as
>> it will ensure that the clone repo on Github attributes the patch to them
>> correctly once it gets merged - although in theory the Gerrit owner,
>> committer and author email address could be three different things, but
>> that's unlikely to happen) and then verify that somehow. You can probably
>> just upload a gist with that address and check whether Github attributes it
>> to them.
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Automatic gerrit authentication and retrieval of reviews

2016-10-22 Thread Strainu
Thank you all for your suggestions. I'll discuss with the people who
implemented the original application and decide on the best approach
given the limited resources and time.

Strainu

2016-10-22 3:23 GMT+03:00 Gergo Tisza <gti...@wikimedia.org>:
> On Fri, Oct 21, 2016 at 3:38 PM, Strainu <strain...@gmail.com> wrote:
>
>> I'm simply trying to make it easy for the users. In the current
>> version of the tool, they login with the github account and the rest
>> happens "magically": the tool retrieves their pull requests and scores
>> them according to a predefined set of criteria - no need for user
>> input of any kind. I just want the same workflow for patches submitted
>> to gerrit and I needed a way to authenticate the users and match the
>> information I have from the OAuth endpoint with reviews from gerrit.
>
>
> In that case, I would require them to prove account ownership by sending an
> email to the gerrit email address with a verification link.
>
> Or you could require that that email address is present in Github (it does
> not have to be the primary address, and this is a good practice anyway as
> it will ensure that the clone repo on Github attributes the patch to them
> correctly once it gets merged - although in theory the Gerrit owner,
> committer and author email address could be three different things, but
> that's unlikely to happen) and then verify that somehow. You can probably
> just upload a gist with that address and check whether Github attributes it
> to them.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Automatic gerrit authentication and retrieval of reviews

2016-10-21 Thread Strainu
2016-10-22 1:16 GMT+03:00 Gergo Tisza :
> Are you worried that the users
> are going to give positive reviews to themselves to bias the scores?

Authentication is used only to ensure they don't claim somebody else's
submissions (say, Gerrit Patch Uploader's :) ). Yes, this could
probably be detected manually, but we're trying to go with an
automated workflow where manual interventions are at a minimum.

> Can you better explain what you are after?

I'm simply trying to make it easy for the users. In the current
version of the tool, they login with the github account and the rest
happens "magically": the tool retrieves their pull requests and scores
them according to a predefined set of criteria - no need for user
input of any kind. I just want the same workflow for patches submitted
to gerrit and I needed a way to authenticate the users and match the
information I have from the OAuth endpoint with reviews from gerrit.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Automatic gerrit authentication and retrieval of reviews

2016-10-21 Thread Strainu
2016-10-21 19:03 GMT+03:00 Alex Monk <kren...@gmail.com>:
> On 21 October 2016 at 10:13, Strainu <strain...@gmail.com> wrote:
>
>> 1. Gerrit does not seem to support oauth authentication. I vaguely
>> remember that the gerrit account used to be linked to the mw.org
>> account. Is there any way I could use the mw.org auth to retrieve the
>> gerrit account and/or authenticate to gerrit with it? The gerrit
>> uploader seems to only use the mw account to put the username in the
>> committer field and then uploads the change as itself.
>
>
> Gerrit uses LDAP authentication, which is controlled by
> wikitech.wikimedia.org and matches accounts there. This is separate to
> Wikimedia SUL which controls accounts on wikis like mediawiki.org.
> See https://phabricator.wikimedia.org/T148048 and the bottom of
> https://meta.wikimedia.org/wiki/Community_Tech/Tool_Labs_support/Tool_Labs_vision#Project_roadmap


Thanks Alex, that's very nice! I have some follow-up questions:
1. What's the best way to match users between wiki and gerrit? I
suspect the answer is username, since on gerrit one can register
multple emails and I can't find a full/real name on wiki, but what
about if a user is moved? Also, does the shell username has anything
to do with gerrit?
2. When requesting a new OAuth consumer, what should I choose as the
Type of grants being requested?
  - Authentication only, no API access.
  - Authentication only with access to real name and email address via
Special:OAuth/identify, no API access.
  - Request authorization for specific permissions.

Thank you,
   Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Automatic gerrit authentication and retrieval of reviews

2016-10-21 Thread Strainu
2016-10-21 16:08 GMT+03:00 Marielle Volz <mv...@wikimedia.org>:
> You can add multiple e-mails both to gerrit [0] and github [1]. As long as
> the e-mail address you are making commits with is added to both accounts,
> you can likely use your preexisting software directly on the mirrored
> github repos[2]. For example, my contributions to the citoid repo, all of
> which were made on gerrit, are also automatically* associated with my
> github account [3]. You could add a throwaway email to both both gerrit and
> github and set this as your git email [4] and then your e-mail will not be
> publicly exposed anywhere.


Hi Marielle,

Thank you for your response, it was really informative. Your solution
seems basically equivalent to skipping gerrit entirely, right? The big
downside of that is that we can't evaluate changes that were not
merged. We also can't score the commit based on parameters from the
review (such as how many versions were uploaded, etc.)

Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Automatic gerrit authentication and retrieval of reviews

2016-10-21 Thread Strainu
Hi everyone,

I'm organizing a contest for people in Romania willing to contribute
to Wikimedia code. [1] In order to automatically grade the
contributions, we're using a tool already developed be our partners,
ROSEdu, which reviews changes made on github [2][3].

The current (github-based) workflow is:
1. The admins add a number of repositories that qualify for the contest
2. The paticipants login with their github account (using oauth)
3. The software retrieves all the pull requests they made to the
relevant projects.
4. A number of points is assigned for each pull request using a
predefined formula (based on the number of touched lines, if the
change was merged etc.; can by customized)

I need some guidance on how to replicate this workflow to Wikimedia's gerrit.

I've read the API docs [4] and looked at the gerrit uploader [5] and
it seems that retrieving the reviews is fairly straightforward, since
all the reviews seem to be available through unauthenticated access.

The real issue is how to match the user in the tool with the reviews
without user intervention. Any ideas or advice are appreciated, but
here are my thoughts on the issue:

1. Gerrit does not seem to support oauth authentication. I vaguely
remember that the gerrit account used to be linked to the mw.org
account. Is there any way I could use the mw.org auth to retrieve the
gerrit account and/or authenticate to gerrit with it? The gerrit
uploader seems to only use the mw account to put the username in the
committer field and then uploads the change as itself.

2. The simplest (although not so secure) solution would be to ask
people to submit their changes using the same email address used for
their github account. This will only work if the user is willing to
make their github address public (I'm not doing that, for instance).

3. Another idea would be to match the gerrit account with the github
account. This sounds even less reliable.

4. Give up and ask the users to submit the email/user used for gerrit
and check for cheaters manually (this should work as long as the
number of contributors is small)

Thanks,
  Strainu

[1] https://www.mediawiki.org/wiki/Wikimedia_Challenge_powered_by_ROSEdu
[2] http://challenge.rosedu.org/
[3] https://github.com/rosedu/challenge
[4] https://gerrit.wikimedia.org/r/Documentation/rest-api.html
[5] https://github.com/valhallasw/gerrit-patch-uploader

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to check that the page contents has change from API

2016-09-30 Thread Strainu
2016-09-30 10:26 GMT+03:00 Strainu <strain...@gmail.com>:
> Thank you very much Daniel, this is exactly what I was looking for.
> Just to be sure, is that timestamp also updated when a template in the
> page is changed? I would expect so.

Sorry, just saw that you already answered this one in the original email.

>
> Regards,
>Strainu
>
> 2016-09-30 10:12 GMT+03:00 Daniel Kinzler <daniel.kinz...@wikimedia.de>:
>> You can check recentchanges for the page. Relevant changes to wikidata items
>> used on a page are injected into the recentchanges stream, so they show up in
>> the watchlist of people watching the page on wikipedia. You may need to set
>> rctype=edit|external in your API query though.
>>
>> Note that this does not (yet) work for the page history, just recentchanges.
>>
>> You can also look at the "touched" timestamp as returned by the API:
>> <https://en.wikipedia.org/w/api.php?action=query=info=Main%20Page>
>>
>> This will be updated whenever the page is purged/rerendered, e.g. when a
>> template changed - or a wikidata item.
>>
>> HTH
>> Daniel
>>
>> Am 30.09.2016 um 01:59 schrieb Strainu:
>>> With the advent of Wikidata-based infoboxes, the page contents can
>>> change without the local text being changed, so without a new
>>> revision. Is there any way tho find out when this happens from the
>>> API? I know I can always do 2 API calls, one for the page and one for
>>> the item, but that's time consuming.
>>>
>>> Thanks,
>>>   Strainu
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>>
>>
>> --
>> Daniel Kinzler
>> Senior Software Developer
>>
>> Wikimedia Deutschland
>> Gesellschaft zur Förderung Freien Wissens e.V.
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to check that the page contents has change from API

2016-09-30 Thread Strainu
Thank you very much Daniel, this is exactly what I was looking for.
Just to be sure, is that timestamp also updated when a template in the
page is changed? I would expect so.

Regards,
   Strainu

2016-09-30 10:12 GMT+03:00 Daniel Kinzler <daniel.kinz...@wikimedia.de>:
> You can check recentchanges for the page. Relevant changes to wikidata items
> used on a page are injected into the recentchanges stream, so they show up in
> the watchlist of people watching the page on wikipedia. You may need to set
> rctype=edit|external in your API query though.
>
> Note that this does not (yet) work for the page history, just recentchanges.
>
> You can also look at the "touched" timestamp as returned by the API:
> <https://en.wikipedia.org/w/api.php?action=query=info=Main%20Page>
>
> This will be updated whenever the page is purged/rerendered, e.g. when a
> template changed - or a wikidata item.
>
> HTH
> Daniel
>
> Am 30.09.2016 um 01:59 schrieb Strainu:
>> With the advent of Wikidata-based infoboxes, the page contents can
>> change without the local text being changed, so without a new
>> revision. Is there any way tho find out when this happens from the
>> API? I know I can always do 2 API calls, one for the page and one for
>> the item, but that's time consuming.
>>
>> Thanks,
>>   Strainu
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
>
> --
> Daniel Kinzler
> Senior Software Developer
>
> Wikimedia Deutschland
> Gesellschaft zur Förderung Freien Wissens e.V.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   3   >