[Wikitech-l] Re: inquiry on method to compress PNG imagefiles stored by WikiMedia

2024-04-23 Thread Brian Wolff
Generally disk space is cheap enough that doing these sorts of things
aren't worth the extra complexity, especially when we just have to convert
it back to png in order to serve it to the user at the end of the day.

LZMA is a more advanced compression algorithm than DEFLATE (which is what
png uses), so its not entirely surprising it might give better results for
some images.

--
Bawolff

On Tuesday 23 April 2024, TealSynapse--- via Wikitech-l <
wikitech-l@lists.wikimedia.org> wrote:

>
> *i've never contributed to any mailing lists, or partaken much, so i don't
> know how to start this off...*
>
>
> *hi? i've realized i can compressed a dataset of lossless .png files down
> to between a third- or a fourth of the initial size on disk. *
>
>
> *that said: in compressing my own backups i find that lossless .png files
> converted to .xpm (not .ppm, not .bmp) before being compressed result in a
> compression that of 3- to 4 times smaller in size on disk than the initial
> .png with it's own compression --- irregardless of any 'PNG optimization'
> done beforehand: the resulting .xpm files remain identical in size and
> compress to precisely the same size. *
>
> *the following commands¹ can surely be tinkered with to greater effect:*
>
> *---start of shell commands---*
> mkdir -p xpm
> magick convert image.png xpm/image.xpm
> mkdwarfs -i xpm/ -o compressed.dfs -l9
>
> *---end of shell commands---*
> *now you have a compressed image, three to four times smaller in size on
> disk, to inspect.*
>
> *here i openly wonder how a comparison - to assert whether the resulting
> .xpm file and the lossless .png are indeed the same picture still - would
> be carried out.*
>
> *likewise i see this quality of compression extends to the .xgm .xbm
> formats.*
>
>
> *in sharing i wish to bring up the above observation to my best ability.
> insights welcome as to why this happens and if indeed the image resulting
> from a lossless .png to .xpm conversion is the same - if this compressed
> bitmap outperforms the PNG compression significantly without compromising
> image integrity. (do reply with saying if this is irrelavant information
> and or presented inadequately in any way: i don't wish to bring red
> herrings to this mailing list.)*
>
>
> *ultimately this is about if a large portion of WikiMedia imagedata indeed
> can be compressed further by this process - in a 'Pareto improvement' kind
> of way.*
>
> *-Ivy, 25*
>
>
>
> *for interest: sources to programs referenced and my brief notes on these.*
> *[1]*
>
> *magick: https://imagemagick.org/index.php
> *
> *mkdwarfs, part of the inappropriately named dwarfs toolset:*
>
> *https://github.com/mhx/dwarfs/blob/main/doc/mkdwarfs.md
> *
> *note that the -l flag is given the option 9 in the command - this means
> LZMA is used in this program.*
>
> *also note that while i use 'mkdwarfs' with LZMA here, i realize the same
> result on any other program using LZMA or XZ occurs - like the following
> 'dar', more adequate for single file extraction from an archive and
> analyzing individual file compression values en masse. *
>
> *dar: http://dar.linux.free.fr/doc/man/dar.html
> unfortunately the projects'
> website doesn't use HTTPS, so a wayback machine link with HTTPS:*
>
> *https://web.archive.org/web/20240423233825/http://dar.linux.free.fr/doc/man/dar.html
> *
> *---start of referenced dar command---*
> dar -c output -zxz9 -R input/
> *---end of referenced dar command---*
>
> *lastly two clarifications:*
>
> *'a lossless .png file' here means an imagefile which never has been
> converted in a lossy way. a .jpg file converted to a .png used in this
> procedure produces an output taking more space on disk. *
>
> *'the procedure' here refers to conversion of a lossless .png to .xpm
> imagefile format then compressed with either LZMA or XZ in any program.*
>
>
> *thank you for reading.*
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: ORES To Lift Wing Migration

2023-08-04 Thread Brian Wolff
> Our understanding is that renaming extensions in MediaWiki is a long and
complicated process, so we'll likely not be able to rename it in the
foreseeable future.

Why?

Renaming is usually a bad thing because it often confuses the hell out of
users, but from a technical perspective it is pretty trivial.

--
Bawolff

On Friday, August 4, 2023, Luca Toscano  wrote:

> Hi Amir!
>
> Answering inline:
>
> On Thu, Aug 3, 2023 at 10:11 PM Amir E. Aharoni <
> amir.ahar...@mail.huji.ac.il> wrote:
>
>>
>> The email says that "All ML models currently accessible on ORES are also
>> currently accessible on Lift Wing", and if I understand correctly, this
>> means that this feature in Recent Changes will keep working. Do I
>> understand correctly? :)
>>
>
> Definitely yes, we are working on migrating the ORES extension to Lift
> Wing, without any change required for users. The tracking task is
> https://phabricator.wikimedia.org/T319170. At the moment all wikis with
> the ORES extension enabled, except fi/en/wikidata, are already using models
> from Lift Wing.
>
>
>> In addition, I have some followup questions:
>>
>> 1. The MediaWiki extension that implements the frontend in Recent Changes
>> is itself named "ORES". It's an internal name that isn't seen much by wiki
>> editors except if they go to Special:Version or to translatewiki.
>> Nevertheless, as the time goes by, seeing the old name may start getting
>> weird. So what's the plan about it? Will this extension remain as is? Will
>> it be renamed? Will it be replaced with a new frontend extension in the
>> foreseeable future?
>>
>
> This is a good question and we don't have a definitive answer at the
> moment. Our understanding is that renaming extensions in MediaWiki is a
> long and complicated process, so we'll likely not be able to rename it in
> the foreseeable future. We would definitely like to add more models to RC
> Filters, for example Revert Risk (for the curious, see
> https://meta.wikimedia.org/wiki/Machine_learning_models/Proposed/Language-
> agnostic_revert_risk), but we are not sure yet if it is worth to create a
> new extension or just to expand the ORES one. We'll get back to this list
> as soon as we have a better plan :)
>
>
>> 2. Back when ORES was originally developed and deployed around 2017,
>> several wiki editors' communities participated in the development by
>> adapting the product to the needs of their wikis and languages by
>> translating the ORES extension's user interface and, more importantly, by
>> labelling a sample of several thousands of diffs from their wiki using the
>> Wikilabels tool. The communities that did that whole process were, more or
>> less, the communities to which this Recent Changes enhancement was
>> deployed. Will anything like that have to be done again along with the move
>> away from ORES?
>>
>
> The first goal of Lift Wing is to provide a more modern and easy-to-use
> infrastructure to host models at the WMF, for internal teams and for the
> community. The focus of the Machine Learning team is to provide
> infrastructure to run models on, so other teams and the community will be
> able to propose what to host and we'll vet what is possible and what not
> (following strict criteria like security of data and PII, model
> architecture feasibility, etc..). Once a model is deployed on Lift Wing,
> there will be a team or a community group owning it, namely responsible for
> its development in terms of features etc.. (more info in
> https://wikitech.wikimedia.org/wiki/Machine_Learning/
> LiftWing#Hosting_a_model).
> To summarize:
> * All the work done so far with ORES models will be preserved, it is
> already available on Lift Wing and anybody can use it. We hope that it is
> now easier to play with model servers and improve them (for WMF and the
> community), but we are open to any suggestion and feedback about how to
> improve it. For the curious, more details in the Lift Wing Wikitech page (
> https://wikitech.wikimedia.org/wiki/Machine_Learning/LiftWing).
> * The future work will be split into two main areas (as I see it):
> ** The ML team will keep working on improving the infrastructure,
> documentation, performance, etc.. of Lift Wing, to provide better tools and
> data access for any new idea related to models and their usage. We'll
> maintain the infrastructure with monitoring/alarms/etc.., so the day-to-day
> ops will not fall on the model owners (WMF and community), so that they
> will be able to concentrate themselves only on the models and their future
> steps.
> ** Other WMF teams like Research will propose and work on new models that
> the community needs, but we'll also focus on improving what is currently
> being used. For example, most of the ORES traffic is for the goodfaith and
> damaging models that worked very well over the years but they rely on old
> training data and architectures. The Revert Risk models (for example,
> https://meta.wikimedia.org/wiki/Machine_learning_models/Proposed/
> 

[Wikitech-l] Re: [action requested] Technical Decision-Making Retrospective Survey

2023-07-18 Thread Brian Wolff
Just fyi, qustion 17 requires you to answer even though the instructions
say not to answer unless applicable.

On Tuesday, July 18, 2023, Moriel Schottlender 
wrote:

> Hello all,
>
> The Technical Decision-Making Forum Retrospective team
>  invites you to
> complete a survey about Wikimedia's technical decision-making processes.
> While there will be more ways to participate, this is the first and most
> important step in our data collection. It aims to gather information
> about your experience, thoughts, and needs regarding the process of making
> technical decisions across the Wikimedia technical spaces.
>
> This survey will be used for gathering information about the process and
> the needs around technical decision-making that touches our production
> systems.
>
> You can find the survey link here: https://wikimediafoundation.
> limesurvey.net/885471?lang=en
>
> Who should take this survey?
>
> People who do technical work that relies on software maintained by the
> Wikimedia Foundation (WMF)  or affiliates.  If you contribute code to
> MediaWiki or extensions used by Wikimedia, or you maintain gadgets or tools
> that rely on WMF infrastructure, this survey is for you.
>
> What is the deadline?
>
> *August 7th, 2023 *
>
> What will the Retrospective team do with the information?
>
> The retrospective team will synthesize the collected data and publish an
> anonymized analysis that will help leadership make decisions about the
> future of the process.
>
> We will collect anonymized information that we will analyze in two main
> ways:
>
>-
>
>Sentiments based on demographic information: these will tell us
>whether there are different needs and desires from different groups of
>people.
>-
>
>General needs and perceptions about decision-making in our technical
>spaces: This will help us understand what kind of decisions happen in
>the spaces, who is involved, and how to adjust our processes accordingly.
>
>
> Is the survey the only way to participate?
>
> The survey is the most important way for us to gather information because
> it helps us gather input in a structured manner. But it will not be the
> only way you can share your thoughts with us - we will have more
> information soon about upcoming listening sessions where you can talk with
> us live. In the meantime, you are always welcome to leave feedback on the
> talk page: https://www.mediawiki.org/wiki/Talk:Technical_decision_
> making/Technical_Decision-Making_Process_Retrospective_
> and_Consultation_plan
>
>
> Where can I see more information?
>
> There are several places where you can find more information about the
> Technical Decision-Making Process Retrospective:
>
>
>-
>
>The original announcement about the retrospective from Tajh Taylor:
>https://lists.wikimedia.org/hyperkitty/list/wikitech-l@
>lists.wikimedia.org/thread/IF3OJARL4WYKJYQAS7NTP4FT72R626EG/
>
> 
>
>-
>
>The Technical Decision-Making Process general information page:
>https://www.mediawiki.org/wiki/Technical_decision_making
>
>-
>
>The Technical Decision-Making Process Retrospective on MediaWiki:
>https://www.mediawiki.org/wiki/Technical_decision_
>making/Technical_Decision-Making_Process_Retrospective_
>and_Consultation_plan
>
> 
>
>-
>
>Phabricator ticket: https://phabricator.wikimedia.org/T333235
>
>
> How to contact the retrospective core team:
>
>
>-
>
>Write to the core team mailing list: tdf-retro-2...@lists.wikimedia.org
>
>-
>
>The Technical Decision-Making Process Retrospective on MediaWiki talk
>page: https://www.mediawiki.org/wiki/Talk:Technical_decision_
>making/Technical_Decision-Making_Process_Retrospective_
>and_Consultation_plan
>
> 
>
>
>
> Thank you,
>
> Moriel, on behalf of the TDMP Retro Core Group
>
> Core group:
>
>-
>
>Moriel Schottlender (chair)
>-
>
>Daniel Kinzler
>-
>
>Chris Danis
>-
>
>Kosta Harlan
>-
>
>Temilola Adeleye
>
>
>
>
>
> --
> Moriel Schottlender (she/her )
> Principal Software Engineer
> Wikimedia Foundation https://wikimediafoundation.org/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Gerrit replica downtime (30 minutes) tomorrow Tue, 16 May 13:00-15:00 UTC

2023-05-15 Thread Brian Wolff
Does this include other uses of gerrit replica? Should extension
distributor be switched to main gerrit?

--
Brian

On Tuesday, May 16, 2023, Daniel Zahn  wrote:

> > This means codesearch will be affected (and won't get updated) and
> possibly even will be down during that time.
>
> We, at least in my team, would like to switch codesearch (and other
> clients) back to just use gerrit.wikimedia.org and not the replica
> directly.
>
> Just today we agreed to make a new ticket for specifically this,
> because soon we have to reimage the replica to bullseye and add more
> downtime.
>
> The reason we did the split in the past was to reduce load on the main
> gerrit server but meanwhile first the issue has been fixed in newer
> Gerrit
> versions and then also just a few days ago we switched to brand new
> hardware.
>
> So now if anything it should be beefier than before and even without
> that it seemed already a thing of the past.
>
> And we pay for this with this issue that the replica becomes a second
> production system, with the need for downtimes. It complicates
> fail-over scenarios
> too and in a way means there is never a passive host when we do DC
> switch-over.
>
> So yea, I suggest we change the config of codesearch now to use the
> main gerrit unless you have concerns about that.
>
> On Mon, May 15, 2023 at 1:18 PM Amir Sarabadani 
> wrote:
> >
> > This means codesearch will be affected (and won't get updated) and
> possibly even will be down during that time.
> >
> > Best
> >
> > Am Mo., 15. Mai 2023 um 22:03 Uhr schrieb Tyler Cipriani <
> tcipri...@wikimedia.org>:
> >>
> >> Hello
> >>
> >> The read-only Gerrit replica[0] will be down for 30 minutes tomorrow
> (Tue, 16 May 2023) between 13:00–15:00 UTC[1] due to network switch
> upgrades in codfw row D[2].
> >>
> >> During this window, git reads from the replica will not work.
> >>
> >> To my knowledge, this affects bots which rely on the replica for git
> read operations.
> >>
> >> Apologies for any inconvenience.
> >>
> >> Tyler Cipriani (he/him)
> >> Engineering Manager, Release Engineering
> >> Wikimedia Foundation
> >>
> >> [0]: 
> >> [1]: 
> >> [2]: 
> >> ___
> >> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> >> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> >> https://lists.wikimedia.org/postorius/lists/wikitech-l.
> lists.wikimedia.org/
> >
> >
> >
> > --
> > Amir (he/him)
> >
> > ___
> > Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> > To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> > https://lists.wikimedia.org/postorius/lists/wikitech-l.
> lists.wikimedia.org/
>
>
>
> --
> Daniel Zahn 
> Site Reliability Engineer
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.
> lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Continuing development of mediawiki Extension AutoCreatePage

2023-05-03 Thread Brian Wolff
Hi,

Did you try making pull requests against
https://github.com/Universal-Omega/AutoCreatePage ? I think universal omega
took over the extension at some point.


Generally speaking, if the original author doesn't respond after a
reasonable time, the original extension does not work with modern
mediawiki, and nobody else objects, its considered acceptable to take over
the name of the extension. You should definitely post on the talk page of
the extension's page on mediawiki.org, try and email all the authors, and
file github issues first, before doing so, in order to be sure the original
author has been notified.

--
Brian

On Wednesday, May 3, 2023, Oliver Tautz 
wrote:

> Hello,
>
>
> I use the extension AutoCreatePage (https://www.mediawiki.org/wik
> i/Extension:AutoCreatePage) in a project at my University. But it does
> not work
>
> in current mediawiki versions as it was last updated . So I try to adapt
> it to work in mediawiki 1.39.3. It almost works now and I'm considering
> releasing it but don't know how to proceed.
>
>
> I forked the github project (https://github.com/mkroetzsch
> /AutoCreatePage/pull/16) but got no response for now. There are also
> older pull requests not answered.
>
> Should I somehow replace the existing extension with my new one or would
> it be better to make my fork a new extension with a different name? It does
> not feel right, because I use
>
> a lot of the code from the old extension.
>
>
> I hope this is the correct mailing list for this topic. Otherwise I would
> be thankful if you could direct me to the correct place to ask about this.
>
>
> Kind Regards,
>
> Oliver Tautz
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists
> .wikimedia.org/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Technical Decision Making Retro

2023-04-14 Thread Brian Wolff
Hi,

> The retro itself is not a governance process, and it won't involve making
decisions of consequence.

This seems kind of at odds what is written on wiki. To quote the opening
paragraph of
https://www.mediawiki.org/wiki/Technical_decision_making/Technical_Decision-Making_Process_Retrospective_and_Consultation_plan
: "The general goal of this process is to revise the Wikimedia Technical
decision-making process...The process will run in two phases: consultation,
and design and implementation."

The on wiki page seems to imply the primary output of this process will be
to decide what needs to be changed, and make those changes. Doesn't get
much more consequential than that.

I think this is something that should be clarified at the get-go to align
expectations. I think its really important to know what the expected
outputs are when designing a process like this.

--
Brian

On Friday, April 14, 2023, Tajh Taylor  wrote:

> Hi Kunal,
>
> Thank you for the feedback. To address your question about the
> relationship between this work and the Movement Strategy recommendation to
> establish a Technology Council, I will say that I do not believe they are
> addressing the same set of needs, but I can see how the work of each body
> will be related.  What is proposed as the purpose for the Technology
> Council is "to oversee the process of *developing new technical features*"
> (emphasis theirs).  Though the words in the recommendation do not describe
> it this way, I see this as a product development and management function.
> My perspective on the purpose of the TDMP is somewhat different: in the
> pursuit of ongoing product development or operations support, we need to
> make decisions in an engineering capacity, and those decisions have
> consequences for system architecture, operational outcomes, and paths of
> software development.  It isn't a way to make product decisions, and it
> does have to reflect and consider the operational responsibilities that
> reside with Foundation staff. That said, I can see a possible scenario
> where many of the same volunteers are interested in both functions, though
> I do not think that will be true of all volunteers, and it is likely not
> the case that Foundation staff participants would be the same in both
> cases, as our roles are more sharply defined. I will be watching to see how
> the Movement Strategy recommendation is moved forward, to continue to
> verify that my understanding remains accurate, and to see what impact that
> may have on our work.
>
> On the matter of the "Core team", I have given the task of planning and
> running the retro to staff members whose work I can direct.  The retro has
> been planned from the beginning to be inclusive of as wide a range of
> volunteer voices and perspectives as we can manage within the scope of the
> work.  In this way, it is not really different from any other operational
> process that the Foundation runs, where volunteer voices may be included
> but volunteers don't have the responsibility of managing the processes. The
> retro itself is not a governance process, and it won't involve making
> decisions of consequence. It is an information gathering process, to build
> a broad and comprehensive perspective on the TDMP.  Reflection on the
> information we have gathered should inform decisions we can make,
> particular decisions to change or adjust the TDMP itself, but those
> decisions are not themselves part of the retro.
>
> I'm looking forward to your participation in the retro activities!
>
> Tajh Taylor (he/him/his)
>
> VP, Data Science & Engineering
>
> Wikimedia Foundation 
>
>
>
> On Thu, Apr 13, 2023 at 2:56 AM Kunal Mehta  wrote:
>
>> Hi Tajh,
>>
>> On 3/30/23 09:18, Tajh Taylor wrote:
>> > For quite some time now, we have experienced issues with the Technical
>> > Decision Making Process (TDMP). Volunteer contributors and staff have
>> > asked if we are still operating the Technical Decision Forum (TDF, the
>> > member body that participates in the TDMP). Communication about it from
>> > the Foundation has been inconsistent, and interest from the volunteer
>> > community in joining has been low. Some of our most senior engineers on
>> > Foundation staff have expressed that the process is flawed, doesn’t
>> > create room for discussion about the technical issues surrounding a
>> > decision, and doesn’t ensure participation by all stakeholders who may
>> > be affected by the decision. Suffice it to say, the current state of
>> > affairs leaves many participants wanting more.
>>
>> I'm glad to see this stated in the open, I think your summary is a
>> decent starting point of why the TDF never worked. I do think a retro or
>> post-mortem of the TDF from this perspective is needed.
>>
>>  From the talk page:
>>  > The intention of the retrospective is to understand the pain points
>> and the areas to improve the current process.
>>
>> What from the TDF is worth salvaging to the 

[Wikitech-l] Reflecting on my listening tour

2023-04-14 Thread Brian Wolff
Perhaps i am hyperfocused on technical debt in the sense of improving the
abstractions used in mediawiki. The phrasing around sustainability
especially leads me in that direction. However, technical debt is certainly
a broad concept and can mean a lot of things.

The common thread in the examples you cited seem to be things that have
fallen through the ownership cracks. I'm not sure its the case that
engineers aren't incentivized to fix these, so much as there are no natural
engineers to be incentivized due to team structure (scribunto is an
exception but i would disagree with that task's inclusion for reasons that
get off topic). On the other hand, perhaps a different incentivization
structure would encourage people to branch out more.

I think it is especially telling that 3 (or 4 even) of these tasks are
multimedia related, given that wmf hasn't had a multimedia team in a very
long time [SDC does not count], and definitely not one focused on the
backend. There are quite a few multimedia related areas in desperate need
of love (it is 2023 and video uploads are limited to 4gb with the flakiest
upload process known to man).


It was also pointed out to me on irc, that many critical workflows in the
community depend on toolforge tools that have very limited volunteer
maintainership. A sort of https://xkcd.com/2347/ situation. Just because
they're not "production" we often pretend they don't exist. Regardless of
how we label them, in the end it doesn't make a difference to the end user,
and the fragility of that ecosystem is a form of technical debt that is
often overlooked.


So i guess it all depends on what is meant by "technical debt"
--
Brian

On Friday, April 14, 2023, Andre Klapper  wrote:

> On Thu, 2023-04-13 at 20:06 -0700, bawolff wrote:
> > > "I think there are lots of promising opportunities to incentivise
> > > people to pay off technical debt and make our existing stack more
> > > sustainable. Right now there are no incentives for engineers in
> > > this regard."
> >
> > Interesting. Personally to me, it can sometimes feel like we never
> > stop talking about technical debt. While I think paying off technical
> > debt is important, at times I feel like we've swung in the opposite
> > direction where we are essentially rewriting things for the sake of
> > rewriting things.
>
> "Technical debt" spontaneously brings the following items to my little
> mind. They should not be about rewriting but rather "maintenance":
>
>  * librsvg for SVG rendering is a five year old version:
>https://phabricator.wikimedia.org/T193352 /
>https://phabricator.wikimedia.org/T265549
>  * Graph extension on old Vega version 2.6.3: see subtasks of
>https://phabricator.wikimedia.org/T292341
>  * Scribunto extension on old Lua version 5.1 (last 5.1.x release was
>in 2012): https://phabricator.wikimedia.org/T178146
>  * 3D extension on a five year old three.js library in
>https://phabricator.wikimedia.org/T277930#7636129
>  * Removing the OpenStackManager extension from wikitech.wm.org:
>https://phabricator.wikimedia.org/T161553
>  * Removing WVUI from MediaWiki
>core: https://phabricator.wikimedia.org/T310244
>  * Replacing jsduck with JSDoc3 across all Wikimedia code bases:
>https://phabricator.wikimedia.org/T138401
>  * Undeploy VipsScaler from Wikimedia wikis:
>https://phabricator.wikimedia.org/T290759
>  * https://phabricator.wikimedia.org/T151393 (a non-public task)
>
> This is not a complete list. Plus there are also separate "waiting for
> someone to make a decision" and "improving communicating & documenting
> already-made decisions" categories which would be different lists.
>
> Of course there might be valid reasons not to look into some of this
> technical debt (other higher priorities, high risk, complexity, etc).
>
> Cheers,
> andre
>
> --
> Andre Klapper (he/him) | Bugwrangler / Developer Advocate
> https://blogs.gnome.org/aklapper/
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists
> .wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: About "Final view structure for view `page_view`"

2023-03-24 Thread Brian Wolff
As far as i know mediawiki core has never used views. It is possible some
extension made it, but it would be an unusual thing to do (for example the
UserSnoop extension makes a view, but not named that). Or maybe a sysadmin
made it for their own purposes at some point.

--
Brian

On Friday, March 24, 2023, [[kgh]]  wrote:

> Hi Taavi,
>
> Thanks for the pointer. I got this automatically when dumping the database
> of a rather old version of MediaWiki. Was something like this in pre-1.23
> MediaWiki? It would be a surprise, but one never knows.
>
> Cheers, Karsten
>
>
> Am 24.03.23 um 18:11 schrieb Taavi Väänänen:
>
>> That's a MariaDB view[0] definition. Looks like that one might have been
>> used to give someone access to the `page`.`page_title` and
>> `page`.`page_counter` fields without giving them access to the rest of the
>> page data. Hard to say anything more specific without knowing where you've
>> got that from.
>>
>> [0]: https://mariadb.com/kb/en/views/
>>
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists
> .wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Use case for sha1 in revisions

2023-03-23 Thread Brian Wolff
This kind of sounds like a non-answer, but its mostly useful if you want a
hash of the revision. I dont think mw core really uses it, but it can be
useful for quickly detecting duplicate revisions. I think primarily it is
for external users.

--
Brian

On Wednesday, March 22, 2023, Bináris  wrote:

> Hi,
> can someone please tell me a use case for sha1 in revisions?
> https://www.mediawiki.org/wiki/API:Revisions
>
> (Of course, I tried first on talk page.)
> --
> Bináris
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: March 2023 Datacenter Switchover

2023-03-01 Thread Brian Wolff
You should probably just file a bug.

Its certainly plausible it had something to do with data center switch, but
it could just as equally be unrelated. It requires someone to investigate
what part of the system failed (job queue? Varnish? Swift?) which would
then lead to a root cause. Its pretty impossible to say without further
investigation, and speculating on list is probably not helpful.

--
Bawolff

On Wednesday, March 1, 2023, Igal Khitron  wrote:

> Hello.
> Is the new files bug happening because of the database switch? And if it
> is, do you fix it?
> The story: If you reupload file, local or on commons, the new version does
> not create thumb, and the old one is shown in articles. The only way to see
> a new version is opening the file in media: namespace. I waited for hours.
> So, you can't edit files any more. I read the commons reupload log a bit,
> looks like it's not just my problem. The regular things, clear cache,
> purge, null edit, do not help.
> I think there should be a phab task created, but I'd like to know your
> answer first.
> Thank you.
> Igal (user:IKhitron)
>
>
> בתאריך יום ד׳, 1 במרץ 2023, 19:26, מאת Giuseppe Lavagetto ‏<
> glavage...@wikimedia.org>:
>
>> Specifically:
>>
>> if we measure read only time as "an editor can't start an edit because
>> wikis are read only", then the read-only time is 119s;
>> if we measure it by the last timestamp of an edit being saved, that's  94
>> seconds.
>>
>> As Amir explained, we leave some room for propagation of the MediaWiki
>> read-only mode (about 10-15 seconds) and for in-flight edits (another 10
>> seconds) before we set the databases to read-only as well.
>>
>> I think 2 minutes of read-only for such a complex operation are the good
>> balance between reasonable change safety and reduction of impact; we could
>> reduce the read-only time by another 10-20 seconds with some more
>> aggressive moves (like clearing the DNS recursor caches) but I don't think
>> there's a big value there at this point.
>>
>> I'll add: if anyone is interested in knowing more and they're coming to
>> the hackathon, I'll be happy to make an impromptu session about how we
>> handle this procedure.
>>
>> Cheers,
>>
>> Giuseppe
>>
>> On Wed, Mar 1, 2023 at 6:16 PM Amir Sarabadani 
>> wrote:
>>
>>> It's a bit complicated.
>>> When SRE sets the read-only mark, they start counting from that time and
>>> it starts propagating which takes a while to be actually shown to all users
>>> but some users might still see the RO error while some actual writes are
>>> happening somewhere else because the cache is not invalidated yet (I think
>>> it has a TTL of 5 seconds but I need to double check). We still consider
>>> that as RO time because it's affecting users regardless.
>>>
>>> HTH
>>>
>>>
>>>
>>> Am Mi., 1. März 2023 um 18:06 Uhr schrieb Dušan Kreheľ <
>>> dusankre...@gmail.com>:
>>>
 Clément Goubert and everybody,

 I analyzed https://stream.wikimedia.org/v2/stream/recentchange and i
 have the another results.

 Last change (before migration): 2023-03-01T14:00:30
 First change (after migration):2023-03-01T14:02:05
 Result: Down time (14:00:31 to 14:02:05) is 94s.

 I think that analysis is more authoritative. I think it analyzes based
 on something like REQUEST_TIME in PHP.

 Dušan Kreheľ


 2023-03-01 16:30 GMT+01:00, Clément Goubert :
 > Dear Wikitechians,
 >
 > Dear colleagues,
 >
 > The switchover process requires a *brief read-only period for all
 > Foundation-hosted wikis*, which started at *14:00 UTC on Wednesday
 March
 > 1st*, and lasted *119 seconds*. All our public and private wikis
 continued
 > to be available for reading as usual. Users saw a notification of the
 > upcoming maintenance, and anyone still editing was asked to try again
 in a
 > few minutes.
 >
 > As a side note, with other SREs we have been trying to discern the
 effect
 > of the Switchover in many of the graphs we have to monitor the
 > infrastructure in https://grafana.wikimedia.org during Switchover.
 In many,
 > it's impossible to tell the event. The most discernible graph we have
 is of
 > the edit rate, which can be viewed here: Grafana
 > .
 > Can you spot it? See the attached picture to help:
 >
 > I am extending thanks to everyone that was also present on IRC,
 helping out
 > in any way that they could. Thanks as well to Community Relations who
 > notified communities of the read-only window ahead of time. And
 thanks to
 > everyone that contributed to MultiDC
 > ,
 > especially Performance for pushing forward with the last parts of it,
 > allowing us to perform this Switchover faster and with more
 confidence 

[Wikitech-l] Re: [Proposal] Disable setting the "Lowest" Priority value in Phabricator

2023-02-24 Thread Brian Wolff
On Friday, February 24, 2023, Dan Garry (Deskana)  wrote:

> On Fri, 24 Feb 2023 at 11:20, Andre Klapper 
> wrote:
>
>>  * Personally I also assume Lowest priority is sometimes used instead
>>of honestly declining a task (means: "this is not a good idea"[5]).
>>But of course that is rather hard to prove.
>>
>
> This is anecodal, but when I was a product manager at the WMF, I did this
> sometimes. As is true in any company or project, there will always be
> tickets that contain valid bugs or requests, but the reward per unit effort
> makes them not worth fixing. I often tried to close these as declined to
> reflect that reality, but not infrequently someone outside the team would
> reopen the ticket. The path of least resistance to stop team workboards
> being filled with tickets that would never be actioned was to mark them as
> lowest priority, and then use a filter to remove those tickets from our
> views of the board.
>
> I don't particularly have a view one way or the other on removal of the
> "lowest" priority. The workflow I described above wouldn't really change;
> "low" could just become the new "lowest", but people wouldn't find it
> demoralising, I suppose?
>
> Dan
>

I feel this is a problem with phabricator being used as both a project
management tool for teams and as a bug tracker, when those are different
things. Certainly its reasonable for teams to have bugs they dont think is
worth fixing and a way for them to ignore said bugs, but declining them
hinders people who are trying to track what the known bugs are resulting in
duplicated effort. Not to mention external contributors may want to fix
these bugs and declining them hinders this. I personally think the best
solution is to have separate work boards for team management vs mediawiki
component.

My personal definition of lowest vs declined:
Declined: if you wrote a patch for this it would be rejected. Fixing this
is a bad idea.
Lowest: nobody cares and nobody is going to work on it. However if you
care, feel free to write a patch and it will get reviewed.

Generally i find the priority field an endless source of drama. The way
priority/severity gets conflated is problematic, the way it is global
instead of per work board is problematic. The way non-devs think it is
prescriptive instead of descriptive is very problematic. Sometimes i wonder
if it should be removed entirely, although maybe that is too far.

If i could change the priority field i would probably change it to be only:
* unbreak now
* work on next (or "urgent")
* normal
* no intention to fix (aka lowest)

I think that's the only useful statuses. I would prefer statuses to be more
descriptive. Nobody knows what "high" means. Everyone knows what unbreak
now means. I would also suggest being really blunt with the names too. It's
much more demotivating to give people false hope.

--
Bawolff
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Can't claim on Phabricator

2023-02-13 Thread Brian Wolff
I also have a memory of there previously being a 1 click button in the
sidebar to claim task. Maybe it was a feature that got removed or perhaps
just a false memory.

--
Bawolff

On Monday, February 13, 2023, Andre Klapper  wrote:

> On Thu, 2023-02-02 at 17:43 +0100, Bináris wrote:
> >
> > Jaime Crespo ezt írta (időpont: 2023. febr. 2., Cs, 17:10):
> > > to claim or assign a task, you have to go to a particular task,
> > > then at the end click on the "Add action..." list, then "Assign /
> > > Claim" and write your account name or someone else's (it will
> > > autocomplete).
> >
> > Oh, that's pretty hidden in hell! As far as I remember, last time I
> > was looking for it, it was on the right sidebar. At least claiming.
>
> Both is possible to assign a task in Phab: Either via "Edit Task" in
> the sidebar, or via the "Add Action" dropdown above the comment field.
>
> See also https://www.mediawiki.org/wiki/Phabricator/Help
>
> Cheers,
> andre
> --
> Andre Klapper (he/him) | Bugwrangler / Developer Advocate
> https://blogs.gnome.org/aklapper/
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.
> lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Contributing through block bases programming to the Wikimedia Codebase

2023-02-08 Thread Brian Wolff
I don't think this is suitable for the mediawiki codebase, but it sounds
like something wikifunctions would be interested in. See
https://meta.wikimedia.org/wiki/Abstract_Wikipedia

--
Bawolff

On Wednesday, February 8, 2023,  wrote:

> Hello,
>
> I am interested in programming and I asked a question about potential use
> cases for block based programming within the Wikimedia projects in the last
> year on this Mailinglist. In the last months I tried different things to
> offer alternative ways of creating source code.
>
> I like for example the programming language COBOL as it is possible to
> write source code with only a few brackets and it is quite similar to short
> English sentences. I have experience with the programming languages R and
> COBOL. Do you think it is possible to collect the most used code snippets
> and create blocks for these code snippets. I wrote a script in the
> programming language R that enables the conversion of blocks to code.
> https://gitlab.wikimedia.org/hogue/block-to-code/-/tree/main/ There are
> two folders and I am still experimenting and changes are possible to make
> it more useful for code generation. From my point of view it could enable
> more people to contribute to the Wikimedia Codebase if it is possible to
> create programs through combining blocks in a visual programming language
> like Snap! or Scratch. I also tried it the other way with converting source
> code to blocks in the visual programming language Snap! and wrote a script
> to convert Spreadsheet functions to source code in Programming language R.
> Do you think the possibility to use of visual programming languages could
> enable more people to contribute to the Wikimedia Codebase. I am interested
> in extending the existing script to help people to contribute in such a
> way. Another advantage is the possibility to translate the blocks. This
> makes it possible to understand to contribute also without an understanding
> of English.
>
> Please tell me what you think about my thoughts and if and how I could do
> it from your point of view.
>
> Hogü-456
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.
> lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Announcing: Path review board

2023-02-02 Thread Brian Wolff
On Thursday, February 2, 2023, Brett Cornwall 
wrote:

> On 2023-02-02 11:33, Zoran Dori wrote:
>
>> Amazing to hear about this, I'm hoping that this will improve the number
>> of
>> merged patches which are stuck on waiting for review.
>>
>
> It also might be important to keep contributors motivated - nothing is
> more demoralizing than having a patch sitting for years.
>

Indeed. I've certainly felt that way at times.

--
Bawolff
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Announcing: Path review board

2023-02-01 Thread Brian Wolff
Yesterday there was a conversation about code review on irc and among other
things, how sometimes patches can get "stuck".

I had an idea for a way to improve things. I'm not sure if it is a good
idea, but there's only one way to find out.

So without further ado, announcing the Code Review Patch Board:
https://www.mediawiki.org/wiki/Code_review/patch_board

In short - each person is allowed to list one of their patches on the board
that they would really like to see reviewed. You can only list one patch at
a time, and it should be a patch that you have been unable to get review
for for at least a week through normal means. See the page for the full
list of guidelines.

I encourage people to give it a try. Add a patch you wrote that you cannot
get a review for. Or if you have +2 rights, try giving some love to these
underloved patches.

I would also love to hear feedback on the general idea as well as the
current guidelines.

To repeat, the url is:
https://www.mediawiki.org/wiki/Code_review/patch_board

Thanks,
bawolff
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Avoid invisible characters in page titles

2023-01-17 Thread Brian Wolff
Even in english, you still have emoiji that use ZWJ characters. e.g. ️‍
has "invisible" characters. There are all sorts of control characters in
unicode that usually do nothing but sometimes do something.

There may be some invisible characters that make sense to strip or
normalize. Indeed we already do so for some. But one has to be very careful
about these things.

-
Bawolff

On Tuesday, January 17, 2023, Amir Sarabadani  wrote:

> Disallowing invisible characters or cleaning them is a bad idea.
>
> Invisible characters are actually heavily used in many languages including
> Persian (and part of the official manual of style of the language taught in
> schools) it is downright wrong to check and fix those in many wikis in
> those languages.
>
> Also many wikis have titles in other languages such as wiktionaries or
> redirects in a different languages (For example:
> https://en.wikipedia.org/w/index.php?title=%D8%AA%D9%87%
> D8%B1%D8%A7%D9%86=no) which means removing ZWNJ or similar
> characters would be also unacceptable in English Wiktionary or English
> Wikipedia as well.
>
> There are some exemptions though: Two invisible characters are wrong, or
> an invisible character at the end or beginning. But all of these are cases
> in Persian language and another language might actually allow that as well.
>
> Best
>
> Am Di., 17. Jan. 2023 um 12:48 Uhr schrieb Martin Domdey <
> dr.ta...@gmail.com>:
>
>> Thank you,
>>
>> it seems that there's nobody working on it anymore, right?
>>
>> Kind regards,
>> Martin ...
>>
>>
>>
>> Am Di., 17. Jan. 2023 um 12:28 Uhr schrieb Andre Klapper <
>> aklap...@wikimedia.org>:
>>
>>> Hi,
>>>
>>> On Tue, 2023-01-17 at 12:03 +0100, Martin Domdey wrote:
>>> > isn't it better to avoid invisible characters in page titles
>>> > while creating the pages?
>>> >
>>> > Please look here, there has been problems with invisible characters
>>> > working with it when parsing or page linking those page titles with
>>> > invisible unicode
>>> > characters: https://de.wikipedia.org/wiki/Benutzer_Diskussion:Wurgl#L
>>> > iste_der_Biografien/Ci
>>>
>>> See also
>>> https://en.wikipedia.org/wiki/Wikipedia:Village_pump_%
>>> 28technical%29#UTF-8_ZERO_WIDTH_SPACE_in_page_title
>>>
>>> > Instead of this there will never be a problem when invisible
>>> > characters within the page title name will be deleted when
>>> > creating the page.
>>> >
>>> > What do you think about it and what technical approaches do
>>> > already exist? How are LTR and RTL marks dealt if creating pages with
>>> > them?
>>>
>>> See https://phabricator.wikimedia.org/maniphest/query/GDxAs4QdEDTG/#R
>>> for related bugs, and a ticket about improving cleanupTitles.php.
>>>
>>> Cheers,
>>> andre
>>> --
>>> Andre Klapper (he/him) | Bugwrangler / Developer Advocate
>>> https://blogs.gnome.org/aklapper/
>>> ___
>>> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
>>> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
>>> https://lists.wikimedia.org/postorius/lists/wikitech-l.
>>> lists.wikimedia.org/
>>
>> ___
>> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
>> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
>> https://lists.wikimedia.org/postorius/lists/wikitech-l.
>> lists.wikimedia.org/
>
>
>
> --
> Amir (he/him)
>
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Paid collaboration for a custom MediaWiki Skin

2022-11-21 Thread Brian Wolff
Hi,

This list isn't for job postings. There is a list of people who do that
sort of thing at
https://www.mediawiki.org/wiki/Professional_development_and_consulting

--
Brian

On Monday, November 21, 2022, Busayo  wrote:

> Hello,
>
>
>
> We want to create a custom skin for our game's MediaWiki site
> . The final look should be
> more reflective of a game. Something akin to EveSkin
> . Is there anyone available to
> help create this? We have no idea how much it will cost, so we are open to
> quotes.
>
> We are ready to start immediately after we finish the detail, as we hope
> to have the skin implemented by the 1st of December.
>
> Thanks.
>
> Live long and prosper,
>
> Busayo Ajao
>
> Marketing & Communications Lead
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Cookie-based CSS toggles for logged-out readers?

2022-10-10 Thread Brian Wolff
Actually, logged in users are much more expensive to serve than logged out.
So it depends what percentage of users we are talking about.

In any case, a cache-busting cookie is just as costly as being logged in
(since being logged in also involves a cache busting cookie).

On Saturday, October 1, 2022, Samuel Klein  wrote:

> How about a one-click 'account creation + customization' option, which
> updates the skin, rerenders the page, and asks the reader to finish setting
> username + password to save the pref?
>
> That seems cache cheap and a fine tradeoff: more readers with accounts
> would be good on a lot of levels
>
> 
>
> On Fri., Sep. 30, 2022, 4:59 p.m. Brian Wolff,  wrote:
>
>> Basically, if you are ok with it only taking affect after page load (so
>> users see first the original page then the new version) it is trivial.
>> However flash of wrongly styled content is a really bad user experience.
>>
>> To do it without the flash of wrong version of toggle, is difficult due
>> to the way our infrastructure is currently setup. Nothing insurmountable in
>> principle, but high effort and involves some tradeoffs that seems not worth
>> it in context.
>>
>> --
>> Brian
>>
>> p.s. for the avoidance of doubt, this is my personal opinion and not an
>> "official" answer in any capacity.
>>
>> On Friday, September 30, 2022, Samuel Klein  wrote:
>>
>>> Dear WT,
>>>
>>> The perennial discussion about ways to provide logged-out users with
>>> persistent customization of their reading experience in the browser has
>>> cropped up again in the context of the pending deployment of Vector 2022.
>>>
>>> Can a cookie-based width toggle
>>> <https://en.wikipedia.org/wiki/Wikipedia:Requests_for_comment/Deployment_of_Vector_(2022)#Alt_proposal:_gradual_changeover_+_width_toggle>
>>> be offered without splitting the cache or otherwise making the toggler
>>> regret their tog?
>>>
>>> The answers currently range from "*shouldn't be very complicated or
>>> hacky, just toggling a class*..." and "*can be delivered to logged-out
>>> users on top of the cached parser output*.."  to "*impossible*".
>>>
>>> Could someone clarify the challenges and costs of trying to toggle css
>>> classes in this way?
>>>
>>> Warmly, SJ
>>>
>>> --
>>> Samuel Klein  @metasj   w:user:sj  +1 617 529
>>> 4266
>>>
>> ___
>> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
>> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
>> https://lists.wikimedia.org/postorius/lists/wikitech-l.
>> lists.wikimedia.org/
>
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: TDF is looking for community representatives

2022-10-10 Thread Brian Wolff
So i think you would want a compromise approach with the email. Having the
email be 20 pages long would be bad, but its still possible to have more of
a "hook" without turning it into a massive wall of text.

That said, i think the lack of details in the email is not the main problem
here.

To attract candidates you basically have to answer the following questions:

* what exactly would candidates be doing, practically speaking? The process
is very opaque from the outside, and most people aren't going to sign up
for it blind.
* why does the committee want community members. What value does it hope to
get from us?
** more specificly, to put it bluntly, are we being used as tokens to give
the appearence of community consultation, or are people legitametely
interested in what we have to say? Quite frankly, the current appointments
of an anon, or listing fandom as a "volunteer organization" does feel a bit
dehumanizing, as if the committee doesn't care who the community
representitives are, as long as they can check that box.
* What do committee members get out of the process? Just because we are
volunteers, doesn't mean we do things purely out of the goodness of our
hearts.


These questions weren't really answered anywhere, which i expect reduced
interest.

For me personally, some other factors that came to mind:

* This is a committee of like 40 people. I've never seen a group of more
than 5 people that is actually useful. Its unclear why you would want a
committee of half the world, instead of just an open comment period
(actually i suspect the answer is to force wmf staff to actually
participate by having a directly responsible individual, but this approach
doesn't mesh well with the community)
* opinions of people currently on the committee seem quite mixed. Some are
positive, but some are very negative. When WMF staff behind closed doors
describe there experience on the committee in such negative terms, it
really puts a damper on enthusiasm.
* the opaqueness of the committee makes it unclear how feedback is actually
taken and used. There is an underlying fear that the committee is a long
winded process where decisions that have already been made get a rubber
stamp of approval regardless of any feedback brought up.
* inconsistent updates and response to feedback on wiki make the committee
look dead. This puts a damper on interest.


--
Brian

On Monday, October 10, 2022, Erica Litrenta  wrote:

> (Sorry to "hijack" the thread, I am not personally involved in TDF but
> since I was the original "messenger", I'm interested in learning more about
> Daniel's POV.
> The original email linked to https://www.mediawiki.org/
> wiki/Technical_decision_making/Community_representation .
> While I'm well aware that info a click away is not optimal, I'm definitely
> more against walls of text that may be hard to understand for non-native
> readers.
> That page was marked for translation instead, and among other things, it
> offered exactly the process you are describing, and the second email asked
> specifically for recommendations.
> We had /also/ asked for recs to a few dozens colleagues, and none of the
> people pinged gave their availability.
> Interested to hear what could have been done differently.)
>
> On Thu, Oct 6, 2022 at 1:06 PM Daniel Kinzler 
> wrote:
>
>> Am 06.10.2022 um 08:52 schrieb Linh Nguyen:
>>
>> Kunal,
>> I hear you but we only have 3 people who actually put the effort into
>> applying for the position.  We are appointing people who are at least
>> trying to help.  If you want to help in the process please feel free to put
>> your name on the list.
>>
>> The original mail doesn't really make it clear what impact one might have
>> by joining, or what would be expected of a member. Asking people to click a
>> link for details loses most of the audience already.
>>
>> One thing that has worked pretty well in the past when we were looking
>> for people to join TechCom was to ask for nominations, rather than
>> volunteers. We'd then reach out to the people who were nominated, and asked
>> them if they were interested.  Self-nominations were of course also fine.
>>
>> Another thing that might work is to directly approach active volunteer
>> contributors to production code. There really aren't so many really active
>> ones. Ten, maybe.
>>
>> --
>> Daniel Kinzler
>> Principal Software Engineer, Platform Engineering
>> Wikimedia Foundation
>>
>> ___
>> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
>> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
>> https://lists.wikimedia.org/postorius/lists/wikitech-l.
>> lists.wikimedia.org/
>
>
>
> --
>
> --
>
>
> Erica Litrenta (she/her)
>
> Senior Manager, Community Relations Specialists (Product)
>
> Wikimedia Foundation 
>
>
___
Wikitech-l mailing list -- 

[Wikitech-l] Re: TDF is looking for community representatives

2022-10-05 Thread Brian Wolff
So I see that User:Grunny, User:Rtnf and Dessalegn Yehuala were appointed.

Respectfully, given the criteria was someone with +2 rights, I  would
question the decision to appoint Dessalegn Yehuala, who not only has never
contributed to MediaWiki, but also appears to have never edited any
Wikimedia project, or even have an account.

Brian


On Thu, Oct 6, 2022 at 1:58 AM Technical Decision Forum Support <
tdfsupp...@wikimedia.org> wrote:

> Hi Brian,
>
> We have our representatives but we are always looking for some more as we
> rotate throughout the year.  My apologies for the late updates to the
> forum.  The lists are here
> <https://www.mediawiki.org/wiki/Technical_decision_making/Forum>.  The
> community representatives have completed one feedback to the IP Masking
> Proposal and next week we should have "Modern user interfaces for all
> users" proposal out to the representatives for their feedback.  You can see
> it on the TDF <https://phabricator.wikimedia.org/project/board/5179/>
> board.
>
> Thank you,
> TDF
>
>
> On Wed, Oct 5, 2022, 5:46 PM Brian Wolff  wrote:
>
>> Out of curiosity, what happened? Aug 12 has come and gone. More to the
>> point, i was wondering if TDF is even still a thing?
>>
>> --
>> Brian
>>
>> On Tuesday, August 2, 2022, Erica Litrenta 
>> wrote:
>>
>>> Hi everyone.
>>>
>>> This is a reminder that the search for community representatives for the
>>> Technical Decision Forum is still OPEN.
>>>
>>> There are a few weeks left to apply and/or recommend someone who would
>>> be a great fit for the role (learn more at
>>> https://www.mediawiki.org/wiki/Technical_decision_making/Community_representation
>>> ).
>>>
>>> Please let us know if there are any questions we can answer before you
>>> submit your application, or a friend's name! Email us at
>>> tdfsupp...@wikimedia.org by Aug 12, 2022.
>>>
>>> Looking forward to hearing from you soon…
>>>
>>> The Technical Decision Forum.
>>>
>>> On Tue, Jul 12, 2022 at 11:17 AM Erica Litrenta 
>>> wrote:
>>>
>>>> *(This email may not be written in your native language, and/or may be
>>>> a crosspost: our apologies.)*
>>>>
>>>>
>>>> Greetings, everyone.
>>>>
>>>> The Technical Decision Forum, or TDF (1), is seeking community
>>>> representatives. Please visit the TDF Community Representation (2)
>>>> page for more information, and email tdfsupp...@wikimedia.org by Aug
>>>> 12, 2022 to be considered for selection.
>>>>
>>>>
>>>> Don't worry about the requirements listed on wiki, they are not set in
>>>> stone; just provide your name, name of the group you "represent" if any,
>>>> and a short sentence or two explaining your interest.
>>>>
>>>>
>>>> For any questions, please post at *MediaWiki.org (3)*, or on the
>>>> related Movement Strategy Forum topic (4), or simply email
>>>> tdfsupp...@wikimedia.org.
>>>>
>>>> Thanks,
>>>>
>>>> The Technical Decision Forum.
>>>>
>>>> (1) https://www.mediawiki.org/wiki/Technical_decision_making/Forum
>>>> (2)
>>>> https://www.mediawiki.org/wiki/Technical_decision_making/Community_representation
>>>> (3) https://www.mediawiki.org/wiki/Talk:Technical_decision_making/Forum
>>>> (4)
>>>> https://forum.movement-strategy.org/t/technical-decision-forum-tdf-is-looking-for-community-representatives/1082
>>>>
>>>> --
>>>>
>>>> --
>>>>
>>>>
>>>> Erica Litrenta (she/her)
>>>>
>>>> Senior Manager, Community Relations Specialists (Product)
>>>>
>>>> Wikimedia Foundation
>>>> <https://meta.wikimedia.org/wiki/User:Elitre_(WMF)>
>>>>
>>>>
>>>
>>> --
>>>
>>> --
>>>
>>>
>>> Erica Litrenta (she/her)
>>>
>>> Senior Manager, Community Relations Specialists (Product)
>>>
>>> Wikimedia Foundation <https://meta.wikimedia.org/wiki/User:Elitre_(WMF)>
>>>
>>>
>>>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Cookie-based CSS toggles for logged-out readers?

2022-09-30 Thread Brian Wolff
Basically, if you are ok with it only taking affect after page load (so
users see first the original page then the new version) it is trivial.
However flash of wrongly styled content is a really bad user experience.

To do it without the flash of wrong version of toggle, is difficult due to
the way our infrastructure is currently setup. Nothing insurmountable in
principle, but high effort and involves some tradeoffs that seems not worth
it in context.

--
Brian

p.s. for the avoidance of doubt, this is my personal opinion and not an
"official" answer in any capacity.

On Friday, September 30, 2022, Samuel Klein  wrote:

> Dear WT,
>
> The perennial discussion about ways to provide logged-out users with
> persistent customization of their reading experience in the browser has
> cropped up again in the context of the pending deployment of Vector 2022.
>
> Can a cookie-based width toggle
> 
> be offered without splitting the cache or otherwise making the toggler
> regret their tog?
>
> The answers currently range from "*shouldn't be very complicated or
> hacky, just toggling a class*..." and "*can be delivered to logged-out
> users on top of the cached parser output*.."  to "*impossible*".
>
> Could someone clarify the challenges and costs of trying to toggle css
> classes in this way?
>
> Warmly, SJ
>
> --
> Samuel Klein  @metasj   w:user:sj  +1 617 529 4266
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Reminder about WMF board elections - People who develop mediawiki extensions are eligible to vote

2022-08-29 Thread Brian Wolff
I haven't seen too much discussion about this, but i just wanted to remind
everyone that people who participate in mediawiki development (including
many third party extensions) are eligible to vote in the current Wikimedia
foundation board of trustees election, even if you don't edit any wikimedia
wiki. You just have to have made one commit. Commits to third party
mediawiki extensions not used by Wikimedia wikis count (including some
hosted at github). The Wikimedia Foundation is the largest contributor to
MediaWiki core, and the trustees control the direction of the Wikimedia
Foundation. Hence your vote (indirectly) determines the direction of
MediaWiki software!

There is an unofficial list of eligible developers at
https://i-want-to-vote.toolforge.org . I encourage anyone who participates
in development of software in the mediawiki ecosystem to check if they are
on the list. Even if you are not on that list you may still be eligible.

You can learn more about the candidates at
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_elections/2022/Candidates
. In particular, many of you might already know legoktm who is one of the
candidates and is very active in the mediawiki developer community. You can
vote now up until september 6 (however some eligible people must register
to vote before sept 2). The link to vote is
https://meta.wikimedia.org/wiki/Special:SecurePoll/vote/393 . People who
are eligible to vote due to software development must pre-register via
email before that link works for them and they must do so 4 days before the
end of the election. See https://i-want-to-vote.toolforge.org for details.

The full official list of eligibility criteria is at
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_elections/2022/Voter_eligibility_guidelines#Developers
. The key point for software developers, is you only have had to make 1
commit between Jan 5 and july 5, to any gerrit repo, any toolforge tool
repo (including those outside gerrit), or any mediawiki extension/skin on
MWStake's list of mediawiki extensions/skins developed outside of gerrit.
If the above doesn't describe you, there are also other criteria that can
make you eligible, like if you have made lots of edits to
https://mediawiki.org or https://translatewiki.net, maintain or contribute
to "tools, bots, user scripts, gadgets, and Lua modules on Wikimedia
wikis", or "substantially engaged in the design and/or review processes of
technical development related to Wikimedia." or are employed by the
Wikimedia foundation or an affiliate of Wikimedia. Check the official rules
for the full list.

Anyways i encourage everyone to vote who is eligible. People who are voting
by virtue of being a software developer MUST register 4 days before the end
of the election (i.e. sept 2), so if that's you, be sure to do it in time.
See the official rules or https://i-want-to-vote.toolforge.org for details.

--
Bawolff

p.s. Just to clarify, this isn't an official email, i just wanted to ensure
that the wider mediawiki developer community was aware of this as i suspect
many eligible developers may be unaware they are allowed to vote.
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Invitation to join the Movement Strategy Forum

2022-08-21 Thread Brian Wolff
So it seems like the killer feature here is integrated machine translation.

As a naive monolingual english feature this surprised me - i always assumed
separate external translation would be basically equivalent, but i guess
that's wrong.

Anyways, assuming this is accurate, we should bring something equivalent to
mediawiki. We already integrate machine translation into cx, so we already
have the partnerships with external translation services in place.

--
Bawolff

On Friday, August 19, 2022, Mahuton Possoupe  wrote:

> Hi everyone,
>
> The Movement Strategy Forum  (MS
> Forum) is a multilingual collaborative space for all conversations about
> Movement Strategy implementation. We are inviting all Movement
> participants to collaborate on the MS Forum. The goal of the forum is to
> build community collaboration using an inclusive multilingual platform.
>
> The Movement Strategy
>  is
> a collaborative effort to imagine and build the future of the Wikimedia
> Movement. Anyone can contribute to the Movement Strategy, from a comment to
> a full-time project.
>
> Join this forum with your Wikimedia account, engage in conversations, and
> ask questions in your language.
>
> The Movement Strategy and Governance team (MSG) launched the proposal for
> this MS Forum in May. After a 2-month review period, we have just published
> the Community Review Report
> .
>  It
> includes a summary of the discussions, metrics, and information about the
> next steps.
>
> We look forward to seeing you at the MS Forum!
>
> Best regards,
>
> --
>
> Software developer | Python mentor @OpenCLassrooms | E-commerce specialist
> | Founder and Tech lead at CAURIS DEV: https://www.cauris-dev.com/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Don't forgot, only 4 more days to become eligible to vote in Board elections

2022-07-01 Thread Brian Wolff
Out of curiosity, who decides what constitutes "hav[ing] substantially
engaged in the design and/or review processes of technical development
related to Wikimedia"?

--
Brian
of technical development related to Wikimedia

On Friday, July 1, 2022, K. Peachey  wrote:

> Hi all,
>
> Just a small reminder for any developers that they only have 4 more
> days to become eligible to vote in the 2022 WMF Board of Trustees
> election.
>
> Developers can confirm their eligibility by reviewing the below list:
>
> * are Wikimedia server administrators with shell access
> * or have made at least one merged commit to any Wikimedia repos on
> Gerrit, between 5 January 2022 and 5 July 2022.
> * or have made at least one merged commit to any repo in
> nonwmf-extensions or nonwmf-skins, between 5 January 2022 and 5 July
> 2022.
> * or have made at least one merged commit to any Wikimedia tool repo
> (for example magnustools) between 5 January 2022 and 5 July 2022.
> * or have made at least 300 edits before 5 July 2022, and 20 edits
> between 5 January 2022 and 5 July 2022, on Translatewiki.
> * or maintainers/contributors of any tools, bots, user scripts,
> gadgets, and Lua modules on Wikimedia wikis.
> * or have substantially engaged in the design and/or review processes
> of technical development related to Wikimedia.
>
> This list was copied from
>  elections/2022/Voter_eligibility_guidelines#Developers>.
>
> If you need help getting a patch(/s) merged in the next few days,
> don't hesitate to reach out to the community in the next few days on
> the mailing lists, other locations such as the various IRC channels or
> phabricator.
>
> Regards,
> p858snake/Peachey88
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.
> lists.wikimedia.org/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Looking for opinions on PHPSessionHandler::gc behavior and Timezones

2022-05-06 Thread Brian Wolff
That does sound like a bug. I would suggest filing a task about it at
https://phabricator.wikimedia.org where it can be discussed further (feel
free to add me as a subscriber). Optionally, if you are feeling ambitious,
you can also submit a patch in gerrit.

--
Bawolff

On Friday, May 6, 2022, David Raison  wrote:

> Hi everyone,
>
>
> I just realized I probably sent my original message (see below) to the
> wrong list. So here goes again, this time with a deeper understanding what
> is going on, but also still much doubt and confusion.
>
> tl;dr on the original post: Users are being randomly disconnected from our
> mediawiki instances.
>
>
> I would like to check with you if what I found is a bug or not and if it's
> not, then I'm looking to understand why it behaves this way. I am also
> looking for information on what triggers garbage collection of PHP sessions.
>
>
> Since I originally posted, I have found out what is deleting sessions from
> the objectcache table in the database. I have been able to reproduce this
> both on a remote server and locally. Both times, my set-up is running in a
> docker environment.
>
>
> The call-stack that does this is the following:
>
> #0  SqlBagOStuff->deleteServerObjectsExpiringBefore() called at
> [/var/www/html/includes/objectcache/SqlBagOStuff.php:668]
> #1  SqlBagOStuff->deleteObjectsExpiringBefore() called at
> [/var/www/html/includes/libs/objectcache/CachedBagOStuff.php:148]
> #2  CachedBagOStuff->deleteObjectsExpiringBefore() called at
> [/var/www/html/includes/session/PHPSessionHandler.php:376]
> #3  MediaWiki\Session\PHPSessionHandler->gc()
> #4  session_start() called at [/var/www/html/includes/Setup.php:757]
> #5  require_once(/var/www/html/includes/Setup.php) called at
> [/var/www/html/includes/WebStart.php:89]
> #6  require(/var/www/html/includes/WebStart.php) called at
> [/var/www/html/index.php:44]
>
> Usually, SqlBagOStuff::deleteServerObjectsExpiringBefore is called from
> MWCallableUpdate->doUpdate() and passed a Unix timestamp. This timestamp
> is a UTC value and handled as such.
>
> However, sometimes, as can be seen from the backtrace above, it is called
> from the Garbage Collector in PHPSessionHandler:
>
> public function gc( $maxlifetime ) {
> if ( self::$instance !== $this ) {
> throw new \UnexpectedValueException( __METHOD__ . ': Wrong
> instance called!' );
> }
> $before = date( 'YmdHis', time() );
> $this->store->deleteObjectsExpiringBefore( $before );
> return true;
> }
>
> The problem here is that – probably for historical reasons - the function
> is passed a date string here instead of a Unix timestamp. And *the actual
> problem with that* is that the date() function applies timezone
> information (if $wgLocaltimezone is set). The resulting timestamp will
> thus be in the future (or be local time, interpreted as UTC). With a value
> of something equivalent to GMT+2, gc() will cause all current sessions to
> be deleted since they have a standard expiry time of 1h, stored as UTC
> value in the database.
>
> If this happens, and the session is not declared persistent (
> $session->isPersistent(), "Keep me logged in"), then all users are
> immediately logged out of mediawiki.
>
>
> Thus, my first question: Is this a bug? Since the $timestamp parameter is
> passed to ConvertibleTimestamp::convert( TS_UNIX, $timestamp ) in
> SqlBagOStuff::deleteServerObjectsExpiringBefore anyway, I do not see any
> reason to pass a date string. My suggestion for a fix would be to change
> the gc method as follows:
>
> public function gc( $maxlifetime ) {
> if ( self::$instance !== $this ) {
> throw new \UnexpectedValueException( __METHOD__ . ': Wrong
> instance called!' );
> }
> $this->store->deleteObjectsExpiringBefore( microtime(true) );//
> time() would be sufficient
> return true;
> }
>
>
> My second question would be: What controls when PHPSessionHandler::gc is
> called? Is it indeed the session configuration for the PHP process (i.e. 
> session.gc_probability,
> session.gc_divisor and session.gc_maxlifetime) ?
>
>
> Thanks,
> David
>
>
>
>
> Hi all,
>
>
> I'm writing to the list in the hope of receiving some feedback, questions,
> maybe answers to help us solve a conundrum with sessions we have on our
> Wikibase/Mediawiki installation.
>
>
> We're running two different docker-based wikibase installations (one
> staging, one production) on two separate virtual machines. Both use the
> SimpleSaml extension to connect to a SAML implementation, but we have found
> that the random session deletion happens both with and without the SAML
> extension used.
>
>
> The symptoms that we've seen are that out of the blue, users are
> disconnected from mediawiki. Since we use SAML, it's enough to click the
> Login link to again be connected.
>
>
> The duration or number of requests until a session is deleted seems
> random. It does appear though, that the more (or more frequent) requests
> are made (and thus 

[Wikitech-l] Re: [Wikitech-ambassadors] Wikimedia Hackathon: Call for Sessions

2022-04-28 Thread Brian Wolff
While i don't know the specifics of the situation you encountered, a common
view (not the only view) is priority should be set by people who are going
to do the work to fix the bug (or in consultation with them), except for
"unbreak now". This may or may not be WMF staff.

The problem is that almost every bug is important to someone, so if
everybody sets priorities just for the things they care about (and people
tend to only increase them never decrease) every single bug ends up being
set as "high", even the ones that people generally think are useless and
noone intends to work on. The end result is the priority field becomes
ignored, because it no longer reflects the real priority of the bug.

--
Brian

On Thursday, April 28, 2022, Roy Smith  wrote:

> The last time I tried to set a phab ticket's priority, I was basically
> told that volunteers shouldn't be doing that, i.e. changing priorities of
> phab tickets was reserved to WMF staff.  Some clarity around this would be
> appreciated :-)
>
> On Apr 28, 2022, at 8:47 AM, Derk-Jan Hartman <
> d.j.hartman+wmf...@gmail.com> wrote:
> >
> > Would anyone be interest in a session about triaging tickets ?
> > ...
> > * Set a priority
>
>
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.
> lists.wikimedia.org/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Different cache invalidation rules for similar users?

2022-04-03 Thread Brian Wolff
Its possible that different cache clusters have different versions of a
cached file.

Different ways of loading js have different cache characteristics (e.g.
some have much less purging) so it can depend a bit on method. Its also
possible that one datacenter missed a cache purge the others did not,
however in modern times this is a much more reliable process so unlikely.

--
Bawolff

On Sunday, April 3, 2022, Strainu  wrote:

> Hi,
>
> I've recently seen some complaints from 2 users located in the same
> country that it takes about half a day for the Javascript changes to
> propagate. Users from different countries but similar user rights don't
> seem to have this problem.
>
> Is it possible to have different cache invalidation rules for different
> countries? If not, what else could cause this behavior?
>
> Thanks,
>   Strainu
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: [Wikimedia-l] Uplifting the multimedia stack (was: Community Wishlist Survery)

2022-01-28 Thread Brian Wolff
While i'm sure there is always more that can be done in terms of attracting
new devs, i don't think that's really where we have the problem. We have
lots of interested people. Enwiki is basically a huge captive audience of
interested people.

I think the problem lies more in the developer experience after people
join. It is politically difficult to get stuff merged with long lag times.
That's frustrating to people who want to fix bugs as opposed to play
politics, and its really difficult for outsiders who don't have the
internal social connections, not to mention the time availability to always
be responsive as opposed to just being available to contribute a couple
hours on the weekend.

--
Bawolff

On Friday, January 28, 2022, Jay prakash <0freerunn...@gmail.com> wrote:

> Hi everyone,
>
> Andre, It is really good to see a major transformation in recent years.
> Since I am part of this transformation, I can say that it is already really
> going in a very good direction. But few things to note:
>
> 1. *Lack of staff*: WMF's Developer Advocacy has 6 staff for the globe. I
> had a recent experience with Google India's Developer Relations team. They
> had 5-8 members, just for a single county. I know comparison should have
> other factors as well but the gap is too far. Even Mozilla is outnumbering
> us. Another thing that I also noticed is that some WMF staff are part of
> multiple teams.
>
> 2. *No Wikimedia outreach program*: As far as I know WMF does not run its
> own outreach program. We are/were the only participant organization in
> various programs like Outreachy, Google Summer of Code, and Google Code-In.
> Wikimedia itself is a big name so we can attract many students with
> Wikimedia internship certificates without any stipend expenses.
>
> 3. *No partnership with the external organization*: As far as I know WMF
> did not collaborate with external organizations regarding technical
> partnerships and campaigns. Last year, two profit organizations, Intel and
> DigitalOcean, organized Hacktoberfest, an amazing month of open source
> love. They got 294,451 accept pull requests for open source projects.
> Profit organizations are running campaigns for open source software but we
> can't. (I know they have an advertisement factor but the point is open
> source campaigns). Why can't we?
>
> I am not blaming anything on the current Developer Advocacy team. I worked
> with many members very closely as a volunteer. They are already working to
> their full capacity. But upper management has to break the glass.
> Otherwise, we will just fix/mingle with current problems within a single
> bubble.
>
> Regards,
>
>
> Jay Prakash,
> Volunteer Developer
>
> On Fri, Jan 28, 2022 at 11:46 PM Andre Klapper 
> wrote:
>
>> On Tue, 2022-01-11 at 13:30 +, Inductiveload wrote:
>> > [5] Developer Advocacy is a team that exists, but at least in my
>> > personal experience, I have never actually encountered it, except for
>> > bug wrangling.
>>
>> If you're interested in what that (my) team is doing, I recommend
>> checking out https://www.mediawiki.org/wiki/Developer_Advocacy
>>
>> Spoiler: Improving Small Wiki Toolkits, working on a Developer Portal
>> and improving related technical docs, organizing Outreach Programs,
>> maintaining Community Metrics, sorting out the next Hackathon, helping
>> with the Coolest Tool Award (which happened two weeks ago), etc...
>>
>> Hope that provides a bit of an impression? :)
>>
>> Cheers,
>> andre
>> --
>> Andre Klapper (he/him) | Bugwrangler / Developer Advocate
>> https://blogs.gnome.org/aklapper/
>> ___
>> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
>> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
>> https://lists.wikimedia.org/postorius/lists/wikitech-l.
>> lists.wikimedia.org/
>
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Wikitext, Document Models, and HTML5 Output

2022-01-11 Thread Brian Wolff
Have you seen the html structure of parsoid?

E.g.
https://en.wikipedia.org/api/rest_v1/page/html/Dog

--
Bawolff
On Monday, January 10, 2022, Adam Sobieski  wrote:

> Wikitech-l,
>
>
>
> Hello. I have a question about the HTML output of wiki parsers. I wonder
> about how simple or complex that it would be for a wiki parser to output,
> instead of a flat document structure inside of a  element, an
>  element containing nested  elements?
>
>
>
> Recently, in the Community Wishlist Survey Sandbox
> , the
> speech synthesis of Wikipedia articles
> 
> was broached. The proposer of these ideas indicated that, for best results,
> some content, e.g., “See also” sections, should not be synthesized.
>
>
>
> In response to these interesting ideas, I mentioned some ideas from EPUB, 
> referencing
> pronunciation lexicons from HTML
>  and SSML
> attributes in HTML
> ,
> the CSS Speech Module , and that
> output HTML content could be styled using the CSS Speech Module’s speak
> property.
>
>
>
> In these regards, I started thinking about how one might extend wikitext
> syntax to be able to style sections, e.g.,:
>
>
>
> == See also == {style="speak:never"}
>
>
>
> Next, I inspected the HTML of some Wikipedia articles and realized that,
> due to the structure of the output HTML documents, it isn’t simple to style
> or to add attributes to sections. There are only , ,  (et
> cetera) elements inside of a containing  element; sections are not
> yet structured elements.
>
>
>
> The gist is that, instead of outputting HTML like:
>
>
>
> 
>
>   Heading
>
>   Paragraph 1
>
>   Paragraph 2
>
>   Subheading
>
>   Paragraph 3
>
>   Paragraph 4
>
> 
>
>
>
> could a wiki parser output HTML5 like:
>
>
>
> 
>
>   
>
> Heading
>
> Paragraph 1
>
> Paragraph 2
>
> 
>
>   Subheading header>
>
>   Paragraph 3
>
>   Paragraph 4
>
> 
>
>   
>
> 
>
>
>
> Initial thoughts regarding the latter HTML5 include that it is better
> structured, more semantic, more styleable, and potentially more accessible.
> If there is any interest, I could write up some lengthier discussion about
> one versus the other, why one might be better – and more useful – than the
> other.
>
>
>
> Is this the correct mailing list to discuss any of these wiki technology,
> wiki parsing, wikitext, document model, and HTML5 output topics?
>
>
>
>
>
> Best regards,
>
> Adam
>
>
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Limit to the number of images in a page?

2021-11-29 Thread Brian Wolff
This isn't a per page limit but a number of thumbnails per unit time. Wait
a little bit and revisit the page and more pictures should load. Eventually
all should. As long as nobody purges the image pages, once the image loads
once it should always load again in the future.

The current limits are 70 per 30 second, unless its a certaun "common" size
in which case its 700/30 seconds. This is defined in:
https://noc.wikimedia.org/conf/InitialiseSettings.php.txt

renderfile' => [
// 1400 new thumbnails per minute
'ip' => [ 700, 30 ],
'user' => [ 700, 30 ],
],
'renderfile-nonstandard' => [
// 140 new thumbnails per minute
'ip' => [ 70, 30 ],
'user' => [ 70, 30 ],
],



In practise i dont think people use those "standard" sizes all that more
often than other sizes.

On Monday, November 29, 2021, Strainu  wrote:

> Hi,
>
> I have some wikipages with a large number of images (1000+). Those
> pages never load completely, as upload.wikimedia.org starts returning
> 429 Too many requests after a while.
>
> This limit does not seem to be documented on mediawiki.org, so I would
> like to know what it the exact value and if there is a way to work
> around it (except for splitting the pages).
>
> Thanks,
>Strainu
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.
> lists.wikimedia.org/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Proposal: new structure for MediaWiki RELEASE-NOTES

2021-07-19 Thread Brian Wolff
> We could write a custom merge driver for RELEASE-NOTES which always puts
‘ours’ before ’theirs’,
but I’m not sure the result will justify the investment.

Didnt someone already do that? There used to be a ruby script floating
around for that.

On Monday, July 19, 2021, Petr Pchelko  wrote:

> Hello.
>
> TLDR: In MW core, RELEASE-NOTES is a cause of the majority of merge
> conflicts.
> I propose to restructure it in a way that could let us do less manual
> conflict resolutions in Gerrit.
>
> *The problem*
>
> When working on MediaWiki core, all of us often need to update
> RELEASE-NOTES-XX file
> in case some deprecations or breaking changes were made. While the change
> is under review,
> other changes touching RELEASE-NOTES-XX are made, causing merge conflicts
> that need
> to be resolved manually. Non insignificant portion of manual conflict
> resolutions are needed
> exclusively due to RELEASE-NOTES.
>
> This is not a surprise due to the structure of the notes file. It has
> several sections, with every
> section being a list of individual notes. Since most of us only append
> lines to the end of the list,
> it creates a contention.
>
> *The proposal*
>
> I propose to reorganize RELEASE-NOTES. We keep the sections, but within
> each section
> we start the note with the name of the class that’s been touched, if it
> makes sense.
>
> For example, instead of writing
>
> * In AbstractBlock, the getTargetAndType() and getTarget() methods are
>   hard deprecated. Instead use getTargetName() and getTargetUserIdentity()
>   together with getType().
>
> We would write
>
> * AbstractBlock: the following method were hard-deprecated:
>   - getTarget() - use getTargetName() or getTargetUserIdentity()
>   - getTargetAndType() - use getType() with getTargetUserIdentity() or
> getTargetName()
>
> After switching to a more standardized format, we sort individual notes
> alphabetically by class
> name. Method names within messages need to be sorted alphabetically as
> well. This will make
> an insertion point of new release notes more random, hopefully reducing
> the contention and
> thus reducing the probability of a merge conflict. Additionally, I
> personally think this will look nicer
> and will be easier to read for extension developers.
>
> Not all the release notes can follow this pattern, and it’s ok. The point
> is not to be perfect, the
> point is to try to be better.
>
> *Alternative solutions*
>
> We could write a custom merge driver for RELEASE-NOTES which always puts
> ‘ours’ before ’theirs’,
> but I’m not sure the result will justify the investment.
>
> *Feedback*
>
> Please tell me what you think about this.
> - Do you think it is a real problem?
> - Do you think simply restructuring the notes with help?
> - Do you think writing a custom merge driver is a way to go?
> - Will you remember to follow  the new structure if we agree on it even
> though it’s not really possible
> to enforce with linters since not all the notes can follow this structure?
> - Extension developers - do you care? Do you think alphabetically sorted
> RELEASE-NOTES will be
> any easier for you to read?
>
> Thank you.
> Petr Pchelko.
> Staff Software Engineer.
> WMF Platform Engineering Team.
>
>
>
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Upstream shutdowns

2021-06-01 Thread Brian Wolff
Also HHVM and Blazegraph.

--
Brian

On Tuesday, June 1, 2021, Kunal Mehta  wrote:

> On 5/29/21 5:59 PM, Brian Wolff wrote:
>
>> It sounds like phabricator upstream is going away:
>> https://admin.phacility.com/phame/post/view/11/phacility_is_
>> winding_down_operations/ <https://admin.phacility.com/p
>> hame/post/view/11/phacility_is_winding_down_operations/>
>>
>
> On the general topic, it's worrying to me this is the third major
> platform/software that Wikimedia relies on that is gone, with little
> notice. OTRS went EOL 2 days before Christmas last year[1] and Freenode
> well, switched hands and imploded.
>
> In all 3 cases though it's been pretty clear beforehand that something was
> wrong or unsustainable. OTRS didn't release their community edition on
> time[2], Freenode has had questionable governance for a while now, the
> "Freenode Limited" company was created in 2017, and Phabricator/Phacility
> has been maintained by a single person (who's done great work, no doubt) in
> a way that did not really encourage upstream contributions to build a
> sustainable community of contributors (my impression, correct me if I'm
> wrong).
>
> And with COVID-19, plenty of IRL businesses/shops/etc. have gone under or
> just closed down so it's unsurprising that we're seeing the effect in
> tech/FOSS too.
>
> I think it would be a good idea if we can check in on our upstreams[2],
> have sustainable governance, identifying if they assistance/support, and
> then figuring out how to provide what's needed.
>
> To start with, I think we have a unique opportunity to influence the
> governance of Libera Chat given they're just starting off and appear to be
> interested in getting our feedback[3].
>
> [1] https://phabricator.wikimedia.org/T275294
> [2] https://www.mediawiki.org/wiki/Upstream_projects
> [3] https://meta.wikimedia.org/wiki/Wikimedia_Forum#Libera_Chat_governance
>
> -- Legoktm
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists
> .wikimedia.org/
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Phabricator upstream shutdown

2021-05-29 Thread Brian Wolff
It sounds like phabricator upstream is going away:
https://admin.phacility.com/phame/post/view/11/phacility_is_winding_down_operations/

Just curious, are we planning to continue using it long term or move to
something else?

--
Brian
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

Re: [Wikitech-l] Dynamically declaring parser functions?

2021-04-19 Thread Brian Wolff
Why not just use the normal magic i18n file? You could just not load it
when you don't want it.

--
Brian

On Sunday, April 18, 2021, FreedomFighterSparrow <
freedomfighterspar...@gmail.com> wrote:

> My use case is this:
>
> I have three alternative access points to the wiki, where some things are
> not allowed - e.g. videos. Each access point has different extensions
> disabled, and then tags and parser functions show "as is" on screen, which
> looks bad.
>
> My goal is to hide those, obviously.
>
> I'm upgrading MW from 1.29 to 1.35; My previous solution is here:
> https://github.com/kolzchut/mediawiki-extensions-NoopTags
>
> Basically I hooked ParserFirstCallInitHook and LanguageGetMagic to
> dynamically declare empty stubs for those missing function parsers and
> tags, using global variables $wgNoopTagsFunctionBlacklist and
> $wgNoopTagsBlacklist.
>
>
> This doesn't work in MW 1.35, because LanguageGetMagic was removed. I
> tried bypassing the issue by hooking GetMagicVariableIDsHook, but
> apparently that's only for "variables" ({{variable}}), and not parser
> functions.
>
> Is there a way to achieve my goal? Either by fixing my extension or doing
> something completely different which I haven't thought about?
>
> Thanks in advance
> Dror
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] 1st CfP: SEMANTiCS 2021 EU || Sep 6 - 9, 2021 || Amsterdam, The Netherlands

2020-12-26 Thread Brian Wolff
Normally i avoid +1 posts, but since Max is getting some criticism, I
wanted to say I also think that this particular conference is off topic to
this mailing list and should be posted in a more appropriate venue.

--
Brian

On Friday, December 25, 2020, Dan Garry (Deskana)  wrote:

> I agree with Max. This conference is pretty much irrelevant for this list,
> as it has almost nothing to do with the topic of wikis generally. The only
> connection this conference has to this list is the existence of the
> Semantic MediaWiki extension; this list could quickly be overrun with
> conference emails if they were allowed on the basis of the existence of a
> MediaWiki extension related to the topic area.
>
> Sebastian, there are a few Semantic MediaWiki email lists
> ,
> why don't you try posting this there? You might get a lot more interest, as
> the audience there will be very interested in the Semantic Web. :-)
>
> Dan
>
> On Fri, 25 Dec 2020 at 16:18, Andreas Papacharalampous 
> wrote:
>
>> i don't mind a crosspost, but probably not a good idea to vent your
>> frustration against a crosspost on a public list IMO.
>>
>> --andreas
>> he/him
>>
>>
>>
>> On Fri, Dec 25, 2020 at 10:53 AM Andrew Krizhanovsky <
>> andrew.krizhanov...@gmail.com> wrote:
>>
>>> Sorry Max, but this information was interesting for me :)
>>>
>>> Best regards,
>>> Andrew Krizhanovsky.
>>>
>>>
>>> On Fri, 25 Dec 2020 at 13:32, Max Semenik  wrote:
>>>
 On Fri, Dec 25, 2020 at 1:21 PM Sebastian Hellmann <
 pr-a...@informatik.uni-leipzig.de> wrote:

> Apologies for cross-posting


 Apologies not accepted. Don't know about others on this list, but I'm
 tired of your spam of conferences very remotely related to MediaWiki
 development. MediaWiki is not Semantic MediaWiki, 99% of people here don't
 care about your stuff. Please stop.

 --
 Best regards,
 Max Semenik
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Hard deprecation of the Revision class

2020-07-06 Thread Brian Wolff
Hard deprecated = generates warnings if $wgEnableDevelopmentWarnings is on
(which in turn cause unit tests to fail, as warnings are considered errors
in a test). The code is still there and working for the moment.

--
Brian

On Monday, July 6, 2020, Jay R. Ashworth  wrote:

> 
> I think you might be better calling that "removed"; the entire *point*
> of the jargon usage of 'deprecated' is "it's still there and working,
> but start moving off it soon, cause we're gonna remove it".
> 
>
> - Original Message -
> > From: "Petr Pchelko" 
> > To: "Wikimedia developers" 
> > Sent: Monday, July 6, 2020 4:18:34 PM
> > Subject: [Wikitech-l] Hard deprecation of the Revision class
>
> > Hey all,
> >
> > TLDR: your extension CI might break due to hard deprecation of Revision
> > class.
> >
> > Today a Gerrit change[1] has been merged, hard deprecating MediaWiki core
> > Revision class.
> > Loads of work has been done to prepare for this moment, mostly by a
> > volunteer developer DannyS712 with code review
> > support from almost every corner of the MediaWiki developer universe. You
> > can judge the amount of work by the number
> > of subtasks in the tracking ticket[2]
> >
> > A lot of effort has been done to update all the WMF deployed extensions
> and
> > some non-deployed ones to the new code,
> > In accordance with stable interface policy deprecation section[3].
> However
> > due to the gravity of a change, we might have
> > missed some corner cases. Missed usages will not cause problems in
> > production since the only consequence of using hard
> > deprecated code is a log message, but some CI tests might start failing.
> > If you find your tests failing, please replace all the usages of the
> > Revision class according to 1.35 release notes,
> > and your tests should start passing again. In case you’re having troubles
> > doing it, please reach out to core platform team
> > and we will try to help.
> >
> > This is a huge milestone in decoupling MediaWiki core! We understand that
> > this might cause some inconveniences, apologize
> > for them and are hoping we can help with resolving any issues.
> >
> > Cheers.
> > Petr Pchelko
> >
> > 1. https://gerrit.wikimedia.org/r/c/mediawiki/core/+/608845
> > 2. https://phabricator.wikimedia.org/T246284
> > 3. https://www.mediawiki.org/wiki/Stable_interface_policy#Deprecation
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> --
> Jay R. Ashworth  Baylink
> j...@baylink.com
> Designer The Things I Think   RFC
> 2100
> Ashworth & Associates   http://www.bcp38.info  2000 Land
> Rover DII
> St Petersburg FL USA  BCP38: Ask For It By Name!   +1 727 647
> 1274
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SelectQueryBuilder

2020-05-21 Thread Brian Wolff
That's really cool.

It would be interesting if we could use this as an excuse to move away from
what i would consider antipatterns in our sql layer e.g., having no high
level way of specifying WHERE comparisons (x > "foo". Currently we do
addQuotes() instead of automatic or `field1` = `field2`). Similarly the
whole thing with field names only getting automatically
addIdentifierQuotes() if it looks like its not sql, always seemed sketch.
This might provide us a path forward to address those while still
maintaining backwards compat.

--
Brian

p.s. Now all i need to dream about is a fluent version of Html class.

On Thursday, May 21, 2020, Tim Starling  wrote:

> SelectQueryBuilder is a new fluent interface for constructing database
> queries, which has been merged to master for release in MediaWiki
> 1.35. Please consider using it in new code.
>
> SELECT page_id FROM page
> WHERE page_namespace=$namespace AND page_title=$title
>
> becomes
>
> $id = $db->newSelectQueryBuilder()
>->select( 'page_id' )
>->from( 'page' )
>->where( [
>   'page_namespace' => $namespace,
>   'page_title' => $title,
>] )
>->fetchField();
>
> As explained on the design task T243051, SelectQueryBuilder was
> loosely based on the query builder in Doctrine, but I made an effort
> to respect existing MediaWiki conventions, to make migration easy.
>
> SelectQueryBuilder is easy to use for simple cases, but has the most
> impact on readability when it is used for complex queries. That's why
> I chose to migrate the showIndirectLinks query in
> Special:WhatLinksHere as a pilot -- it was one of the gnarliest
> queries in core.
>
> SelectQueryBuilder excels at building joins, including parenthesized
> (nested) joins and joins on subqueries.
>
> SelectQueryBuilder can be used as a structured alternative to the
> "query info" pattern, in which the parameters to Database::select()
> are stored in an associative array. It can convert to and from such
> arrays. As a pilot of this functionality, I converted ApiQueryBase to
> use a SelectQueryBuilder to store accumulated query info.
>
> Check it out!
>
> -- Tim Starling
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Render with a slow process

2020-04-28 Thread Brian Wolff
Can you be a bit more concrete in what you are thinking of.


In general  i think most users would find it unacceptable if part of the
page was only available after some slow (<1 min) amount of time after the
page rendered.

The closest things i can think of to this is image rendering (and
especially video scaling) is asyncronous. Page rendering itself also works
kind of like this - if someone edits a page, and pool counter decides too
many people are trying to render at once, next viewer will get an old
version of page.

--
Brian

On Saturday, April 25, 2020, John Erling Blad  wrote:

> Slow process, fast rendering
>
> Imagine someone edits a page, and that editing hits a very slow
> tag-function of some kind. You want to respond fast with something
> readable, some kind of temporary page until the slow process has finished.
> Do you chose to reuse what you had from the last revision, if it the
> content of the function hasn't changed, or do you respond with a note that
> you are still processing? The temporary page could then update missing
> content through an API.
>
> I assume that a plain rerender of the page will occur after the actual
> content is updated and available, i.e. a rerender will only happen some
> time after the edit and the slow process would then be done. A last step of
> the process could then be to purge the temporary page.
>
> This work almost everywhere, but not for collections, they don't know about
> temporary pages. Is there some mechanism implemented that does this or
> something similar? It feels like something that should have a general
> solution.
>
> There are at least two different use cases; one where some existing
> information gets augmented with external data (without really changing the
> content), and one where some external data (could be content) gets added to
> existing content. An example of the first could be verification of
> references, wile the later could be natural language generation.
>
> /jeblad
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What is "revision slot" and why is it necessary to specify it now?

2020-04-05 Thread Brian Wolff
It is where a page can have multiple content handlers associated with it.

The most common example is commons file pages. They have normal wikitext
page, but they also have a slot containing wikidata-style data structured
data.

Whether or not you have to worry about this depends on what you're doing.
For most things probably no, but more things might have multiple slots in
the future.

--
Bawolff

On Sunday, April 5, 2020, Petr Bena  wrote:

> Hello,
>
> https://www.mediawiki.org/wiki/API:Revisions mentions "revision slots"
> multiple times, but fails to explain what it is?
>
> I noticed that some queries that were running fine in the past now
> have warning: "Because "rvslots" was not specified, a legacy format
> has been used for the output. This format is deprecated, and in the
> future the new format will always be used."
>
> Which is interesting, but not very descriptive. What are revision
> slots? Why are they needed? What is difference between "legacy format"
> and "new format"? If rvslots are so important, why they aren't part of
> some of the examples?
>
> So far I googled this: https://www.mediawiki.org/wiki/Manual:Slot it
> seems like some new "work-in-progress" feature, that isn't very useful
> yet, as there is only 1 slot in existence.
>
> Is it necessary to pay attention to this? Can I simply silence this
> warning by providing parameter rvslots=*? Or should I pay it some
> deeper attention?
>
> Thanks
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [WikiEdDiff] Review wanted

2020-02-29 Thread Brian Wolff
Umm, what is with the sgndrp.online tracker urls?

--
Bawolff

On Saturday, February 29, 2020, Aron Demian  wrote:

> On Sat, 29 Feb 2020 at 21:50, Zoran Dori  wrote:
>
> > Hmm, URL works only in old UI, not in new.
>
>
> Didn't work for me even in the old one. Please submit a working link next
> time.
> https://gerrit.wikimedia.org/r/q/project:mediawiki%
> 252Fextensions%252FWikEdDiff+status:open
>  3A%2F%2Fgerrit.wikimedia.org%2Fr%2Fq%2Fproject%3Amediawiki%
> 25252Fextensions%25252FWikEdDiff%2Bstatus%3Aopen=1583010965492&
> linkName=https://gerrit.wikimedia.org/r/q/project:
> mediawiki%252Fextensions%252FWikEdDiff+status:open>
> First patch in chain:
> https://gerrit.wikimedia.org/r/c/mediawiki/extensions/WikEdDiff/+/571093
>  3A%2F%2Fgerrit.wikimedia.org%2Fr%2Fc%2Fmediawiki%
> 2Fextensions%2FWikEdDiff%2F%2B%2F571093=1583010965492&
> linkName=https://gerrit.wikimedia.org/r/c/mediawiki/
> extensions/WikEdDiff/+/571093>
>
>
> Demian
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tech Talk livestream chat transcript (was Re: [Wikimedia Technical Talks] Data and Decision Science at Wikimedia with Kate Zimmerman, 26 February 2020 @ 6PM UTC)

2020-02-26 Thread Brian Wolff
Note: the list of who has access (both wmf and otherwise) is at
https://github.com/wikimedia/puppet/blob/3f7e6a775321e993c001a85ede736d2db087d3e0/modules/admin/data/data.yaml#L270
(there are also some other groups that are probably relevant). This of
course wouldnt include people who got specific information as result of a
collaboration but didnt have actual access to the cluster. You can look at
the file's history to see historically who had access.

--
Bawolff

On Wednesday, February 26, 2020, James Salsman  wrote:

> I note now that the full chat transcript has been restored; thank you.
>
> I am still interested in the answer to the question.
>
> Sincerely,
> Jim
>
> On Wed, Feb 26, 2020 at 10:46 AM James Salsman  wrote:
> >
> > Just now I asked the following question on the Technical Talk
> > livestream at https://www.youtube.com/watch?v=J-CRsiwYM9w
> >
> > 10:19 AM: Page 20 of Robert West's 2016 Stanford thesis, "Human
> > Navigation of Information Networks" says, "We have access to
> > Wikimedia’s full server logs, containing all HTTP requests to
> > Wikimedia projects."
> > 10:19 AM: The text is at
> > http://infolab.stanford.edu/~west1/pubs/West_Dissertation-2016.pdf
> > 10:19 AM: Page 19 indicates that this information includes the "IP
> > address, proxy information, and user agent."
> > 10:20 AM: This is confirmed by West at
> > https://www.youtube.com/watch?v=jQ0NPhT-fsE=25m40s
> > 10:20 AM: Does the Foundation still share that identifying information
> > with research affiliates? If so, how many are them world-wide; if not,
> > when did sharing this information stop?
> > 10:25 AM: MediaWiki @James Salsman I see your question, but donfly.
> > Can we reach out to you after the talk?
> > 10:25 AM: MediaWiki: sorry hit enter too soon!
> > 10:26 AM: MediaWiki: I don't have the full-context of the thesis to
> > ask kate to answer the question on the fly. Can we reach out to you
> > after?
> > 10:26 AM: James Salsman: With whom am I corresponding?
> > 10:27 AM: James Salsman: Sarah?
> > 10:28 AM: MediaWiki: Yes! That's me!
> > 10:28 AM: James Salsman: Would it be easier to ask, "how many research
> > affiliates does the Foundation share server logs with IP addresses?"
> > 10:29 AM: MediaWiki: Yes, I can ask that.
> > 10:30 AM: James Salsman: Thank you.
> >
> > At 10:34, the messages with the the URLs I posted were removed from
> > the chat log.
> >
> > At 10:36, the chat stream was removed from the video, which was
> > replaced with the text, "Chat is disabled for this live stream," and
> > now is completely missing.
> >
> > I am still interested in getting an answer to this question, but
> > disturbed by the removal of links to sources. Could I please have an
> > explanation?
> >
> > Sincerely,
> > Jim
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] "Chain verification error" when trying to log-in to Beta cluster

2020-01-30 Thread Brian Wolff
The ssl tester site is reporting that ocsp stapling is misconfigured on
that domain, maybe that's causing the issue(?)

One of the biggest differences is that beta uses lets encrypt. If for some
reason the platform you are testing on didnt have lets encrypt certs
installed, that could cause this. However lets encrypt at this point is
very widely supported.

--
Brian

On Thursday, January 30, 2020, Kaartic Sivaraam <
kaarticsivaraam91...@gmail.com> wrote:

> Hi all,
>
> I'm writing this email about an issue we recently faced in the
> Wikimedia Commons Android app. The beta flavour[1] of the Commons app
> connects to the Wikimedia Commons beta cluster[2]. When trying to
> log-in to the beta cluster using valid credentials we get the
> following error[3]:
>
> javax.net.ssl.SSLHandshakeException: Chain validation failed
>
> We get this error when trying to do the following API call:
>
> https://commons.wikimedia.beta.wmflabs.org/w/api.php?
> format=json=2=plaintext=
> query=tokens=login
>
> This seems to be a problem only when making the API call to get the
> login token from the app (IIUC, we use OkHttp to make the API call).
> The same API call succeeds without issues when done using the browser.
> We're not sure what's causing this issue. We've stopgapped the
> issue[4] for the mean time but we would like to identify the actual
> problem and fix it. It would be nice if someone could help us with
> identifying the problem. If there's a better place or person to
> contact about this issue please let us know.
>
> As the stopgap fix has been merged, the beta version built from the
> latest source in 'master'[5] would not have the issue described above.
> The source that has the issue can be found at [6].
>
> Notes and references
> [1]: It's different from the beta version released in Play store.
> [2]: https://commons.wikimedia.beta.wmflabs.org
> [3]: Unable to login #3320 -
> https://github.com/commons-app/apps-android-commons/issues/3320
> [4]: https://github.com/commons-app/apps-android-commons/pull/3349 and
> https://github.com/commons-app/apps-android-commons/pull/3350
> [5]: https://github.com/commons-app/apps-android-commons/tree/master
> [6]: https://github.com/commons-app/apps-android-commons/tree/
> fe56cefdbca21125e9202b30c408b3736dc3421d
>
> Thanks,
> Sivaraam
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Community Tech: New Format for 2020 Wishlist Survey

2019-10-13 Thread Brian Wolff
One concern id have if this approach is scaled up, is what happens to
maintenance of these projects after they are done? Not that this isnt a
problem with traditional teams too, but with the wishlist approach it seems
like it would be all the more problematic, especially for larger projects.

--
Brian

On Sunday, October 13, 2019, Amir E. Aharoni 
wrote:

> I don't know if restricting the wishlist only for other projects is so
> terrible. Non-Wikipedia projects have complained for years that they are
> not getting enough attention, so perhaps giving them extra love at the
> expense of Wikipedia makes sense.
>
> I would look at a wider problem, though. Over the years that the community
> wishlist vote has been working, a lot of excellent projects were completed
> by the Community Tech team: global preferences, the pageviews tool,
> TemplateWizard, wiki syntax highlighting, edit summary improvements, and
> many more. However, several projects that received a very large number of
> votes were declined, not because they were undesirable but because they
> were too big for this team, which by its nature is oriented at small,
> timeboxed projects. That, by itself, is understandable.
>
> The real trouble is that even though there is demand for these things
> (given the vote results), and even though no-one seems to think that the
> ideas are invalid or undesirable, they ended up not being done. Some
> notable examples:
> * Cross-wiki watchlist (#4 in 2015, declined:
> https://en.wikipedia.org/wiki/Wikipedia_talk:Global,_cross-
> wiki,_integrated_or_stacked_watchlists#Declined
> )
> * Global gadgets (#1 in 2016; I couldn't find the reason for declining)
> * Global templates (#3 in 2015; marked as "in development by Parsing team",
> but not actually done)
>
> All of these things are still very much in demand. All of them happen to be
> particularly beneficial also to non-Wikipedia projects, but if they will be
> proposed for this year's wishlist vote and get a lot of votes, will they
> again be declined because they are too big for the Community Tech team or
> will they be escalated to another team (or teams) that can execute them?
>
> If there is no commitment to such escalation from higher management, we'll
> stay stuck in a ridiculous situation in which the most needed projects are
> also those that cannot be carried out.
>
> --
> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> http://aharoni.wordpress.com
> ‪“We're living in pieces,
> I want to live in peace.” – T. Moore‬
>
>
> ‫בתאריך שבת, 5 באוק׳ 2019 ב-2:44 מאת ‪Yuri Astrakhan‬‏ <‪
> yuriastrak...@gmail.com‬‏>:‬
>
> > Ilana, restricting wishlist to non-Wikipedia this year is a very sad
> news.
> >
> > For many years, wishlist survey was the best way for the community to
> talk
> > back to the foundation, and to try to influence its direction. WMF mostly
> > ignored these wishes, yet it was still a place to express, discuss,
> > aggregate and vote on what community needed. Big thank-you is due to the
> > tiny community tech team that tackled the top 10 items, but that's just
> ~3%
> > of the foundation's employees.
> >
> > WMF has been steadily separating itself from the community and loosing
> > credibility as a guiding force.  Take a look at the last election --
> almost
> > every candidate has said "no" to the question if WMF is capable of
> > deciding/delivering on the direction [1].  In **every** single
> conversation
> > I had with the community members, people expressed doubts with the
> movement
> > strategy project, in some cases even treating it as a joke.
> >
> > This is a huge problem, and restricting wishlist kills the last effective
> > feedback mechanism community had.  Now WMF is fully in control of itself,
> > with nearly no checks & balances from the people who created it.
> >
> > I still believe that if WMF makes it a priority to align most of its
> > quarterly/yearly goals with the community wishlist (not just top 10
> > positions), we could return to the effective community-governance.
> > Otherwise WMF is risking to mirror Red Cross Haiti story [2] -- hundreds
> of
> > millions of $$ donated, and very few buildings actually built.
> >
> > With great respect to all the people who made Wikis what they are today,
> > --[[User:Yurik]]
> >
> > [1]
> >
> > https://meta.wikimedia.org/wiki/Affiliate-selected_Board_
> seats/2019/Questions#Do_you_believe_the_Wikimedia_
> Foundation_in_its_present_form_is_the_right_vehicle_for_
> the_delivery_of_the_strategic_direction?_If_so_why,_and_if_
> not,_what_might_replace_it
> > ?
> >
> > [2]
> >
> > https://en.wikipedia.org/wiki/American_Red_Cross#Disaster_
> preparedness_and_response
> >
> > On Fri, Oct 4, 2019 at 5:18 PM Ilana Fried  wrote:
> >
> > > Hello, everyone!
> > >
> > > My name is Ilana, and I'm the product manager for the Community Tech
> > team.
> > > We’re excited to share an update on the Community Tech 2020 Wishlist
> > Survey
> > > 

Re: [Wikitech-l] WebTestCase

2019-10-02 Thread Brian Wolff
I think selenium is often used for that use case

https://www.mediawiki.org/wiki/Selenium/Node.js/Simple

--
Bawolff

On Tuesday, October 1, 2019, Jeroen De Dauw  wrote:

> Hey,
>
> Does MediaWiki have something similar to Symfony's WebTestCase? (
> https://symfony.com/doc/current/testing.html#functional-tests)
>
> I want to write some integration tests in the form of "does web page
> /wiki/MyWikiPage contain HTML snippet XYZ".
>
> Cheers
>
> --
> Jeroen De Dauw | www.EntropyWins.wtf  |
> www.Professional.Wiki 
> Entrepreneur | Software Crafter | Speaker | Open Souce and Wikimedia
> contributor
> ~=[,,_,,]:3
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Outreachy Round 19] Contribution period is now open!

2019-10-01 Thread Brian Wolff
Just in case anyone had the same question as i did, the campaign project is
in the context of wiki education dashboard (I was originally assuming
upload wizard campaigns).

Good luck to all the participants.

--
Brian

On Tuesday, October 1, 2019, Srishti Sethi  wrote:

> Hello everyone,
>
> Wikimedia has listed seven projects in Outreachy Round 19:
> https://www.outreachy.org/apply/project-selection/#wikimedia :)
>
> One of these projects is documentation-related, one related to quality
> assurance and remaining five are coding projects:
>
>1. A system for releasing data dumps from a classifier detecting
>unsourced sentences in Wikipedia, mentored by *Sam Walton, Miriam Redi
>and Guilherme Gonçalves*
>2. Convert Campaign pages to React, mentored by *Khyati Soneji and Sage
>Ross*
>3. Create command-line runner for MediaWiki maintenance tasks, mentored
>by *Will Doran*
>4. Create regression automated tests for Special:Homepage functionality
>testing, mentored by *Elena Tonkovidova*
>5. Documentation improvements to the ~20 top 100 most viewed MediaWiki
>Action API pages on-wiki, mentored by *Jerop Brenda*
>6. Improve MediaWiki Action API Integration Tests, mentored by *Kate
>Chapman, Clara Andrew-Wani and Daniel Kinzler*
>7. Improvements and User Testing of Wiki Education Dashboard Android
>App, mentored by *Ujjwal Agrawal and Sage Ross*
>
> Only applicants with an account on the Outreachy site
>  will be able to see the details of these projects.
> You can learn more about the timeline, roles, and responsibilities of
> participants and mentors, and ways to get in touch with the coordinators
> here: https://www.mediawiki.org/wiki/Outreachy/Round_19. You can also view
> these projects under "Featured projects" column here on Phabricator:
> https://phabricator.wikimedia.org/project/view/4265/.
>
> Spread the news among your friends both in the Wikimedia world and outside
> and encourage them to apply :)
>
> If you have any questions, please reach out to me!
>
> Cheers,
> Srishti
>
> *Srishti Sethi*
> Developer Advocate
> Wikimedia Foundation 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PLURAL in mw.msg

2019-09-29 Thread Brian Wolff
I think you need to load mediawiki.jqueryMsg module otherwise these
interfaces do totally different things.

Personally im not really a fan of this as i dont like the idea that methods
randomly change meaning based on what modules are loaded, but it is what it
is.

--
Bawolff


On Sunday, September 29, 2019, Jeroen De Dauw 
wrote:

> Hey,
>
> I have this message: Removed $1 {{PLURAL:$1|shape|shapes}}
>
> This message is used in JavaScript via mw.msg() to create an edit summary.
> Unfortunately this summary is showing up without the PLURAL function being
> rendered:
> https://user-images.githubusercontent.com/146040/
> 65845180-6cd43b80-e339-11e9-9fb3-12642a835aa0.png
>
> The docs state that PLURAL is supported in JS:
> https://www.mediawiki.org/wiki/Manual:Messages_API#
> Feature_support_in_JavaScript.
> I've also tried mw.message('my-key').parse() and some variants of that,
> though these all gave me the same result.
>
> Why is this not working and how can I fix it?
>
> Cheers
>
> --
> Jeroen De Dauw | www.EntropyWins.wtf  |
> www.Professional.Wiki 
> Entrepreneur | Software Crafter | Speaker | Open Souce and Wikimedia
> contributor
> ~=[,,_,,]:3
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] LoginSignupSpecialPage.php form won’t let me add fields

2019-09-21 Thread Brian Wolff
If you havent seen it, the docs at
https://www.mediawiki.org/wiki/Manual:SessionManager_and_AuthManager/Adding_fields_to_the_registration_form
may be helpful.

--
Brian

On Saturday, September 21, 2019, John Shepherd  wrote:

> Hello all,
>
> On my own mediawiki install, I am trying to add another checkbox field to
> the Special:CreateAccount page. I have found the code responsible for the
> form, but for some reason the checkbox does not show up. As a test, I then
> went and tried copying and pasting one of the existing text boxes (with its
> IDs etc changed of course) to see if that would work. Nothing shows up
> other than the fields already present.
>
> Does anyone have any ideas what could be blocking it and/or what I am
> missing? Below is the diff of the change that doesn’t show.
>
> https://github.com/TheSandDoctor/misc-code-bits/commit/
> 4f2f6221c64095777622219c6c04c174eb197597
>
> Thanks!
> TheSandDoctor
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Statistics for CoC Cases Term May 2018 to May 2019

2019-09-05 Thread Brian Wolff
One thing im kind of curious about: Were any of these cases appealed? If so
how many and what is the "success" rate of the appeals process?

--
Brian

On Thursday, September 5, 2019, Amir Sarabadani  wrote:

> Hello,
> The committee received several reports for cases that were out of our
> jurisdiction (usually due to mediawiki.org links followed from wikis that
> are not part of Wikimedia). Those are not included in the following
> numbers:
>
> In total, the committee dealt with and decided on 17 cases. Nine of the
> cases had originated or had links to comments on Wikimedia Phabricator. The
> most frequent action taken there by the committee for these was to delete
> comments that violate the CoC. An email was also sent in all cases to the
> person engaging in those rhetorics.
>
> Among the rest, two of the cases had events happening on Wikimedia Gerrit,
> four on the wikitech-l mailing list, and two had links to the userspace and
> talk pages on mediawiki.org.
>
> Given the low number of offline cases, we won't disclose any details about
> those in order to protect confidentiality of those.
>
> We saw at least 3 occasions of repeat violations from offenders. A harsher
> decision was taken in at least one case among these.
>
> The committee came to a conclusion of no-further action at least 5 times
> while processing the aforementioned cases. However, we ended up at least
> sending an email to notify both parties involved in the report in all
> cases.
>
> Actions taken varied from deleting comments, asking for apologies, and user
> bans. The committee also made sure to redirect reporters with genuine cases
> of harassment to law enforcement.
>
> Due to the limited number of cases, we decided to not go into more detail
> where the cases were reported and what actions were taken and how often.
>
> The current committee wishes the next committee a successful term!
>
> On behalf of the CoC Committee,
> Amir
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Category aliasing proposals

2019-08-06 Thread Brian Wolff
Re other attempts - have you seen
https://gerrit.wikimedia.org/r/#/c/mediawiki/core/+/65176/ ?

Beyond that some people have talked about using wikidata stuff instead of
categories, but im not sure how seriously.

--
Brian

On Monday, August 5, 2019, Adam Wight  wrote:

> Friends,
>
> As part of an investigation into category aliasing (think “theater
> directors” vs. “theater directors"), we’ve identified two potential
> technical implementations and I’m hoping to get feedback to help us choose
> between the proposals, or change course entirely.
>
> For a summary of the problem and solutions, please see
> https://meta.wikimedia.org/wiki/WMDE_Technical_Wishes/Gendered_Categories
> and join the talk page if you wish.
>
> Some questions on my mind are:
> * Will the proposed hard-redirect category behavior break any MediaWiki or
> third-party software assumptions about hard redirects or categories?
> * Is “non-page” category aliasing really the mountain of tech debt I
> imagine it to be?
> * Have there been other attempts to solve this problem?
>
> Kind regards,
> Adam
> from the Technical Wishes Team at Wikimedia Germany
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Coolest Tool Award 2019: Call for Nominations

2019-07-16 Thread Brian Wolff
To clarify, does this mean tool in the sense of "tool"server? (Aka coolest
thing hosted using cloud services) or is it more general including gadgets,
standalone apps or any other piece of technology in the wikimedia ecosystem
that's "cool"?

--
Bawolff

On Tuesday, July 16, 2019, Birgit Mueller  wrote:

> Dear all,
>
>
>
> we’re really happy to announce the first ever Coolest Tool Award
> !
>
>
>
> Tools play an essential role at Wikimedia, and so do the many volunteer
> developers who experiment with new ideas, develop & maintain local & global
> solutions and bridge workflow gaps for the Wikimedia communities.
>
>
>
> There are incredible many great tools out there. It’s time to celebrate
> this & to make the work volunteer developers do more visible to everyone
> :-)
>
>
>
> The Coolest Tool Award ceremony will take place at Wikimania, as part of
> the official program:
>
> https://wikimania.wikimedia.org/wiki/2019:Technology_
> outreach_%26_innovation/Coolest_Tool_Award_2019
>
>
>
> The award is organized & selected by this year's Coolest Tool Academy. The
> plan is to award the greatest tools in a variety of categories: For
> example: “best Commons tool” to recognize the value of a tool for a
> specific project, or “best newcomer” to award the work of a fairly new
> developer.
>
>
>
> As no one can possibly know all the cool tools out there, we’re looking for
> some help & inspiration: Please point us to the tools that you think are
> great - out of any reason you can think of!
>
>
>
> Please use this form:
> https://docs.google.com/forms/d/1Ip5Sb_CDvgO6IN2f51V3WjkVYU9Sa-
> nneX5PoY0sjoo/viewform
>
> to recommend tools by July 29, 2019.
>
> You can nominate as many tools as you want by filling out the form multiple
> times.
>
>
>
> This survey will be conducted via a third-party service, which may subject
> it to additional terms. For more information on privacy and data-handling,
> see the survey privacy statement:
> https://foundation.wikimedia.org/wiki/Coolest_Tool_Award_
> 2019_Survey_Privacy_Statement
>
>
>
> Thank you very much for your ideas & recommendation(s)!
>
>
>
> We will continue to spread the word over the next 1-2 days, but if you get
> the chance, please feel welcome to share this information with others too!
>
>
>
> If you have any questions, please contact us at
> https://meta.wikimedia.org/wiki/Coolest_Tool_Award.
>
>
>
> Thanks :-)
>
> Birgit, for the Coolest Tool Academy 2019
>
>
>
> --
> Birgit Müller (she/her)
> Director of Technical Engagement
>
> Wikimedia Foundation 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dealing with composer dependencies in early MediaWiki initialization

2019-06-26 Thread Brian Wolff
Another option is just removing the $wgServer back compat value.

The installer will automatically set $wgServer in LocalSettings.php. The
default value in DefaultSettings.php is mostly for compat with really old
installs before 1.16.

Allowing autodetection is a security vulnerability - albeit mostly
difficult to exploit. The primary method is via cache poisioning and then
either redirecting or otherwise tricking users about the fake domain. See
the original ticket https://phabricator.wikimedia.org/T30798 . Another
possibility is putting unsafe values in the host header to try and get an
xss (followed by cache poisioning so its not just self xss). Im unsure off
the top of my head what validation if any is done (im pretty sure its less
strict than legal domains) so im not sure how practical that is.

Anyways 1.16 was a long time ago, put my vote as we should make a breaking
change and just throw an exception if wgServer is not set in
LocalSettings.php

--
Brian

P.s. people with access to security tasks may also find the phab comment at
https://phabricator.wikimedia.org/T157426#3192740 interesting where some of
the implications of $wgServer were discussed (note the task was primarily
about something else and is unfortunately still secret)


On Tuesday, June 25, 2019, Kunal Mehta  wrote:

> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA512
>
> Hi,
>
> I (with Reedy's help) recently started work on librarizing MediaWiki's
> IP class into a separate composer package (wikimedia/ip-utils[1]). The
> main motivation was so that the Parsoid PHP port could use it[2].
>
> However, I ran into an unexpected hitch[3], as it seems we're using
> the IP class before the composer autoloader is even intialized. Here's
> the basic initialization in Setup.php:
>
> - - AutoLoader.php (MediaWiki's)
> - - Defines.php
> - - DefaultSettings.php
>   - $wgServer = WebRequest::detectServer()
> - Calls IP::splitHostAndPort()
> - - GlobalFunctions.php
> - - vendor/autoload.php (composer's)
>
> My understanding is that composer's autoloader runs late so extensions
> registering themselves using it can add their stuff to the necessary
> globals.
>
> And we call WebRequest::detectServer() in DefaultSettings.php so that
> in LocalSettings.php people can use the value of $wgServer for other
> stuff.
>
> I see 3 main ways to move forward:
>
> 1. Move vendor/autoload.php earlier in Setup.php, potentially breaking
> extensions that still rely on composer autoloading for initialization.
> 2. Set $wgServer = false or something in DefaultSettings.php, and then
> fill it in later in Setup.php *after* the composer autoloader has been
> loaded, potentially breaking anyone relying on the value of $wgServer
> in LocalSettings.php.
> 3. (status quo) not librarize code that runs before composer
> autoloader initialization. :(
>
> Advice/input welcome.
>
> [1] https://packagist.org/packages/wikimedia/ip-utils
> [2]
> https://gerrit.wikimedia.org/g/mediawiki/services/parsoid/+/77064cfff717
> 6493a2828bb4f95f397dfce7d659/src/Utils/Title.php#46
> [3] https://gerrit.wikimedia.org/r/c/mediawiki/core/+/519089/
>
> - -- Legoktm
> -BEGIN PGP SIGNATURE-
>
> iQIzBAEBCgAdFiEE2MtZ8F27ngU4xIGd8QX4EBsFJpsFAl0S1oQACgkQ8QX4EBsF
> Jpufrg/+J9RUUxRAtgJLEkyACE6GREis0eyEIZnWmMr3s9YpFPoqtWocFrUk6Wsn
> W7d9Oda/8CW0/d894gGMn8LWIj9oWq2gMPWzCVFpg8uu3r4967qxBp+ba29uMOJw
> Qpw6DhXtPvVAeUCy8P38Y5vM7TGmV+J1T5jDY21zimT1dRrJsI1KD+u/Ue3nYy/y
> B1ic3i7vJfhYErdhHgN98ETXfXOaDx4rgd2N7PLjVNx3IYCC8LNiR8wSLuydfdbk
> PLTT1bA2qi0h2wgcEr7Qtq9YstVotq8899rgKLtGDBwQi3qGNcdOgQGEMFDVfjfO
> CsiWocj6s4oc3ScVj+Eb9xtvIqhNx+oRbWE1vKd4TmtSdyzpv6xadV60tq5qNFEY
> I0cBDOWU5UFNHbvbyjK4dqIDEVhJ6LiEgLVBOj81U27s8mR4Dv/yFB3eac0ROk7p
> gaEeOjfhtVU558XfpEsmu1H05VJT3kXNxK8y0UQOjy11SErzsXv6vDzyzLDJM/W7
> WF0I4nyjeqVsBjLBN9li+5AnU3cAKVOCfZ+/aRYyg89Du//nJRjm+4lxnuPrGlaG
> ES/nVUnkDZ9Yc/xA1yacm3Ytx9hpoY1mIZgxxxveyeU1KsNXAZ2BOGA2T7kU4yUw
> Uyg+byYwI+1uVOjAVd3BInGV2R2/GmeIn9FOpthBaw8wcz0Y/8c=
> =tU4+
> -END PGP SIGNATURE-
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 今週あなたを幸せにするものは何ですか? / What's making you happy this week? (Week of 16 June 2019)

2019-06-19 Thread Brian Wolff
I personally think these threads are fine (within reason) if they are
thankful about technical things and/or stuff connected to mediawiki.
Generic things about the wikimedia movement should stay on wikimedia-l; we
have multiple mailing lists with different scopes for a reason. In that
light, I consider the majority of your posts on this topic to be offtopic.

I would prefer that subject lines stay english. People use subject lines to
sort their mail into whats interesting to them. Random language subjects
interfere with that.

On Wednesday, June 19, 2019, Pine W  wrote:

> Hi Andre,
>
> I think that whether these threads are welcome by Wikitech-l subscribers in
> general is a reasonable question, although I disagree with a number of
> elements of your description of these threads. I took a somewhat bold step
> in February 2018 of expanding the posting of these threads from Wikimedia-l
> to both Wikimedia-l and Wikitech-l, and I expressed a willingness to listen
> to objections; see
> https://lists.wikimedia.org/pipermail/wikitech-l/2018-February/089495.html
> .
> At that time, no one objected.
>
> Of the occasional feedback that I have received in public (more on
> Wikimedia-l than Wikitech-l) and private, 100% has been positive until now.
> However, I recognize that people's opinions may change over time. If a
> consensus emerges from Wikitech-l subscribers that these threads are not
> currently welcome on Wikitech-l then I will respect that and I will stop
> cross-posting the threads to Wikitech-l. However, if no such consensus
> emerges, then I suggest that you ignore these threads in the future.
>
> The purpose of these emails is certainly not to annoy people, and I regret
> if people find them to be more annoying than interesting, useful, or
> encouraging.
>
> I hope that people will contribute their own posts to these threads in
> response to the prompt, "What’s making you happy this week? You are welcome
> to comment in any language." Also, I would be happy to see a new person
> start these threads each week, and to see multiple people contribute their
> thoughts each week. However, if a new consensus emerges in against having
> these threads on Wikitech-l, then I will abide by that consensus.
>
> Pine
> ( https://meta.wikimedia.org/wiki/User:Pine )
>
>
> On Wed, Jun 19, 2019 at 11:45 AM Andre Klapper 
> wrote:
>
> > On Wed, 2019-06-19 at 07:17 +, Pine W wrote:
> > > While browsing Japanese Wikipedia, I found this
> >
> > As "What's making you happy this week" messages seem to contain no
> > questions, no expected interaction or followup on specific technical
> > topics, no announcements about specific technical topics, and often no
> > connection to the technical scope of wikitech-l@ (in the previous
> > message, 1 out of 4 items covered a technical aspect), but seem to be a
> > random link list of collected personal interest items:
> > Has it been considered to post such messages to a personal blog (or
> > such) instead, which might be a more suitable venue?
> >
> > andre
> > --
> > Andre Klapper | Bugwrangler / Developer Advocate
> > https://blogs.gnome.org/aklapper/
> >
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] My views on the proposed projects for GSoC '19

2019-01-27 Thread Brian Wolff
Hi Kaushik,

You are welcome to suggest your own projects for gsoc. If your interest is
in data science you may want to read up on some of the stuff that the
analytics team does.

Different areas have different programming language requirements. For
example SRE often uses python. I think analytics stuff is ok with python
but I'm less familiar with that area personally.

Good luck,
Bawolff

On Sunday, January 27, 2019, K. Kaushik Reddy 
wrote:

> Hi people,
>
> This is Kaushik again. I have gone through the proposed GSoC'19 project
> ideas, they are all good.
> https://m.mediawiki.org/wiki/Google_Summer_of_Code/2019
>
>
> But , I am looking for something which can be done with python. I also
> think the above projects can be done with python befriended with  java
> script.( Just suggesting, no offense).
> I am really optimistic that wikimedia would propose some projects dealing
> with Data science or python for GSoC'19.
> I actually have many ideas regarding the above mentioned topics.
> I would bring them in a good form and would like to propose before you
> people soon.
>
>
> with regards,
> Kaushik.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Old tiles.wfmlabs.org maps services need maintainers to keep them alive after december 18th

2018-12-14 Thread Brian Wolff
That is the process for getting shell access on production [e.g. servees
directly running Wikipedia]. Which is rather different (and much harder)
than getting access to an abandoned tool.

--
brian

On Friday, December 14, 2018, Alexander Vassilevski <
alexan...@vassilevski.com> wrote:
> I created a task in phab to be added to NDA ldap, so I can sign the NDA
and get shell access:
> https://phabricator.wikimedia.org/T211996
> One of the subtasks (from
https://wikitech.wikimedia.org/wiki/Volunteer_NDA#Privileged_LDAP_or_shell_access
) on this is to get
>
> [ ] At least one comment of support from a Wikimedia Foundation employee,
explaining why it is a good idea to accept your request
> [ ] A comment of approval from one Wikimedia Foundation manager (usually
the manager of an employee supporting you).
> [ ] (Have someone with access double-check which mediawiki.org account
that the manager's Phabricator account is linked to, where the SUL account
was created, and how it was created on that wiki.)
>
> Can someone do this? If you need to see some qualifications I can try to
get some code of  mine to you from some old puppet stuff I've done and I
can show you my github, where you can see some of my code.
> If I'm making a rookie mistake here and this type of access is not
needed, let me know. I'm also sasheto +i on IRC #wikimedia-tech, so you can
message me there too, I check it from time to time.
> alexan...@vassilevski.com
>
> ‐‐‐ Original Message ‐‐‐
> On Thursday, 13 December 2018 12:33, Alex Monk  wrote:
>
>> You'll basically need to create a wikitech account (if you don't already
have one) and convince one of the existing project admins (listed at
https://tools.wmflabs.org/openstack-browser/project/maps) to add you.
>> Please use Puppet instead of manually setting up servers by hand.
>>
>> On Thu, 13 Dec 2018 at 20:21, Alexander Vassilevski <
alexan...@vassilevski.com> wrote:
>>
>>> I have some free time this December and I could manually create the new
vm's and replicate the configs/installs by hand on newly created Ubuntu 18
vm's ( after figuring out how the old ones are installed and configured and
if given access, of course .. ).
>>>
>>> Keep in mind that I'm new to wikimedia and don't know much about the
infrastructure or procedures, so what do I need to do to be given access
and volunteer to do this?
>>>
>>> alexan...@vassilevski.com
>>>
>>> ‐‐‐ Original Message ‐‐‐
>>> On Thursday, 13 December 2018 12:09, Johan Jönsson <
jjons...@wikimedia.org> wrote:
>>>
 On Thu, Dec 13, 2018 at 9:36 AM Johan Jönsson jjons...@wikimedia.org
 wrote:

 > On Thu, Dec 13, 2018 at 9:24 AM Derk-Jan Hartman <
 > [d.j.hartman+wmf...@gmail.com](mailto:d.j.hartman%2bwmf...@gmail.com)>
wrote:
 >
 > > Yeah might be wise to at least reach out to en.wp and de.wp. Or
maybe Tech
 > > News even ?
 >
 > "Hey folks, this might not work in the future" could be reason
enough to
 > include it in Tech News, yes.

 Included inhttps://meta.wikimedia.org/wiki/Tech/News/2018/51 which is
 going out to the wikis on Monday.

 //Johan Jönsson


---

 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] +2 nomination for Alangi Derick (D3r1ck01) in mediawiki/*

2018-11-25 Thread Brian Wolff
I have closed this as successful.

Congratulations D3r1ck!

--
Brian
On Sat, Nov 17, 2018 at 11:36 PM Brian Wolff  wrote:
>
> I'd like to nominate D3r1ck01 for +2 rights in mediawiki/*. See
> https://phabricator.wikimedia.org/T209775 for details.
>
>
> Copying the contents of the ticket below:
>
> I'd like to nominate Alangi Derick (@D3r1ck01) for +2 on mediawiki/*.
> He's been around for a while now (First merged commit was to Echo in
> sept 2015), and has contributed to a large number of mediawiki related
> repositories. He has also been very active mentoring people as part of
> GCI. In particular, I think him having code review rights will be very
> helpful with his mentoring students in mentorship programs. I think
> this contributor has shown good judgment, and I trust him to know
> which things he is capable of reviewing and giving good reviews for
> those things.
>
> List of merged commits in gerrit:
> https://gerrit.wikimedia.org/r/#/q/owner:%22D3r1ck01+%253Calangiderick%2540gmail.com%253E%22+status:merged,25
> List of reviews:
> https://gerrit.wikimedia.org/r/#/q/reviewedby:%22D3r1ck01+%253Calangiderick%2540gmail.com%253E%22,50
>
>
> ---
> Brian

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] +2 nomination for Alangi Derick (D3r1ck01) in mediawiki/*

2018-11-17 Thread Brian Wolff
I'd like to nominate D3r1ck01 for +2 rights in mediawiki/*. See
https://phabricator.wikimedia.org/T209775 for details.


Copying the contents of the ticket below:

I'd like to nominate Alangi Derick (@D3r1ck01) for +2 on mediawiki/*.
He's been around for a while now (First merged commit was to Echo in
sept 2015), and has contributed to a large number of mediawiki related
repositories. He has also been very active mentoring people as part of
GCI. In particular, I think him having code review rights will be very
helpful with his mentoring students in mentorship programs. I think
this contributor has shown good judgment, and I trust him to know
which things he is capable of reviewing and giving good reviews for
those things.

List of merged commits in gerrit:
https://gerrit.wikimedia.org/r/#/q/owner:%22D3r1ck01+%253Calangiderick%2540gmail.com%253E%22+status:merged,25
List of reviews:
https://gerrit.wikimedia.org/r/#/q/reviewedby:%22D3r1ck01+%253Calangiderick%2540gmail.com%253E%22,50


---
Brian

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Looking for maintainers: RelatedSites

2018-11-05 Thread Brian Wolff
Lots of extensions are maintainerless. I dont see why that implies it will
eventually have to be archived. Perhaps the day will come when it no longer
works with modern mediawiki, but until that day comes, why not leave it be?

--
brian

On Monday, November 5, 2018, Max Semenik  wrote:
> As RelatedSites extension has been undeployed from Wikimedia projects
> today[1], a question arises: who will maintain it in the future? Unless
> someone volunteers to maintain it, we will probably eventually have to
> archive the repository. According to WikiApiary[2], besides WMF it's used
> only on vojnaenciklopedija.com (ping Zoranzoki21).
>
> 
> [1] https://phabricator.wikimedia.org/T128326
> [2] https://wikiapiary.com/wiki/Extension:RelatedSites
> --
> Best regards,
> Max Semenik ([[User:MaxSem]])
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] problematic use of "Declined" in Phabricator

2018-10-02 Thread Brian Wolff
Declined = WONTFIX (e.g. if some talented developer wrote a patch, and the
patch was perfect, you would still -2 it because the functionality is not
wanted/stupid/etc)

Invalid = not a real bug. That should include things like spam, stuff where
the reporter is mistaken ( can't reproduce or if someone say reports a
sharepoint bug)

I think the defining difference is its possible to write a patch for a
declined bug, even though it would be rejected, where an invalid bug by
definition is unfixable.

Just my 2 cents, others may have different definitions.

--
Brian

p.s. ive never liked the "need volunteer" term for lowest priority - I have
always felt it had offensive implications as if volunteer time isnt as
important so they get the low priority bugs.

On Tuesday, October 2, 2018, Joe Matazzoni  wrote:
> I agree with Amir’s understanding. "Declined” is basically for ideas
whose proper timing is never.  Valid ideas that we just aren’t going to
work on any time soon should go in a backlog or freezer or some such, where
they can await until some future project or other development makes them
relevant (at least theoretically).
>
> All of which does raise a slightly different question: I am much less
clear on what the exact difference is between “Invalid” and “Declined.”
Thoughts?
>
> Best,
> Joe
> _
>
> Joe Matazzoni
> Product Manager, Collaboration
> Wikimedia Foundation, San Francisco
> mobile 202.744.7910
> jmatazz...@wikimedia.org
>
> "Imagine a world in which every single human being can freely share in
the sum of all knowledge."
>
>
>
>
>> On Oct 2, 2018, at 9:31 AM, Amir E. Aharoni 
wrote:
>>
>> Hi,
>>
>> I sometimes see WMF developers and product managers marking tasks as
>> "Declined" with comments such as these:
>> * "No resources for it in (team name)"
>> * "We won't have the resources to work on this anytime soon."
>> * "I do not plan to work on this any time soon."
>>
>> Can we perhaps agree that the "Declined" status shouldn't be used like
this?
>>
>> "Declined" should be valid when:
>> * The component is no longer maintained (this is often done as
>> mass-declining).
>> * A product manager, a developer, or any other sensible stakeholder
thinks
>> that doing the task as proposed is a bad idea. There are also variants of
>> this:
>> * The person who filed the tasks misunderstood what the software
component
>> is supposed to do and had wrong expectations.
>> * The person who filed the tasks identified a real problem, but another
>> task proposes a better solution.
>>
>> It's quite possible that some people will disagree with the decision to
>> mark a particular task as "Declined", but the reasons above are
legitimate
>> explanations.
>>
>> However, if the task suggests a valid idea, but the reason for declining
is
>> that a team or a person doesn't plan to work on it because of lack of
>> resources or different near-term priorities, it's quite problematic to
mark
>> it as Declined.
>>
>> It's possible to reopen tasks, of course, but nevertheless "Declined"
gives
>> a somewhat permanent feeling, and may cause good ideas to get lost.
>>
>> So can we perhaps decide that such tasks should just remain Open? Maybe
>> with a Lowest priority, maybe in something like a "Freezer" or "Long
term"
>> or "Volunteer needed" column on a project workboard, but nevertheless
Open?
>>
>> --
>> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
>> http://aharoni.wordpress.com
>> ‪“We're living in pieces,
>> I want to live in peace.” – T. Moore‬
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] My Phabricator account has been disabled

2018-08-14 Thread Brian Wolff
So we are going to magically assume that somehow this block is going to
change mcbride's behaviour when it took a 100 message email thread before
he even found out the reason he was blocked (which differs from the implied
reason in the email which was sent to an email account he usually doesnt
check)?

--
bawolff


On Tuesday, August 14, 2018, Amir Ladsgroup  wrote:
> Brian, that's actually exactly how Wikipedia operates, as an admin in
> Wikipedia serving for more than 9.5 years. The only difference is that
it's
> not punitive, and I don't think this ban was also punitive either. The ban
> is made to prevent further damage.
>
> Best
>
> On Tue, Aug 14, 2018, 22:23 Brian Wolff  wrote:
>
>> Given that many of our users are from wikipedia, and as far as i
understand
>> (I am not a wikipedian), on Wikipedia, using increasing length blocks as
as
>> a punative punishment for rule infractions isn't allowed, I would guess
>> many of our community don't see it valid to block people temporarily just
>> because the warnings arent working out.
>>
>>
>> --
>> bawolff
>> On Tuesday, August 14, 2018, Amir Ladsgroup  wrote:
>> > That's very valid but you don't see the CoCC bans anyone who makes an
>> > unconstructive or angry comment. The problem here happens when it
happens
>> > too often from one person. When a pattern emerges. Do you agree that
when
>> > it's a norm for one person and warnings are not working out, the option
>> is
>> > to ban to show this sort of behavior is not tolerated?
>> >
>> > One hard part of these cases is that people see tip of an iceberg, they
>> > don't see number of reports, pervious reports and number of people who
>> the
>> > user made uncomfortable so much that they bothered to write a report
>> about
>> > the user for different comments and actions. That's one thing that
shows
>> > the committee that it's a pattern and not a one-time thing.
>> >
>> > Best
>> >
>> >
>> >
>> > On Tue, Aug 14, 2018, 21:49 Isarra Yos  wrote:
>> >
>> >> Expecting every single comment to specifically move things forward
>> >> seems... a bit excessive, frankly. Not everyone is going to have the
>> >> vocabulary to properly express themselves, let alone the skill to
fully
>> >> explain exactly what the issues are, why they are, how to move
forward,
>> >> or whatever. And even then, I would argue that having input that isn't
>> >> directly doing any of this can still be useful to indicating to others
>> >> that can that such might indeed be in order, that there is indeed
>> >> sufficient interest to merit the effort, or sufficient confusion that
>> >> there might be more issue than immediately met the eye.
>> >>
>> >> A wtf from one person can help to get others involved to actually
>> >> clarify, or ask followup questions, or what have you. It's not off
>> topic.
>> >>
>> >> -I
>> >>
>> >> On 14/08/18 19:41, Amir Ladsgroup wrote:
>> >> > Hey Petr,
>> >> > We have discussed this before in the thread and I and several other
>> >> people
>> >> > said it's a straw man.
>> >> >
>> >> > The problem is not the WTF or "What the fuck" and as I said before
the
>> >> mere
>> >> > use of profanity is not forbidden by the CoC. What's forbidden is
>> >> "Harming
>> >> > the discussion or community with methods such as sustained
disruption,
>> >> > interruption, or blocking of community collaboration (i.e.
>> trolling).".
>> >> > [1]  When someone does something in phabricator and you *just*
comment
>> >> > "WTF", you're not moving the discussion forward, you're not adding
any
>> >> > value, you're not saying what exactly is wrong or try to reach a
>> >> consensus.
>> >> > Compare this with later comments made, for example:
>> >> > https://phabricator.wikimedia.org/T200742#4502463
>> >> >
>> >> > I hope all of this helps for understanding what's wrong here.
>> >> >
>> >> > [1]: https://www.mediawiki.org/wiki/Code_of_Conduct
>> >> > Best
>> >> >
>> >> > On Tue, Aug 14, 2018 at 9:29 PM Petr Bena 
wrote:
>> >> >
>> >> >> I am OK if people who are attacking others are somehow informed
that
>> >> >> this is not acceptable and 

Re: [Wikitech-l] My Phabricator account has been disabled

2018-08-14 Thread Brian Wolff
Given that many of our users are from wikipedia, and as far as i understand
(I am not a wikipedian), on Wikipedia, using increasing length blocks as as
a punative punishment for rule infractions isn't allowed, I would guess
many of our community don't see it valid to block people temporarily just
because the warnings arent working out.


--
bawolff
On Tuesday, August 14, 2018, Amir Ladsgroup  wrote:
> That's very valid but you don't see the CoCC bans anyone who makes an
> unconstructive or angry comment. The problem here happens when it happens
> too often from one person. When a pattern emerges. Do you agree that when
> it's a norm for one person and warnings are not working out, the option is
> to ban to show this sort of behavior is not tolerated?
>
> One hard part of these cases is that people see tip of an iceberg, they
> don't see number of reports, pervious reports and number of people who the
> user made uncomfortable so much that they bothered to write a report about
> the user for different comments and actions. That's one thing that shows
> the committee that it's a pattern and not a one-time thing.
>
> Best
>
>
>
> On Tue, Aug 14, 2018, 21:49 Isarra Yos  wrote:
>
>> Expecting every single comment to specifically move things forward
>> seems... a bit excessive, frankly. Not everyone is going to have the
>> vocabulary to properly express themselves, let alone the skill to fully
>> explain exactly what the issues are, why they are, how to move forward,
>> or whatever. And even then, I would argue that having input that isn't
>> directly doing any of this can still be useful to indicating to others
>> that can that such might indeed be in order, that there is indeed
>> sufficient interest to merit the effort, or sufficient confusion that
>> there might be more issue than immediately met the eye.
>>
>> A wtf from one person can help to get others involved to actually
>> clarify, or ask followup questions, or what have you. It's not off topic.
>>
>> -I
>>
>> On 14/08/18 19:41, Amir Ladsgroup wrote:
>> > Hey Petr,
>> > We have discussed this before in the thread and I and several other
>> people
>> > said it's a straw man.
>> >
>> > The problem is not the WTF or "What the fuck" and as I said before the
>> mere
>> > use of profanity is not forbidden by the CoC. What's forbidden is
>> "Harming
>> > the discussion or community with methods such as sustained disruption,
>> > interruption, or blocking of community collaboration (i.e. trolling).".
>> > [1]  When someone does something in phabricator and you *just* comment
>> > "WTF", you're not moving the discussion forward, you're not adding any
>> > value, you're not saying what exactly is wrong or try to reach a
>> consensus.
>> > Compare this with later comments made, for example:
>> > https://phabricator.wikimedia.org/T200742#4502463
>> >
>> > I hope all of this helps for understanding what's wrong here.
>> >
>> > [1]: https://www.mediawiki.org/wiki/Code_of_Conduct
>> > Best
>> >
>> > On Tue, Aug 14, 2018 at 9:29 PM Petr Bena  wrote:
>> >
>> >> I am OK if people who are attacking others are somehow informed that
>> >> this is not acceptable and taught how to properly behave, and if they
>> >> continue that, maybe some "preventive" actions could be taken, but is
>> >> that what really happened?
>> >>
>> >> The comment by MZMcBride was censored, so almost nobody can really see
>> >> what it was and from almost all mails mentioning the content here it
>> >> appears he said "what the fuck" or WTF. I can't really think of any
>> >> language construct where this is so offensive it merits instant ban +
>> >> removal of content.
>> >>
>> >> I don't think we need /any/ language policy in a bug tracker. If
>> >> someone says "this bug sucks old donkey's " it may sounds a bit
>> >> silly, but there isn't really any harm done. If you say "Jimbo, you
>> >> are a f retard, and all your code stinks" then that's a problem,
>> >> but I have serious doubts that's what happened. And the problem is not
>> >> a language, but personal attack itself.
>> >>
>> >> If someone is causing problems LET THEM KNOW and talk to them. Banning
>> >> someone instantly is worst possible thing you can do. You may think
>> >> our community is large enough already so that we can set up this kind
>> >> of strict and annoying policies and rules, but I guarantee you, it's
>> >> not. We have so many open bugs in phabricator that every user could
>> >> take hundreds of them... We don't need to drive active developers away
>> >> by giving them bans that are hardly justified.
>> >>
>> >> P.S. if someone saying "WTF" is really giving you creeps, I seriously
>> >> recommend you to try to develop a bit thicker skin, even if we build
>> >> an "Utopia" as someone mentioned here, it's gonna be practical for
>> >> interactions in real world, which is not always friendly and nice. And
>> >> randomly banning people just for saying WTF, with some cryptic
>> >> explanation, seems more 1984 style 

Re: [Wikitech-l] My Phabricator account has been disabled

2018-08-14 Thread Brian Wolff
Thanks Amir for clarifying this. This is the first I remember hearing this
was the reason for the sactions against mcbride, and differs significantly
from what I assumed was the reason.

However, MZMcbride has also claimed his comment was in exasperation after
facing the same breach of the CoC you have cited, from varnent. Given that
there is a narrative going around that the CoC is unfairly biased in favour
of staff, would you mind sharing what deliberations took place that
resulted in sactions against only one of the participants in a dispute
where both participants are alleged to have committed the same fault. To be
clear, im not neccesarily sugesting (nor am i neccesarily suggesting the
converse) that the CoC is wrong in this - only that it seems full
disclosure of the rationale seems like the only method to heal this rift
that has opened up.

Thank you
--
Bawolff

On Tuesday, August 14, 2018, Amir Ladsgroup  wrote:
> Hey Petr,
> We have discussed this before in the thread and I and several other people
> said it's a straw man.
>
> The problem is not the WTF or "What the fuck" and as I said before the
mere
> use of profanity is not forbidden by the CoC. What's forbidden is "Harming
> the discussion or community with methods such as sustained disruption,
> interruption, or blocking of community collaboration (i.e. trolling).".
> [1]  When someone does something in phabricator and you *just* comment
> "WTF", you're not moving the discussion forward, you're not adding any
> value, you're not saying what exactly is wrong or try to reach a
consensus.
> Compare this with later comments made, for example:
> https://phabricator.wikimedia.org/T200742#4502463
>
> I hope all of this helps for understanding what's wrong here.
>
> [1]: https://www.mediawiki.org/wiki/Code_of_Conduct
> Best
>
> On Tue, Aug 14, 2018 at 9:29 PM Petr Bena  wrote:
>
>> I am OK if people who are attacking others are somehow informed that
>> this is not acceptable and taught how to properly behave, and if they
>> continue that, maybe some "preventive" actions could be taken, but is
>> that what really happened?
>>
>> The comment by MZMcBride was censored, so almost nobody can really see
>> what it was and from almost all mails mentioning the content here it
>> appears he said "what the fuck" or WTF. I can't really think of any
>> language construct where this is so offensive it merits instant ban +
>> removal of content.
>>
>> I don't think we need /any/ language policy in a bug tracker. If
>> someone says "this bug sucks old donkey's " it may sounds a bit
>> silly, but there isn't really any harm done. If you say "Jimbo, you
>> are a f retard, and all your code stinks" then that's a problem,
>> but I have serious doubts that's what happened. And the problem is not
>> a language, but personal attack itself.
>>
>> If someone is causing problems LET THEM KNOW and talk to them. Banning
>> someone instantly is worst possible thing you can do. You may think
>> our community is large enough already so that we can set up this kind
>> of strict and annoying policies and rules, but I guarantee you, it's
>> not. We have so many open bugs in phabricator that every user could
>> take hundreds of them... We don't need to drive active developers away
>> by giving them bans that are hardly justified.
>>
>> P.S. if someone saying "WTF" is really giving you creeps, I seriously
>> recommend you to try to develop a bit thicker skin, even if we build
>> an "Utopia" as someone mentioned here, it's gonna be practical for
>> interactions in real world, which is not always friendly and nice. And
>> randomly banning people just for saying WTF, with some cryptic
>> explanation, seems more 1984 style Dystopia to me...
>>
>> On Tue, Aug 14, 2018 at 4:08 PM, David Barratt 
>> wrote:
>> >>
>> >> Again, this isn't enwiki, but there would be a large mob gathering at
>> the
>> >> administrators' doorstep on enwiki for a block without that context
and
>> >> backstory.
>> >>
>> >
>> > That seems like really toxic behavior.
>> >
>> > On Tue, Aug 14, 2018 at 6:27 AM George Herbert <
george.herb...@gmail.com
>> >
>> > wrote:
>> >
>> >> I keep seeing "abusers" and I still haven't seen the evidence of the
>> >> alleged long term abuse pattern.
>> >>
>> >> Again, this isn't enwiki, but there would be a large mob gathering at
>> the
>> >> administrators' doorstep on enwiki for a block without that context
and
>> >> backstory.  That's not exactly the standard here, but ... would
someone
>> >> just answer the question?  What happened leading up to this to justify
>> the
>> >> block?  If it's that well known, you can document it.
>> >>
>> >> On Tue, Aug 14, 2018 at 12:18 AM, Adam Wight 
>> wrote:
>> >>
>> >> > Hi Petr,
>> >> >
>> >> > Nobody is language policing, this is about preventing abusive
behavior
>> >> and
>> >> > creating an inviting environment where volunteers and staff don't
>> have to
>> >> > waste time with emotional processing of traumatic interactions.
>> >> 

Re: [Wikitech-l] Making two factor auth less annoying

2018-08-13 Thread Brian Wolff
While there are two people in this thread complaining so i suspect its not
that obscure, but this is also the first i have ever heard of it as well.
Definitely something we need to track down.

--
Brian
On Monday, August 13, 2018, Daniel Kinzler 
wrote:
> Am 13.08.2018 um 07:34 schrieb Gergo Tisza:
>> Two-factor authentication does not affect how the session works. If you
>> check "Remember me", the login will last for 180 days, whether you use
>> two-factor authentication or not.
>
> Yea, works fine for me - and this is the first time I hear people
complain that
> they constantly have to log in again with 2fa. This certainly isn't
intentional.
> Sounds like a bug that only affacts a few people... or are people so used
to
> pain and suffering that so few complain about it?
>
> -- dnaiel
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] My Phabricator account has been disabled

2018-08-08 Thread Brian Wolff
Wikimedia-l is not a technical mailing list.

That said I personally think that any sort of effective CoC would have take
actions on other spaces into account when it is about a matter that is in
coc juridsiction (otherwise harrasment would just move off wiki). The more
concerning part to me is that the rationale email mcbride did not mention
this. If this was indeed part of the reason then the user should be told
that.

More concerningly it seems there have been multiple contradictory opinions
on what mcbride's offense was in this thread. How can he
fix his faults if nobody seems to agree what they are. How can other users
avoid falling into the same trap if appearently our norms of behaviour are
so underspecified that even when provided with the email statement from the
CoC comittee about why he was blocked people seem to have differing
opinions on what he actually did wrong?

--
brian
On Wednesday, August 8, 2018, Max Semenik  wrote:
> On Wed, Aug 8, 2018 at 5:08 PM, MZMcBride  wrote:
>>
>> The wikimedia-l mailing list is very specifically not within the purview
>> of the mediawiki.org "Code of Conduct" or its associated committee.
>
>
> CoC very explicitly states that it applies to "technical mailing lists
> <
https://meta.wikimedia.org/wiki/Mailing_lists/Overview#MediaWiki_and_technical
>
> ".
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] My Phabricator account has been disabled

2018-08-08 Thread Brian Wolff
While what are we arguing then? I think i have lost track.

--
brian

On Wednesday, August 8, 2018, Ryan Kaldari  wrote:
> It's possible to highlight abuses while still being respectful and
> collegial. No one is seriously arguing that criticism should be banned (or
> the word "fuck").
>
> On Wed, Aug 8, 2018 at 3:50 PM MZMcBride  wrote:
>
>> Ori Livneh wrote:
>> >MZMcBride has a very good ear for grievances, and he knows how to use
his
>> >considerable social clout to draw attention to them, and then use words
>> >as a kind of lightning-rod for stoking outrage and focusing it on
>> >particular targets.
>>
>> I'm going to paraphrase what you're writing here: I spend some of my
>> accumulated social capital calling out or highlighting abuses by and
>> corruption within Wikimedia Foundation Inc. And you think that's _bad_?
>>
>> MZMcBride
>>
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Creation of new mediawiki/extensions/** group in gerrit

2018-07-24 Thread Brian Wolff
To clarify, would the proposed new group be for all extensions, or just
non-WMF deployed extensions (and i suppose allow others to opt out as well
if they so desire)?

--
brian

On Tuesday, July 24, 2018, Jay prakash <0freerunn...@gmail.com> wrote:
> Hello Everyone,
>
> This is Jay. I have some questions and want to know that it is possible to
> apply on the ground.
>
> == Problems ==
> * I work on the very trivial tasks. and even after working on the very
> trivial tasks, sometimes my patch does not merge even after 6 months has
> passed. Recently I have Abandoned my some patches due to be not merged
> after 6 months has passed.[1][2]
> * Recently there are many tasks created on phab to Archive the
> extensions. Those
> extensions have not maintained in the last 5 years.
> * And also We have lack of peoples for reviewing those extension's
patches,
> which have not deployed on Wikimedia or any other big wiki.
>
> == Potential Solution ==
> The potential solution is to create a new group on the Gerrit with +2
> rights. This group will manage those extensions which have not maintained
> by the owner in last 1 year (if not, Maybe 2 years). This will save the
> extension from being unmaintained. There are many users who have good
> knowledge and They review patches like Mainframe98,[3] MGChecker,  and
> others.
>
> So my question is why does we not create this type of group? so that users
> can managed those extensions which have not maintained in last 1 year.
>
> [1]
>
https://gerrit.wikimedia.org/r/#/c/mediawiki/extensions/LanguageTag/+/406053/
> [2]
https://gerrit.wikimedia.org/r/#/c/mediawiki/extensions/Arrays/+/405057/
> [3]
>
https://gerrit.wikimedia.org/r/#/q/reviewedby:%22Mainframe98+%253Ck.s.werf%2540hotmail.com%253E%22
>
> Thanks
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] +2 nomination for MarcoAurelio in mediawiki/*

2018-06-20 Thread Brian Wolff
Wow.

That is kind of a sad statistic :(

--
brian

On Wednesday, June 20, 2018, Kunal Mehta  wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA512
>
> I've closed the request as successful. Congrats to MarcoAurelio for
> being the first new volunteer +2'er since 2016!
>
> - -- Legoktm
>
> On 06/13/2018 10:25 AM, Kunal Mehta wrote:
>> Hi,
>>
>> I've filed , nominating
>> MarcoAurelio for +2 access in mediawiki/.
>>
>> -- Legoktm
>>
>> ___ Wikitech-l mailing
>> list Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
> -BEGIN PGP SIGNATURE-
>
> iQIzBAEBCgAdFiEE+h6fmkHn9DUCyl1jUvyOe+23/KIFAlsqVbwACgkQUvyOe+23
> /KLMDhAAlUGeFLV8KCOKOtLb/ZXdLskLEe0ptsMULxmK0DMKsUvlXsF/kHo4dUoR
> JkUAJ2ET5GtcSm4Cl1g9Doyy5ShXY4Y79alsO7wH/V7CkCeZqzlqjnRIItwBFX+5
> 3lO1vWfqvsjT4Aae0atfbh5PkqjUQotiRg1gK+5rBQroOG+n5sqoBIX+j1dcO1Dz
> SgIWJ2zgHkTpFTgPviNd0UE14ia54ypvG5NzXVlmeR3oPDujvt0qxGghlvpmdfdg
> X2kdpWJfd7Gt8NF6sUfSwzfOi2yfwJVpn2XcsflsDkyTb56z0gFdeNex/44yTcLK
> x8VBWXOKgUkpgikI/fkbsh1kjjnIFv9OWyTI+JZrTYQK/iB1Cbw7DOWQF5xgGwKP
> UtOBCvzxPpLCMRKwD+klhRmvF117EX8LZMDUXc7kBBi2MvPHRlsk9VukELAF3F97
> A1bHcntrEO46y/B2IAOn0lxe0ww2SMEYmHKsTQ8WQ7VRVas50pmOZSmF9Oosp9PO
> X8k/vIXlON/gsjP/9qkPOxL6EAFwS59KpKRZtUobJu18vTo/Y6U03keKMN2dOfCt
> v9Unk4byfB/8KL+TneQYVZAPLMsj8clbxMn1a48qwF0Y7bu8lucCGcjDijdptree
> Vg8+P3TwdZB0fSl7KOM23cgbUBIQiMaVNNjaIUUzK2u16dPVnnU=
> =vfnF
> -END PGP SIGNATURE-
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Multimedia] Video output changing to WebM VP9/Opus soon

2018-06-17 Thread Brian Wolff
Woo!

Good to see us moving forward to newer codecs.

What is the expected impact on transcode time/memory usage? (As in, will
large videos still transcode fine or would they potentially hit limits
sooner?)

--
brian

On Sunday, June 17, 2018, Brion Vibber  wrote:
> In the next couple weeks I'm planning to start switching our video
transcode output from WebM VP8/Vorbis to the newer WebM VP9/Opus profile,
which saves us about 38% on file size and bandwidth while retaining the
same quality.
> This will not affect what kinds of files you upload; only the scaled
transcoded output files used for playback will change. All modern browsers
that support VP8 support VP9 as well, and our player shim for Safari and IE
will continue to work.
> All the details:
https://www.mediawiki.org/wiki/Extension:TimedMediaHandler/VP9_transition
> Comments and questions welcome!
> -- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit as a shared community space

2018-06-09 Thread Brian Wolff
Taking a step back here...

I agree with you in principle...but

Shared spaces imply that occasionally disputes are going to arise as to
what belongs in a repo. If we dont have a fair method of resolving such
disputes (/my way or the highway/ is not fair), then this model is not
going to work.

--
Brian

On Saturday, June 9, 2018, Brion Vibber  wrote:
> I'd just like to apologize for dragging the other thread into this one and
> being overly personal and failing to assume good faith.
>
> That was a failing on my part, and not good practice.
>
> Please if you respond further to this thread, treat only the narrow issue
> of ownership / maintainership expectations and where/how we should be more
> clear on it.
>
> Further discussion on people's motives about the code of conduct will
> likely not be productive for anyone on this thread.
>
> My apologies to all; I should do better. You all deserve better.
>
> -- brion
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Can/should my extensions be deleted from the Wikimedia Git repository?

2018-06-08 Thread Brian Wolff
On Friday, June 8, 2018, Chris Koerner
> [snip]
> There are voices not present in this very public conversation. I have
> been approached by a few that do not feel comfortable participating
> here. I don't want to see anyone's contributions deleted. I also don't
> want to see an exception made in this particular case because we as a
> community haven't written it down somewhere.
>
[snip]

Fwiw, I also had a discussion last night with someone who hasnt
participated in this discussion as of yet but stated that incidents like
this make them want to host their extensions elsewhere except they are not
willing to pass up translatewiki integration-so the people not commenting
objection goes both ways

I dont think we should consider hearsay (particularly the type where we
dont even specify the source) in discussions of these types. For one,
regardless of what view you hold you can probably always find someone on
the internet who agrees with you. Second if the people dont participate we
cannot evaluate their arguments on their merits or count them. If its not
for consensus seeking (evaluate args on merit) or for counting (voting)
what's the point?

--
brian
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Can/should my extensions be deleted from the Wikimedia Git repository?

2018-06-08 Thread Brian Wolff
On Friday, June 8, 2018, Chris Koerner  wrote:
>> I for one think that requiring a specific filesystem structure or notice
in
>> a git repo is quite far afield from the sorts of things that CoC is
>> designed to deal with.
>
> I agree. I do think that as a community of practice we have many
> unwritten rules and numerous expectations of how we work together. We
> don't explicitly define the expectation of a README.MD file in repos
> either.[0] It's a best practice and cultural expectation in our spaces
> to include one. The code works the same with or with out it.
>
> Yeah, sure a coc.md isn’t “the same”, but both are expected as
> something we do as a community. If we need to write that down
> somewhere so there's no repeat confusion on if it's expected or not,
> that seems like a good compromise. However, I'd like to think we don't
> have to define everything, but ¯\_(ツ)_/¯ .
>
> [0] I'm waiting for someone to contradict me on this risky comparison.
> :) I could not find anything explicit in
> https://www.mediawiki.org/wiki/Gerrit/New_repositories or
> https://www.mediawiki.org/wiki/Manual:Coding_conventions
>
> Yours,
> Chris K.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

The issue is, it seems like this is not something we "do" as a community:

* There was a previous discussion about requiring coc.md. there was a lot
of arguing and no clear "winner", but a very significant portion of the
opinions was that CoC.md was highly recommended but not required if the
extension maintainer didnt want it. Thus supporting Yaron's position.
https://phabricator.wikimedia.org/T165540#3358929
* There is a general community norm that overriding a -2 by a maintainer of
a component is an extraordinary action, even more so when the person doing
it is not a maintainer of the extension. This situation is no where near
clear cut enough to justify that without discusion
* Generally speaking, its usually considered in poorform to have an
argument about something, lose the argument (or at least not win it), wait
a year until people forget about it, and then try and do the exact same
thing.

--
brian
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Can/should my extensions be deleted from the Wikimedia Git repository?

2018-06-07 Thread Brian Wolff
I for one think that requiring a specific filesystem structure or notice in
a git repo is quite far afield from the sorts of things that CoC is
designed to deal with.

--
Brian

On Thursday, June 7, 2018, Chris Koerner  wrote:
> “Please just assume for the sake of this discussion that (a) I'm willing
to abide by the rules of the Code of Conduct, and (b) I don't want the
CODE_OF_CONDUCT.md file in my extensions.”
> Ok, hear me out here. What if I told you those two things are
incompatible? That abiding by the community agreements requires the file as
an explicit declaration of said agreement. That is to say, if we had a
discussion about amending the CoC to be explicit about this expectation you
wouldn’t have issues with including it? Or at least you’d be OK with it?
>
>
>
> Yours,
> Chris K.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC discussion: the future of rev_parent_id

2018-06-04 Thread Brian Wolff
On Monday, June 4, 2018, Daniel Kinzler  wrote:
> Am 04.06.2018 um 20:54 schrieb Leon Ziemba:
>>> ... the size differences shown on the history and contributions pages,
>> which is the only thing that rev_parent_id is used for
>>
>> This may be true in MediaWiki but not so much for external tools. I just
>> wanted to preemptively say this. I'll be joining the IRC discussion to
>> share more :)
>
> Excellent, thank you! It would be particularly interesting to know what
> assumptions you make about the semantics of rev_parent_id. E.g. there are
three
> revisions, A, B, and C, and revision B gets romoved - what should
revision C's
> parent be?
>
> Similarly, when revision X gets imported and inserted between A and B,
what
> should revision B's parent be?
>
> --
> Daniel Kinzler
> Principal Platform Engineer
>
> Wikimedia Deutschland
> Gesellschaft zur Förderung Freien Wissens e.V.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I think in such a case, revision C's parent should benull or 0. Similarly
for revision X.

I dont view the parent as what revision came first but what revision was
edited to make this revision (i.e. where the current revision was "forked"
off from)
--
Brian
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [nominations needed] Wikimedia Technical Conference

2018-06-01 Thread Brian Wolff
The survey cannot be opened (gives a permission error. Presumably its only
working if you are logged in to a WMF google account)

Additionally, I wish there was more information about what basis we are
supposed to be nominating people on. Its not even clear under what category
of person we are nominating people for (people with skills (what type of
skills?) who are going to be "selected by the comittee" or are we
nominating for the 17 volunteers? (What type of volunteers? Wikipedians
(aka users)? Volunteer devs? How do third parties fit into that?)

--
Brian

On Friday, June 1, 2018, Rachel Farrand  wrote:
> Hello!
>
> As per the email below we are now opening up the nomination process for
the
> Wikimedia Technical Conference (WMTechConf), to be held in Portland, OR,
> USA on October 22-25, 2018.
>
> *Please fill out the survey using this link to nominate yourself or
someone
> else to attend: *https://goo.gl/forms/MKF682BE1OVI0pe63 This nomination
> form will remain open between June 1 and June 14, 2018.
>
> This survey is conducted via a third-party service, which may make it
> subject to additional terms. For more information on privacy and
> data-handling, see this survey privacy statement: https://
>
wikimediafoundation.org/wiki/Wikimedia_Technical_Conference_Survey_Privacy_
> Statement
>
> *If you have any questions, please post them on the event's talk page
> .
*
> Thanks!
>
> The Wikimedia Technical Conference Team
>
>
> On Mon, May 21, 2018 at 3:47 PM, Deborah Tankersley <
> dtankers...@wikimedia.org> wrote:
>
>> *Hello,We recently announced the new Wikimedia Technical Conference
>> (TechConf) during the closing session of the Barcelona Hackathon on May
20,
>> 2018. We are sending this email to give an update on the planning and
>> organization, and to also let everyone know how the nomination process
will
>> work for those interested in attending. The Wikimedia Technical
Conference
>> will take place in Portland, OR, USA on October 22-25, 2018. And, as
>> mentioned in previous emails [1][2] and on the wiki page [3], this
>> conference will be focused on the cross-departmental program called
>> Platform Evolution. We will be providing more information and context as
we
>> go along in the process.For this conference, we are looking for diverse
>> stakeholders, perspectives, and experiences that will help us to make
>> informed decisions for the future evolution of the platform. We need
people
>> who can create and architect solutions, as well as those who actually
make
>> decisions on funding and prioritization for the projects.Later this week,
>> we will send out a form to provide more detailed information on the
>> nomination process and how to nominate people (or it can be yourself) to
>> attend this conference, along with the skills, experiences, and/or
>> backgrounds that we are looking for. Due to the time needed for visa
>> application and other constraints, the deadline for nominations will be
>> June 8th. Please make sure that you don’t miss the deadline!If you have
any
>> questions, please post them on the talk page [4][1]
>> https://lists.wikimedia.org/pipermail/mediawiki-l/2018-April/047367.html
>> 
>> [2] https://lists.wikimedia.org/pipermail/wikitech-l/2018-
>> April/089738.html
>> 
>> [3] https://mediawiki.org/wiki/Wikimedia_Technical_Conference/2018
>>  [4]
>> https://www.mediawiki.org/wiki/Talk:Wikimedia_Technical_Conference/2018
>> 
>> *
>>
>> *Cheers from the Program Committee:*
>> *Kate, Corey, Joaquin, Greg, Birgit and TheDJ*
>>
>> --
>>
>> deb tankersley
>>
>> Program Manager, Engineering
>>
>> Wikimedia Foundation
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
>
> --
> Rachel Farrand
> Events Program Manager
> Technical Collaboration Team
> Wikimedia Foundation
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Recent Account hijacking activities

2018-05-16 Thread Brian Wolff
Forcing people to change passwords regularly does tend to reduce overall
password security because people tend to get very tired of the process and
will start to pick really poor passwords. That doesn't mean you should
never change your password - changing your password from time to time, as
long as its a strong password that you do not use anywhere else - does
improve your security. That said, in terms of user security, the most
important factor is not using the same password on multiple websites. After
that, comes using a strong password (for example using a password manager
so your password is randomly generated). Then comes 2FA if available for
your account. I would rank changing your password from time to time a
distant fourth - still a good idea - but the first two things are much more
important in my mind.

(I can't comment on ongoing investigations so this should only be taken as
a comment about general password security and not about this particular
incident)
--
Brian
Wikimedia Security Team

On Wednesday, May 16, 2018, Leon Ziemba  wrote:
> I'm no security expert, so bear with me! Just looking for some
> clarification.
>
>> regularly changing your passwords
>
> It was my understanding studies have shown regularly changing passwords
can
> be adverse, no? [1][2] Not sure if we have a stance on that, because this
> is the first time I've heard it come up.
>
> I don't know if this is relevant to this particular incident of account
> hijacking, but I've also been told it's important to ensure your
> password is unique
> to Wikimedia, and to turn on two-factor authentication, if possible and
you
> are willing to do so.[3][4]
>
> [1]
>
https://www.ftc.gov/news-events/blogs/techftc/2016/03/time-rethink-mandatory-password-changes
> [2]
>
https://www.ncsc.gov.uk/guidance/password-guidance-simplifying-your-approach
> [3]
>
https://meta.wikimedia.org/wiki/Password_strength_requirements/en#So_that's_it,_my_account_is_secure
> ?
> [4] https://office.wikimedia.org/wiki/Security_Basics#Passwords (staff
only)
>
> ~Leon
>
> On Wed, May 16, 2018 at 8:10 AM John Bennett 
wrote:
>
>> *On 8 May 2018, account hijacking activities were discovered on
Wikiviajes
>> - Spanish Wikivoyage (es.wikivoyage.org ). It
>> was
>> identified by community stewards and communicated to the Trust and
Safety,
>> Legal, and Security teams who responded to the event.  At this time the
>> event is still under investigation and we are unable to share more about
>> what is being done without risking additional hijacking of accounts.
>> However, we feel it is important to share what details we can and inform
>> the community of what happened.  Similar to past security incidents, we
>> continue to encourage everyone to take some routine steps to maintain a
>> secure computer and account - including regularly changing your
passwords,
>> actively running antivirus software on your systems, and keeping your
>> system software up to date. The Wikimedia Foundation's Security team and
>> others are investigating this incident as well as potential improvements
to
>> prevent future incidents. We are also working with our colleagues in
other
>> departments to develop plans for how to best share future status updates
on
>> each of these incidents. However, we are currently focused on resolving
the
>> issues identified. If you have any questions, please contact the Trust
and
>> Safety team (ca{{@}}wikimedia.org ). John
>> BennettDirector of Security, Wikimedia Foundation*
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Proposal: Add security researchers to CREDITS file & [[Special:Version/credits]]

2018-05-01 Thread Brian Wolff
Hi everyone,

Currently we only credit people who report security vulnerabilities at
https://www.mediawiki.org/wiki/Wikimedia_Security_Team/Thanks (which
basically nobody reads or knows exists) and sometimes in the commit message
and release announcements. Given such people are instrumental in keeping
MediaWiki secure, I think we should also credit them in the CREDITS file. I
propose adding another section to the file - "Vulnerability Reporters",
listing the names of everyone who has reported a security vulnerability in
either MediaWiki or a bundled extension.

Thoughts?

--
Brian
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introduction : Tanvi Dadu ( GSoC 2018)

2018-04-24 Thread Brian Wolff
Welcome Tanvi!

--
Brian

On Tuesday, April 24, 2018, Tanvi Dadu  wrote:
> Greetings !
>
> I am Tanvi Dadu, GSoC 2018 participant . I will be implementing  Feedback
> Sharing Module in Commons App and
> quiz
> , for users who have a
> high upload revert rate ,as a part of my intern program under Josephine
Lim
> and Vivek Maskara. I have a keen interest in android development and
> participation in GSoC provides a huge opportunity to get acquainted with
> the latest technology and work with highly skilled professionals.This is
my
> first time being a part of a FOSS internship and I am quite excited about
> it.
>
> Regards
> Tanvi Dadu
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] File upload file in HTMLForm with ooui

2018-04-15 Thread Brian Wolff
Ok, I looked into this a little closer.

type => file is missing from HTMLForm::typeMapping. But class overrides
this, and the HTMLTextField has some code where the type parameter is just
stuck into the html type attribute so it can be used in multiple places.

But in the OOUI case OOUI\TextInputWidget::getSaneType doesnt allow file
types (since ooui probably doesnt support it), so type goes back to text.
Thus breaking this.

So basically no, its not possible currently without implementing your own
HTMLFormField subclass. The fact it worked in the old system may have been
accidental, im not sure. There is probably a bug here somewhere though,
either the two modes should be consistent in their support for file types
or it should fallback to non-ooui in that case (imho)

Sorry,
Brian

On Sunday, April 15, 2018, Toni Hermoso Pulido <toni...@cau.cat> wrote:
> Hi Brian,
>
> if I only specify one I get this exception:
>
> ...
>
> Descriptor with no class for fileupload: Array
> (
> [section] => upload
> [label] => Upload file
> [type] => file
> [class] =>
> )
>
> ...
>
> Not using ooui, both are needed and it works...
>
>
> On 15/04/2018 19:40, Brian Wolff wrote:
>>
>> Im not sure, but having the class specified as HTMLTextField looks wrong
>> (type & class are mutually exclusive. You should only specify one).
>>
>> --
>> Brian
>>
>> On Sunday, April 15, 2018, Toni Hermoso Pulido <toni...@cau.cat> wrote:
>>>
>>> Hello,
>>>
>>> I'm trying to migrate a HTMLForm to use ooui
>>>
>>> by following https://www.mediawiki.org/wiki/OOUI/Using_OOUI_in_MediaWiki
>>>
>>> htmlForm = HTMLForm::factory( 'ooui', $formDescriptor, 'myform' );
>>>
>>> I noticed that a file upload form field that I had defined is not
>>
>> rendered correctly when using ooui.
>>>
>>>  'fileupload' => array(
>>>  'section' => 'upload',
>>>  'label' => 'Upload file',
>>>  'class' => 'HTMLTextField',
>>>  'type' => 'file'
>>>  ),
>>>
>>> This is shown as a text field but not as a file upload, as I would have
>>
>> expected...
>>>
>>> Is there any way to handle this and using OOUI at the same time?
>>>
>>> Thanks!
>>>
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] File upload file in HTMLForm with ooui

2018-04-15 Thread Brian Wolff
Im not sure, but having the class specified as HTMLTextField looks wrong
(type & class are mutually exclusive. You should only specify one).

--
Brian

On Sunday, April 15, 2018, Toni Hermoso Pulido  wrote:
> Hello,
>
> I'm trying to migrate a HTMLForm to use ooui
>
> by following https://www.mediawiki.org/wiki/OOUI/Using_OOUI_in_MediaWiki
>
> htmlForm = HTMLForm::factory( 'ooui', $formDescriptor, 'myform' );
>
> I noticed that a file upload form field that I had defined is not
rendered correctly when using ooui.
>
> 'fileupload' => array(
> 'section' => 'upload',
> 'label' => 'Upload file',
> 'class' => 'HTMLTextField',
> 'type' => 'file'
> ),
>
> This is shown as a text field but not as a file upload, as I would have
expected...
>
>
> Is there any way to handle this and using OOUI at the same time?
>
> Thanks!
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Incoming and outgoing links enquiry

2018-03-18 Thread Brian Wolff
Hi,

You can run longer queries by getting access to toolforge (
https://wikitech.wikimedia.org/wiki/Portal:Toolforge) and running from the
command line.

However the query in question might  still take an excessively long time
(if you are doing all of wikipedia). I would expect that query to result in
about 150mb of data and maybe take days to complete.

You can also break it down into parts by adding WHERE page_title >='a' AND
page_title < 'b'

Note, also of interest: full dumps of all the links is available at
https://dumps.wikimedia.org/enwiki/20180301/enwiki-20180301-pagelinks.sql.gz
(you would also need
https://dumps.wikimedia.org/enwiki/20180301/enwiki-20180301-page.sql.gz to
convert page ids to page names)
--
Brian
On Sunday, March 18, 2018, Nick Bell  wrote:
> Hi there,
>
> I'm a final year Mathematics student at the University of Bristol, and I'm
> studying Wikipedia as a graph for my project.
>
> I'd like to get data regarding the number of outgoing links on each page,
> and the number of pages with links to each page. I have already
> inquired about this with the Analytics Team mailing list, who gave me a
few
> suggestions.
>
> One of these was to run the code at this link https://quarry.wmflabs.org/
> query/25400
> with these instructions:
>
> "You will have to fork it and remove the "LIMIT 10" to get it to run on
> all the English Wikipedia articles. It may take too long or produce
> too much data, in which case please ask on this list for someone who
> can run it for you."
>
> I ran the code as instructed, but the query was killed as it took longer
> than 30 minutes to run. I asked if anyone on the mailing list could run it
> for me, but no one replied saying they could. The guy who wrote the code
> suggested I try this mailing list to see if anyone can help.
>
> I'm a beginner in programming and coding etc., so any and all help you can
> give me would be greatly appreciated.
>
> Many thanks,
> Nick Bell
> University of Bristol
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What ways are there to include user-edited JavaScript in a wiki page? (threat model: crypto miners)

2018-03-14 Thread Brian Wolff
On Wednesday, March 14, 2018, David Gerard <dger...@gmail.com> wrote:
> What ways are there to include user-edited JavaScript in a wiki page?
>
> I ask because someone put this revision in (which is now deleted):
>
>
https://fa.wikipedia.org/w/index.php?title=%D9%85%D8%AF%DB%8C%D8%A7%D9%88%DB%8C%DA%A9%DB%8C:Common.js=next=22367460=en
>
> You can't see it now, but it was someone including a JavaScript
> cryptocurrency miner in common.js!
>
> Obviously this is not going to be a common thing, and common.js is
> closely watched. (The above edit was reverted in 7 minutes, and the
> user banned.)
>
> But what are the ways to get user-edited JavaScript running on a
> MediaWiki, outside one's own personal usage? And what permissions are
> needed? I ask with threats like this in mind.
>
>
> - d.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

You need editinterface, edituserjs, or some of the centralnotice related
rights (or the steward related rights to give yourself these rights).

Any method that does not involve editinterface or a related right that is
normally restricted to administrator (or higher group) should be considered
a serious security issue in mediawiki and reported immediately.

--
Brian Wolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] TechCom Radar, 2018-02-14

2018-02-20 Thread Brian Wolff
Umm, why would you want that over just looking at the page views for a page?

If its a, people navigating to a page internally vs externally, it would be
probably better to try to convince analytics to break down the page view
data between local and external referrers instead of relying on a url
shortener to separate hits.

--
bawolff

On Tuesday, February 20, 2018, Pine W  wrote:
> Regarding "There seems to be interest in reviving the WMF hosted URL
> shortener service", +1 from me, particularly if there will be public logs
> of how many people (i.e. not bots) clicked a particular short link in a
> certain period of time that is long enough to provide some anonymity but
> short enough to be useful for analysis (such as 24 hours).
>
> Thanks!
>
> Pine
> ( https://meta.wikimedia.org/wiki/User:Pine )
> 
>
> On Tue, Feb 20, 2018 at 2:59 AM, Daniel Kinzler <
daniel.kinz...@wikimedia.de
>> wrote:
>
>> Hello all!
>>
>> Here are the minutes from this week's meeting:
>>
>>
>>
>> * There seems to be interest in reviving the WMF hosted URL shortener
>> service
>>
>> * Ongoing work on preparing the move of WMF infrastructure to PHP7
>>
>> * No IRC meeting on February 21
>>
>> * RFC under discussion: MediaWiki support for Composer equivalent for
>> JavaScript
>> packages 
>>
>> * Parsing team looking into changing HTML generated for embedded media.
>> Related
>> RFC: 
>>
>> * RFC ready for IRC meeting, probably on February 28: Normalize change
tag
>> schema 
>>
>> * Last call closing on February 21, last chance to raise issues: Stop
>> logging
>> autopatrol actions 
>>
>>
>> You can also find our meeting minutes at
>> 
>>
>> See also the TechCom RFC board
>> .
>>
>> --
>> Daniel Kinzler
>> Principal Platform Engineer
>>
>> Wikimedia Deutschland
>> Gesellschaft zur Förderung Freien Wissens e.V.
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Security question re password resets and Spectre

2018-01-05 Thread Brian Wolff
[This is kind of getting far afield of mediawiki, but...] Spectre can
potentially be used to read your private (bitcoin) keys, so bitauth is just
as vulnerable to it as anything else (assuming keys on your computer and
not some hardware token setup). The only benefit i see is that bitauth
would probably happen in a separate process and the cross process variants
of spectre look more difficult to pull off.

As far as different/exotic authentication technologies go, I think U2F
would be the way to go. But its all pretty irrelevent to this attack as if
you had an unpatched browser and someone did this attack against you, they
would probably target your session cookie. (Assuming its available in the
process. I dont know enough about different browser architectures to say if
thats always true)

--
bawolff

On Friday, January 5, 2018, Dan Bolser  wrote:
> My favorite solution to the password problem is BitAuth2017. I believe
> that Spectre / Meltdown can't beat PoW, but I'm not 100% sure of the
> details.
>
> On 4 January 2018 at 17:29, Denny Vrandečić  wrote:
>
>> I often get emails that someone is trying to get into my accounts. I
guess
>> there are just some trolls, trying to login into my Wikipedia account. So
>> far, these have been unsuccessful.
>>
>> Now I got an email that someone asked for a temporary password for my
>> account.
>>
>> So far so good. What I am wondering is whether that password reset trial
is
>> actually even more dangerous now given Spectre / Meltdown?
>>
>> Thoughts?
>>
>> (I probably should set up 2FA right now. Have been too lazy so far)
>>
>> Happy new year,
>> Denny
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Security question re password resets and Spectre

2018-01-04 Thread Brian Wolff
Labs and production machines are separate machines. An attack on labs in
the worst case would only be able to attack other labs users.

As Cyken said, one of the very scary scenarios is js getting access to data
it should not have access to (e.g. if your inputting your password in one
tab and a malicious site is in a different tab). The Spectre paper has a
proof of concept they say worked to extract private memory against (a now
outdated) version of google chrome.

All this is to say, you should update your browser ASAP or ensure  that
autoupdates are enabled. Similarlarly for your OS as updates become
available.

--
bawolff


On Thursday, January 4, 2018, Denny Vrandečić  wrote:
> Ah, that sounds good. I was thinking of a scenario where someone runs code
> in, say labs, and gains access to memory while that machine generates my
> temporary code to send it to me, and thus gains access to that code.
>
> Or, alternatively, just attack my browser through a compromised site
> running a JS exploit and gaining access to anything in my memory. But
> that's on my side to fix (or, rather, on the browser developers).
>
> One way or the other, I have set up 2FA for now.
>
> Use more lynx!
>
>
>
> On Thu, Jan 4, 2018 at 10:18 AM Cyken Zeraux 
wrote:
>
>> Spectre can be exploited in just only javascript.
>>
>>
>>
>>
https://blog.mozilla.org/security/2018/01/03/mitigations-landing-new-class-timing-attack/
>>
>> Browsers are making changes to mitigate this.
>>
>>
>>
http://www.tomshardware.com/news/meltdown-spectre-exploit-browser-javascript,36221.html
>>
>> The actual extents of the attack that are realistically possible in this
>> scenario, I do not know. But as stated in the article google suggests:
>> "Where possible, prevent cookies from entering the renderer process'
memory
>> by using the SameSite and HTTPOnly cookie attributes, and by avoiding
>> reading from document.cookie."
>>
>> I would take that to mean that cookies could be accessed, at the least.
>>
>> On Thu, Jan 4, 2018 at 12:16 PM, Stas Malyshev 
>> wrote:
>>
>> > Hi!
>> >
>> > > So far so good. What I am wondering is whether that password reset
>> trial
>> > is
>> > > actually even more dangerous now given Spectre / Meltdown?
>> >
>> > I think for those you need local code execution access? In which case,
>> > if somebody gained one on MW servers, they could just change your
>> > password I think. Spectre/Meltdown from what I read are local privilege
>> > escalation attacks (local user -> root or local user -> another local
>> > user) but I haven't heard anything about crossing the server access
>> > barrier.
>> >
>> > > (I probably should set up 2FA right now. Have been too lazy so far)
>> >
>> > Might be a good idea anyway :)
>> >
>> > --
>> > Stas Malyshev
>> > smalys...@wikimedia.org
>> >
>> > ___
>> > Wikitech-l mailing list
>> > Wikitech-l@lists.wikimedia.org
>> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>> >
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Security question re password resets and Spectre

2018-01-04 Thread Brian Wolff
No, spectre/meltdown wouldnt apply to that situation.

The meltdown/spectre vulnerabilities is all about computer programs having
access to data they should not. In order to exploit the attacker must be
able to run computer programs on the victims computer.

--
brian

On Thursday, January 4, 2018, Denny Vrandečić  wrote:
> I often get emails that someone is trying to get into my accounts. I guess
> there are just some trolls, trying to login into my Wikipedia account. So
> far, these have been unsuccessful.
>
> Now I got an email that someone asked for a temporary password for my
> account.
>
> So far so good. What I am wondering is whether that password reset trial
is
> actually even more dangerous now given Spectre / Meltdown?
>
> Thoughts?
>
> (I probably should set up 2FA right now. Have been too lazy so far)
>
> Happy new year,
> Denny
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deprecation + phan issue

2017-12-19 Thread Brian Wolff
I agree with this. Splitting the deprecation phan issues out to a separate
non-voting job sounds like a good idea (and should be easily supported by
phan with its issue whitelist/blacklist config directive)

Ideally we would have a phan plugin that generates warnings for hard
deprecations (wfDeprecated()), so that those could be voting.

--
brian

On Tuesday, December 19, 2017, James Hare  wrote:
> I’m not entirely familiar with how we use Phan, but if it’s like the
other tests we run with Gerrit, what we could do is have a test for
deprecated APIs that passes or fails as you might expect, but have it be a
non-voting test. This way a failure is a signal to check something out, but
it doesn’t stop the build altogether.
>
>> On Dec 19, 2017, at 10:01 AM, Stas Malyshev 
wrote:
>>
>> Hi!
>>
>> I've noticed a certain problem in our workflow that I'd like to discuss.
>>
>> From time to time, we deprecate certain APIs in core or extensions. The
>> idea of deprecation is that we have gradual transition - existing code
>> keeps working, and we gradually switch to the new API over time, instead
>> of removing old API at one sweep and breaking all the existing code.
>> This is the theory, and it is solid.
>>
>> Also, we have phan checks on the CI builds, which prevent us from using
>> deprecated APIs. Right now if phan detects deprecated API, it fails the
>> build.
>>
>> Now, combining this produces a problem - if somebody deprecates an API
>> that I use anywhere in my extension (and some extensions can be rather
>> big and use a lot of different APIs), all patches to that extension
>> immediately have their builds broken - including all those patches that
>> have nothing to do with the changed API. Of course, the easy way is just
>> to add @suppress and phan shuts up - but that means, the deprecated
>> function is completely off the radar, and nobody probably would remember
>> to look at it again ever.
>>
>> I see several issues here with this workflow:
>> 1. Deprecation breaks unrelated patches, so I have no choice but to shut
>> it up if I want to continue my work.
>> 2. It trains me that the reaction to phan issues should be to add
>> @suppress - which is the opposite of what we want to happen.
>> 3. The process kind of violates the spirit of what deprecation is about
>> - existing code doesn't keep working without change, at least as far as
>> the build is concerned, and constantly @suppress-ing phan diminishes the
>> value of having those checks in the first place.
>>
>> I am not sure what would be the best way to solve this, so I'd like to
>> hear some thoughts on this from people. Do you also think it is a
>> problem or it's just me? What would be the best way to improve it?
>>
>> Thanks,
>> --
>> Stas Malyshev
>> smalys...@wikimedia.org
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Many spurious RefreshLinksDynamic jobs in Queue

2017-12-19 Thread Brian Wolff
DynamicRefreshLinks jobs are meant in cases where links vary pretty much
randomly (e.g. you have something on the page causing a low cache time).
The idea being that code like [[category:Foo {{CURRENTHOUR}}]] should have
categories refreshed regularly and not just during edit.

The RefreshLinksDynamic jobs arent critical for MediaWiki running properly,
you can disable them (I think) by setting
$wgJobClasses[refreshLinksDynamic]='NullJob';
This would of course mean pages with dynamic wikitext do not get their
links updated except when edited.

--
brian

On Tuesday, December 19, 2017, Michael Erdmann  wrote:
> Hello fellow MW developers,
>
> in a private Wiki I have strange issues with refreshLinksDynamic Jobs,
that pop up apparently without reason. In particular I even have a loop
between 2 pages. After page 1 is refreshed MW thinks it needs to refresh
page 2. And after MW refreshed it, I get page 1 in the queue again [WTF].
>
> MW's behaviour is even more strange:
>
> I run my MW jobs via CRON and have
> $wgJobRunRate = 0;
> $wgRunJobsAsync = false;
> in LocalSettings.
>
> When running one job at a time the job queue never depletes. When running
100 jobs at once, via
>   php runJobs.php --maxjobs=100
> the queue eventually gets processed completely.
>
> I saw Daniel's page about JobQueue issues [1] but apparently no solution
exists yet. Is this still the current state of understanding?
>
> BTW. I am running the latest MW 1.27 and I must stay on the LTS track of
MW. With 1.23 I did not have issues like that at all.
>
> Any feedback will be appreciated.
>
> thx,
>   michael
>
> [1] https://www.mediawiki.org/wiki/User:Daniel_Kinzler_(WMDE)/Job_Queue
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Announcing a new security testing tool for MediaWiki extensions "phan-taint-check-plugin"

2017-12-14 Thread Brian Wolff
The 7.0 requirement is due to phan 0.8. You could try changing the version
of phan to a higher one (I used phan 0.8 originally because thats what
Wikimedia used in their continous integration setup. Which in retrospect
really didn't matter). I have not tried it with higher versions of phan. I
have no idea how stable the phan plugin api is, so it could totally work
with higher versions of phan - I have no idea. That is really something I
should test.

You should be able to co-install both versions of php beside each other,
with the php 7.0 binary named php7.0 instead of php. On macs homebrew will
let you do this, and I assume other installation methods will let you do
this too.

Thanks,
Brian

On Thursday, December 14, 2017, Tom Bishop, Wenlin Institute <
tan...@wenlin.com> wrote:
>
>
>> On Dec 11, 2017, at 4:09 PM, Brian Wolff <bwo...@wikimedia.org> wrote:
>>
>> ...
>> Note: the tool has a requirement of php 7.0 (neither higher nor lower)
>> see
https://www.mediawiki.org/wiki/Continuous_integration/Phan#Dependencies
>> for how to install php 7.0 if your system doesn't have it.
>
> I'm interested in trying it. However, I'm on macOS with php 7.1.1 and
reluctant to downgrade to php 7.0 or set up a virtual machine just for
this. Has anybody tried it wih macOS and/or php 7.1.1?
>
> Thanks!
>
> Tom
>
> Wenlin Institute, Inc. SPC (a Social Purpose Corporation)
> 文林研究所社会目的公司
> Software for Learning Chinese
> E-mail: wen...@wenlin.com Web: http://www.wenlin.com
> Telephone: 1-877-4-WENLIN (1-877-493-6546)
> ☯
>
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Announcing a new security testing tool for MediaWiki extensions "phan-taint-check-plugin"

2017-12-11 Thread Brian Wolff
Hello everyone,

For the last little while I have been working on a new tool
to automatically detect common security issues in MediaWiki
extensions.

The tool can detect a number of issues, including:
* XSS
** We include here using wfMessage( 'foo' )->text()
   when you should have used ->escaped() or ->parse().
* Sql injection
* Shell injection
* PHP deserialization vulnerabilities (A little buggy on this one)

In the future, it will likely also detect double escaping issues.

Of course, as with any static analysis tool, there will be instances
of false positives, as well as things it cannot detect.

I've now reached the stage where I feel the tool is useful,
and would really like people to test it out and give feedback.

Note: the tool has a requirement of php 7.0 (neither higher nor lower)
see https://www.mediawiki.org/wiki/Continuous_integration/Phan#Dependencies
for how to install php 7.0 if your system doesn't have it.

To test with your extension, simply do:

$ composer require --dev mediawiki/phan-taint-check-plugin

and then merge into the scripts directive of composer.json
  "scripts": {
 "seccheck": "seccheck-mwext",
 "seccheck-fast": "seccheck-fast-mwext"
  }
and simply run
composer seccheck

seccheck will take about 3 minutes and use lots of ram (~2 GB),
seccheck-fast won't test certain things involving hooks,
but will work in about 27 seconds and use much less ram.
This assumes that your extension is installed in the extensions/
subdirectory of MediaWiki.

In the future we may make this into a non-voting jenkins job.

If you are not making a MediaWiki extension, there is also
a "seccheck-generic" script you can use, which should work
with any PHP project. It is also possible to customize the script
for other projects that have custom escaping methods.
Generic mode is not well tested yet.

See the README for more information about the tool:
https://github.com/wikimedia/Phan-Taint-Check-Plugin/blob/master/README.md

Anyways, I hope this is useful, and am very eager to
hear feedback. I also hope that this will not only be useful
for Wikimedia, but also helpful to the third party extension
development community. Please test it and let me know what
you think.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FLIF for Wikimedia

2017-12-10 Thread Brian Wolff
Maybe not a hard no, but I would rate the probability as somewhere around
1%.

If you really wanted to push this (with the understanding that its probably
not going anywhere) I would say make a report, comingup with a solid case
with a solid implementation plan, including:
* what is the fallback plan for non js users and users with old browsers
* what would the bandwidth saving be in typical usage on typical wikipedia
pages
* what is the server side latency on an uncached hit where we have to
generate a thumbnail for the request, compared to existing formats
*what is the client side latency to render with the polyfill compared to
native format. What happens during rendering? What about people using
old-generation cell phones with lackluster cpus? Is it in a separate worker
thread or does it stop the main js thread? What is the general affect to
the user during polyfil loading.
*combining server side latency, client side latency bandwidth difference,
etc what is the overall difference in loading time for the user on a
typical wikipedia page- and what is it for a (client side) cached hit vs
(server side i.e. thumb is already rendered) vs totally uncached where
thumbnail has to be converted on the fly.

I think that would be the minimum information required to convince people
to do this, and i doubt that that would be enough unless the numbers are
super good.

--
bawolff

On Sunday, December 10, 2017, Ruben Kelevra  wrote:
> So... to break the current discussion down, there is no hard "we
> don't want this format" yet shown up.
>
> What's the next step guys? Generate a feature request for MediaWiki
> which enables FLIF as possible input-format and ships the poly-flif
> JavaScript library for browsers which does not natively support FLIF?
>
> We talk here about the chicken-egg problem, so yeah, this javascript
> crap is hard to swallow, but we must start somewhere, right?
>
> Best regards
>
> Ruben
>
> On 08.12.2017 22:00, Ruben Wisniewski wrote:
>> Hey Thiemo,
>>
>> On 06.12.2017 14:27, Thiemo Kreuz wrote:
 Point was more: Get rid of this bloody Download a JPG, do some Stuff &
>>> Upload a 3 Times locally saved JPG again […]
>>>
>>> I'm afraid I did not made my point clear enough. With all respect to
your
>>> enthusiasm, but the scenario you describe is exactly what your
suggestion
>>> will not improve. How could it? We can not control what people do on
their
>>> local computers.I'm sure we can't, but we can follow our primary goal,
>> to spread information and educate users to handle their contributed data
>> and time better. JPG has huge generation loss, that's why I always
>> choose PNG for my private files or use a program which can handle raw.
>>
>> We don't educate the users currently about the right choice of file
>> formats, so they upload their files in the format which is available for
>> them.
>>
>> Taking JPG directly from a camera which does not support raw is fine,
>> but we should take the step and convert this initial JPG to a lossless
>> format because:
>>
>> Every user which edit those file will take the JPG and save a "new
>> version" in the same format because they want to preserve the filename.
>>
>> If we would convert the JPG to PNG or FLIF, this step would be lossless
>> while we don't control anything on the user side.
>>
>> This was my point.
>>> Even worse: FLIF is not even needed for the scenario you describe. We
>> could
>>> just convert all JPEG to PNG. But this will not happen for the reasons
>>> collected in this thread.
>>
>> FLIF is better instead of PNG because it supports animations, is faster
>> to decode, use less disk space. Also saving it interlaced does not
>> increase the file size significantly and we just need to save one file
>> instead of at least 3 different versions of the same file: the
>> thumbnail, the zoomed version, and the original file.
>>
>> With FLIF this file would be always the same while the browser would
>> limit the amount of data required for the display size.
>>
>>> Sure. Go and encourage people to upload RAW. That's very much welcome.
But
>>> converting their JPEG after they made them will make many scenarios
worse.
>> Well, yeah, I tried several times in the past ... and no, my RAW-Format
>> is still not supported:
>>
>> "Bei der Übertragung gab es einen Fehler
>> Auf diesem Wiki sind Dateien mit der Endung „.NEF“ nicht zulässig."
>>
>> Means .NEF is not supported.
>>
>> "Bei der Übertragung gab es einen Fehler
>> Auf diesem Wiki sind Dateien mit der Endung „.DNG“ nicht zulässig."
>>
>> Means, .DNG is not supported.
>>
>> With FLIF we could simply accept all which does have a free decoder and
>> convert them to FLIF. Which would 'free' the file format. :)
>>
>>> Even including the one you aim for: What you want is to allow people to
>>> download an image, open, edit and save it without ever thinking about
the
>>> file format. FLIF can not do that, yet.
>> Since we could easily convert the FLIF on export to PNG and any 

Re: [Wikitech-l] Fwd: Re: [Wikidata] Imperative programming in Lua, do we really want it?

2017-12-07 Thread Brian Wolff
Lua is used for many purposes - some of which I think are well suited to an
imperaive approach (or at least would be totally fine to do in an
imperative style). There are of course many cases where an imperative
approach would be a poor choice.

There are no rules that apply to all programming things equally - context
of the programming task matters.

--
bawolff

On Thursday, December 7, 2017, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:
> Following your message Jeroen, there it also is on Wikitech-l now.
>
>  Message transféré 
> Sujet : Re: [Wikidata] Imperative programming in Lua, do we
really want it?
> Date :  Wed, 6 Dec 2017 23:53:17 +0100
> De :Jeroen De Dauw 
> Répondre à :Discussion list for the Wikidata project. <
wikid...@lists.wikimedia.org>
> Pour :  Discussion list for the Wikidata project. <
wikid...@lists.wikimedia.org>
>
>
>
> Hey,
>
> While I am not up to speed with the Lua surrounding Wikidata or
MediaWiki, I support the call for avoiding overly imperative code where
possible.
>
> Most Lua code I have seen in the past (which has nothing to do with
MediaWiki) was very imperative, procedural and statefull. Those are things
you want to avoid if you want your code to be maintainable, easy to
understand and testable. Since Lua supports OO and functional styles, the
language is not an excuse for throwing well establishes software
development practices out of the window.
>
> If the code is currently procedural, I would recommend establishing that
new code should not be procedural and have automawted tests unless there is
very good reason to make an exception. If some of this code is written by
people not familiar with software development, it is also important to
create good examples for them and provide guidance so they do not
unknowingly copy and adopt poor practices/styles.
>
> John, perhaps you can link the code that caused you to start this thread
so that there is something more concrete to discuss?
>
> (This is just my personal opinion, not some official statement from
Wikimedia Deutschland)
>
> PS: I just noticed this is the Wikidata mailing list and not the
Wikidata-tech one :(
>
> Cheers
>
> --
> Jeroen De Dauw | https://entropywins.wtf |https://keybase.io/jeroendedauw
> Software craftsmanship advocate | Developer at Wikimedia Germany
> ~=[,,_,,]:3
>
> On 6 December 2017 at 23:31, John Erling Blad  wrote:
>
>With the current Lua environment we have ended up with an imperative
>programming style in the modules. That invites to statefull objects,
>which does not create easilly testable libraries.
>
>Do we have some ideas on how to avoid this, or is it simply the way
>things are in Lua? I would really like functional programming with
>chainable calls, but other might want something different?
>
>John
>
>___
>Wikidata mailing list
>wikid...@lists.wikimedia.org 
>https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FLIF for Wikimedia

2017-12-04 Thread Brian Wolff
An encode latency of 7 seconds and decode latency of 1 second arent what I
would call "very fast" (the decode latency measurement probably isnt
realistic as decode and encode to png is different from just display in
browser. Then again all these measurements are going to vary by filesize,
not to mention low power devices like phones)

If it indeed takes 1 second to decode, thats probably more time than the
space savings would save time wise on a moderate speed internet connection.

And time is only one metric. Often image encoding is limitted by memory.


Anyways i'd be opposed to this unless the bandwidth saving was extreme.
Wikipedia is not the place to be a testing ground for experimental image
formats that browsers arent even supporting.


--
bawolff

p.s.lossless rotatio /cropping of jpegs is actually very common due to
rotatebot

On Monday, December 4, 2017, Ruben Kelevra  wrote:
> Hey Thiemo,
>
> On 04.12.2017 10:43, Thiemo Kreuz wrote:> I consider myself an image
> file format nerd, so thanks a lot for
>> sharing this! FLIF was new to me.Don't mind it! :)
>
>> I would like to share two important notes:
>>
>> 1. Unfortunately the flif.info website does not say a word about the
>> CPU resources their current implementation burns when converting a,
>> let's say, PNG to FLIF.Well, currently it's single core and not
> optimized at all. You should know CABAC encoding from x264/x265 so it's
> slow, but not dead slow.
>
> The website doesn't mention it because it highly depends on the subject
> as well as the setting on encoding named effort.
>
> Currently, effort is default 60, I tried 100 a lot, but there's nearly
> nothing to gain. So I assume we always want to use the good default. :)
>
> PNG Encoding of the current featured picture of today[1] at a medium
> image size for the web, 1.280×868 Pixel, opened in Gimp, converted to
> sRGB and exported as maximum compression without any checkbox done in
> Gimp ... takes Gimp 3 seconds to write it to the Harddisk.
>
> Transcoding this PNG to FLIF takes on my machine (i3-2330M@2.20GHz; 12
> GB RAM):
> real 0m7,405s
> user 0m7,320s
> sys 0m0,053s
>
> decoding the file again to PNG on the same machine (with FLIF)
> real 0m1,077s
> user 0m1,067s
> sys 0m0,007s
>
> For comparison, we save 708 KByte in comparison to PNG in this case, and
> the PNG exported by FLIF is just 14 KByte bigger than the one created by
> Gimp.
>
>> It's pretty important to realize that CPU
>> resources are even more valuable than storage space and network
>> bandwidth. Sure, It's not really possible to come up with an exact
>> threshold. But if, let's say, converting a PNG to FLIF saves 100 KiB,
>> but takes a minute, this minute will never pay off.
> So my Point was more: Get rid of this bloody Download a JPG, do some
> Stuff & Upload a 3 Times locally saved JPG again, calling it an
improvement.
>
>> If you follow the discussions on Wikimedia Commons, you will find this
>> argument quite often: Downloading PNGs, optimizing them, and uploading
>> them again is practically never worth it. Think of all the resources
>> that are burned to do this: CPU time, download and upload, database
>> storage and time, disk storage for the new revision, and not to forget
>> the user doing all this.
> Yep, but enabling FLIF for new files would do nothing to the old Data at
> all, I don't want to start a discussion about converting petabytes of
> Data, but all new revisions should be done in FLIF, if you ask me.
>> Sure, your suggestion avoids a lot of this. But the CPUs involved will
>> experience heavy load, both on the server as well as clients that need
>> to recode FLIF files via a JavaScript library first.
> FLIF is very fast to decode via JavaScript, and in general, as the
> example shown above, it takes just 1 second to decode and encode a
> medium size image as PNG with just one thread on a pretty outdated
> notebook with an unoptimized decoder and encoder. :)
>
> Try adding a FLIF to a website and test out if the website load anywhat
> slower with the FLIF ... at the small image sizes you get on articles,
> the performance impact is neglectable and comparable to loading a font
> file to the browser.
>> 2. Lossy file formats like JPEG should never be converted to lossless
>> formats. This will actually decrease quality (over time). The problem
>> is that the image data will still contain the exact same JPEG
>> artifacts, but the fact that it was a JPEG (and how it was encoded) is
>> lost.
> Yes, but those images should never be saved as JPG in the first place.
> Even under Android RAW-Photography is going to be a thing. FLIF just
> started to get rudimentary RAW-capabilities, which means you can just
> convert the special RAW-File to a FLIF and upload it with any loss in
> quality.
>
>> No tool specialized in squeezing the maximum quality out of
>> lossy JPEGs can work anymore. And there are a lot of super-awesome
>> tools like this. Not only tools like JPEGCROP and such that 

Re: [Wikitech-l] Proposal for a developer support channel

2017-11-18 Thread Brian Wolff
I was wrong about Support_desk being permenantly semi-protected. It
was temporarily semi-protected for 3 days, but that's been lifted now.

Regardless of spam concerns, point still stands that it seems bad form
to semi-protect the venue where newbies are supposed to ask for help.
Not that i have any better solution to the spam issue.

--
bawolff

On Sun, Nov 19, 2017 at 3:33 AM, Brian Wolff <bawo...@gmail.com> wrote:
> Neither project:support_desk nor project:current_issues is really meant for
> that purpose - support desk is mainly for user and (external) sysadmin
> support. And current_issues is the village pump of mediawiki.org (the
> website not the software)
>
> Honestly, I kind of think that lqt was better than flow for support desk. As
> much as lqt sucked at least search sort of worked.
>
> Although the bigger problem probably is that project:support desk is
> protected so new users arent allowed to ask questions(!)
>
> On the subject of search, i do think that
> https://lists.wikimedia.org/robots.txt is rediculous. At least for technical
> lists we should let google in, and the privacy concern is silly as there are
> mirrors that are indexed.
>
> --
> bawolff
>
>
> On Saturday, November 18, 2017, Sam Wilson <s...@samwilson.id.au> wrote:
>> Hear hear to being able to properly search past conversations.
>>
>> I know it's not the fashionably geek thing to say, but I must admit that
>> I always find mailing lists to be incredibly annoying, compared to
>> forums. Not only is searching completely separate from reading, even
>> browsing old topics is another interface again (assuming one hasn't been
>> subscribed forever and kept every old message). Then, when you do manage
>> to find an old message, there's no way to reply to it (short of copying
>> and pasting and losing context).
>>
>> Maybe https://www.mediawiki.org/wiki/Project:Support_desk (and its
>> sibling https://www.mediawiki.org/wiki/Project:Current_issues ?) is the
>> best place to ask questions about the software, its development, and
>> other things. If so, let's make that fact much more well advertised!
>> (Although, I think Flow is brilliant, when it's for discussing a wiki
>> page — because the topic is already set (effectively by the title of the
>> page its attached to). When it's trying to be a host to multiple
>> unrelated topics, it becomes pretty annoying to use.)
>>
>> On Sun, 19 Nov 2017, at 05:57 AM, Niharika Kohli wrote:
>>> I'd like to add that having Discourse will provide the one thing IRC
>>> channels and mailing lists fail to - search capabilities. If you hangout
>>> on
>>> the #mediawiki IRC channel, you have probably noticed that we get a lot
>>> of
>>> repeat questions all the time. This would save everyone time and effort.
>>>
>>> Not to mention ease of use. Discourse is way more usable than IRC or
>>> mailing lists. Usability is the main reason there are so many questions
>>> about MediaWiki asked on Stackoverflow instead:
>>> https://stackoverflow.com/unanswered/tagged/mediawiki
>>> <https://stackoverflow.com/unanswered/tagged/mediawiki>,
>>> https://stackoverflow.com/questions/tagged/mediawiki-api?sort=newest
>>> <https://stackoverflow.com/questions/tagged/mediawiki-api?sort=newest>,
>>> https://stackoverflow.com/questions/tagged/mediawiki-extensions...
>>>
>>> I'd personally hope we can stop asking developers to go to IRC or mailing
>>> lists eventually and use Discourse/something else as a discussion forum
>>> for
>>> support.
>>>
>>> On Sat, Nov 18, 2017 at 1:31 PM, Quim Gil <q...@wikimedia.org> wrote:
>>>
>>> > Hi, I have expanded
>>> > https://www.mediawiki.org/wiki/Discourse#One_place_to_
>>> > seek_developer_support
>>> >
>>> > https://www.mediawiki.org/wiki/Project:Support_desk is the only channel
>>> > whose main purpose is to provide support. The volunteers maintaining
>>> > are
>>> > the ones to decide about its future. There is no rush for any decisions
>>> > there. First we need to run a successful pilot.
>>> >
>>> > The rest of channels (like this mailing list) were created for
>>> > something
>>> > else. If these channels stop receiving questions from new developers,
>>> > they
>>> > will continue doing whatever they do now.
>>> >
>>> > > I'd like to understand how adding a venue will improve matters.
>>> >
>>> > For new developers arriving to our shores, bein

Re: [Wikitech-l] Proposal for a developer support channel

2017-11-18 Thread Brian Wolff
Neither project:support_desk nor project:current_issues is really meant for
that purpose - support desk is mainly for user and (external) sysadmin
support. And current_issues is the village pump of mediawiki.org (the
website not the software)

Honestly, I kind of think that lqt was better than flow for support desk.
As much as lqt sucked at least search sort of worked.

Although the bigger problem probably is that project:support desk is
protected so new users arent allowed to ask questions(!)

On the subject of search, i do think that
https://lists.wikimedia.org/robots.txt is rediculous. At least for
technical lists we should let google in, and the privacy concern is silly
as there are mirrors that are indexed.

--
bawolff

On Saturday, November 18, 2017, Sam Wilson <s...@samwilson.id.au> wrote:
> Hear hear to being able to properly search past conversations.
>
> I know it's not the fashionably geek thing to say, but I must admit that
> I always find mailing lists to be incredibly annoying, compared to
> forums. Not only is searching completely separate from reading, even
> browsing old topics is another interface again (assuming one hasn't been
> subscribed forever and kept every old message). Then, when you do manage
> to find an old message, there's no way to reply to it (short of copying
> and pasting and losing context).
>
> Maybe https://www.mediawiki.org/wiki/Project:Support_desk (and its
> sibling https://www.mediawiki.org/wiki/Project:Current_issues ?) is the
> best place to ask questions about the software, its development, and
> other things. If so, let's make that fact much more well advertised!
> (Although, I think Flow is brilliant, when it's for discussing a wiki
> page — because the topic is already set (effectively by the title of the
> page its attached to). When it's trying to be a host to multiple
> unrelated topics, it becomes pretty annoying to use.)
>
> On Sun, 19 Nov 2017, at 05:57 AM, Niharika Kohli wrote:
>> I'd like to add that having Discourse will provide the one thing IRC
>> channels and mailing lists fail to - search capabilities. If you hangout
>> on
>> the #mediawiki IRC channel, you have probably noticed that we get a lot
>> of
>> repeat questions all the time. This would save everyone time and effort.
>>
>> Not to mention ease of use. Discourse is way more usable than IRC or
>> mailing lists. Usability is the main reason there are so many questions
>> about MediaWiki asked on Stackoverflow instead:
>> https://stackoverflow.com/unanswered/tagged/mediawiki
>> <https://stackoverflow.com/unanswered/tagged/mediawiki>,
>> https://stackoverflow.com/questions/tagged/mediawiki-api?sort=newest
>> <https://stackoverflow.com/questions/tagged/mediawiki-api?sort=newest>,
>> https://stackoverflow.com/questions/tagged/mediawiki-extensions...
>>
>> I'd personally hope we can stop asking developers to go to IRC or mailing
>> lists eventually and use Discourse/something else as a discussion forum
>> for
>> support.
>>
>> On Sat, Nov 18, 2017 at 1:31 PM, Quim Gil <q...@wikimedia.org> wrote:
>>
>> > Hi, I have expanded
>> > https://www.mediawiki.org/wiki/Discourse#One_place_to_
>> > seek_developer_support
>> >
>> > https://www.mediawiki.org/wiki/Project:Support_desk is the only channel
>> > whose main purpose is to provide support. The volunteers maintaining
are
>> > the ones to decide about its future. There is no rush for any decisions
>> > there. First we need to run a successful pilot.
>> >
>> > The rest of channels (like this mailing list) were created for
something
>> > else. If these channels stop receiving questions from new developers,
they
>> > will continue doing whatever they do now.
>> >
>> > > I'd like to understand how adding a venue will improve matters.
>> >
>> > For new developers arriving to our shores, being able to ask a first
>> > question about any topic in one place with a familiar UI is a big
>> > improvement over having to figure out a disseminated landscape of wiki
Talk
>> > pages, mailing lists and IRC channels (especially if they are not used
to
>> > any of these environments). The reason to propose this new space is
them,
>> > not us.
>> >
>> > On Sat, Nov 18, 2017 at 5:01 PM, MZMcBride <z...@mzmcbride.com> wrote:
>> >
>> > > Brian Wolff wrote:
>> > > >On Friday, November 17, 2017, Quim Gil <q...@wikimedia.org> wrote:
>> > > >> The Technical Collaboration team proposes the creation of a
developer
>> > > >> support channel focusing on newcomers, as part 

Re: [Wikitech-l] Proposal for a developer support channel

2017-11-17 Thread Brian Wolff
On Friday, November 17, 2017, Quim Gil  wrote:
> Hi,
>
> The Technical Collaboration team proposes the creation of a developer
> support channel focusing on newcomers, as part of our Onboarding New
> Developer program. We are proposing to create a site based on Discourse
> (starting with a pilot in discourse-mediawiki.wmflabs.org) and to point
the
> many existing scattered channels there.
>
> This is an initial proposal for a pilot. If the pilot is successful, we
> will move it production. For that to happen we still need to sync well
with
> Wikimedia Cloud Services, Operations and the Wikimedia technical
community.
>
> Please check https://www.mediawiki.org/wiki/Discourse and share your
> feedback.
>
> --
> Quim Gil
> Engineering Community Manager @ Wikimedia Foundation
> http://www.mediawiki.org/wiki/User:Qgil
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

What does point existing channels to discouse mean exactly? Are you
planning to shutdown any existing channels? If so, which ones?

--
bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Security release: 1.29.2 / 1.28.3 / 1.27.4

2017-11-15 Thread Brian Wolff
Hi,

I believe they are coming.

-bawolff

On Wednesday, November 15, 2017, Seb35  wrote:
> Hi!
>
> There is no corresponding Git tags 1.29.2, 1.28.3, 1.27.4, could someone
> issue them?
>
> I guess they are respectively ee7f9fe, 5b85506, a806476.
>
> Thanks!
> ~ Seb35
>
> Le 15/11/2017 à 00:37, Sam Reed a écrit :
>> I would like to announce the release of MediaWiki 1.29.2, 1.28.3 and
1.27.4!
>>
>> These releases fix nine security issues in core and one related issue in
>> the vendor
>> folder. Download links are given at the end of this email.
>>
>> Patches will be pushed to gerrit after this email is sent, and will land
>> into the relevant
>> branches as fast as our CI infrastructure allows. Git tags will follow
soon
>> after. All related
>> tasks will be made public in phabricator too in the following few hours.
>>
>> Please note that this month is the End-Of-Life date for MediaWiki 1.28.
This
>> means that MediaWiki 1.28.3 will be the last security release for that
>> version, barring any unforeseen issues. We would strongly encourage
users of
>> MediaWiki 1.28 to upgrade to MediaWiki 1.29, released in July 2017, or a
yet
>> newer version as soon as possible. MediaWiki 1.29 will be supported until
>> July
>> 2018. See  for more
>> information.
>>
>> This release also serves as a maintenance release for these branches.
>>
>> == Security fixes ==
>> * (T128209) Reflected File Download from api.php. Reported by Abdullah
>> Hussam. (CVE-2017-8809)
>> * (T165846) BotPasswords doesn't throttle login attempts.
>> * (T134100) On private wikis, login form shouldn't distinguish between
>> login failure
>>   due to bad username and bad password. (CVE-2017-8810)
>> * (T178451) XSS when $wgShowExceptionDetails = false and browser sends
>>   non-standard url escaping. (CVE-2017-8808)
>> * (T176247) It's possible to mangle HTML via raw message parameter
>> expansion.
>>   (CVE-2017-8811)
>> * (T125163) id attribute on headlines allow raw >. (CVE-2017-8812)
>> * (T124404) language converter can be tricked into replacing text inside
>> tags by
>>   adding a lot of junk after the rule definition. (CVE-2017-8814)
>> * (T119158) Language converter: unsafe attribute injection via glossary
>> rules (CVE-2017-8815)
>>
>> The following only affects 1.29:
>> * (T180488) (T125177) "api.log contains passwords in plaintext" wasn't
>> correctly fixed in all
>>   branches in the previous security release. (CVE-2017-0361)
>>
>> The following only affects 1.27 and 1.28:
>> * (T180231) composer.json has require-dev versions of PHPUnit with known
>> security
>>   issues. Reported by Tom Hutchison. (CVE-2017-9841)
>>
>> It is recommended to run `composer update --no-dev` after upgrading to MW
>> 1.27.4 or
>> 1.28.3 if you installed MediaWiki via git. If you are using the tarball,
>> you are not affected,
>> and you do not need to run this command. This will remove developer
>> dependancies that
>> production wikis do not require. If you require developer dependancies,
run
>> `composer update` which will update to a version of PHPUnit without known
>> RCE.
>>
>> If you cannot run `composer update` for any reason, it is recommended
that
>> you delete the
>> offending file as a minimum yourself using the following command:
>>
>> `rm -rf vendor/phpunit/phpunit/src/Util/PHP/eval-stdin.php`
>>
>> == Links to all mentioned tasks ==
>> https://phabricator.wikimedia.org/T128209
>> https://phabricator.wikimedia.org/T165846
>> https://phabricator.wikimedia.org/T134100
>> https://phabricator.wikimedia.org/T178451
>> https://phabricator.wikimedia.org/T176247
>> https://phabricator.wikimedia.org/T125163
>> https://phabricator.wikimedia.org/T180231
>> https://phabricator.wikimedia.org/T125163
>> https://phabricator.wikimedia.org/T124404
>> https://phabricator.wikimedia.org/T119158
>> https://phabricator.wikimedia.org/T180488
>> https://phabricator.wikimedia.org/T125177
>>
>> == Release notes ==
>>
>> Full release notes for 1.27.4:
>>
https://phabricator.wikimedia.org/diffusion/MW/browse/REL1_27/RELEASE-NOTES-1.27
>> https://www.mediawiki.org/wiki/Release_notes/1.27
>>
>> Full release notes for 1.28.3:
>>
https://phabricator.wikimedia.org/diffusion/MW/browse/REL1_28/RELEASE-NOTES-1.28
>> https://www.mediawiki.org/wiki/Release_notes/1.28
>>
>> Full release notes for 1.29.2:
>>
https://phabricator.wikimedia.org/diffusion/MW/browse/REL1_29/RELEASE-NOTES-1.29
>> https://www.mediawiki.org/wiki/Release_notes/1.29
>>
>> For information about how to upgrade, see
>> 
>>
>> **
>> Download:
>> https://releases.wikimedia.org/mediawiki/1.27/mediawiki-1.27.4.tar.gz
>>
>> Download without bundled extensions:
>>
https://releases.wikimedia.org/mediawiki/1.27/mediawiki-core-1.27.4.tar.gz
>>
>> Patch to previous version (1.27.3):
>> 

Re: [Wikitech-l] PHPUnit error: Cannot access the database: Unknown database

2017-10-25 Thread Brian Wolff
Try passing

--wiki wiki

As a command line argument to the script. (Assuming wiki is the db name.
Otherwise pass --wiki dbname_here )

--
bawolff

On Wednesday, October 25, 2017, Sebastian Berlin <
sebastian.ber...@wikimedia.se> wrote:
> I'm having trouble running PHPUnit for the extension I'm developing
> (Wikispeech), in Vagrant. I followed the instructions on
>
https://www.mediawiki.org/wiki/Manual:PHP_unit_testing/Writing_unit_tests_for_extensions#Run_your_tests
,
> i.e. `tests/phpunit/phpunit.php extensions/Wikispeech/tests`, and got the
> error:
>
>> Fatal error: Uncaught exception 'Wikimedia\Rdbms\DBConnectionError' with
>> message 'Cannot access the database: Unknown database
>> 'extensions/Wikispeech/tests/phpunit/ApiWikispeechTest.php' (127.0.0.1)'
in
>> /vagrant/mediawiki/includes/libs/rdbms/database/Database.php:800
>>
> For full log, see https://phabricator.wikimedia.org/P6176.
>
> Does anyone know what's causing this?
>
> Previously I've run the tests with `phpunit extensions/Wikispeech/tests`
> which I realize isn't the correct way, after reading the Wiki-page again.
> However, this didn't work after I reinstalled Vagrant (due to various
other
> problems).
>
> All the best,
>
> *Sebastian Berlin*
> Utvecklare/*Developer*
> Wikimedia Sverige (WMSE)
>
> E-post/*E-Mail*: sebastian.ber...@wikimedia.se
> Telefon/*Phone*: (+46) 0707 - 92 03 84
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Roadmap for continued compatibility with MySQL

2017-10-19 Thread Brian Wolff
On Thursday, October 19, 2017, Hogan, Michael C 
wrote:
> How many years into the future is it reasonable to expect that MySQL will
remain a recommended database server for "production use" along with
MariaDB now that the Foundation has switched to using MariaDB internally? I
know that MariaDB claims to be drop in compatible with MySQL, but the list
of variances seems to be slowly growing. References...
> * Compatibility - [[mw:Compatibility]]
> * MariaDB support? - [[mw:Topic:R1hqki3kaytylml4]]
> * https://mariadb.com/kb/en/library/mariadb-vs-mysql-compatibility/
>
> Other than [[mw:Compatibility]] on mediawiki.org is there anywhere else
(wiki pages, Phabricator tasks/tags, etc.) that I should check for
technical roadmap information before emailing wikitech-l?
>
>
> Thank you!
> Michael
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Mediawiki usually doesnt use new exotic db features. I would be shocked if
we dropped support for mysql at any point in the forseeable future unless
mysql becomes totally abandoned.

--
brian
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] New Developers Quarterly Report's first edition

2017-10-18 Thread Brian Wolff
On Tue, Oct 17, 2017 at 7:03 PM, Srishti Sethi  wrote:
> Hello everyone,
>
>
> I would like to share the first edition of the New Developers Quarterly
> Report that the Developer Relations team has produced. This report covers
> metrics, survey analysis and lessons learned from new developers focused
> activities in the previous quarter (July-September 2017).
>
>
> If you have questions and feedback that you would like to share with us,
> please add them on the discussion page.
>
>
> To receive a notification when a new report is published, subscribe here.
>
>
> We plan to release a report every quarter and take action items identified
> from the key findings for improving our existing methods and processes. The
> next release will be in January 2018.
>
>
> If you have any questions, comments, and concerns, we will be more than
> happy to hear them!
>
>
> Thanks,
>
> Srishti
>
>

From the report:

>Percentage of volunteers active one year (± 3 months) after their first 
>contribution, out of all new volunteers attracted one year ago (between 
>April–June >2016). (Source: Calculation on data)
>
>QoQ: -26.5%. YoY: -60.0%

That's kind of scary

--
bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Person of interest to blame

2017-10-03 Thread Brian Wolff
Realistically its probably in the enhancedChanges list base class somewhere
and not watchlist itself.

--
bawolff

On Tuesday, October 3, 2017, C. Scott Ananian 
wrote:
> git log includes/specials/SpecialWatchlist.php suggests:
>
> T172757 / I27bdaa1401323fa3143f79a57dc5b9773e48fd1d (sep 27)
> T176857 / I90c79ef9c4819a34060515b863277fd185828ed9 (sep 27)
> T176348 / If6280ad6fbad65909e1d0b2a48344e24d485aca2 (sep 21)
> T173533 / I4138fde22bdd8cc55c65846b91184c3ad3057244 (sep 19)
> T176155 / Idc4e450c9fa03fc78a012fb5eefa60174bb1245a (sep 18)
> T175965 / I14ba792f1cd435f89b2e09067b0a0e894a0a2557 (sep 14)
> T168376 / I3c75f9f2f6287414bf330f116d959d078250392d (aug 23)
> T174725 / I1cb402559edb242b40a77f509480560f0636264d (sep 1)
> T172030 / I65c50d75b21b3e5d7b69fdecb79c032f5d1d8853 (aug 31)
> -  / I1c50b67b0a4aaddb561e79fdacd9849ffe0ded8c (aug 31)
> and a bunch more in late august related to "WLFilters".
>
> As Sebastien suggests, the easiest way to identify the problem is to run a
> local copy and use git-bisect to identify exactly where the problematic
> change occurred.
>  --scott
>
>
> On Tue, Oct 3, 2017 at 2:37 PM, יגאל חיטרון 
wrote:
>
>> Hello. There was a patch, or a bug, or a bug fix in the last month that
I'm
>> trying to find. Something changed the Special:Watchlist view and broke a
>> couple of gadgets. I wanted to find what was the gerrit patch, or at
least
>> some explanation, what exactly was done, but nobody knows (Thanks a lot
to
>> Quiddity which spent many time trying to help, you are awesome).
>> The problem is different HTML structure of Special:Watchlist, at least
when
>> "Group changes by page in recent changes and watchlist" in preferences is
>> on. I even can't describe the difference, because it's just "looks
>> different". I read a roadmap for the last 50 days, and can't find
anything.
>> A couple of people I asked can't help. One of the symptoms is different
css
>> inside the revisions groups, bold vs unbold.
>> Maybe somebody can help to find a gerrit to blame? Thank you,
>> Igal (User:IKhitron)
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
>
> --
> (http://cscott.net)
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Writing unit tests for extensions that use dbr calls

2017-09-27 Thread Brian Wolff
Try adding

$this->tablesUsed[] = 'cacao_annotation';

To the setUp() method of your tests

--
brian

On Wednesday, September 27, 2017, Jim Hu  wrote:
> I’m trying to improve my coding practices for our MW extensions, and I’m
very confused by whether my problem with PHPUnit is a problem with the test
I’m writing or with the extension object class I’m trying to test.
>
> My extension uses some custom database tables to store the properties of
objects that are created during execution. In older versions of MW, I could
just create an object and test that the methods of the object class
returned appropriate values for known data that was already in the
database. Now, instantiating an object is creating temporary tables that
don’t exist and causing fatal errors. Example:
>
> 7) CacaoModelAnnotationTest::testNewCacaoModelAnnotationSuccess
> Wikimedia\Rdbms\DBQueryError: A database query error has occurred. Did
you forget to run your application's database schema updater after
upgrading?
> Query: SELECT
annotation_id,row_id,original_row_data,annotation_timestamp,user_id,team_id,session_id,annotation_inning
FROM `unittest_cacao_annotation` INNER JOIN `unittest_cacao_user` ON
((annotation_user = cacao_user.id))   WHERE annotation_id = "6057"
> Function: CacaoModelAnnotation::load
> Error: 1054 Unknown column 'cacao_user.id' in 'on clause' (localhost)
>
> This is the load method from class CacaoModelAnnotation:
>
> public function load() {
> try{
> wfProfileIn( __METHOD__ );
>
> if ( $this->loaded ) {
> return false;
> }
>
> MWDebug::log( __METHOD__ . ' called for
annotation_id: ' . $this->id );
>
> $dbr = wfGetDB( DB_SLAVE );
>
> // query to get annotation attributes
> $result = $dbr->select(
> array( 'cacao_annotation', 'cacao_user' ),
> array(
> 'annotation_id', 'row_id',
'original_row_data',  'annotation_timestamp',
> 'user_id', 'team_id',
'session_id', 'annotation_inning'
> ),
> 'annotation_id = "' . $this->id . '"',
> __METHOD__,
> array(),
> array(
> 'cacao_user' => array('INNER
JOIN', 'annotation_user = cacao_user.id' ),
> )
> );
> if ( $dbr->numRows($result) == 0 ) {
> throw new CacaoException(
wfMessage('cacao-annotation-load-returned-zero-rows', $this->id)->text() );
> }
> $resultRow = $dbr->fetchObject( $result );
> $this->loadFromResultRow($resultRow);
>
> wfProfileOut( __METHOD__ );
> }catch(CacaoException $e){
> if(!isset($thrown)){
> throw $e;
> }
> $thrown = true;
> }
> }
>
> Is there a good example of an extension that uses custom database tables
that I can use as a role model?
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   3   4   5   6   7   >