Re: [Wikitech-l] Recently proposed patchsets by new contributors awaiting code review - organization?

2016-10-02 Thread Marcin Cieslak
A just checked this one:

> http://korma.wmflabs.org/browser/scr-backlog.html

how does one recognize or define "Organization" affected by the backlog?
(it's the first time I see this)

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] Giving actual CSRF tokens to not logged in users (T40417)

2016-09-29 Thread Marcin Cieslak
Dnia 29.09.2016 Max Semenik  napisał/a:

>> Note it will affect scripts and API clients that expect to see "+\" as the
>> token as a sign that they're logged out, or worse assume that's the token
>> and don't bother to fetch it.
>
>
> We had breaking API/frontend infrastructure changes before, this one seems
> less invasive and will break only badly written clients. In any case, most
> clients are intended for logged in users.

Before this thread  I believed that the "+\" token was some kind of a bug
(but it worked anyway, so I didn't bother digging futher).

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia Developer Summit 2017 discussion

2016-09-29 Thread Marcin Cieslak
Dnia 28.09.2016 Quim Gil  napisał/a:

> Summit sessions are considered tasks themselves, not just a conversation
> happening in a room and eventually documented in a wiki page.

I think this kind of captures the opinions expressed here very well
(if it could be one sentence).

a. Some folk prefer Phabricator because it provides workflow, tracking,
   boards, hierarchy (task->substask), explicit (scrollable) history,
   short URL's.

b. Others prefer MediaWiki because it is easier to cooperate in-place,
   visual editor, better search, good linking, readable titles and being loyal
   to our own tools.

I'd say (a) set of requirements is better for typical (classic) project
management, for something bound in time and resources that needs to be
managed swiftly.

Set (b) of requirements provides better community involvement, transparency
and it is easier to maintain things that are perpetual work in progress
(never need to be really "done").

(a) is better for "fast moving consumer goods" of sorts,
(b) is better for long-term stuff. 

But wiki is not a "final resting place" of a documentation polished
elsewhere. Things should not become  "eventually documented
in a wiki page").

In my other note I wrote how CCC is using pentabarf submission and
conference scheduling tool for (a) and MediaWiki for (b) probably
for the same reasons we have here.

I think I kind of share both points of view: my event organiser's brain
is with (a) but my volunteer heart is with (b).


One nice solution would be to teach Phabricator to treat links
to wiki items as first-class objects that can be tagged, prioritized,
deadlined, assigned, traced etc. I could imagine having MediaWiki pages
as items on the project board and some correspondence between categories
and phab tasks and boards. A casual look on the Phabricator does not
reveal we have a Task for that (but it might be I could not find it) .


There is one more thing that may explain why we are having this discussion
for now: Phabricator filled with content we have at the moment is very
difficult to search. I literally have to remember titles of the tasks
to try to somehow find them again. I have a feeling (that might be our
fault and not software's) that it's filled with temporal junk
which was there only for the purpose of some workflow/tracking sometime
ago. I somehow feel our Phabricator instance is overloaded with
those shooting starts (events, shortlived action items etc.).


This has started to annoy me some time ago (especially given
my ad-hoc and seasonal interest in MediaWiki development) but
it has never overflowed enough to say something about it,
I just sighed and moved on. 


I think many participants in this thread feel something similar
and this thread just got hijacked to express something
a bit broader than the original purpose of this discussion.


Saper

sent from a desktop device. please excuse my verbosity.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia Developer Summit 2017 discussion

2016-09-29 Thread Marcin Cieslak
Dnia 28.09.2016 Yaron Koren  napisał/a:
> Hi Quim,
>
> Most relevantly, the Chaos Communications Congress wiki uses the Semantic
> Forms [1] extension to handle submissions - speakers use a form to enter
> their talk proposals. I don't know how exactly talks are approved, or
> whether the form is used for approving/rejecting talks too - or, for that
> matter, whether they have any sort of real screening process. But the basic
> mechanics are that forms are used for entering all the relevant information
> about each talk, including tags (among many other fields).

Yaron, this is interesting: 

I know that CCC is using MediaWiki extensively to show the programme and enable
cooperation between participants (incl. sharing a ride or dating),
but when I has a privilege to speak at the Congress in 2014 they used
a homegrown tool called pentabarf to collect and evaluate submissions.

By the way I am amazed how well their MediaWiki usually performs under
a very heavy load during the event.  

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit screen size

2016-09-25 Thread Marcin Cieslak
Dnia 25.09.2016 Tim Starling  napisał/a:
> On 25/09/16 21:09, Bináris wrote:
>> Hi,
>> 
>> I try to familiarize myself with Gerrit which is not a good example for
>> user-friendly interface.
>> I noticed a letter B in the upper right corner of the screen, and I
>> suspected it could be a portion of my login name. So I looked at it in HTML
>> source, and it was. I pushed my mouse on it and I got another half window
>> as attached.
>> 
>> So did somebody perhaps wire the size of a 25" monitor into page rendering?
>> My computer is a Samsung notebook.
>
> In T38471 I complained that the old version was too wide at 1163px
> (for my dashboard on a random day). Now the new version is 1520px.

This is something strange, on a Windows box with FF 49 I only need 1075px
when not logged in and 1090px when logged into not to have a scroll bar.
(Keep in mind that "saper" is short).

On FreeBSD's oldish Firefox, the username part almost always is hanging off.
With ca. 1400 px it is enough (if only the screen was wide enough).

Something with fonts?

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to login with API?

2016-09-17 Thread Marcin Cieslak
Dnia 06.09.2016 Gergo Tisza  napisał/a:
> On Mon, Sep 5, 2016 at 8:08 PM, Bináris  wrote:
>
>> I found bot passwords, but they offer very limited user rights.
>>
>
> They offer rights for anything which is defined in $wgGrantPermissions [1],
> which should be include everything a bot might need. Feel free to file a
> bug if that's not the case, but maybe you are just overlooking something?
>
> So the best thing is to try without API?

Bináris, did you manage to solve it?

I am using "compat" all the time and I have no login issues.

Just tried with a fresh clone, empty "login-data" and
no user-config.py and "python login.py" worked just fine.

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Outreachy-13 mentors needed

2016-09-17 Thread Marcin Cieslak
Dnia 17.09.2016 Sumit Asthana  napisał/a:
>>
>> I have been contacted on IRC by a candidate willing to work on graphs
>> and Graphoid.  I can offer my generic knowledge of MediaWiki
>> as well as substantial node experience (incl. writing extensions).
>>
> Yes, we need mentors. Are you having any specific project related to graphs
> in mind? We're maintaining the list of all possible projects at
> possible-tech-projects
> and
> featured Outreachy projects at Outreachy board
>.

Thank you. If a new project emerges, it will be proposed on
the Phabricator.

>> I have added myself to the bench of potemtial mentors on the wiki,
>> is there anything else shall I do, except for signing up as a potential
>> mentor on opw.gnome.org?
>>
>  Yes, if the the project you'd like to mentor exists on the above mentioned
> board, add yourself as a mentor, if not create a new task and we'll look
> into featuring it. Please go through the possible-tech-projects board(
> especially the "*Could have*" column ) to see if you like any task to
> mentor and ping on it.

I have just signed up on the Outreachy's website, giving a pointer
to my mediawiki.org userpage.

I am reviewing both boards from time to time as well, thank you!

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Outreachy-13 mentors needed

2016-09-17 Thread Marcin Cieslak
Dnia 12.09.2016 Sumit Asthana  napisał/a:
> Hello Wikimedians,
>
> Gnome's Outreachy-13  is round the corner
> and Wikimedia is again participating in yet another internship season ( Dec
> 6 to March 6 ). The application deadline is *Oct - 17*.
>
> Its interesting to see that candidates have already started to ping on
> projects and show their interest!

I have been contacted on IRC by a candidate willing to work on graphs
and Graphoid.  I can offer my generic knowledge of MediaWiki
as well as substantial node experience (incl. writing extensions).

I have added myself to the bench of potemtial mentors on the wiki,
is there anything else shall I do, except for signing up as a potential
mentor on opw.gnome.org?

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit 2.12.2 test instance - PLEASE TEST

2016-07-20 Thread Marcin Cieslak
Dnia 15.07.2016 Isarra Yos  napisał/a:
>
>   * Diffs - using icons for back to change, previous file, next file is
> very unclear and hard to find; new users will not see them and have
> any idea what to do with them. The current textual 'up to change',
> previous/next file is immediately clear and would be great to
> maintain. (The side-by-side and unified diff icons are also very
> strange, like fuzzy rgb blobs, but at least they're a bit clearer.
> Sort of.)

Those icons are extremely ugly and unreadable. Just plain black
arrows would be much better.

>   * Diff preferences - white text on black background is very bad,
> especially when it suddenly appears on top of an interface that uses
> black text on white.

In general, diff screen looks like taken from some completely different
app.

>   * The very narrow scrollbars are weird and unnecessary, and hard to
> use. 

Those wimpy scrollbars are barely visible and very hard to click on.

In short, the new diff screen is a disaster, but I never really use
it (only for comments if ever) - I prefer reviewing on the commandline
anyway.

Also I suffer badly (because of my outdated browser, maybe) from
the horizontal scrollitis:

my userid and a part of the search box are extending past right margin
off the screen, so I have to scroll to login (old version has this problem
too). 

This is strange, since generally interface tries very hard to adjust
to the current screen width.

The "Same Topic" gadget is also scrolling off the screen.
Actually after clicking a few changes it seems that it comes delivered
later to the screen, so suddenly the change metadata box resizes
to make a bit more room for the "Same Topic" feature.

"Same Topic" thing is potentially great but feels like a bolt on
right now (not so bad as the new diff screen though). (it's a plugin,
right?)

good things:

"Conflicts With" is GREAT!!! Barely noticed it's there
(scrolled off) but this is very much needed.
This should be an invitation for a completely new
collaboration scenarios.

patchlist/downloads removed from the main screen - good
- I was looking for it for a moment (as others already complained) 
but after giving it a bit of thought I am happy they are hidden.
Finally I can copy the URL easily with a simple X11 PRIMARY
selection.

Generally it feels more compact, unlike Isarra I am not sure we need
more padding.

Since it seems we have lost the contributors anyway after
the Gerrit migration I think we should upgrade anyway - things
seem to be a bit better this time.

Saper



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC meeting: Minimum PHP version

2015-11-28 Thread Marcin Cieslak
On 2015-11-24, Rob Lanphier  wrote:
> Hi folks,
>
> This week's RFC review meeting is scheduled for Wednesday, November 25
> at 2pm PST (22:00 UTC).  Event particulars can be found at
>
>
> The main task this week is to plan out what we will define the minimum
> PHP version to be for MediaWiki 1.27 (the next LTS version).  The
> viable choices seem to be:
> *  PHP 5.3 (the status quo) - this version is no longer supported
> upstream, and doesn't have widespread support even in conservatively
> updated Linux distros.
> *  PHP 5.4 - this version is no longer supported by The PHP Group, but
> is still part of older supported Linux distros (e.g. Debian Wheezy)
> *  PHP 5.5 - this is the lowest version with reliable LTS support in

In one enterprise environment I am familiar with one is stuck
with SLES 11.3/11.4 with they own enterprise repository, and the
newest I've seen was 5.3.something.

Saper



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [MediaWiki-announce] MediaWiki 1.26 Now Available

2015-11-26 Thread Marcin Cieslak
On 2015-11-25, Chad  wrote:
> Hello everyone,
>
> https://phabricator.wikimedia.org/diffusion/MW/browse/REL1_26/RELEASE-NOTES-1.26

> ResourceLoader now loads all scripts asynchronously. The top-queue and 
> startup modules are no longer synchronously loaded.

so we ship this now with https://phabricator.wikimedia.org/T115692 open? ouch...

saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Outreachy round 11 results out

2015-11-17 Thread Marcin Cieslak
On 2015-11-17, Tony Thomas <01tonytho...@gmail.com> wrote:
> Hello all,
>
> Glad to inform that the results of Outreachy round 11 are out, and one of
> our candidate - Josephine Lim made it to the program for her proposal
> on Easier categorization of pictures in Upload to Commons Android app -
> https://phabricator.wikimedia.org/T115101. Congratulations to the candidate
> and the mentors involved. The select should follow up with the mentors to
> finish the 'Community bonding period' as per [1].

Congratulations, Josephine!

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Yandex?

2015-11-14 Thread Marcin Cieslak
On 2015-11-13, Runa Bhattacharjee  wrote:

> Specifics about this can be seen at:
>
> https://www.mediawiki.org/wiki/Content_translation/Machine_Translation/Yandex#Summary_of_terms_of_Yandex_agreement
>

Thanks! This is very helpful.

The link was already in the FAQ (although a bit buried in the text), I gave it 
a bit more exposure:

https://www.mediawiki.org/w/index.php?title=Content_translation%2FDocumentation%2FFAQ=revision=1939107=1936677

Thanks a lot!

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Git for idiots

2015-11-14 Thread Marcin Cieslak
On 2015-11-13, Purodha Blissenbach  wrote:
> Hi,
>
>> git clone 
>> ssh://review.openstack.org:29418/openstack-infra/git-review.git
>
> fatal: Could not read from remote repository.
>
> Please make sure you have the correct access rights
> and the repository exists.
>
> Greetings -- Purodha

You can use

 git clone https://review.openstack.org/openstack-infra/git-review.git

if you don't have an account on review.openstack.org with your SSH key

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Yandex?

2015-11-12 Thread Marcin Cieslak
On 2015-11-11, Luis Villa  wrote:
> I haven't been directly involved in a while, but we were certainly not
> paying Yandex last time I looked at the arrangement. Appropriate legal
> steps were also taken to protect the licensing of the content, and
> appropriate technical steps to protect PII of editors, among other things.

It looks like the Yandex translation API is free for up to 1 million codepoints
a day and 10 million a month. They also say the service is made
available for personal, non-commercial use only.

They terms and conditions also require software description, help pages and
that pages which use the result of the translation display hyperlinked
notice http://translate.yandex.ru/;>Переведено сервисом 
«Яндекс.Переводчик»

(How) are we going to meet this requirement?

Saper

https://yandex.ru/legal/translate_api/ API ToS
https://yandex.ru/legal/translate_termsofuse/ General Yandex Translate ToS


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Summit proposal: Turning the Table of Contents into a discrete object

2015-11-10 Thread Marcin Cieslak
On 2015-11-10, Isarra Yos  wrote:
> Hi! I would like to turn the mw ToC into a discrete object within the 
> codebase. Write a ToC class and pull all the random building parts out 
> of the parser and five levels of pageoutput, and make it stop messing up 
> the page caching and stuff. Make this class a thing, separate from the 
> content itself, that can appear on the page or be toggled or messed with 
> or added to or moved or whatever by extensions.
>
> I have a proposal about this for the developers summit which is about as 
> specific: https://phabricator.wikimedia.org/T114057

Wow, very good, I would no longer need 
https://www.mediawiki.org/wiki/Extension:DeToc

Lots of nuances probably though.

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC/Summit] `npm install mediawiki-express`

2015-11-06 Thread Marcin Cieslak
On 2015-11-05, Ryan Lane  wrote:
> Is this simply to support hosted providers? npm is one of the worst package
> managers around. This really seems like a case where thin docker images and
> docker-compose really shines. It's easy to handle from the packer side,
> it's incredibly simple from the user side, and it doesn't require
> reinventing the world to distribute things.

I got heavily involved in to node world recently and I fully share your opinion
about npm and npm@3 takes the disaster to the next level.

Are we using some native npm modules in our stack? *That* is hard
to support.

> If this is the kind of stuff we're doing to support hosted providers, it
> seems it's really time to stop supporting hosted providers. It's $5/month
> to have a proper VM on digital ocean. There's even cheaper solutions
> around. Hosted providers at this point aren't cheaper. At best they're
> slightly easier to use, but MediaWiki is seriously handicapping itself to
> support this use-case.

I feel very strongly there is a need for a quick setup for people who
have their LAMP stack already working and feel familiar with that environment.
The problem is that a full-stack MediaWiki is no longer a LAMP application.
Those people aren't going away any soon and joining the coolest game in town.

I have already written scripts to keep code, vendor and core skins in sync
from git. I am beginning to write even more scripts to quickly deploy/destroy MW
instances. (My platform does not do Docker, btw.).

Maybe the right strategic move will be to implement MediaWiki phase
four in the server-side JavaScript. Then the npm way is probably the only way
forward.

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Reading] October updates

2015-11-04 Thread Marcin Cieslak
On 2015-11-03, MZMcBride  wrote:
> Moushira Elamrawy wrote:
>>Heads up that we have updates available for Reading team's work during
>>September and October. One highlight is the new "*Read more*
>>" feature
>>that will be tested on mobile and desktop web beta, soon.
>
> Hi.
>
> How are "related articles" suggested?
>
> The bottom of articles often have manually created lists of related
> articles already, in the form of navigation boxes, category links, "in
> this series" links (for people who have won an award, for example), etc.
>
> It seems pretty sketchy for the "Reading" team to be inserting
> programmatically generated related articles into the content area, which
> is primarily within the control and discretion of human editors.

I wonder how is {{#related:Something}} different from the ordinary link?
I think that embedded and linked list as well as "Special:Whatlinkshere"
build a fantastic network of knowledge that can be browsed for hours
(I love that!).

On the other hand, when editing (especially when creating a small
article) a number of bureaucratic tasks involved in categorizing,
creating inline and listed links as well as creation of links
pointing towards the newly created article is sometimes overwhelming.
If that could be machine-assisted, that would be really really great.
Does this look like a task to the "Writing" department?

Off-topic:

As I wrote this an old communist-times story from
the https://en.wikipedia.org/wiki/Radio_Yerevan_jokes
series springs up:

"- Why are police patrols in our country always
   composed of two policemen and a dog?
 - One policeman can read, the other can write
   and the dog has recently graduated."

~Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] documentation files in git and .{media, }wiki extension (bikeshedding!)

2015-11-04 Thread Marcin Cieslak
On 2015-11-03, S Page  wrote:
> I noticed two conventions for the extension of wikitext files in git:
> .mediawiki and .wiki, e.g. tests/browser/README.mediawiki and
> extensions/Wikibase/docs/lua.wiki. GitHub will render both kinds as
> MediaWiki wikitext (of course it only implements a subset); it seems
> Phabricator doesn't recognize either extension. I think we should be
> consistent. .mediawiki is more precise (it's not MoinMoin or IkeWiki
> markup) while .wiki is shorter. Maybe there are other arguments for one
> extension over the other.

If both "README" and "README.mediawiki" were present I'd tend to think
that this is some imported piece of software and README is the upstream
README file and README.mediawiki contains MW-specific bits (like
import/upstreaming instructions, whatever). Something we might start
to see in vendor/ eventually.

I avoid extensions for text files without source code at all,
but maybe because I don't use Windows a lot.

Somehow funny we cannot use our own markup where we need to :)

Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MW 1.25 extension registration - PHP constants redux

2015-11-03 Thread Marcin Cieslak
On 2015-11-02, Jason Ji  wrote:
> Hi all,
>
> Sorry to be spamming this list due to my own incompetence. I'm basically
> doing an upgrade-a-thon of our various extensions to the new extension
> format, and therefore running into issues as I go.
>
> In a previous email I discussed a problem I was having with the fact that
> the new extension registration doesn't support PHP constants, which I was
> using for explicit dependencies. Now I'm working on upgrading another
> extension where I use PHP constants to allow users to configure options for
> my extension.
>
> My extension is called CommentStreams. In the old registration approach,
> inside of CommentStreams.php, I had three constants defined:
>
> define('NS_COMMENTSTREAMS', 1000);
> define('CS_COMMENTS_EXPANDED', 0);
> define('CS_COMMENTS_COLLAPSED', 1);

You might want to see how ProofreadPage solved related problems:

https://phabricator.wikimedia.org/T39483

and the fix in 98bb44055a9a44429ab40d9ff71375eb79379d24
by your truly (although at the time it didn't use NS_xxx anymore).

~saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Random rant

2015-10-28 Thread Marcin Cieslak
On 2015-10-28, Brian Wolff  wrote:
> I'm not sure how necessary that all is, especially for apps with only
> normal edit rights, or less. If an app maintainer tries to pull
> anything silly, we can just block it. Users can already be tricked
> into giving their password to someone malicious, at least this way we
> can easily keep track of what's going on.

I think the point of the approval process is that I don't install
OAuth app key in the application like Vicuña (https://github.com/yarl/vicuna).

There is no clear consensus what to do with apps like this. One idea
would be ever user needs to register it for themselves, but that
of course wouldn't work with the permission queue.

~saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Organizing project in the Phabricator - advice needed

2015-10-28 Thread Marcin Cieslak
Hi,

I have recently did some review of the Mediawiki-Installer
project, fixing some minor bugs right away:

https://gerrit.wikimedia.org/r/#/q/project:mediawiki/core+branch:master+topic:installer,n,z

https://phabricator.wikimedia.org/tag/mediawiki-installer/

As far as I understand the workboard, I cannot (and shouldn't)
have too many columns - they are there to illustrate the progress of work.

Since we have so many issues open there (129 open in the backlog,
which is one per 100 lines of code counted with "wc -l") 
I'd like to group them into those than can be tested and even
probably fixed at the same time:

* Restricted environments (currently a column on the workboard).
* Users and passwords
* Environment checks

etc.

Those things need to be pretty fluid (I like to re-think
and re-organize as I go).

How can I do that? I understand I need a permission to create
a project, which is too bureaucratic for this task.

Probably the good way would be to define sprints (but without
dates, since it's volunteer work:) but a Sprint is also
a project, so I need permission, too.

Any ideas how to do it?

~Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 1.26 Release

2015-10-28 Thread Marcin Cieslak
On 2015-10-27, Chad  wrote:
> Hi all,
>
> We're rapidly approaching the 1.26 release. General guilting e-mail
> to get some extra eyes on the things currently tagged against the
> release.
>
> https://phabricator.wikimedia.org/maniphest/query/jcSXdUecbcLp/#R
>
> Buggy releases -> sad times. Let's wrap this one up :)

What is the correct use of MW-1.26-release and MW-1.26-release-notes
tags in ze Phab?

1) Should I add RELEASE NOTES entry in every change that closes a bug?
2) When should I tag MW-1.26-release-notes ?
3) How should I tag changes that are in master but should be backported?

examples:
https://phabricator.wikimedia.org/T116374
https://phabricator.wikimedia.org/T116375
https://phabricator.wikimedia.org/T75031

~saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Organizing project in the Phabricator - advice needed

2015-10-28 Thread Marcin Cieslak
On 2015-10-28, Brad Jorsch (Anomie)  wrote:
> On Wed, Oct 28, 2015 at 12:26 PM, Kevin Smith  wrote:
>
>> My main trick for moving items between columns on a board that has a lot of
>> columns is to zoom my browser out so far that the text is microscopic (and
>> all the columns are visible). Then use the browser find feature (Ctrl-F) to
>> highlight the task you want to move, so you can see it. Then drag it. For
>> me, that's easier than coordinating the "scroll while dragging" operation.
>>
>
> My main trick is a Greasemonkey script to narrow the columns to fit the
> screen. The non-boilerplate bit is
>
> GM_addStyle(".aphront-multi-column-fixed .phui-workpanel-view,
> .phui-workpanel-view .phui-header-shell { width: auto; }");

Amazing how people deal with such an inconvenience! IMHO a simple
keyboard shortcuts (h j k l) could do wonders!

I subscribed to https://secure.phabricator.com/T5240 to watch progress.

Thanks everyone for your input. I'll try to request the Create Project
permission and will try with Sprints/mini sub-projects.

~saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki core tests failing? why?

2015-10-26 Thread Marcin Cieslak
On 2015-10-26, Brad Jorsch (Anomie) <bjor...@wikimedia.org> wrote:

Brad - thank you very much for having a look at this.

> On Sat, Oct 24, 2015 at 7:14 PM, Marcin Cieslak <sa...@saper.info> wrote:
>
>> I am getting few interesting failures:
>>
>> - floating point format problems
>>
>
> Why do you have serialize_precision set to 100? Once you go over the
> default of 17, you're into the range of rounding errors in an IEEE double,
> which in turn is what PHP typically is using for its float type.
>
> This probably doesn't break anything besides unit tests, though.

That is probably inherited from the vendor's (Gentoo) defaults.
This should probably fixed in the test - to have a fixed precision
in the test, enforced by ini_set() in the worst case.

Is there any requirement for those serializations to be consistent
in the API? If yes, then probably the API value formatting should be
more tightly formatted.

The test passes after setting serialize_precision to 14.

>> - various XMP XML metadata issues
>>
>
> This seems to cover most of your errors. In particular, the metadata seems
> to be completely missing. What might be happening there is that
> XMPReader::isSupported() may be returning false on your system due to
> missing dependencies, and all these tests should be made to skip in that
> case.

A quick test with eval.php gives me

> var_dump(XMPReader::isSupported())
bool(true)

Will investigate.

>> and few others.
>>
>
> The only "other" looks like failures 3-6, which seems to boil down to some
> issue with JavaScriptContent::getRedirectTarget() recognizing redirects in
> general. If you can manage to figure out one of them (e.g. failure 4), it
> will probably figure out all of them. My guess offhand is that something in
> your setup is making the preg_match() in that function fail to match.

I think that 'wgScriptPath' set to '/w/index.php' might be incorrect
(I think they meant 'wgScript'). 

I have $wgScriptPath = "/~saper/ybabel"; in my LocalSettings.php
and this confuses the test.

https://gerrit.wikimedia.org/r/249004 fixes this for me.

Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki core tests failing? why?

2015-10-26 Thread Marcin Cieslak
On 2015-10-26, Brad Jorsch (Anomie) <bjor...@wikimedia.org> wrote:
> On Sat, Oct 24, 2015 at 7:14 PM, Marcin Cieslak <sa...@saper.info> wrote:
>
>> I am getting few interesting failures:
>>
>> - floating point format problems
>>
>
> Why do you have serialize_precision set to 100? Once you go over the
> default of 17, you're into the range of rounding errors in an IEEE double,
> which in turn is what PHP typically is using for its float type.
>
> This probably doesn't break anything besides unit tests, though.

serialize_precision too high:

filed bug https://phabricator.wikimedia.org/T116683

gerrit change https://gerrit.wikimedia.org/r/#/c/249018/ that does ini_set()


>> - various XMP XML metadata issues
>>
>
> This seems to cover most of your errors. In particular, the metadata seems
> to be completely missing. What might be happening there is that
> XMPReader::isSupported() may be returning false on your system due to
> missing dependencies, and all these tests should be made to skip in that
> case.

Two problems: 

- XMP XML metadata reading needs allow_url_fopen enabled:

filed bug https://phabricator.wikimedia.org/T116701

also filed https://phabricator.wikimedia.org/T116704 to make sure
CI is running without allow_url_fopen (got bitten before as well)

- Some charset/iconv problem for testIPTCParseForcedUTFButInvalid()

filed bug https://phabricator.wikimedia.org/T116705
no patch/idea why it fails yet

>
>> and few others.
>>
>
> The only "other" looks like failures 3-6, which seems to boil down to some
> issue with JavaScriptContent::getRedirectTarget() recognizing redirects in
> general. If you can manage to figure out one of them (e.g. failure 4), it
> will probably figure out all of them. My guess offhand is that something in
> your setup is making the preg_match() in that function fail to match.

reason: mixed up $wgScript / $wgScriptPath

gerrit change https://gerrit.wikimedia.org/r/#/c/249004/
merged by legoktm, thanks!


Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] XSS warning for an image download

2015-10-25 Thread Marcin Cieslak
On 2015-10-25, Pine W  wrote:
> When I right-click on the image download link for
> File:Commodore_Grace_M._Hopper,_USN_(covered).jpg the download I get is
> only 269 bytes and it contains a 404 error in plaintext even though it's a
> jpg file.
>
> When I click on the image preview that's 480x600 pixels, I get an XSS
> warning from Noscript.

upload.wikimedia.org is 2620:0:862:ed1a::2:b for me:

https://upload.wikimedia.org/wikipedia/commons/thumb/a/ad/Commodore_Grace_M._Hopper%2C_USN_%28covered%29.jpg/480px-Commodore_Grace_M._Hopper%2C_USN_%28covered%29.jpg

This gives me a 55471 byte JPEG file:

MD5 (480px-Commodore_Grace_M._Hopper,_USN_(covered).jpg) = 
edc91a604c7538e869b3be3d122d1fbe

Some caching problem somewhere?

~Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki core tests failing? why?

2015-10-24 Thread Marcin Cieslak
Hello,

I have tried to run our current MediaWiki tests
using PHP 5.4 on a Gentoo machine against +/- git master:

+ git -C . log --oneline -1
5c63cce Merge "ApiQueryAllRevisions: Actually use 'start' and 'end'"
+ git -C vendor log --oneline -1
5efd7d7 Update OOjs UI to v0.12.12
+ git -C skins/Vector log --oneline -1
9f5f333 Localisation updates from https://translatewiki.net.


I have posted my php.ini to  https://phabricator.wikimedia.org/P2226
I am running PHPUnit 4.3.1; 3.x series crashes
at some point.

I am getting few interesting failures:

- floating point format problems
- various XMP XML metadata issues

and few others.
 
This looks to me like some environment problems
(some library too old etc. etc.).

Any hints where those come from?
If those are env problems indeed, I'd like to
try to add some checks to the installers.

~saper


env LC_MESSAGES=C LANG=C LC_TIME=C  php5.4 -c ${HOME}/php54.ini 
tests/phpunit/phpunit.php --configuration tests/phpunit/suite.xml 
--exclude-group Broken,Stub,Dump,ParserFuzz --log-junit 
"${HOME}/tests/log/postgres-log.xml"
Script started on Sun Oct 25 00:47:01 2015
#!/usr/bin/env php
Using PHPUnit from /usr/share/php/phpunit/phpunit.phar
PHPUnit 4.3.1 by Sebastian Bergmann.

Configuration read from 
/usr/home/saper/public_html/ybabel/tests/phpunit/suite.xml

(...)

Time: 12.45 minutes, Memory: 1301.75Mb

There were 35 failures:

1) ApiFormatPhpTest::testGeneralEncoding with data set #7 (array(1.0E+42), 
'a:1:{i:0;d:1.0E+42;}', array(1))
Failed asserting that two strings are identical.
--- Expected
+++ Actual
@@ @@
-a:1:{i:0;d:1.0E+42;}
+a:1:{i:0;d:144885712678075916785549312;}

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/api/format/ApiFormatTestBase.php:61
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

2) ApiFormatPhpTest::testGeneralEncoding with data set #30 (array(1.0E+42), 
'a:1:{i:0;d:1.0E+42;}', array(2))
Failed asserting that two strings are identical.
--- Expected
+++ Actual
@@ @@
-a:1:{i:0;d:1.0E+42;}
+a:1:{i:0;d:144885712678075916785549312;}

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/api/format/ApiFormatTestBase.php:61
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

3) JavaScriptContentTest::testUpdateRedirect with data set #1 ('/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=MediaWiki:MonoBook.js\\u0026action=raw\\u0026ctype=text/javascript");',
 '/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=TestUpdateRedirect_target\\u0026action=raw\\u0026ctype=text/javascript");')
Failed asserting that two strings are equal.
--- Expected
+++ Actual
@@ @@
-'/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=TestUpdateRedirect_target\u0026action=raw\u0026ctype=text/javascript");'
+'/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=MediaWiki:MonoBook.js\u0026action=raw\u0026ctype=text/javascript");'

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/content/JavaScriptContentTest.php:268
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

4) JavaScriptContentTest::testGetRedirectTarget with data set #0 
('MediaWiki:MonoBook.js', '/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=MediaWiki:MonoBook.js\\u0026action=raw\\u0026ctype=text/javascript");')
Failed asserting that null matches expected 'MediaWiki:MonoBook.js'.

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/content/JavaScriptContentTest.php:324
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

5) JavaScriptContentTest::testGetRedirectTarget with data set #1 
('User:FooBar/common.js', '/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=User:FooBar/common.js\\u0026action=raw\\u0026ctype=text/javascript");')
Failed asserting that null matches expected 'User:FooBar/common.js'.

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/content/JavaScriptContentTest.php:324
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

6) JavaScriptContentTest::testGetRedirectTarget with data set #2 
('Gadget:FooBaz.js', '/* #REDIRECT 
*/mw.loader.load("//example.org/w/index.php?title=Gadget:FooBaz.js\\u0026action=raw\\u0026ctype=text/javascript");')
Failed asserting that null matches expected 'Gadget:FooBaz.js'.

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/content/JavaScriptContentTest.php:324
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

7) BitmapMetadataHandlerTest::testMultilingualCascade
'right(iptc)' does not match expected type "array".

/usr/home/saper/public_html/ybabel/tests/phpunit/includes/media/BitmapMetadataHandlerTest.php:43
/usr/home/saper/public_html/ybabel/tests/phpunit/MediaWikiTestCase.php:137

8) BitmapMetadataHandlerTest::testPNGXMP
Failed asserting that two arrays are equal.
--- Expected
+++ Actual
@@ @@
 Array (
 'frameCount' => 0
 'loopCount' => 1
-

[Wikitech-l] Working around composer? (Fatal error: Class 'Cdb\Reader' not found)

2015-01-13 Thread Marcin Cieslak
I am kind of late to the party but I have upgraded one of
my throaway development wikis with the usual 
git remote update  git merge  php maintenance/update.php process
and after the above succeeded I was nevertheless greeted by:

Fatal error:  Class 'Cdb\Reader' not found 

exception coming out of includes/cache/LocalisationCache.php on line 1263

It seems that I just forgot to update the vendor directory
(I am somehow reluctant to run composer due to allow_url_fopen=1)
requirement

Would that be reasonable to add some basic external libraries
checks to update.php to remind users to update those core
components prior to accessing the wiki?

Btw. I think UPGRADE doc does not (yet) mention the new process.

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Abandoning -1 code reviews automatically?

2014-04-13 Thread Marcin Cieslak
 Tim Landscheidt t...@tim-landscheidt.de wrote:

 I'm all for abandoning changes when the author doesn't react
 and the patch doesn't apply anymore (not in a technical
 sense, but the patch's concept cannot be rebased to the cur-
 rent HEAD).  But forcing work on many just so that a metric
 can be easier calculated by one is putting the burden on the
 wrong side.

As somebody who contributes something in development
surges (like a week per three months or so), I think
that cleaning up statistics to make our code review
process look nicer is not the way to go.


What about automatically accepting the change
that has not been reviewed by anybody for two weeks
or so instead?


I agree that -1 is practically a death penalty to a change.
But that's not a positive development, because
even a mild -1's completety discourages anybody to post
a positive review (I wonder how many +1 or neutral
comments were posted *after* some of the WMF reviewers
posted a -1).

Some examples from my own dashboard:

1) https://gerrit.wikimedia.org/r/#/c/99068/

practically dead although I completely disagree
with the -1 reviewer, as reflected in the comment afterwards.

2) https://gerrit.wikimedia.org/r/#/c/11562/

My favourite -1 here is needs rebase.

In general our review process disourages somehow
incremental updating of the patches (do we know
how many non-original-submitters posted follow up patchsets,
not comments?).

This kind of review discourages refactoring or some
non-trivial changes. See seems too complex in the example
#2 above. 

Regarding Openstack policies: I'd say we should not follow them.

I used to be #2 git-review contributor according to launchpad
until recently. I gave up mainly because of my inability
to propose some larger change to this relatively simple
script. For a nice example of this, please see

https://review.openstack.org/#/c/5720/

I have given up to contribute to this project some time
after this, I have no time to play politics to submit
a set of tiny changes and play the rebase game depending
on the random order they might would have got reviewed.

The next time I find time to improve Johnny the causual
developer experience with gerrit I will just rewrite
git-review from scratch. The amount of the red tape
openstack-infra has built around their projects is
simply not justifiable for such a simple utility
like git-review. Time will tell if gerrit-based
projects generally fare better than others.


//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mailing list etiquette and trolling [Bugzilla etiquette]

2013-12-22 Thread Marcin Cieslak
 Chad innocentkil...@gmail.com wrote:
 On Wed, Dec 11, 2013 at 3:50 PM, Isarra Yos zhoris...@gmail.com wrote:

 On 11/12/13 23:28, Petr Bena wrote:

 I think we need to use less rules and more common sense.

  This.


 Rules are silly. Common sense for all :)

Yeah, and at this very moment we are getting a Bugzilla etiquette[1]
instead of improving a plain text explanation to bug
submitters how our process works.


//Saper

[1] https://www.mediawiki.org/wiki/Bug_management/Bugzilla_etiquette



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Uninstalling hooks for tests?

2013-12-05 Thread Marcin Cieslak
I am not very happy about this but we came to the case
where it might be useful to explicitly uninstall some
hook(s) for out unit tests.
 
You might want to checkout MediaWikiTestCase::uninstallHook

https://gerrit.wikimedia.org/r/#/c/99349/

I am not happy about blurring differences between unit
and integration testing, but breaking core with extensions
and vice versa is sometimes useful.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Pywikipediabot] Using the content of a file as input for articles

2013-12-01 Thread Marcin Cieslak
 Mathieu Stumpf psychosl...@culture-libre.org wrote:
 Hello,

 I want to add esperanto words to fr.wiktionary using as input a file
 where each line have the format word:the fine definition. So I copied
 the basic.py, and started hacking it to achieve my goal.

 Now, it's seems like the -file argument expect a file where each line is
 formated as [[Article name]]. Of course I can just create a second
 input file, and read both in parallel, so I feed the genFactory with the
 further, and use the second to build the wiktionary entry. But maybe you
 could give me a hint on how can I write a generator that can feed a
 pagegenerators.GeneratorFactory() without creating a miror file and
 without loading the whole file in the main memory.

I think that the secret sauce to make a working generator is yield
Python keyword. Will try to provide a working example later.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Pywikipediabot] Using the content of a file as input for articles

2013-12-01 Thread Marcin Cieslak
 Mathieu Stumpf psychosl...@culture-libre.org wrote:
 Hello,

 I want to add esperanto words to fr.wiktionary using as input a file
 where each line have the format word:the fine definition. So I copied
 the basic.py, and started hacking it to achieve my goal.

 Now, it's seems like the -file argument expect a file where each line is
 formated as [[Article name]]. Of course I can just create a second
 input file, and read both in parallel, so I feed the genFactory with the
 further, and use the second to build the wiktionary entry. But maybe you
 could give me a hint on how can I write a generator that can feed a
 pagegenerators.GeneratorFactory() without creating a miror file and
 without loading the whole file in the main memory.

All pagegenerators return only a series of Page objects and nothing else;
they are useful to create just a list of pages to work on.

I wrote a very simple mini-bot using a different kind of generator
that feeds the bot with both pagename and the content.

You can download the code from Gerrit:

https://gerrit.wikimedia.org/r/98457

You should run it like this:

python onelinecontent.py -simulate -contentfile:somecontent

where somecontent contains:

A:Test one line
B:Second line

Hope that provides some starting point for you,

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Operations buy in on Architecture of mwlib Replacement

2013-11-13 Thread Marcin Cieslak
 Matthew Walker mwal...@wikimedia.org wrote:
 [1 ]https://www.mediawiki.org/wiki/PDF_rendering/Architecture

I think requirement number one is that Jimmy the casual MediaWiki
user would be able to install his own renederer without replicating
WMF infrastructure:

https://www.mediawiki.org/wiki/Talk:PDF_rendering/Architecture#Simple_set_up_for_casual_MediaWiki_users_35545

//Saper



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New Bugzilla users have restricted accounts

2013-11-13 Thread Marcin Cieslak
 Andre Klapper aklap...@wikimedia.org wrote:
 I don't know your specific usecase - maybe the shared saved search named
 My CC'd Bugs might work (or not) which you could enable on
 https://bugzilla.wikimedia.org/userprefs.cgi?tab=saved-searches (see
 http://blogs.gnome.org/aklapper/2013/07/12/bugzillatips-saved-searches/
 for general info on saved searches and sharing them with other users).

I've been using i-am-on-cc (now shared) filter similar to this one
to a great success to find stuff I am working on/interested in.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New Bugzilla users have restricted accounts

2013-11-07 Thread Marcin Cieslak
 Let's look at the github model -- there's no assignment at all.  I just
 file a bug, maybe make some comments on it to say I'm working on it, and
 some time later I submit a pull request referencing the bug and saying, I
 fixed it.  That seems to work fine for collaboration, and offers no
 roadblocks.

GitHub issues are owned by whoever submitted them (and the project
owner). You can't for example convert an issue to a pull request
if you are a third-party. 

But you can always reference an issue in a commit or a comment though.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Subclassing User?

2013-10-28 Thread Marcin Cieslak
 Brion Vibber bvib...@wikimedia.org wrote:
 I think a better way to go is to add a hook point in
 RecentChange::checkIPaddress()... I don't like mixing more session-related
 stuff into User like a getUserIP method.

Brion and everyone,

thanks for your time and your insight. I will try to hold my object oriented
horses next time :)

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Subclassing User?

2013-10-27 Thread Marcin Cieslak
Hi,

given that there are some extensions which perform edits/actions
automatically (not directly as a result of user request),

I was wondering, was anyone attempting or successful at subclassing
User?

There are some places where name of this class is hardcoded.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Subclassing User?

2013-10-27 Thread Marcin Cieslak
 Brion Vibber bvib...@wikimedia.org wrote:
 Generally I would not recommend subclassing User; while you can certainly
 create such a subclass it will have limited utility as you can't really
 control how they get created easily.

 Like the rest of MediaWiki, the User class is intended to be customized
 through extension hooks... What sort of behavior customization are you
 thinking of doing?

Some example:

https://gerrit.wikimedia.org/r/#/c/92252/

needs https://gerrit.wikimedia.org/r/#/c/92179/ in core,
that gives some method to override.

Surprisinly, it even works (rc_ip will be set to  on
AbuseFilter blocks).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Subclassing User?

2013-10-27 Thread Marcin Cieslak
 Daniel Friesen dan...@nadir-seen-fire.com wrote:
 On 2013-10-27 2:45 PM, Marcin Cieslak wrote:
 Some example:

 https://gerrit.wikimedia.org/r/#/c/92252/

 needs https://gerrit.wikimedia.org/r/#/c/92179/ in core,
 that gives some method to override.

 Surprisinly, it even works (rc_ip will be set to  on
 AbuseFilter blocks).

 //Saper

 Could you explain why a whole subclass of user is needed. From what I'm
 seeing there's little need for an actual class. And a whole lot of what
 looks methods copied from core and then perhaps only slightly modified
 (ie: non-DRY).

newFrom* and friends needed to be copied over because they create
instances of User and not of derivative class (to fix this
a factory method would be needed to replace new User in those
methods).

The only (experimental) reason for now is the little getUserIP.
This is just a proof of concept, if such subclassing has
any chance of working.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [IRC] hiring more wm-bot operators

2013-10-11 Thread Marcin Cieslak
 Petr Bena benap...@gmail.com wrote:
 Hey all,

 Some of you may know our belowed robot, which is working as a slave in
 some of our dev channels. Unfortunately, freenode as well as wikimedia
 labs is a bit unstable, when it comes to network connectivity. So both
 freenode servers as well as internet connectivity of labs are
 occasionally down. This is causing some troubles to wm-bot who isn't
 really able to reconnect, given to laziness of its developers as well
 as complexity of multiple bouncers it is using.

Petr,

I was running a couple of recentchanges minibots (based on the
UDP logging - urdrec.c - Python IRC module) pretty reliably.

The code is here https://bitbucket.org/plwiki/bot/src/ (irc module)
but of course I am happy to help with hosting/reducing complexity
and getting my hands finaly on C# if needed.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Exceptions, return false/null, and other error handling possibilities.

2013-10-09 Thread Marcin Cieslak
 Erik Bernhardson ebernhard...@wikimedia.org wrote:
 Moving forward, the Flow team is considering using a php implementation
 that follows the ideas of the haskell Maybe monad(
 https://github.com/schmittjoh/php-option ).  This is, in concept, rather
 similar to the Status class the wikitext parser returns. We would like to
 use this library as a way to more explicitly handle error situations and
 reduce the occurrences of forgetting to check false/null.  This particular
 pattern is very common in Functional languages.

I don't think exceptions are evil, they are more structured gotos and
goto can be used properly to handle errors.

Status class has similar problem as unchecked exceptions - you never
know which exception might come, and similarly, you never know
what kind of Status you might inherit from the code called.

However, exceptions can form a hierarchy, which allows the
caller to react selectively to the class of problems we have.

I recently was struggling with this piece of MediaWiki
code, which interprets values of internalAttemptSave:

(from EditPage.php)

1195  // FIXME: once the interface for internalAttemptSave() is made nicer, 
this should use the message in $status
1196  if ( $status-value == self::AS_SUCCESS_UPDATE || $status-value == 
self::AS_SUCCESS_NEW_ARTICLE ) {
1197  $this-didSave = true;
1198  if ( !$resultDetails['nullEdit'] ) {
1199  $this-setPostEditCookie();
1200  }
1201  }
1202 
1203  switch ( $status-value ) {
1204  case self::AS_HOOK_ERROR_EXPECTED:
1205  case self::AS_CONTENT_TOO_BIG:
1206  case self::AS_ARTICLE_WAS_DELETED:
1207  case self::AS_CONFLICT_DETECTED:
1208  case self::AS_SUMMARY_NEEDED:
1209  case self::AS_TEXTBOX_EMPTY:
1210  case self::AS_MAX_ARTICLE_SIZE_EXCEEDED:
1211  case self::AS_END:
1212  return true;
1213 
1214  case self::AS_HOOK_ERROR:
1215  return false;
1216 
1217  case self::AS_PARSE_ERROR:
1218  $wgOut-addWikiText( 'div class=error' . 
$status-getWikiText() . '/div' );
1219  return true;
1220 
1221  case self::AS_SUCCESS_NEW_ARTICLE:
1222  $query = $resultDetails['redirect'] ? 'redirect=no' : '';
1223  $anchor = isset( $resultDetails['sectionanchor'] ) ? 
$resultDetails['sectionanchor'] : '';
1224  $wgOut-redirect( $this-mTitle-getFullURL( $query ) . 
$anchor );
1225  return false;
1226 

This code is somehow replicated in ApiEditPage.php, but in another way.

I wanted re-use this logic in the Collection extension
(which sometimes creates some page in bulk on behalf
of the user) and I really wished error reporting
was done with exceptions. At top level, they
could be handled in EditPage.php way, API would
return exception objects instead and other
extensions could selectively handle some values
and ignore others - for example, I would be happy
to get the standard EditPage.php behaviour 
for most of the errors I am not particularly
interested in.

Regarding the use of php-option:

php-option seems to be as a handy way to provide substitute
objects in case of errors; I think this case
comes not very often and is arguably better
handled by the use of factory methods, i.e.
a method is reponsible to deliver foreign
instance of some class; factory methods can
be defined once for particular orElse/getorElse
situation; it is also easier to make sure
that particular instances will deliver
the same (or similar) interface.
And factory methods can be overriden
if some particular implementation 
needs different creation of fallback objects.

Instead of having:

  return $this-findSomeEntity()-getOrElse(new Entity());

One could have:

  interface StuffNeededFromEntity {
/* what Entity() really provides */
  }
  
  class Entity implements StuffNeededFromEntity {
  }

  class FallbackEntity implements StuffNeededFromEntity {
  }
  

  /*...*/

  /* In the code trying to findSomeEntity */

  /* @return StuffNeededFromEntity */
  protected function entityFactory() {
  $entity = $this-findSomeEntity();
  if (null === $entity) {
 return new FallbackEntity();
  }
  }

  /* replace return $this-findSomeEntity()-getOrElse(new Entity()); */
  return $this-entityFactory();

[The use of interface/implements above is for clarity only,
 one does not actually need to use those features]

I agree with php-option author that if ($x===null) boilerplate
should be avoided, but in my opinion the solution is
to do the logic once and abstract it properly for re-use
and not hide it in some syntatic sugar instead.
Imagine what happenes if we need some constructor
parameters for new Entity() - instead of updating
entityFactory once one needs to go through all
those orElse cases again.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org

Re: [Wikitech-l] Identifying pages that are slow to render

2013-03-08 Thread Marcin Cieslak
 Antoine Musso hashar+...@free.fr wrote:
 Le 06/03/13 22:05, Robert Rohde a écrit :
 On enwiki we've already made Lua conversions with most of the string
 templates, several formatting templates (e.g. {{rnd}}, {{precision}}),
 {{coord}}, and a number of others.  And there is work underway on a
 number of the more complex overhauls (e.g. {{cite}}, {{convert}}).
 However, it would be nice to identify problematic templates that may
 be less obvious.

 You can get in touch with Brad Jorsch and Tim Starling. They most
 probably have a list of templates that should quickly converted to LUA
 modules.

 If we got {{cite}} out, that will be already a nice improvement :-]

Not really, given https://bugzilla.wikimedia.org/show_bug.cgi?id=45861

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How do MS SQL users install MediaWiki?

2013-03-07 Thread Marcin Cieslak
 Mark A. Hershberger m...@everybody.org wrote:
 On 03/04/2013 01:34 AM, Chad wrote:
 However, we do
 have people who want/use MSSQL, so I think taking the effort to
 keep it working is worthwhile--if someone's willing to commit.

 Since Danny Bauch has been using MSSQL and modifying MW for his needs,
 I'll work with him to get the necessary changes committed.

 Danny, if you could commit your changes into Gerrit, I'd be happy to
 test them.

I'll be happy to come back to my PostgreSQL work and I'd happy to
talk to other RDBMs people to coordinate some stuff (like getting
unit tests to work or getting some abstractions right - transactions,
schema management etc.).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Research on newcomer experience - do we want to take part?

2012-11-14 Thread Marcin Cieslak

Hello,

Kevin Carillo[1] from University of Wellington is going to research
Newcomer experience and contributor behavior in FOSS communities[2]
So far Debian, GNOME, Gentoo, KDE, Mozilla, Ubuntu, NetBSD, OpenSUSE
will be taken into account, and FreeBSD recently joined[3] and 
there is still some possibility for other large FOSS projects to join.

I think it could fit nicely into our recent efforts directed
at newcomer experience after Git migration. And MediaWiki is
a bit different than above projects.

Are we interested
to include MediaWiki in that research?

As Kevin explains in his post he tried to avoid spamming mailing
lists to look for project interested, so I am doing this for him :-)

//Saper

[1] http://kevincarillo.org/about/, http://twitter.com/kevouze
[2] http://kevincarillo.org/survey-invitation/
[3] http://kevincarillo.org/2012/11/15/welcome-freebsd/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] IPv6 routing problem?

2012-10-15 Thread Marcin Cieslak
 Faidon Liambotis fai...@wikimedia.org wrote:
 Hi,

 Thanks for forwarding the report. I've chatted with the user via IRC on
 Sunday and subsequently via e-mail, so we're on it. For what it's worth,
 the underlying issue is still there, although restoring European traffic
 via the esams (Amsterdam) cluster has significantly reduced the impact.

Could 
http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/64555 be 
the same problem?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] master broken on PostgreSQL - ContentHandler?

2012-10-11 Thread Marcin Cieslak
 Marcin Cieslak sa...@saper.info wrote:
 Daniel Kinzler dan...@brightbyte.de wrote:
 I'll try to look into this today, but I need to find help from someone
 knowledgable about postrtges (and especially about the postgres
 updater. it's... different).

Thank you everyone - the working fix is now in Gerrit 
https://gerrit.wikimedia.org/r/#/c/27413/

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] master broken on PostgreSQL - ContentHandler?

2012-10-10 Thread Marcin Cieslak
Hello,

I updated one of my wikis today from f2138b1 to 
9299bab032a85c1a421436da04a595b79f2b9d6c (git master as I write this) and after 
running update.php
I get this:

A database error has occurred. Did you forget to run maintenance/update.php 
after upgrading? See: 
https://www.mediawiki.org/wiki/Manual:Upgrading#Run_the_update_script
Query: SELECT page_id,page_len,page_is_redirect,page_latest,page_content_model 
FROM page WHERE page_namespace = '0' AND page_title = 'Test11' LIMIT 1 
Function: LinkCache::addLinkObj
Error: 42703 ERROR: column page_content_model does not exist
LINE 1: ...*/ page_id,page_len,page_is_redirect,page_latest,page_conte...
^

Backtrace:

#0 /usr/home/saper/public_html/pg/w/includes/db/DatabasePostgres.php(477): 
DatabaseBase-reportQueryError('ERROR: column ...', '42703', 'SELECT 
page_id...', 'LinkCache::addL...', false)
#1 /usr/home/saper/public_html/pg/w/includes/db/Database.php(942): 
DatabasePostgres-reportQueryError('ERROR: column ...', '42703', 'SELECT 
page_id...', 'LinkCache::addL...', false)
#2 /usr/home/saper/public_html/pg/w/includes/db/Database.php(1367): 
DatabaseBase-query('SELECT page_id...', 'LinkCache::addL...')
#3 /usr/home/saper/public_html/pg/w/includes/db/Database.php(1458): 
DatabaseBase-select('page', Array, Array, 'LinkCache::addL...', Array, Array)
#4 /usr/home/saper/public_html/pg/w/includes/cache/LinkCache.php(222): 
DatabaseBase-selectRow('page', Array, Array, 'LinkCache::addL...', Array)
#5 /usr/home/saper/public_html/pg/w/includes/Title.php(2895): 
LinkCache-addLinkObj(Object(Title))
#6 /usr/home/saper/public_html/pg/w/includes/Title.php(4320): 
Title-getArticleID()
#7 /usr/home/saper/public_html/pg/w/includes/WikiPage.php(416): Title-exists()
#8 /usr/home/saper/public_html/pg/w/includes/WikiPage.php(465): 
WikiPage-exists()
#9 /usr/home/saper/public_html/pg/w/includes/WikiPage.php(204): 
WikiPage-getContentModel()
#10 /usr/home/saper/public_html/pg/w/includes/WikiPage.php(190): 
WikiPage-getContentHandler()
#11 /usr/home/saper/public_html/pg/w/includes/Action.php(92): 
WikiPage-getActionOverrides()
#12 /usr/home/saper/public_html/pg/w/includes/Action.php(139): 
Action::factory('view', Object(WikiPage))
#13 /usr/home/saper/public_html/pg/w/includes/Wiki.php(144): 
Action::getActionName(Object(RequestContext))
#14 /usr/home/saper/public_html/pg/w/includes/Wiki.php(528): 
MediaWiki-getAction()
#15 /usr/home/saper/public_html/pg/w/includes/Wiki.php(447): MediaWiki-main()
#16 /usr/home/saper/public_html/pg/w/index.php(59): MediaWiki-run()
#17 {main}

Look like LinkCache.

Can this be quickly fixed or do we need to revert this?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] master broken on PostgreSQL - ContentHandler?

2012-10-10 Thread Marcin Cieslak
 Platonides platoni...@gmail.com wrote:
 On 10/10/12 09:02, Marcin Cieslak wrote:
 Hello,
 
 I updated one of my wikis today from f2138b1 to 
 9299bab032a85c1a421436da04a595b79f2b9d6c (git master as I write this) and 
 after running update.php
 I get this:
 
 A database error has occurred. Did you forget to run maintenance/update.php 
 after upgrading? See: 
 https://www.mediawiki.org/wiki/Manual:Upgrading#Run_the_update_script
 Query: SELECT 
 page_id,page_len,page_is_redirect,page_latest,page_content_model FROM page 
 WHERE page_namespace = '0' AND page_title = 'Test11' LIMIT 1 
 Function: LinkCache::addLinkObj
 Error: 42703 ERROR: column page_content_model does not exist
 LINE 1: ...*/ page_id,page_len,page_is_redirect,page_latest,page_conte...
 ^
 (...)
 Look like LinkCache.
 
 Can this be quickly fixed or do we need to revert this?
 
 //Saper

 Seems the files of maintenance/archives/patch-*content* could need to be
 copied to maintenance/postgres/archives

 It works for me on mysql, but it is inconsistent in that the db is still
 used a varbinary but the description uses an integer.

Yes, the updater needs to be fixed (it is normally enough
to update includes/installer/DatabaseUpdater.php for smaller changes).

I remember mentioning it in code review somewhere, finding the right
data type was an issue to be addressed by developers...

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] master broken on PostgreSQL - ContentHandler?

2012-10-10 Thread Marcin Cieslak
 Daniel Kinzler dan...@brightbyte.de wrote:
 I'll try to look into this today, but I need to find help from someone
 knowledgable about postrtges (and especially about the postgres
 updater. it's... different).

Sure, feel free to ask, I will be travelling starting tomorrow
but today we can try to fix it. To add simple fields you don't
usually need to create a patch file, a simple
array ('addPgField', ... ) should do.

Thanks!

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Github replication

2012-10-04 Thread Marcin Cieslak
 Chad innocentkil...@gmail.com wrote:
 Yeah, that sounds sane. Anyone who wants to volunteer to keep an eye
 on Github and make sure patches get into Gerrit, let me know and I'll add
 you to the group on Github.

+1

I'm github.com/saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Learning Git/Gerrit? - 3 Oct 2012 17:30 UTC

2012-10-04 Thread Marcin Cieslak
 Marcin Cieslak sa...@saper.info wrote:
 More information on the setup:

  https://www.mediawiki.org/wiki/Git/Workshop

 I am already available on SIP as well as on IRC
 (#git-gerrit on Freenode) if you would like
 to test your setup.

Thank you everyone for joining, it was fun 
although it took a bit longer than expected.

We've had 8 people on the teleconference, most
of them managed to join the hands-on part
of the tutorial.

Please feel free to send me (or Sumanah)
your feedback.

I think we should restructure this into
a series of smaller trainings, separating
git basic/advanced and gerrit basic/advanced
stuff.

//Marcin



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Welcome Željko Filipin, QA Engineer

2012-10-04 Thread Marcin Cieslak
 Zeljko Filipin zfili...@wikimedia.org wrote:
 On Tue, Oct 2, 2012 at 9:21 PM, Antoine Musso hashar+...@free.fr wrote:
 I am in CET timezone myself and
 working on continuous integration.  Ping hashar on freenode :-]

 Will do. I will probably need help with Jenkins.

Welcome :) Now I know why you joined us yesterday on the Git+Gerrit
session!

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Learning Git/Gerrit? - 3 Oct 2012 17:30 UTC

2012-10-03 Thread Marcin Cieslak
Hello, 

Our scheduled Git+Gerrit session starts in ca. 40 minutes from now.

Everything will happen via SIP audioconference and SSH connection.

Please make sure your SIP and SSH clients works!

More information on the setup:

 https://www.mediawiki.org/wiki/Git/Workshop

I am already available on SIP as well as on IRC
(#git-gerrit on Freenode) if you would like
to test your setup.

See you soon!

Marcin Cieślak
(saper)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Can we kill DBO_TRX? It seems evil!

2012-09-28 Thread Marcin Cieslak
 Brad Jorsch b-jor...@alum.northwestern.edu wrote:
 On Thu, Sep 27, 2012 at 08:40:13PM +, Marcin Cieslak wrote:
 
 From the PostgreSQL side I see one problem with nesting - we are already
 using savepoints to emulate MySQL's INSERT IGNORE and friends.\
 It might be difficult to abuse that feature for something more than this.
 There is a class SavepointPostgres which is used for that.

 As long as the savepoints are properly nested and multiple levels of
 nesting don't try to reuse the same name, things should be fine. And
 since this use is just SAVEPOINT, INSERT, RELEASE SAVEPOINT,
 there's no opportunity for things to not be properly nested, and
 avoiding name collision would not be hard.

All is fine as long something like SqlBagOStuff or Localization
Cache or something else working in parallel does not do
something to your so-called transaction 
(https://bugzilla.wikimedia.org/show_bug.cgi?id=35357 or 
https://bugzilla.wikimedia.org/show_bug.cgi?id=27283).

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Can we kill DBO_TRX? It seems evil!

2012-09-27 Thread Marcin Cieslak
 Daniel Kinzler dan...@brightbyte.de wrote:
 So, my current proposal is a more expressive high-level api for transaction
 control consisting of start/finish/flush (and perhaps abort) on top of the low
 level interface consisting of begin/commit/rollback. Documentation needs to be
 very clear on how they behave and how they relate to each other.

I did some application (in Zope) where correctness was more important
than locks and it was running on PostgreSQL so we never had this problem.
Zope collects all transactions from different sources (multiple database
connections for example) and handles them transparently (like automatic
rollback on error/exception n the code). In MediaWiki context that would
be equivalent to keeping transactions controlled at the WebRequest
level. I know too little about InnoDB transactions to comment as I
understand MySQL is very different.

 For the short term, I suggest to suppress warnings about nested transactions
 under some conditions, see my previous response to aaron.

In includes/db/DatabasePostgres.php there is PostgresTransactionState
monitor, which is very nice to debug all problems with implicit/explicit
transactions. It can easily be abstracted (to Database.php or somewhere
else) and maybe there are functions to monitor InnoDB transaction
status as well.

From the PostgreSQL side I see one problem with nesting - we are already
using savepoints to emulate MySQL's INSERT IGNORE and friends.\
It might be difficult to abuse that feature for something more than this.
There is a class SavepointPostgres which is used for that.

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Learning Git/Gerrit? - 3 Oct 2012 17:30 UTC

2012-09-27 Thread Marcin Cieslak
 Sumana Harihareswara suma...@wikimedia.org wrote:
 Git, Gerrit, and You! A Tutorial

 Where:IRC/SIP/SSH

 We want all our developers to feel comfortable with Git, git-review, and
 Gerrit. So saper is leading a hands-on online training:
 https://www.mediawiki.org/wiki/Project:WikiProject_Extensions/MediaWiki_Workshops/Git_/_Gerrit_Tutorial
 . Check [[Git/Workshop]] for testing access to the conference  lab setup.

 Saper will be available for 3 hours, and there'll be a break in the
 middle. Absolute beginners with Git might want to stay for the whole
 three hours; people with some experience won't need as long.

save the date: 3 October 2012, 17:30 UTC:

http://timeanddate.com/worldclock/fixedtime.html?msg=Git%2BGerrit+with+saper+liveiso=20121003T1730ah=3

In case you want to participate and haven't signed up yet, please
sign up here:

http://www.doodle.com/zhn7buksgrg8e8rx

Thank you!

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Bugzilla workflow: keywords

2012-08-27 Thread Marcin Cieslak
Hello,

Recently I noticed that keywords in bugzilla get
updated more and more often, mostly with keywords
like patch, patch-need-review, etc.

I am wondering what to do in the following situations
(like https://bugzilla.wikimedia.org/show_bug.cgi?id=39635
for example):

- user A posts a patch
- the bug gets patch, patch-need-review
- user B posts a patch that is different and says
  he does not like patch of A
- user B submits change to gerrit

When need-review should be removed? What are replacements
if any? What if I believe that core ideas behind the
patch are wrong? What if I just think the implementation
should be improved? What it it's more or less okay?
I see only patch-reviewed in the keywords - which can be
both negative and positive.

Before I open a whole can of worms by asking a question
how do I relate those keywords to the Gerrit workflow
we have, maybe the current bugmeisters could explain
how they use those keywords and how we can help?

//Saper



*





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Let's talk about arc: On Arcanist docs

2012-08-10 Thread Marcin Cieslak
 Evan Priestley epriest...@phacility.com wrote:
 I sent out a diff to fix the error message 
 (https://secure.phabricator.com/D3231), the new one reads:

 This patch is for the 'phabricator' project, but the working copy does
 not have an '.arcconfig' file to identify which project it belongs to.
 Still try to apply the patch?

 We're trying to catch the case where you're attempting to apply a
 patch to the wrong project -- I'm guessing revision D3 was made in
 a working copy with a .gitignored .arcconfig file that associates
 it with E3Experiments. If you check in the .arcconfig file, arc
 will be able to recognize the working copy's project and will stop
 complaining.

I realize my probleme where related to the E3Experiments not having
.arcconfig 

I started suspecting E3Experiments is kind of unconfigured when
arc git-hook-pre-received hinted me about this:

Usage Exception: You have installed a git pre-receive hook in a remote without 
an .arcconfig.)

 What could we improve about the user guide?

First, I found two entry points:

* Arcanist User Guide
  http://www.phabricator.com/docs/phabricator/article/Arcanist_User_Guide.html

Frankly, I don't remember I how I came to this one.

The problem here is that there is no full table of contents of the User Guide
(btw. Defined: src/docs/userguide/arcanist.diviner:1 seems useless to the 
causual
onlooker).

I could find this:

 
http://www.phabricator.com/docs/phabricator/article/Arcanist_User_Guide_arc_diff.html

But not much more. While writing this email I discovered 

 
http://www.phabricator.com/docs/phabricator/article/Arcanist_User_Guide_Configuring_a_New_Project.html

hidden in the arc_diff guide which was like tl;dr to me - I didn't know I needed
to learn about arc diff command to find out about .arcconfig and stuff.

Suggestions for improvement:

1) In the Overview there should be some links to basic installation and 
configuration
(the .arcconfig thing).

2) Arcanist allows you to do things like should explain more about the 
commands -
descriptions are too short. There are no links to explanations of particular
commands (certainly arc diff has one). 

Coming from gerrit I kind of looked for equivalent of git fetch ... 
refs/changes/CD/ABCD/1
and git push ... refs/for/master. From the terse description there I could 
sense
that arc diff does something to push the changes for review and arc patch 
fetches
the change from the repo (although arc export sound nice, too). 
Unfortunately, arc download/upload do something different :)

* Arcanist Something - Table of Contents 
  http://www.phabricator.com/docs/arcanist/

The good thing is that Phabricator installation has links to this document
at https://phabricator.wmflabs.org/apps/. This is a big plus.

This the Arcanist Something guide is confusing because it contains 95% of
developer API documentation. I hoped to find info about .arcconfig
in ArcanistConfiguration or ArcanistSettings but both were
disappointing. 

Now I see I should go into ArcanistOverview but somehow I missed that.
It links to Arcanist_User_Guide_Configuring_a_New_Project
which I missed so badly yesterday. 

1) Probably ArcanistOverview should be *the* front page for the documentation
and the User Guide with full Table of Contents of all docs. Maybe the
TOC should be on all User Guide pages.

API pages should be clearly marked. Use different skin if possible
(red instead of blue:) or clearly mark links to API and UserGuide
articles differently (consistent title names? we can't rely on colors or
icons). Javadoc output might be ugly, but at least I know immediately
uh oh I ended up in the developer documentation.


There is some problem with 
http://www.phabricator.com/docs/phabricator/article/Arcanist_User_Guide_Customizing_Lint,_Unit_Tests_and_Workflows.html
it sounds like the gentle introduction into the whole API stuff.
Not sure yet how it fits to the casual user.

And then, there is

http://www.phabricator.com/docs/phabricator/article/Arcanist_User_Guide_Repository_Hooks.html

deals only with SVN. arc git-hook-pre-receive sounds promising
but I have no idea where to find out more about it.

Unfortunately, Phabricator docs use workflow as a slang description
of some piece of code, so I could not find out How a typical
workflow with arc looks like and How installing a git hook
changes my workflow.


In general: docs seems to aimed either at the advanced person looking
to write workflows or classes for linting/whatever
or for the user of the already pre-configured repository.
I would review this again in view of the lonely wolf developer
that has some (maybe her own repository) and tries to
set up this thing. I didn't look at the rest of the Phabricator
docs yet but I'd be happy to find guides for 
How do I switch to Phabricator with a github/sourceforge/whatever project.


//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org

Re: [Wikitech-l] Let's talk about arc: Phabricator replacement for git-review and more

2012-08-10 Thread Marcin Cieslak
 Daniel Friesen li...@nadir-seen-fire.com wrote:

 Why is arc written in PHP? That seems to be a horrible software decision  
 to me. Core extensions not enabled by default can be hard to install on  
 some OS. And imho the packaging setup is not as good. Frankly I gave up  
 trying to get mcrypt installed on either version of php installed on my  
 Mac.

It could be improved to check for curl and bcmath (the ones I found
out are needed) on startup, not during some other command
after other succeded (unless of course the extension is needed
only for some specific operation not applicable to general public).

This one I find interesting:


 arc looks as if it works completely with patches on it's own rather than  
 doing anything with git.

 I can't see how phabricator can have commit workflow support any better  
 than gerrit when it appears to take the repo completely out of the  
 question.

Erik also wrote this earlier:

 As I understood it, the big gotchas for Phabricator adoption are that
 Phabricator doesn't manage repositories - it knows how to poll a Git
 repo, but it doesn't have per-repo access controls or even more than a
 shallow awareness of what a repository is; it literally shells out to
 git to perform its operations, e.g. poll for changes - and would still
 need some work to efficiently deal with hundreds of repositories,
 long-lived remote branches, and some of the other fun characteristics
 of Wikimedia's repos. Full repo management is on the roadmap, without
 an exact date, and Evan is very open to making tweaks and changes
 as needed, especially if it serves a potential flagship user like
 Wikimedia.

After Gerrit, I think it might actually be a GoodThing(tm)
to detach the code review tool from managing the repository.

Git at its core is a tool to quickly snapshot directories.
Blobs are its first-class objects, not patches or diffs
(this is I think pretty innovative looking at the traditional
version control systems). 

I think there is a reason why Linus settled with email
patch workflow that is even included in the git user
interface. 

Keeping patches and commits separately starts making 
sense to me - otherwise one ends up in rebase hell.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Let's talk about arc: Phabricator replacement for git-review and more

2012-08-10 Thread Marcin Cieslak
 Evan Priestley epriest...@phacility.com wrote:

 On Aug 10, 2012, at 12:52 AM, Marcin Cieslak wrote:

 It could be improved to check for curl and bcmath (the ones I found
 out are needed) on startup, not during some other command
 after other succeded (unless of course the extension is needed
 only for some specific operation not applicable to general public).

 We should be checking for curl on startup (and a few other things
 -- namely JSON and a reasonable PHP version). Was this not your
 experience? 

I have already had php-curl installed and I only noticed I needed
it because the need to configure list of acceptable CAs; so I *knew*
php-curl is needed.

PHP without bcmath failed on me badly with PHP fatal error.

//Marcin


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Let's talk about arc: Phabricator replacement for git-review and more

2012-08-09 Thread Marcin Cieslak
I just wrote a very rough and quick walkthrough how to get that tool running:

https://www.mediawiki.org/wiki/Arcanist

My first impression is very good. The UI is very nice (it guides you
when it needs to, it just does the job if all is fine).

The user's guide is unfortunately poor.

I don't know yet how to avoid this warning:

This diff is for the 'E3Experiments' project but the working copy belongs to 
the '' project. 

I see that arc can be also installed as the git pre-receive hook,
but it needs some project configuration for that. Interesting.

Anyway, I managed to download one change:

$ git branch -vv
* arcpatch-D3 be7cfd9 Merge branch 'master' of 
ssh://gerrit.wikimedia.org:29418/mediawiki/extensions/E3Experiments into 
munaf/pef2
  master  e40fce0 [origin/master] Rename events.js - communityClicks.js

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] About outreach and tech events (as suggested by Sumana!)

2012-08-05 Thread Marcin Cieslak
 Julien Dorra juliendo...@juliendorra.com wrote:

 «Testing Wikipedia» could be a nice catchy name for a series for events in
 various cities around TDD, with experienced dev mentoring less experienced
 community members, etc. Even if the experts come and go, everybody learn,
 some test and process get done, and the community grow and learn.

Maybe we should not be afraid and not only offer buzzwords but also point
out real technical issues we are facing here.

I would say that at least some parts the core code is hardly testable.
The situation improves as we go, but still there are lots of problems
with our almost-object-oriented coding.

On the other hand the number of integration issues we are facing
(talking to databases, caches etc.) plus high level of optimization
of code suitable for the hugh site does not make a TDD approach
enthusiasts happy; we need multiple levels of testing unit;
integration; user inteface with the last two very important.

And there also infrastructure issues, like

https://bugzilla.wikimedia.org/show_bug.cgi?id=37702
Cloned tables for unitests do not have references and constraints

discovered when trying to write unit test for the very core
MediaWiki functionality. (The fact we didn't find is earlier
is telling something about our test coverage).

So may be one of the approaches would be to have a mini-bugathon
to review some (or most typical? most annoying? site-breaking?)
bugs and try to evaluate how TDD approach could help us
to improve.

We might even have a nice cultural clash as a result:)

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Serious alternatives to Gerrit

2012-07-18 Thread Marcin Cieslak
 Rob Lanphier ro...@wikimedia.org wrote:
 It would appear from reading this page that the only alternative to
 Gerrit that has a serious following is GitHub.  Is that the case?
 Personally, it seems like Phabricator or Barkeep has the best chance
 of dislodging Gerrit, but those won't probably get serious
 consideration without a champion pushing them.

Maybe this is because of the discussion format which was framed
as for and against Gerrit.

Is it possible to setup a copy of Phabricator in labs? What
is needed to help with this?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Serious alternatives to Gerrit

2012-07-18 Thread Marcin Cieslak
 Daniel Friesen li...@nadir-seen-fire.com wrote:
 On Wed, 18 Jul 2012 11:41:06 -0700, Ryan Lane rlan...@gmail.com wrote:

 On Wed, Jul 18, 2012 at 1:18 PM, Petr Bena benap...@gmail.com wrote:
 What about changing gerrit to our needs? It's open source, I suppose.


 That's of course the route we're going right now. The biggest hurdle
 to doing so is that it's written in Java, and Prolog. Neither of those
 languages are amazingly difficult, though, so I don't really see that
 as a complete blocker.

 - Ryan

 The blocker for me was not the language, but the codebase. I wanted to  
 make a small tweak to Gerrit so I started looking through the code. And I  
 had absolutely no clue where to find what I was looking for. I couldn't  
 make sense of what most of what I found was even supposed to do. And  
 people have pointed out a number of issues with Gerrit like the way it  
 handles output and css which feel much more like fundamental (ie:  
 unfixable without practically rewriting) issues with Gerrit.

I got used to it. It's completely different Java if one is used
to old-skool Java programming. Components are decoupled with the
use of Guice (for dependency injection - 
http://c2.com/cgi/wiki?DependencyInjection)
framework plus there is Google Web Toolkit programming, again a very
special beast.

Another component is the ORM mapper, gwtorm.

Other than that it's pretty fine, with the usual Java problem
that I need to cut through gazillion of classes and interfaces
before I get to the core of things.

For example, to fix https://bugzilla.wikimedia.org/show_bug.cgi?id=38114
few lines need to be added before line 296 of

https://gerrit.googlesource.com/gerrit/+/05d942c324d7a17285873a468f3605cbc190b8d5/gerrit-gwtui/src/main/java/com/google/gerrit/client/changes/ChangeTable2.java

(not sure it's a good idea but here it is)

I have attached Jython interpreter to Gerrit to 
play a bit with the code:

https://gerrit-review.googlesource.com/#/c/34670/

You can play live with the ORM mapper for example
and retrieve Java objects from the database (not just
rows).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Maintaining (and thus editing) SVGs' text like wiki-text pages

2012-07-15 Thread Marcin Cieslak
 Achim Flammenkamp ac...@uni-bielefeld.de wrote:

 My aim is the have on wikimedia/wikipedia (/wiki-whatever sounds apropriate)
 1) a version-control environment (as we have for artcile-, talk-, user-,
  category-, template- ... etc (text-)namespace), because it is IMHO apropriate
  for each (huamn-readable) textual namespace. 
 SVG has a (I guess historcial) exception, because it was new and was sorted in
 like (bitmap-) graphics (JPEG/PNG or what ever exists only on wikipedia)
 (badly classified IMHO).

I don't think that version control we offer with article revisions
is a proper one for any kind of XML, including SVGs. For git fanbois:
yes, git does not solve that, either.

The problem is that you have to think in terms of nodes, their attributes
and contents and not their textual form. I pretty often add/remove
spaces and newlines when editing XML by hand for clarity; that should
not be versioned as this does not change the substance.

I am editing SVG files by hand pretty often (using vi for simple things
and XMLmind's xxe for more complex stuff) to fix various problems
related by users, like missing namespaces, wrong CSS, etc.

But I wouldn't really want to do that within some textarea
interface within MediaWiki. Maybe, for the purpose of educating
users, there should be some way to pretty print XML source
of the SVG file - but unless there is a decent XML node editor
I don't think we this is something we need right now.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] [[Template:WikimediaDownload]]

2012-07-15 Thread Marcin Cieslak
Due to various requestes popping out on IRC I visting sometimes
some random extensions to MediaWiki. Today's pick was

https://www.mediawiki.org/wiki/Extension:WikiForum

The infobox says it is available on Git - I remember
the old version was giving pointers to SVN and Git.

Git links there give 404, as the extensions still
rests in

http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/WikiForum/

Now that ExtensionDistributor works nice with git,
shouldn't we have {{WikimediaGit}} and {{WikimediaSVN}} templates,
or let the {{WikimediaDownload}} point to SVN where necessary (older
releases or something?)

I don't fully follow what's happening there and what's the plan...
anyone?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] potential network issue due to packet losses

2012-07-03 Thread Marcin Cieslak
 Leslie Carr lc...@wikimedia.org wrote:
 When in a firewall filter, packets are rejected (which sends an ICMP
 rejected notice), the routing engine can receive too many of these
 requests, causing the routing engine to choke on its backlog of
 requests. 

Leslie, thanks for excellent update! Was is something similar to ICMP
storm caused by unreachables (similar to the problems caused by
subnet-directed packets in the old days) that even ICMP rate limiting
didn't help?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Upstream keyword in bugzilla

2012-07-02 Thread Marcin Cieslak
 Niklas Laxström niklas.laxst...@gmail.com wrote:
 On 2 July 2012 17:55, Chad innocentkil...@gmail.com wrote:
 On Mon, Jul 2, 2012 at 10:45 AM, Niklas Laxström
 niklas.laxst...@gmail.com wrote:
 I think the upstream keyword in bugzilla is useless. Can we replace it
 with a field which takes an url to the upstream bug report?


 When I've filed an upstream but, I usually put it in the URL
 field. Does that not work?

 If you can keep the keyword and URL in sync. Quick search [1] confirms
 my concerns. There seems to be only few cases where URL is used to
 indicate where the bug happens, so conflicts on there are rare. But
 the problem remains.

I just used that keyword on https://bugzilla.wikimedia.org/show_bug.cgi?id=38114
to say can be fixed upstream, no bug filed yet, and, in this case
not a local customization or configuration change.

Real bug URL should be added when this is filed.

There is another small problem with See also field: it currently
accepts are pretty limited set of bugtrackers. 

See disussion under https://bugzilla.mozilla.org/show_bug.cgi?id=577847
and https://bugzilla.mozilla.org/show_bug.cgi?id=735196 for example
how this is handled within current Bugzilla development.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Today's git master

2012-06-29 Thread Marcin Cieslak
$ git reset --hard
HEAD is now at de13c31 Actually we have many contributors
$

Chad, you made my day:)

//Saper



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Barkeep code review tool

2012-06-29 Thread Marcin Cieslak
As seen on IRC:

https://github.com/ooyala/barkeep/wiki/Comparing-Barkeep-to-other-code-review-tools

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] bot activity in #mediawiki on freenode

2012-06-22 Thread Marcin Cieslak
 Brandon Harris bhar...@wikimedia.org wrote:


   Please move the bots out.  

I like bots. I've taken care of some bugs or CR only because
I've seen it on IRC.

+1 for flood protection and thanks for silencing l10n

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Planning for the future: prepare high resolution icons and images in your code

2012-06-20 Thread Marcin Cieslak
 Brion Vibber bvib...@wikimedia.org wrote:

 * INCLUDE THE SVG IN SOURCE CONTROL!

(...)

 We'll develop some best practices about how to switch in high-res versions
 and whether it's better to use the SVGs or double-sized PNG rasterizations.
 You don't need to use them now, just make sure the images are there when
 we're ready to use them.

One of possible solutions (cerainly not sufficient) is to ask browsers
to send us image/svg+xml in their HTTP Accept: line.

Relevant Mozilla bug:

https://bugzilla.mozilla.org/show_bug.cgi?id=240493

got wonfixed in January 2012 after 8 years of discussion (good read!)

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Some old proposed changes in Gerrit waiting merge, after a code review.

2012-06-14 Thread Marcin Cieslak
 Sébastien Santoro dereck...@espace-win.org wrote:
 Hi,

 I saw this morning those reviewed but not merged code changes in gerrit:

 Parser issue for HTML definition list
 Bug 11748: Handle optionally-closed HTML tags without tidy
   2012-04-17
   Owner: GWicke
   Review: +1 by saper
   https://bugzilla.wikimedia.org/11748
   https://gerrit.wikimedia.org/r/#/c/5174/

I have just rebased this one to the current master.

https://gerrit.wikimedia.org/r/#/c/5174/

git fetch https://gerrit.wikimedia.org/r/mediawiki/core refs/changes/74/5174/3 
 git checkout FETCH_HEAD

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Update on IPv6

2012-06-10 Thread Marcin Cieslak
 Anthony wikim...@inbox.org wrote:
 On Sat, Jun 9, 2012 at 4:29 PM, Anthony wikim...@inbox.org wrote:
 On Fri, Jun 8, 2012 at 9:59 AM, Strainu strain...@gmail.com wrote:
 2012/6/8 Anthony wikim...@inbox.org:
 No one has to break the loop.  The loop will break itself.  Either
 enough people will get sick of NAT to cause demand for IPv6, or they
 won't.

 That one way of seeing things, but I fear it's a bit simplistic and
 naive. People won't get sick of NAT, since most of them don't know
 what NAT is anyway. They'll just notice that the speed sucks or that
 they can't edit Wikipedia because their public IP was blocked. But
 they won't know IPv6 is (part of) the solution unless someone tells
 them to, by events like the IPv6 day.

 Or by the ISP which provides IPv6 advertising those faster speeds or
 decreased privacy.

 Here at BestISP, we assign you a unique number that you can never
 change!  We attach this unique number to all your Internet
 communications, so that every time you go back to a website, that
 website knows they're dealing with the same person.

 Switch to BestISP!  1% faster communications, and the increased
 ability for websites to track you!

There are numerous reasons to have fixed IPv6 addresses per
connection. For example, I have right now around 6 devices supporting
IPv6 at home and I do connect between them internally (for example one
of the is printer - my laptop prints on my printer no matter whether it
is at home or somewhere else provided it has IPv6). You *DON'T* want to
renumber your whole home network every time your ISP changes your IPv6
prefix.

Just because some people got away with the stuff they do on the Internet
because their ISP changes their IPv4 address every so and then does
not mean that dynamic IPv4 address provides *any* privacy.

I could argue that current scheme w/dynamic IPv4 provides less privacy
in the long term for the user.  One of the reasons for that is it is
difficult to run your own infrastructure (like mail server, web server)
on one's own residential connection and you have to rely on external
(called cloud today) providers for that with obvious privacy
consequences of that.

The whole point of IPv6 is to give the choice not to use external
providers - you become part of the cloud, not just a dumb consumer.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Give create gerrit repo right to all WMF engineers

2012-06-09 Thread Marcin Cieslak
 Lars Aronsson l...@aronsson.se wrote:
 On 2012-06-06 00:19, Diederik van Liere wrote:
 A workflow where engineers have to bug a Gerrit admin to do something 
 is a broken workflow:

 As something of an outsider/newcomer, I hear two very different
 stories. The first is the story of all the good reasons why
 Linus Torvalds created git, how it is fully decentralized and
 asynchronous, and how bad it was to work with SVN. The other
 story is gerrit, and how everything must now go through this
 bottleneck of new centralization. There's a conflict here, that
 needs to be sorted out. Does Linus Torvalds really use gerrit?

No, he does not. He uses email workflow to manage patches.

Gerrit tries to do something contrary a bit to the original
git philosophy - it tries to manage commits (trees of files)
as patches (changes to the code), it also encourages
that developers work one-perfect-commit at a time instead
of a feature branch. 

I am not saying it's a bad or impossible workflow but 
it seems to be a bitter dissapointment for people coming
from different background (say, github-like pull-requests).

I would say gerrit puts a cap on a typical git workflow.
Hey, it's even difficult to review and approve changes
off-line.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit question: pushing to another branch

2012-06-05 Thread Marcin Cieslak
 Bergi a.d.be...@web.de wrote:
 Doesn't Git encourage us to create as many branches as we can, to share 
 our work and collaborate? Or should I publish my branch(es) somewhere 
 else, maybe without gerrit at all?

Sorry to say this and many people here might disagree:

 Forget 80% of git versatility when working with gerrit.  No git merge
 (other than fast forward), no git cherry-pick (only -n) and branches
 are rarely useful for more than a local marker for commit.

You are supposed to push one perfect commit for review; you might
also push some with dependencies between them but then things get 
nasty.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] feedback from a gerrit newbie: gerrit questions; improvement of the Git/Tutorial;

2012-06-05 Thread Marcin Cieslak
 Thomas Gries m...@tgries.de wrote:

 In the Tutorial, git review -R is suggested to be used.

 Hashar showed me git review -f
 (documented on
 https://labsconsole.wikimedia.org/wiki/Git-review#Full_feature_branch_wor=
 kflow_with_git-review
 )

 I suggest that the tutorial uses -f this (instead of -R). Please can you
 experts think what's is better suited, or if both should be covered in
 Tutorial ?

Everything is fine given that you don't want to --amend the commit later
for some reason.

The more we get into this the more I regret we recommended using git-review
in the first place ;)

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] git review version update, May 2012

2012-05-30 Thread Marcin Cieslak
 Raimond Spekking raimond.spekk...@gmail.com wrote:
 os.chmod(target_file, os.path.stat.S_IREAD | os.path.stat.S_IEXEC)
 WindowsError: [Error 2] Das System kann die angegebene Datei nicht
 finden: '.git\\hooks\\commit-msg'


 (the last sentence means: The system cannot find the file
 '.git\\hooks\\commit-msg')

 Any ideas how to fix this?

Maybe it is now related to the fact that if git submodules are used
and relatively new git is used and the moon is waning there might
no longer be .git subdirectory in the submodule (i.e. extension)
but there is only one two levels below. 

I think aaronsw fixed this upstream with this change:

https://review.openstack.org/#/c/7166/

(it got merged, and should be included in 1.17, the latest of git-review).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit question: pushing to another branch

2012-05-30 Thread Marcin Cieslak
 Ryan Kaldari rkald...@wikimedia.org wrote:
 How do you create the new branch on gerrit?

In Gerrit Web UI:

Admin - Projects - (choose project) - Branches - Create branch

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] git review version update, May 2012

2012-05-27 Thread Marcin Cieslak
 Platonides platoni...@gmail.com wrote:
 On 26/05/12 20:02, Amir E. Aharoni wrote:
 `git review' says that a new version of git-review is availble on PyPI.
 
 The last update created some unwanted surprises, so I decided to avoid
 updating it for now. What do our Git experts suggest?
 
 Thank you,

 It has been telling me that for a long time.
 Yet the proposed command to update (pip install -U git-review)
 dies with exec(open(git-review, r))
 TypeError: exec() arg 1 must be a string, bytes or code object

 So well, I just ignore it :)

You can try to download it by hand from

http://pypi.python.org/pypi/git-review#downloads

it's just one python script.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Cry for help from translatewiki.net

2012-05-21 Thread Marcin Cieslak
 Some extension commits go past Gerrit code review. This means that it
 is impossible to even get notifications on those extensions. Some of
 those extensions are in use at translatewiki.net and given the
 numerous breakages related to those extensions lately, I am seriously
 considering removing those extensions from translatewiki.net until
 this issue is solved. That is bye bye to maps showing our registered
 translators around the world.

I think that having commits to extensions unreviewed is a good thing
for their maintainers. Auto-approval looks like a workaround.

I have a question here since I am not sure I fully understand the problem:
how are you getting notifications that i18n files have changed?
From gerrit or git? I think it should be the latter (some git hook).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Reminder about online Git/Gerrit tutorial - May 8 (today) @ 19:00 UTC

2012-05-08 Thread Marcin Cieslak
 Gregory Varnum gregory.var...@gmail.com wrote:
 Reminder to folks about the tutorial that saper is doing for Git / Gerrit 
 later today @ 19:00 UTC.

 Below is more info, but here is a link with the highlights:
 https://www.mediawiki.org/wiki/Project:WikiProject_Extensions/MediaWiki_Workshops#Upcoming:_Git_.2F_Gerrit_Tutorial

 Remember to RSVP!
 http://doodle.com/qnrgibpqxyamqhzb

 I hope to see folks there!

 This is at least the third Git related training and more are coming.  In 
 other words - have no fear - help is on the way.  :)

I have posted some details regarding SIP/IRC/SSH setup:

  https://www.mediawiki.org/wiki/Git/Workshop

Please test your SIP connection beforehand!

See you there...

//Marcin



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The most efficient approach to patchset dependencies

2012-05-08 Thread Marcin Cieslak
 Beau b...@adres.pl wrote:
 I know gerrit can use dependencies, so I can make a chain of dependant
 changes: c1 - c2 - c3 - c4. However if c2 and c3 got a positive
 review and c1 needs some rework, c2 and c3 need to be reviewed again
 after I submit c1. Sometimes another, unrelated change may be merged, so
 the whole chain needs to be rebased against master.

Refactoring with code review is hard... If you refactor code,
whole merging magic is usually making more harm than good.

What I would suggest:

* Try to keep refactoring changes separate to functional changes
* You can try to separate independent changes and have a common parent
  for them

Ideally, the commit tree could look like:

  P --
  | \ \
  |  \ \
  r1  r2   c1
  |\   \
  | \   \
  c2 c3 c4

r1 - refactoring change 1
r2 - independent refactoring change 2
c1 - independent functional change
c2,c3 - functional changes after refactoring 1
c4 - functional change after refactoring 2 

You can achieve this by goint back to parent every time
you are finished with some other change:

 git checkout parent

or 

 git reset --soft parent (if you already have changes pending)

Make sure to have separate Change ID's for c2 and c3 otherwise
they might end up as two patchsets of the same change.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Reminder about online Git/Gerrit tutorial - May 8 (today) @ 19:00 UTC

2012-05-08 Thread Marcin Cieslak
 Beau b...@adres.pl wrote:
 W dniu 05/08/12 17:23, Marcin Cieslak pisze:
 See you there...

 Saper, thanks for your time! Some mysterious git commands actually make
 sense for me now :-).

There were 11 people on the call who managed to overcome issues with
SIP networking. Big thanks to alfa, hotel, india, oscar, sierra, tango,
whiskey, xray and zulu for staying for so long and your patience :)

I took almost two hours - but I hope you liked the somewhat experimental
format of the exercise. Please feel free to send your feedback to me
or Sumanah - it was the first time in this format and we need to work
to make it better.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lua: return versus print

2012-04-15 Thread Marcin Cieslak
 Trevor Parscal tpars...@wikimedia.org wrote:
 +1 to all the points for using return values.

Zope has a nice solution here:

   print Asdsds 

prints actually to the internal magic variable printed
which has to be returned later with

   return printed

if it's going to end up as the function result.

Not sure if this is possible in Lua.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] git-review wanting to submit lot of changes

2012-04-15 Thread Marcin Cieslak
 Daniel Friesen li...@nadir-seen-fire.com wrote:
 What about the users who clone from their own GitHub fork as origin and  
 push side-project branches there before merging and pushing finished  
 projects to gerrit?

A proper fix in the works currently is to not need .gitreview file at all
if one of the remotes (origin, gerrit or whatever) is pointing
to a valid Gerrit instance. Actually we don't need a new remote at all;
but that's the behaviour git-review insisted on.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Save to userspace

2012-04-11 Thread Marcin Cieslak
 Petr Bena benap...@gmail.com wrote:
 It isn't stable, maybe someone should take over the work on it... If
 it was finished it would be nice to have feature, if it was ever
 deployed of course.

Can you describe (maybe on a talk page or maybe better in bugzilla)
what's wrong with this extension (why it isn't working) and also
how would you like to have it changed? It's much easier to start
working on the obsolete codebase than from scratch. 

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] selenium browser testing proposal and prototype

2012-04-11 Thread Marcin Cieslak
 Markus Glaser gla...@hallowelt.biz wrote:
 Some time ago some people from the test framework team started working on a 
 Selenium Framework for MediaWiki [1], in PHP and with Selenium 1.0. One of 
 the reasons the project discontinued was that there was no clear case of when 
 Selenium would be useful as opposed to unit tests, esp. using QUnit and 
 TestSwarm for UI testing. I still see some use cases, though:

 * This also might be useful when filing bugs (make them reproducible)

One example I would like to have scripted is the following:

- have a sysop account to watch the non-existing page name
- create that page with some content
- have a sysop to delete this page

Very good testing case for DB transaction related problems.

I doing such tests should be possible in the new framework?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] @since

2012-04-11 Thread Marcin Cieslak
Hello,

I have a problem similar to the one of RELEASE-NOTES.

After great pains (broken merges, unknown dependencies, etc.)
I have pushed f74ed02ef744138a8d2a87322f81023ddc143a5f where
I have marked some methods @since 1.19 since I really hope
to have this backported to 1.19 and maybe even 1.18.

How should we handle the @since stuff? What if it never
gets merged to the release branch? If not, which one?

Should REL1_18 say @since 1.18.3 and REL1_19 @since 1.19
or should all consitent mention the lowest (although
1.18.4 may be younger than, say 1.19)?

Or, once something @since 1.20 gets merged in to REL1_19,
I should modify @since to 1.19 in master and update 
master's RELEASE-NOTES-1.19 as well?

The answer I got on IRC is that it is usually up to the
developer to plan for which release it should go to
in advance;  but I can understand that my changes
won't be allowed into, say, REL1_18 for this reason or another
(they are not security fixes, for example).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] GSOC proposal: Native application uploader

2012-04-04 Thread Marcin Cieslak
Zawartość nagłówka [Followup-To: 
gmane.science.linguistics.wikipedia.technical.]
 Platonides platoni...@gmail.com wrote:
 Hello all,
 I'm presenting a GSOC proposal for a native desktop
 application designed for mass uploading files on
 upload campaigns.

 Opinions? Improvements? Sexy names? Mentors?

 All of them are welcome!

 1-
 http://lists.wikimedia.org/pipermail/wikilovesmonuments/2012-March/002538.html

Odder already mentions Commonist in his email, so let me expand on this
(as I was fixing older Java versions to work with a more modern MediaWikis):

You have the last Java version in our SVN:

http://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/commonist-java/

as well as the next generation Commonist in Scala:

http://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/commonist/

Scala version solved some architectural issues with the Java version.

I would definitely recommend to build on Commonist; I actually like the
tool very much (I was still using old java version until recently).  It
has simple UI that meets *most* of the requirements.

Actually providing some sensible defaults (or even-simpler-UI) should be
enough for WLM people. Commonist is actually quite customizable
(a lot can be done using property files and templates), 

The only thing which I really don't like in Commonist ist that actual
upload phase is done together with metadata editing.  Metadata weren't
saved (at least in the older versions I have used) together with
images (or somewhere else - images can be on R/O medium) 
so you would lose them if the tool was closed.

So probably there should be three phases:

(1) metadata management/editing (that includes some defaults for WLM
folk) 

(2) actual upload/sync (Commonist has ability to re-upload).  

(3) obtaining upload results and letting users to decide what to do with
problems (force re-upload etc.)

Some users with very limited upstream bandwidth reported quite good
results with Commonist when needing to upload lots of images and having
to leave computer working overnight to actually transfer them.

And there is one feature that actually huge majority of people liked
- Commonist can be launched from the webpage as the Java Webstart
application, so - from the user's perspective - you don't really need
to install it on your computer. I've even talked to some who didn't
realize really it was a separate application, it just magically worked
for them out of the browser. Huge advantage. 

So from my POV - +1 for taking Commonist to the next level, even
if this means learning Scala. 

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] correct way to import SQL dumps into MySQL database in terms of character encoding

2012-04-01 Thread Marcin Cieslak
 Piotr Jagielski piotr.jagiel...@op.pl wrote:
 Hello,

 set my data source URL to the following in my Java code:
 jdbc:mysql://localhost/plwiki?useUnicode=truecharacterEncoding=UTF-8

Please note you have plwiki here and you imported into wiki.
Assuming your .my.cnf is not making things difficult I ran a small
Jython script to test:

$ jython
Jython 2.5.2 (Release_2_5_2:7206, Mar 2 2011, 23:12:06) 
[OpenJDK 64-Bit Server VM (Sun Microsystems Inc.)] on java1.6.0
Type help, copyright, credits or license for more information.
 from com.ziclix.python.sql import zxJDBC
 d, u, p, v = jdbc:mysql://localhost/wiki, root, None, 
 org.gjt.mm.mysql.Driver
 db = zxJDBC.connect(d, u, p, v, CHARSET=utf8)
 c=db.cursor()
 c.execute(select cl_from, cl_to from categorylinks where cl_from=61 limit 
 10)
 c.fetchone()
(61, array('b', [65, 110, 100, 111, 114, 97]))
 (a,b) = c.fetchone()
 print b
array('b', [67, 122, -59, -126, 111, 110, 107, 111, 119, 105, 101, 95, 79, 114, 
103, 97, 110, 105, 122, 97, 99, 106, 105, 95, 78, 97, 114, 111, 100, -61, -77, 
119, 95, 90, 106, 101, 100, 110, 111, 99, 122, 111, 110, 121, 99, 104])
 for x in b:
... try:
... print chr(x),
... except ValueError:
... print %02x % x,
... 
C z -3b -7e o n k o w i e _ O r g a n i z a c j i _ N a r o d -3d -4d w _ Z j e 
d n o c z o n y c h

array('b, [ ... ]) in Jython means that SQL driver returns an array of bytes.

It seems to me that array of bytes contains raw UTF-8, so you need to decode it 
into
proper Unicode that Java uses in strings. 

I think this behaviour is described in

http://bugs.mysql.com/bug.php?id=25528

Probably you need to play with getBytes() on a result object
to get what you want.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Changeset differ (was Re: Committing followups: please no --amend)

2012-03-29 Thread Marcin Cieslak
 Tim Starling tstarl...@wikimedia.org wrote:
 I wrote:

 Also, I'm concluding based on Roan's objections that I'm going to have
 a hard time convincing people to stop amending their commits. So I
 wrote this script that provides changeset diffs for reviewers:

 http://tstarling.com/gerrit-differ.php

 It fetches both commits from a local mirror into a temporary branch,
 rebases the branches to a common ancestor, and then diffs them.

There is an interesting failure mode of the whole plan :)
As you noticed on IRC comparing patchset 4 and 5 makes no sense.
Patchset 5 actually does not change anything except the commit
message relative to the patchset 4, but that's not the issue.

Side note:
 I have a small tool (https://github.com/saper/gerrit-fetch-all)
 to fetch all changesets from gerrit and branch the as
 change/3841/4. Then you can diff your changesets locally,
 in the git repository.

Here's why: patchsets 1 till 4 are based off revision

$ git log --pretty=oneline change/3841/4 ^change/3841/4^^
fcc05dee9b93e080b13fc4b0d5b83a1c75d34362 (bug 5445) remove autoblocks when user 
is unblocked
8824515e571eadd4a63b09e1331f35309315603f Fix Bug 30681 - Wrong escaping for 
inexistent messages.

While patchset 5 has been rebased to a newer changeset
from master:

$ git log --pretty=oneline change/3841/5 ^change/3841/5^^
4f2ff743ff1bc93d922ab9b5b3135786df5c7b69 (bug 5445) remove autoblocks when user 
is unblocked
571e63cd2c2bac9a033e1816f5ad8b6a14b4f42b Merge Use local context to get 
messages
95c35e52113b9a98accc1e9b0e9fffc15b1661a8 Use local context to get messages

$ git branch -vv |grep change/3841/
  change/3841/1  89daac5 Remove autoblocks when original block 
goes (Bug #5445) 
  change/3841/2  b9090b3 (bug 5445) remove autoblocks when user 
is unblocked
  change/3841/3  96692fb (bug 5445) remove autoblocks when user 
is unblocked
  change/3841/4  fcc05de (bug 5445) remove autoblocks when user 
is unblocked
* change/3841/5  4f2ff74 (bug 5445) remove autoblocks when user 
is unblocked

So here's how patchsets 4 and 5 differ according to git:

$ git log --pretty=oneline change/3841/4...change/3841/5
* 4f2ff743ff1bc93d922ab9b5b3135786df5c7b69 (bug 5445) remove autoblocks when 
user is unblocked
*   571e63cd2c2bac9a033e1816f5ad8b6a14b4f42b Merge Use local context to get 
messages
|\  
| * 95c35e52113b9a98accc1e9b0e9fffc15b1661a8 Use local context to get messages
* |   681a170f290ca0a7b0d771155ddc59f091a5576d Merge Add phpunit testcases for 
Bug 30681
|\ \  
| * | b91ffd7b09b445224cdef27a3a40bc9ded1fb8c7 Add phpunit testcases for Bug 
30681
|  /  
* | dde3821ac130486a24a7f7a97eaf0eb6d67e55d2 (bug 35541) ns gender aliases for 
Croatian (hr)
|/  
* cc2f70df0d106f84877591113d3973214bcfd36a gitignore mwsql script history file
* fcc05dee9b93e080b13fc4b0d5b83a1c75d34362 (bug 5445) remove autoblocks when 
user is unblocked

The common revision for both changesets is the next one:

$ git merge-base change/3841/4 change/3841/5   
8824515e571eadd4a63b09e1331f35309315603f

(this is the parent of change/3841/1..4)

So it is clear that diff between them will include all the changes merged to
master in between. 

My git-fu is limited, so I don't know how to compare such revisions.

I think we generally run into git architectural assumption
- that git is meant to store trees of files and diffs
or changes are not an object in the git world at all.

So I think that rebasing the patchset to the current master is a bad idea
for review. However, this increases likelihood of merge conflict
once the change is approved. Maybe we should have a workflow like this:

patchset 1 - proposed change
patchset 1 - review, negative
patchset 2 - updated change
patchset 2 - review, negative
patchset 3 - updated change
patchset 3 - review, possitve - change approved
patchset 4 - patchset 3 rebased to the master branch
patchset 4 - merged, closed (if ok)

Would that work in practice?

//Saper





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Changeset differ (was Re: Committing followups: please no --amend)

2012-03-28 Thread Marcin Cieslak
 Tim Starling tstarl...@wikimedia.org wrote:
 I wrote:
 It doesn't work, I'm afraid. Because of the implicit rebase on push,
 usually subsequent changesets have a different parent. So when you
 diff between the two branches, you get all of the intervening commits
 which were merged to the master.

 I was hoping that someone was going to say you're wrong, making those
 diffs is easy, here's how. But I take it by the silence that I'm not
 wrong, and it really is hard.


I just tried to push second commit to 
https://gerrit.wikimedia.org/r/#change,3841
patchet three.

If you don't start from scratch i.e. base your commit on the parent:

8824515e571eadd4a63b09e1331f35309315603f

(now I have

$ git log HEAD ^HEAD^^^
commit e67af5bbd843db3062cc0082254b69aae3d1241b
Author: saper sa...@saper.info
Date:   Wed Mar 28 22:06:17 2012 +0200

An example how a foreign key should be added to the table

Change-Id: I0da5b25f4b4499facac6c410fa7ab74250935288

commit 96692fb23c00cb726144290b108623896cf24834
Author: Marc A. Pelletier m...@uberbox.org
Date:   Tue Mar 27 22:44:32 2012 -0400

(bug 5445) remove autoblocks when user is unblocked

(...comment truncated...)

Change-Id: I4aa820ae9bbd962a12d0b48b6c638a1b6ff4efc9

This is the current HEAD:

commit 8824515e571eadd4a63b09e1331f35309315603f
Author: Santhosh Thottingal santhosh.thottin...@gmail.com
Date:   Wed Mar 28 11:25:45 2012 +0530


Trying to commit e67af5bbd843db3062cc0082254b69aae3d1241b
makes gerrit say:

 ! [remote rejected] HEAD - refs/changes/3841 (squash commits first)

It does not matter if I use the same change ID or not. It knows
exactly where it should go but it still refuses it.

I have managed to workaround this by creating a branch, doing
lots of commits there, merging it, and push the merge to gerrit.

But then it uploads lots of unrelated changets:

https://gerrit.wikimedia.org/r/#change,3706
https://gerrit.wikimedia.org/r/#change,3707
https://gerrit.wikimedia.org/r/#change,3708 (but this was outside of the branch)
https://gerrit.wikimedia.org/r/#change,3709

The commit tree looked like:

 private branch:  3706 --- 3707 ---
 / \
   62562768cf8f2696 +  3708 + 3709 (merge)


As you can see, although there were so many changesets,
they all have dependencies set properly.

Is this a better way? I don't know...

I wonder why in this case gerrit does not complain
with its usual  (squash commits first)

Having private branches with other people
would certainly help to work together on issues.

I tried to submit an improvement to 
https://gerrit.wikimedia.org/r/#change,3841 and
it seems I can't do this the other way than 
rebasing my changes to the parent of the changeset
(*not* master). Not sure how to make a branch
out of it (maybe I should merge it with the parent
commit?)


//Saper




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Committing followups: please no --amend

2012-03-27 Thread Marcin Cieslak
 Tim Starling tstarl...@wikimedia.org wrote:

 It doesn't work, I'm afraid. Because of the implicit rebase on push,
 usually subsequent changesets have a different parent. 

How does the implicit rebase on push work? Do you mean git-review?

I don't know whether git push remote HEAD:for/master/topic
rebases anything. I still prefer plain git push (where I can also
ay HEAD:refs/changes/XXX) to git-review magic.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Test suite for dumping MediaWikis using xmldumps-backup

2012-03-17 Thread Marcin Cieslak
 Christian Aistleitner christ...@quelltextlich.at wrote:

 --===2205038051751942713==
 Content-Type: multipart/signed; micalg=pgp-sha512;
   protocol=application/pgp-signature; boundary=Q68bSM7Ycu6FN28Q
 Content-Disposition: inline


 --Q68bSM7Ycu6FN28Q
 Content-Type: text/plain; charset=us-ascii
 Content-Disposition: inline
 Content-Transfer-Encoding: quoted-printable

 Hello everyone,

 in the past weeks, I put together xmldumps-test---a test suite for
 Ariel's xmldumps-backup software. xmldumps-test tries to assure that
 the MySQL database, MediaWiki, and xmldumps-backup play nicely
 together.

 xmldumps-test injects data into the database (so do not use it on a
 live database), starts xmldumps-backup, and compares the generated XML
 dumps against pre-verified data.
 Using xmldumps-test I hope to catch problems caused by modifications
 to MediaWiki or xmldumps-backup /before/ they hit Wikimedia's
 production servers dumping enwiki, ...

 The code is up for review at
   https://gerrit.wikimedia.org/r/p/operations/dumps/test.git
=2E
 README serves as general point of entry to the documentation.
 README.installation shows you how to set up xmldumps-test.

One question:

in the https://gerrit.wikimedia.org/r/p/operations/dumps/test.git
repository there are two branches, master and ariel
and the README me says we should use ariel.

master however see to be also attached to a gerrit project

I was able to check it out using 

ssh://sa...@gerrit.wikimedia.pl/operations/dumps/test.git 

port 29418

Which shall we use? It seems that I can propose patches
using gerrit only to master while ariel seems
to be a bit more active.


Second thing - I was fixing recently few nuts and bolts
for seamless PostgreSQL support, so I'd love to have
that for PostgreSQL too. Once I sort out outstanding
installer/updater issues I am willing to help, of course.

We already ran a PostgreSQL testsuite on jenkins and
I think we should check dumps too. 

//Saper



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mark Hershberger departing Wikimedia Foundation in May

2012-03-17 Thread Marcin Cieslak
 Petr Bena benap...@gmail.com wrote:
 Hi all
 I thought that his role must have been one of most boring like the walking
 through a lot of various reports and trying to make some lazy developers to
 fix them ;-) but i was surprised that when I started to work with him thas
 the work we did was almost most entertaining experience I had on wikimedia
 ever. Seriously I always felt like he is underappreciated for the hard work
 he did. Thanks to Mark I got in contact with lot of awesome people inluding
 himself and found that being a bugmeister is not a boring work but
 important part of development process. Thanks for your hard work! I hope ur
 not leaving the project definitely and stay at least as a volunteer who can
 help us even if not so actively

I fully agree. Mark turned this uninteresting job into something actually
motivating. No bug was too stupid to take care of it and research. 
He also did some testing which was very helpful. I have to admit Mark's
work motivated me to do a bit more with MediaWiki!

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Integrating code common to several extensions into core?

2012-03-15 Thread Marcin Cieslak
 Max Semenik maxsem.w...@gmail.com wrote:
 What is our policy/best practices on code needed by several
 extensions? Does it make sense to integrate such code into core?

 For example, my current situation: there is a class in MobileFrontend
 that performs reformatting of HTML: remove some tags depending on
 their ID/class and so on. MF uses it to make page content more
 suitable for mobile devices. At the same time, it can be used for
 other transformations such as getting plain-text extracts of articles.
 Consequentially, producing such extracts is currently part of
 MobileFrontend although such functionality should belong to a separate
 extension (and why not core?). So if I want to use this functionality
 in several extensions, they should either depend on one of them or some
 meta-extension, both of these would be inconvenient.

If done *cleverly* - why not? Do you mean something like preventing
adding something by OutputPage::addHTML()/addElement() in the first place?

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] We need to use git-review - need help packaging for Win/Mac (Gerrit)

2012-03-14 Thread Marcin Cieslak
 Sumana Harihareswara suma...@wikimedia.org wrote:
 I wrote:
  So if you have tried and failed to install and use
 git-review, please speak up ASAP so we can make our git-review
 instructions and workflow more robust.

 I'm not hearing anyone saying that this is still failing for them.  And,
 as Roan pointed out, people can use

There is still some problem with pushing to 

/test/mediawiki/extensions/examples

with git-review:

http://tools.wikimedia.pl/~saper/fail/gerrit-fail-01

or without:

http://tools.wikimedia.pl/~saper/fail/gerrit-fail-02

Seems to be something because of configuration:

https://labsconsole.wikimedia.org/w/index.php?title=Gerrit_bugs_that_matteraction=historysubmitdiff=2638oldid=2611

https://labsconsole.wikimedia.org/wiki/File:Gerrit_branch_permissions_wrong.png

notice refs/for/refs/*  - but changing this apparently does not help.

Fortunately, /test/mediawiki/core works fine for me.

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


  1   2   >