Re: [Wikitech-l] pt.wikimedia.org - database naming

2016-03-03 Thread Waldir Pimenta
Well, the conversation stalled in a way that made it unclear (to me at
least) who can decide what option to choose (rename the db, if that's at
all viable, or start a new wiki using pt2wikimedia, if that's acceptable,
or something else). Is there anything we from Wikimedia Portugal can do or
say to help move this forward?

On Thu, Mar 3, 2016 at 2:16 PM, Alex Monk  wrote:

> I'm not sure what you're expecting, Alchimista. I haven't received any
> extra emails on the subject beyond those from the Phabricator task and
> those on this list.
>
> On 3 March 2016 at 11:34, Alchimista  wrote:
>
> > Any update on this?
> >
> > 2016-02-24 18:39 GMT+00:00 Legoktm :
> >
> > > Hi,
> > >
> > > On 02/24/2016 06:18 AM, Antoine Musso wrote:
> > > > Given most of the useful data/history has been exported, I would
> > suggest
> > > > to rename on WMF cluster the ptwikimedia DB to something like
> > > > ptwikimedia_old or even just archive a dump of it and drop it.
> > > >
> > > > Then create a new ptwikimedia and import the external db there.
> > >
> > > I think reusing the "ptwikimedia" db name would be ideal. My only
> > > concern with doing so is that it might be in databases somewhere, but I
> > > checked CentralAuth and there are no ptwikimedia entries, so at least
> > > that will be fine.
> > >
> > > -- Legoktm
> > >
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > >
> >
> >
> >
> > --
> > Alchimista
> > http://pt.wikipedia.org/wiki/Utilizador:Alchimista
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] pt.wikimedia.org - database naming

2016-02-24 Thread Waldir Pimenta
Assuming there's no easy way to merge the databases, we are fine with
dropping the old db. I believe most content was imported to the current
wiki at the time of the migration, see
https://phabricator.wikimedia.org/T25537. An xml dump was used, not an SQL
one, so I suppose stuff like logs may not have been preserved, but in any
case it's not critical that we preserve all that historical info. I mean,
it would certainly be nice, but we can live without it.

Or we could use the pt2wikimedia, as that would allow future archeologists
to recover the data from the beginning of the wiki :) Either option is fine.

On Wed, Feb 24, 2016 at 12:13 PM, This, that and the other <
at.li...@live.com.au> wrote:

> Why not just delete the old ptwikimedia site and put the new one in its
> place, using the same dbname?
>
> The old wiki is inaccessible, since pt.wikimedia.org redirects offsite,
> so it's unclear if the old DB even needs to be preserved. And presumably
> any configuration bits that refer to ptwikimedia will still be relevant to
> the new site.
>
> If for some reason that is not feasible, I guess pt2wikimedia is
> acceptable, though only as a last resort. As I've said before, there really
> needs to be a better way to rename wikis without wasting hours of
> everyone's time...
>
> TTO
>
> --
> "Alex Monk"  wrote in message
> news:CALMPGzX_E4ML0xD2A_2vTRZ+2a+nWtpC9KkJe58x=mdg-uo...@mail.gmail.com...
>
>
> Hi all,
>
> A request has come up (https://phabricator.wikimedia.org/T126832) to
> re-create pt.wikimedia.org on the wikimedia cluster. Unfortunately it was
> previously hosted there and so the 'ptwikimedia' database name is already
> taken.
> Since database renaming does not really appear to be an option, does anyone
> have any objections to using 'pt2wikimedia' (or similar, suggestions
> welcome) instead for the new wiki? I know this doesn't fit the existing
> pattern so I'm unsure about just going ahead without asking for input from
> a wider audience.
>
> Alex
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Primerpedia updated

2014-02-26 Thread Waldir Pimenta
Hi all!

Some of you might remember my previous
anouncementhttp://www.gossamer-threads.com/lists/wiki/wikitech/321799of
the Primerpedia proof of concept for the Concise
Wikipedia https://meta.wikimedia.org/wiki/Concise_Wikipedia proposal on
meta. After a long while, I finally implemented a basic search
functionality and some other fixes, so it should now be a minimally usable
tool. Give it a try! http://waldir.github.io/primerpedia/

--Waldir

ps - please fill issues or pull requests to the github repo (linked in the
footer of the site)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: suggestions regarding my proposals for gsoc and outreach program

2013-04-12 Thread Waldir Pimenta
Hi Richa.

Your project seems interesting. I wonder if you have already heard of
hypothes.is: http://hypothes.is/what-is-it
It seems quite similar to what you're proposing, and they have conducted a
very detailed study of previous projects:
https://docs.google.com/spreadsheet/ccc?key=0Am32b0H2bOCCdFgwRDh6UVIxVzI4ajdadEJVTEZkX0E

Maybe you could make use of the findings they uncovered in their research,
or even implement your project in a way that's interoperable with their API.

Their code is in github: https://github.com/hypothesis/h


On Fri, Apr 12, 2013 at 3:42 PM, Richa Jain richa.jain1...@gmail.comwrote:

 Hi, i have written a rough draft for my proposal to outreach program and
 gsoc and need your suggestions regarding this.
 http://www.mediawiki.org/wiki/User:Rjain/Gsoc-Prototyping-inline-comments.

 Thank you.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Design] Wikimedia Iconathon Remote Participation

2013-04-09 Thread Waldir Pimenta

  On 09/04/13 04:19, Steven Walling wrote:

 On Monday, April 8, 2013, Waldir Pimenta wrote:

  Apart from that, where were the results posted? All I found were
 pictures from the event, spread across several categories. I've collected
 them on http://commons.wikimedia.org/wiki/Category:Iconathon_2013 and
 cleaned up the other categories they were polluting (e.g. Category:I !!). I
 would expect the results to be uploaded to
 http://commons.wikimedia.org/wiki/Category:The_Noun_Project but so far I
 haven't seen any new icons there. Is this coming soon?


  Waldir, the event just happened two days ago, and yesterday was a
 Sunday. Please show a little patience.

  The main activity during the event was design work and critique that did
 not yet involve making permanent digital assets. The core part of this was
 not making simple black and white SVGs of icons, but churning out as many
 ideas for representing concepts as possible, which we did collaboratively
 in groups on paper. Then we sorted and assessed these ideas with lots of
 back and forth between designers, Wikimedians present, etc.

  Now that the idea phase is over and it's time to move toward execution,
 Vibha and others who ran this will be posting more updates this week. All
 relevant materials will of course be available for use on Commons.


On Tue, Apr 9, 2013 at 6:06 AM, Isarra Yos zhoris...@gmail.com wrote:

  I think the point Waldir was making was about a lack of information
 preceding the event, not so much the follow up.


Thanks, that was indeed my main concern. I was a little frustrated as I
would like to have at least watched parts of the event, and even accepted
that it wouldn't be possible due to the nature of the event, only to be
informed, after the fact (I didn't read wikitech-l in the hours prior to
the event) that some sort of remote participation was actually possible.
This led me to vent out a little more than I should in my message, and I
apologize for that (also, I didn't mean to imply that making black and
white icons is a simple matter and should be done quickly; sorry if that
was the impression I passed). Nevertheless, my concern still stands:
although several event pages were created (more than should have been, imo,
but that's another issue), none of them made any reference to either remote
participation, or how non-participants could expect to know about the
outcomes.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia Iconathon Remote Participation

2013-04-08 Thread Waldir Pimenta
Sorry, but this announcement was frankly too little, too late. I had looked
at the Wikimedia blog post, the Noun Project blog post, the eventbrite
page, the talk pages of the event pages in mediawiki, meta and wikipedia
(btw, why so much duplication? where was the central hub?) and there was
nothing even suggesting that remote participation would be possible, or
that a hangout could be expected. This is even worse considering that it
was asked a while ago (
http://lists.wikimedia.org/pipermail/design/2013-March/000423.html) but
nobody seems to have answered.

Apart from that, where were the results posted? All I found were pictures
from the event, spread across several categories. I've collected them on
http://commons.wikimedia.org/wiki/Category:Iconathon_2013 and cleaned up
the other categories they were polluting (e.g. Category:I !!). I would
expect the results to be uploaded to
http://commons.wikimedia.org/wiki/Category:The_Noun_Project but so far I
haven't seen any new icons there. Is this coming soon?

Self-answer, after doing a little research: Afterwards, The Noun Project
will take the pen/paper sketches and have their vector artists render them
in SVG. They will be released about a month after the event — from
http://lists.wikimedia.org/pipermail/design/2013-March/000421.html. Yet
this wasn't mentioned anywhere in any of the page I listed above. A
results section would be most welcome for those who weren't able to
participate.

Related, for anyone interested: I recently came across interesting
resources for the kinds of icons that the Noun Project seems to focus on,
and tried to organize the categories in commons to make them more findable:
http://commons.wikimedia.org/wiki/Category:Plain_black_icons (and
subcategories) and fontello.com (font-based icons, all of them freely
licensed).

--Wadir

ps - I don't understand why this announcement wasn't also posted to the
design mailing list. I'm CC'ing it.


On Sat, Apr 6, 2013 at 6:34 PM, Munaf Assaf mas...@wikimedia.org wrote:

 Hello all,

 You can watch the iconathon here:
 http://youtu.be/sei9SeoObJA

 If you would like to participate in the hangout, please email me, Pau Giner
 (pgi...@wikimedia.org) or Vibha Bamba (vba...@wikimedia.org).

 Thanks!
 Munaf
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] A new project - Wikicards

2013-04-08 Thread Waldir Pimenta
I share Sumana's and Max's concerns. The lead paragraph and the lead
sentence of the article are indeed meant to provide a good, stand-alone
summary of the subject, withing the limits of their corresponding lengths.
The guidelines concerning these are located at
WP:LEADhttp://en.wikipedia.org/wiki/Wikipedia:Manual_of_Style/Lead_sectionand
its section
WP:BEGINhttp://en.wikipedia.org/wiki/Wikipedia:Manual_of_Style/Lead_section#First_sentence,
respectively. The changes you suggest as an example for the Madonna article
are actually largely compatible with the spirit of those guidelines. I
suggest you to become familiarized with both as you review your proposal.
You should also take a look at
http://meta.wikimedia.org/wiki/Concise_Wikipedia, which presents several of
the use cases you already mention. The Related projects/proposals section
of that page is particularly interesting for seeing what others have
already proposed/implemented.

In summary, I think your proposal *could* be useful if it implemented a way
to provide subject-specfic layouts for the article leads or first sentences
(e.g. highlighting the birth date if it's a person, etc), but
Wikidatahttps://www.wikidata.org/wiki/Wikidata:Introductionkind of
renders such a heuristic and/or manual annotation approach
obsolete, since those pieces of information will be explicitly annotated as
such and in a much universal way than the current semantic markup systems
such as PERSONDATA http://en.wikipedia.org/wiki/Wikipedia:Persondata. Not
to mention the advantage of not requiring manual sync between the
cardified data and the current lead section, and also the ability to use
the same set of data in any Wikipedia, and indeed, in any site of the web
as Wikidata matures.

My suggestion is to reconsider your proposal, but I would like to leave you
a word of encouragement, since you clearly had a good idea (the fact that
many before you have thought of ways to tackle the same problems ratifies
that), and took the time to investigate existing implementations (i.e. the
Popups) and create a detailed exposition to your plan. I'm sure your GSoC
project, whatever you choose, will have good chances of success. Keep up
the good work!


On Mon, Apr 8, 2013 at 1:56 PM, Sumana Harihareswara
suma...@wikimedia.orgwrote:

 On 04/07/2013 02:52 PM, Max Semenik wrote:
  On 07.04.2013, 20:11 Paul wrote:
 
  Indeed. We have a Universal Language Selector for 3rd party websites
  already. This one would be super-duper great. This is where Wikidata
  can play its part as well!
 
  As for images, we have already a PageImages extension in all wikis,
  and we have Wikidata, where item properties may point to images.
 
  Also, there's Navigation popups[1] while with
  api.php?action=queryprop=extracts you can get the text of page lede,
  or N first sentences. To summarize, almost everything (or
  eveerything) needed for this project is already available, so in a
  couple hours of hacking it should be possible to hack the navpopups to
  work in the way described in the proposal. I wonder how much reasearch
  have the proposer done before making it public? Nevertheless, I'm not
  saying that this proposal is not worth a SoC, however it should be
  heavily revised based on input from this thread, then we could decide
  if it has enough potential.

 I'm glad Gaurav shared this proposal with us so early so that we can
 help with criticism and other feedback.  Everyone else reading this
 who's thinking about proposing a summer project: please tell the list
 ASAP so we can help you too!

 I think my main concern regarding Gaurav's proposal is the same one
 Yaron mentioned in https://www.mediawiki.org/wiki/User_talk:Grv99 : as
 our guidelines state, YES to projects already backed by a Wikimedia
 community. NO to projects requiring Wikipedia to be convinced.
 https://www.mediawiki.org/wiki/Summer_of_Code_2013#Your_project

 Gaurav, the reason we have these guidelines is that we have seen past
 failures and want to avoid them in the future.

 Maybe you could consider thinking about what you are really interested
 in and finding a more achievable way of working towards that within the
 structure of GSoC.  For example, if you want to ensure that there's a
 canonical photo and one-sentence summary associated with every article
 topic, maybe you could work with Wikidata on that -- and then a future
 student can improve the Navigation Popups gadget (and spread it across
 all the wikis) to make use of the photo and summary sentence.  That's
 just one idea for an approach that's more likely to succeed.
 --
 Sumana Harihareswara
 Engineering Community Manager
 Wikimedia Foundation

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal: Wikitech contributors

2013-04-04 Thread Waldir Pimenta
Keeping a single wiki seems to be the most sensible approach. I agree with
Erik that there are good ways to separate different types of content, such
as namespaces, and I agree with Yuri that wikitech is the best choice of
name, since it is a superset of mediawiki.
I think we should agree on this first, and then decide whether we want the
more ambitious, SMW-powered features (I am for them as well, btw).

--Waldir

On Thu, Apr 4, 2013 at 8:10 AM, Erik Moeller e...@wikimedia.org wrote:
 If feasible, I would at the end of the day still argue in favor of a
 single consolidated technical wiki. I realize calling that wiki
 mediawiki.org is not ideal, but beyond the domain name, a lot can be
 done to provide reasonable divisions (namespaces, navigation, etc.),
 so that MediaWiki the product and other Wikimedia technical projects
 and processes are clearly distinct.

On Thu, Apr 4, 2013 at 8:58 AM, Yuri Astrakhan yastrak...@wikimedia.org
wrote:
 If we are not happy with the URL, lets come up with another one, or decide
 that wikitech is the preferred name, and redirect everything there.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making it possible to create method documentation links on MW.org

2013-03-22 Thread Waldir Pimenta
On Fri, Mar 22, 2013 at 9:43 AM, Daniel Friesen
dan...@nadir-seen-fire.comwrote:

 Right now a lot of our links to generated documentation on php classes are
 manually copied.

 One of the reasons for this seems to be how badly Doxygen handles the
 anchors to sections on individual methods.
 Instead of simply using the method name it generates a md5 hash of a
 signature that is impossible to know simply with a class and method name.
 Meaning we can't simply have a template to link to documentation.

 However Doxygen exports a tagfile that includes a list of classes, their
 methods, and the anchors used for their links.

 This is currently at: (*warning* large file)
 https://doc.wikimedia.org/**mediawiki-core/master/php/**html/tagfile.xmlhttps://doc.wikimedia.org/mediawiki-core/master/php/html/tagfile.xml

 What does everyone think of making it so that when Jenkins generates this
 documentation. It processes the tagfile, splits it up and converts it into
 multiple lua tables, then uses the API to update a Module: page on
 mediawiki.org.

 This way templates can have some Lua code that uses that data to create
 links with something like:

 local classes = mw.loadData( 'DoxygenTags mediawiki-core class' )
 ...

 function p.methodLink(className, methodName)
 local class = classes[className]
 ...
 local method = members.function[methodName]
 ...
 return https://doc.wikimedia.org/**mediawiki-core/master/php/**
 html/ https://doc.wikimedia.org/mediawiki-core/master/php/html/ +
 method.anchorfile + # + method.anchor
 end

 ...


I would go further and suggest a way to integrate code comments into manual
pages at mediawiki.org, so that we could have good documentation in both
code and mw.org, without needing to sync it manually. For example, the
detailed description of api.php found at
https://doc.wikimedia.org/mediawiki-core/master/php/html/api_8php.html#detailscould
be integrated into
https://www.mediawiki.org/wiki/Manual:api.php

Similarly, the contents of README files and the docs folder should be
integratable into mw.org.

If what Daniel suggests is feasible, I assume this also is, and imo would
greatly improve the availability and quality of both in-wiki and code
documentation.

--Waldir
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-10 Thread Waldir Pimenta
On Sat, Mar 9, 2013 at 8:39 PM, Ryan Lane rlan...@gmail.com wrote:

 On Sat, Mar 9, 2013 at 12:15 PM, Yuri Astrakhan yuriastrak...@gmail.com
 wrote:

  Should we re-start the lets migrate to github discussion?
 
 
 No.


To be fair, though I understand the arguments against using github (though
accepting pull requests from it is a must!), I always had the impression
that other options weren't given enough attention during that discussion.
Particularly GitLab looked very usable, could be used for self-hosting,
etc., and never even got much content in the case against section (only
two items, both currently marked as no longer true). I am not entirely
sure why it was discarded so promptly in favor of the admittedly powerful,
but clearly problematic/controversial Gerrit. Are there strong reasons do
dismiss it that weren't stated in that page?

--Waldir
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-10 Thread Waldir Pimenta
On Sun, Mar 10, 2013 at 8:55 AM, Waldir Pimenta wal...@email.com wrote:

 Are there strong reasons do dismiss it that weren't stated in that page?


Sorry, forgot the link:
http://www.mediawiki.org/wiki/Git/Gerrit_evaluation#GitLab
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Announcing the Wikimedia technical search tool

2013-03-10 Thread Waldir Pimenta
Hi all,

I'd like to announce a recently created tool that might help the Wikimedia
technical community find stuff more easily. Sometimes relevant information
is buried in IRC chat logs, messages in any of several mailing lists, pages
in mediawiki.org, commit messages, etc. This tool (essentially a custom
google search engine that filters results to a few relevant URL patterns)
is aimed at relieving this problem. Test it here: http://hexm.de/mw-search

The motivation for the tool came from a post by Niklas [1], specifically
the section Coping with the proliferation of tools within your community.
In the comments section, Nemo announced his initiative to create a custom
google search to fit at least some of the requirements presented in that
section, and I've offered to help him tweak it further. The URL list is
still incomplete and can be customized by editing the page
http://www.mediawiki.org/wiki/Wikimedia_technical_search (syncing with the
actual engine still will have to happen by hand, but should be quick).

Besides feedback on whether the engine works as you'd expect, I would like
to start some discussion about the ability for Google's bots to crawl some
of the resources that are currently included in the URL filters, but return
no results. For example, the IRC logs at bots.wmflabs.org/~wm-bot/logs/.
Some workarounds are used (e.g. using github for code search since gitweb
isn't crawlable) but that isn't possible for all resources. What can we do
to improve the situation?

--Waldir

1.
http://laxstrom.name/blag/2013/02/11/fosdem-talk-reflections-23-docs-code-and-community-health-stability/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Announcing the Wikimedia technical search tool

2013-03-10 Thread Waldir Pimenta
On Sun, Mar 10, 2013 at 8:53 PM, Platonides platoni...@gmail.com wrote:

  I'm not convinced about [[en:MediaWiki_talk:*]] and
 [[en:Template_talk:*]], they can bring quite a bit of noise (similarly
 for [[en:Wikipedia:Village_pump_(technical)]]). I see how interesting
 discussions could be happening there, though.


The tabs in the search results page (sorry I didn't mention them in the
previous email) can be used to filter results to more relevant content, if
desired. I think that might help coping with noise.

  Besides feedback on whether the engine works as you'd expect, I would
 like
  to start some discussion about the ability for Google's bots to crawl
 some
  of the resources that are currently included in the URL filters, but
 return
  no results. For example, the IRC logs at bots.wmflabs.org/~wm-bot/logs/.
  Some workarounds are used (e.g. using github for code search since gitweb
  isn't crawlable) but that isn't possible for all resources. What can we
 do
  to improve the situation?
 Do we really want Google to index them?


Why log them publicly if we don't make them searchable? Either we're
committed to being open or we're not... having a public but hard-to-use
archive seems somewhat contradictory to me.

--Waldir
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-03-09 Thread Waldir Pimenta
On Sat, Mar 9, 2013 at 11:48 AM, Brian Wolff bawo...@gmail.com wrote:

 On 3/8/13, Waldir Pimenta wal...@email.com wrote:
  ...I would suggest the mw-config directory to be renamed to something
 that
  more clearly identifies its purpose. I'm thinking first-run or
 something
  to that effect. I'll submit a patchset proposing 
  this.https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 The installer is also used to do database updates when upgrading
 mediawiki, so its not just a run-once-and-only-once thing.


So mw-config can't be deleted after all? Or you mean the installer at
includes/installer?
Is you mean the former, then how about run-installer instead of my
previous proposal of first-run?
Any of these would be clearer than mw-config, imo.

--Waldir
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-03-08 Thread Waldir Pimenta
On Wed, Feb 27, 2013 at 9:13 PM, Daniel Friesen
dan...@nadir-seen-fire.comwrote:


 index.php, api.php, etc... provide entrypoints into the configured wiki.

 mw-config/ installs and upgrades the wiki. With much of itself
 disconnected from core code that requires a configured wiki. And after
 installation it can even be eliminated completely without issue.


I think this clarifies the issue for me. Correct me if I'm wrong, but
basically the entry points are for continued, repeated use, for indeed
*accessing* wiki resources (hence I suggest the normalization of the name
of these scripts to access points everywhere in the docs, because entry
is a little more generic), while mw-config/index.php is a one-off script
that has no use once the wiki installation is done. I'll update the docs in
mw.org accordingly, to make this clear.


 I wouldn't even include mw-config in entrypoint modifications that would
 be applied to other entrypoint code.


You mean like this one https://gerrit.wikimedia.org/r/#/c/49208/? I can
understand, in the sense that it gives people the wrong idea regarding its
relationship with the other access points, but if the documentation is
clear, I see no reason not to have mw-config/index.php benefit from changes
when the touched code is the part common to all *entry* points (in the
strict meaning of files that can be used to enter the wiki from a web
browser).

That said, and considering what Platonides mentioned:

It was originally named config. It came from the link that sent you
 there: You need to configure your wiki first. Then someone had
 problems with other program that was installed sitewide on his host
 appropiating the /config/ folder, so it was renamed to mw-config.


...I would suggest the mw-config directory to be renamed to something that
more clearly identifies its purpose. I'm thinking first-run or something
to that effect. I'll submit a patchset proposing this.

--Waldir
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Poor man's Liquid Threads

2013-03-04 Thread Waldir Pimenta
Seems pretty usable and a good lightweight alternative to LT. If its bugs
are fixed and it's made independent of wiktionary's common.js, I'd say it
would be an interesting feature to add to core as a preference (at least
while better options aren't available natively).

--Waldir

On Sun, Mar 3, 2013 at 11:33 PM, Lars Aronsson l...@aronsson.se wrote:

 Have a look at
 https://en.wiktionary.org/**wiki/User:Yair_rand/**addcomment.jshttps://en.wiktionary.org/wiki/User:Yair_rand/addcomment.js

 It allows you to add a comment, signed and correctly
 indented, to the end of a section of a talk page.

 It works fine on my Android phone, which is more than
 can be said of the regular edit function.

 The discussion (ironically in Liquid Threads) is found here,
 https://en.wiktionary.org/**wiki/Thread:User_talk:Yair_**
 rand/Comment_gadgethttps://en.wiktionary.org/wiki/Thread:User_talk:Yair_rand/Comment_gadget


 --
   Lars Aronsson (l...@aronsson.se)
   Aronsson Datateknik - http://aronsson.se


 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Developer Hub simplification

2013-02-27 Thread Waldir Pimenta
On Wed, Feb 27, 2013 at 3:39 PM, Quim Gil q...@wikimedia.org wrote:

 And more: that overwhelming page focuses on PHP / MediaWiki core /
 extensions while all the newer areas of development are still missing. A
 Developer Hub must be the hub for all developers.


Agreed.


 Following with your proposal, we could have:
 - Developer Hub: a general overview with a very focused selection of links
 to send people to appropriate places. Less is more.
 - /Get started (or similar name): intro for newcomers with relevant links.
 - /Reference (or similar name): what developers need on a regular basis.

 Developer Hub and Get started should be more concise and visually
 pleasant, satisfying new developers. Reference is for usual suspects, so it
 could have more beef and links in order to have everything in one place.

 Do you think this makes sense? Should we start discussing at
 http://www.mediawiki.org/wiki/**Talk:Developer_hubhttp://www.mediawiki.org/wiki/Talk:Developer_hub?


Yes, that makes sense, though we need to think about how the pages in the
Manual namespace would fit this arrangement. Other developers' opinions
(seasoned and newcomers alike) would be quite valuable to prevent wasted
effort due to overlooking details like these.

Also, I agree that it would be best to have this discussion at
Talk:Developer hub. Perhaps copying this thread there (or linking to the
mailing list archives) would be a good idea, to provide context.

--Waldir
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gsoc 2013 guidelines

2013-02-26 Thread Waldir Pimenta
On Tue, Feb 26, 2013 at 9:23 AM, Atul Agrawal adi.md...@gmail.com wrote:

 Thanks a lot.Will try to fix bugs.But what knowledge should I have to fix
 bugs?knowing php is sufficient?Till now I have read this page(
 Developing_Extensions
 http://www.mediawiki.org/wiki/Manual:Developing_extensions.If
 you can share the documents one should read before fixing bugs,I will be
 highly obliged.


I've been collecting a few links at
http://www.mediawiki.org/?oldid=651991#MediaWiki_reference
that might be useful for newcomer developers to get a first overview of
various components of the MediaWiki universe.

Let me know if those are helpful to you and what you think is missing from
that list. Bear in mind that it's a work in progress, though.

--Waldir

ps - while you're at that page, if you find some interest in any of some of
the bugs I've listed in the easy part of the Potential bugs for hacking
sessions section, I'd love to help you go about fixing them.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Developer Hub simplification (was Re: Gsoc 2013 guidelines)

2013-02-26 Thread Waldir Pimenta
Thanks for the pointer.

My main concern with that page and most documentation on mw.org (and the
reason I started a personal list, so I could quickly find the stuff that
matters to me) is that it's too massive, with too many links (as you noted
yourself) and too much stuff highlighted, and I honestly feel a little
overwhelmed reading it, finding myself following links only to find out
that their target doesn't provide the information at the abstraction level
that I expected.

I'll speculate a little, and suggest that the root of this disorientation
happens due to not separating reference docs from tutorial docs, from what
I'll call guided overviews, and allowing each of these to clearly assume
one of those identities.

The list I've been putting together comprises reference docs, essentially
index-like material that one can look up whenever one needs information on
a specific part of mediawiki; There isn't much done in this regard. Brion
once started a page which he called Module
registryhttps://www.mediawiki.org/wiki/Modular_registry_of_components,
in 2009, which looked promising but it wasn't updated since.

Tutorials should be mostly aimed at absolute newcomers: very short, and
with a rather gentle learning curve; i.e., they should not introduce too
many concepts too fast, nor assume a lot of previous knowledge. modular,
interconnected compoments should be described in separate pages and
hyperlinked so budding developers can RTFM, choose-your-own-adventure
style. Currently very little of the docs fit that goal.

Guided overviews are like Wikipedia articles: long, descriptive and mostly
sequential texts describing an important component of the mediawiki
infrastructure, such as the skinning system, or the messages API. I
actually am not sure about the usefulness of such long guides, since
newcomers will have a hard time following them, and experts will likely
benefit more from index-like reference material. Maybe they should be
broken down into interconnected bite-sized tutorials.

I just came up with these categories, so there are probably many flaws to
this analysis, but at a glance this seems like a good-enough overview of
the state of our docs and what (in my humble opinion) needs to be done.
Maybe my position won't be shared by many, but I'll be content if it causes
some discussion about what overall strategy we need to implement to improve
MW docs (which is undeniably considered important: let's not forget it's
bug #1).

--Waldir

On Tue, Feb 26, 2013 at 10:14 PM, Quim Gil q...@wikimedia.org wrote:

 On 02/26/2013 06:40 AM, Waldir Pimenta wrote:

 I've been collecting a few links at
 http://www.mediawiki.org/?**oldid=651991#MediaWiki_**referencehttp://www.mediawiki.org/?oldid=651991#MediaWiki_reference
 that might be useful for newcomer developers to get a first overview of
 various components of the MediaWiki universe.


 Your help is welcome aligning your selection with
 http://www.mediawiki.org/wiki/**Developer_hubhttp://www.mediawiki.org/wiki/Developer_hub

 By the way, do we really need 23 links under Key documents,
 after the 23 links contained at the Overview section,
 following the first paragraph containing 5 links?

 No wonder someone like Atul keeps asking about what to read for a first
 contribution.

  Let me know if those are helpful to you and what you think is missing from
 that list. Bear in mind that it's a work in progress, though.



 --
 Quim Gil
 Technical Contributor Coordinator @ Wikimedia Foundation
 http://www.mediawiki.org/wiki/**User:Qgilhttp://www.mediawiki.org/wiki/User:Qgil

 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-02-18 Thread Waldir Pimenta
On 15/02/13 09:16, Waldir Pimenta wrote:

 should all access points be on the root directory of the wiki, for
 consistency? currently mw-config/index.php is the only one not in the root.


On Mon, Feb 18, 2013 at 3:56 PM, Platonides platoni...@gmail.com wrote:

 Well, every bit of the installer used to be in the config folder, until
 the rewrite, which moved the classes to includes/installer
 Now you mention it, you're probably right in that it could be moved to
 eg. /installer.php


OK, so I'll ask here on the list: is there any reason we shouldn't move
[root]/mw-config/index.php to [root]/installer.php?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-02-18 Thread Waldir Pimenta
On Mon, Feb 18, 2013 at 5:17 PM, Krinkle krinklem...@gmail.com wrote:

 But before more bike shedding (have we had enough these last 2 months
 yet?), is there a problem with having a directory?


It somewhat breaks the pattern, considering that all the other access
points (and their corresponding php5 files) are located in the root. So
that leaves only overrides.php, which I'm not sure why it was kept in
mw-config, considering that (quoting Platonides) the installer used to be
in the config folder, until the rewrite, which *moved the classes* to
includes/installer (emphasis mine). If the classes were moved to
includes/installer, why did those of overrides.php's remain? So they can be
easily deleted? I don't think they're the only files that are kept around
that people won't use, so the convenience of an easy deletion doesn't seem
that much of a big advantage, especially at the expense of a logical
organization of the code.

By the way, there's also an INSTALL file in the root, so even if
overrides.php was moved there, it wouldn't be the only installation-related
non-access-point to be kept there.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-02-15 Thread Waldir Pimenta
On Fri, Feb 15, 2013 at 11:58 AM, Platonides platoni...@gmail.com wrote:

 On 15/02/13 09:16, Waldir Pimenta wrote:
  1) should all access points be on the root directory of the wiki, for
  consistency?

 No. The installer is on its on folder on purpose, so that you can delete
 that folder once you have installed the wiki.


Sorry if I wasn't clear. I meant that mw-config/index.php should be in te
root, not that the installer files should. Unless I'm misunderstanding you,
and by the installer you mean the contents of mw-config/ rather than
/includes/installer (which would prove my point about unclear nomenclature).

 Also, I used Tim Starling's suggestion on IRC to make sure the list of
  entry point scripts listed in Manual:Code was complete: git grep -l
  /includes/WebStart.php
  I am not sure that exhausts the list, however, since thumb_handler.php
  doesn't show up on its results. Any pointers regarding potential entry
  points currently omitted from that list are most welcome.

 That's probably because it doesn't include WebStart (it included
 thumb.php, which is the one including WebStart).

 Take a look at tools/code-utils/find-entries.php I have updated it to
 add a few new rules in https://gerrit.wikimedia.org/r/49230

 It will give you about 100 files to check, most of them cli scripts.
 Although there are a few web-enabled ones, such as
 tests/qunit/data/styleTest.css.php

 Use -d to see why that file was considered an entry point. As you'll
 see, it is very strict -with reason- in what it considers safe.


Thanks, that sounds quite useful, but I don't seem to be able to run it
properly (I get a few php warnings, and 0/0 as output). I placed it in
the root dir of my local wiki. Am I missing anyting?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit reviewer bot update

2013-02-15 Thread Waldir Pimenta
Update: A new version has been deployed that supports the match_all_files
parameter to make file_regexp apply to all files in the patchset, rather
than any of the files. Example use:

{{Gerrit-reviewer|User
1|file_regexp=nowiki\.txt/nowiki|match_all_files}}

Still lacking is support for a commit_msg_regexp parameter, which should
work similarly to the existing file_regexp.

--Waldir

On Fri, Feb 15, 2013 at 7:42 AM, Waldir Pimenta wal...@email.com wrote:

 It's not critical to support diff content if commit messages can be used.
 In fact diffs would just be a nice to have feature, but the commit
 message ought to be much more informative anyway.

 As for a pull request, I'll give it a shot.

 --Waldir


 On Thu, Feb 14, 2013 at 8:31 PM, Merlijn van Deen valhall...@arctus.nlwrote:

 Hi Waldir,

 On 14 February 2013 02:21, Waldir Pimenta wal...@email.com wrote:
  Any chance the reviewer-bot can support additional triggers? For
 example,
  diff content, commit-message content, etc.

 Anything that is available from the changes REST api (see [1]) can be
 added with relative easy. This includes the commit message, but not
 the diff content (but it might be available in a newer Gerrit
 release).

 It would be possible to get the data from either the JSON-RPC api
 (with a high risk of breakage on new Gerrit deployments) or via git,
 but this would be an considerable effort.

  Also, it wold be nice to specify whether the currently supported filters
  should apply to ANY of the files in the change (the current behavior) or
  ALL of the changed files.

 There is no fundamental reason why this would be impossible, but the
 syntax might quickly become complicated. It would require only a few
 lines of code and an extra parameter ('file_regexp_all'). The use case
 you described on IRC ('new reviewers who only want to review changes
 that only contain .css files') makes sense.


 I won't have time in the coming weeks to implement either one of
 those, though. Feel free to implement it yourself  to submit a pull
 request, though. I have just added some functionality to easily test
 suggested reviewers from the command line. For more details, please
 see [2]. In any case, it's on my to-do list, and I'll get to it when I
 get to it ;-)


 Best,
 Merlijn


 [1]
 https://gerrit.wikimedia.org/r/Documentation/rest-api-changes.html#_get_changes_query_changes
 [2] https://github.com/valhallasw/gerrit-reviewer-bot

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit reviewer bot update

2013-02-14 Thread Waldir Pimenta
It's not critical to support diff content if commit messages can be used.
In fact diffs would just be a nice to have feature, but the commit
message ought to be much more informative anyway.

As for a pull request, I'll give it a shot.

--Waldir

On Thu, Feb 14, 2013 at 8:31 PM, Merlijn van Deen valhall...@arctus.nlwrote:

 Hi Waldir,

 On 14 February 2013 02:21, Waldir Pimenta wal...@email.com wrote:
  Any chance the reviewer-bot can support additional triggers? For example,
  diff content, commit-message content, etc.

 Anything that is available from the changes REST api (see [1]) can be
 added with relative easy. This includes the commit message, but not
 the diff content (but it might be available in a newer Gerrit
 release).

 It would be possible to get the data from either the JSON-RPC api
 (with a high risk of breakage on new Gerrit deployments) or via git,
 but this would be an considerable effort.

  Also, it wold be nice to specify whether the currently supported filters
  should apply to ANY of the files in the change (the current behavior) or
  ALL of the changed files.

 There is no fundamental reason why this would be impossible, but the
 syntax might quickly become complicated. It would require only a few
 lines of code and an extra parameter ('file_regexp_all'). The use case
 you described on IRC ('new reviewers who only want to review changes
 that only contain .css files') makes sense.


 I won't have time in the coming weeks to implement either one of
 those, though. Feel free to implement it yourself  to submit a pull
 request, though. I have just added some functionality to easily test
 suggested reviewers from the command line. For more details, please
 see [2]. In any case, it's on my to-do list, and I'll get to it when I
 get to it ;-)


 Best,
 Merlijn


 [1]
 https://gerrit.wikimedia.org/r/Documentation/rest-api-changes.html#_get_changes_query_changes
 [2] https://github.com/valhallasw/gerrit-reviewer-bot

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit reviewer bot update

2013-02-13 Thread Waldir Pimenta
Any chance the reviewer-bot can support additional triggers? For example,
diff content, commit-message content, etc.

Also, it wold be nice to specify whether the currently supported filters
should apply to ANY of the files in the change (the current behavior) or
ALL of the changed files.

--Waldir

On Wed, Feb 13, 2013 at 8:19 PM, Merlijn van Deen valhall...@arctus.nlwrote:

 Hello all,

 With the upgrade of Gerrit, I have also taken the time to improve the
 Reviewer bot. For those who do not know: the reviewer bot is a tool
 that adds potential reviewers to new changesets, based on
 subscriptions [1].

 One of the problems we have encountered is the use of the SSH-based
 Gerrit change feed. This effectively requires 100% uptime to not miss
 any changesets. This has now been solved: we are now using the
 mediawiki-commits mailing list, and are reading the messages in a
 batched way (every five minutes).
 This change also makes development  testing much easier, which is a nice
 bonus.

 On the backend, there are two changes: instead of the JSON RPC api, we
 are now using the REST api, and we have moved hosting from the
 toolserver to translatewiki.net (thank you, Siebrand!)

 Best,
 Merlijn

 [1] http://www.mediawiki.org/wiki/Git/Reviewers

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Minimalist MediaWiki? (was Re: Merge Vector extension into core)

2013-02-06 Thread Waldir Pimenta
On Thu, Feb 7, 2013 at 1:23 AM, MZMcBride z...@mzmcbride.com wrote:

 Ori Livneh wrote:
 On Wednesday, February 6, 2013 at 3:58 PM, Tim Starling wrote:
  I think we can turn MediaWiki into a fully featured wiki engine which
  can compete with the likes of Confluence.
 
 What would it take?

 * a proper graphical configuration interface.


I'll just use this to mention
https://www.mediawiki.org/wiki/Extension:Configure. It seems like something
that, if nurtured into a stable form, would add great value to mediawiki,
regarding the point quoted above.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Disambiguation features: Do they belong in core or in an extension?

2013-01-16 Thread Waldir Pimenta
On Wed, Jan 16, 2013 at 2:11 AM, Jon Robson jdlrob...@gmail.com wrote:

 To me disambiguation seems like a common problem of wikis and thus
 should be a core feature.

 On a wiki about people, people share the same name
 On a wiki about cities, cities share the same name
 etc etc you get the idea.


Agreed. Also, it only makes sense for mediawiki to natively provide proper
support for disambiguations, the same way there is support for redirects.

Furthermore, I'd like to underline what Ryan said in his original message,
since several people seem to be ignoring it, and using code bloat as an
argument for using an extension:

 The code is pretty clean and lightweight, so it wouldn't increase the
 footprint of core MediaWiki (it would actually decrease the existing
 footprint slightly since it replaces more hacky existing core code). So *
 core bloat isn't really an issue*.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Demo for Concise Wikipedia proposal

2012-12-19 Thread Waldir Pimenta
Hi all,

Since the Concise Wikipedia proposal[1] has been mentioned in the last two
Signpost editions[2][3] (and after being nudged by Sumana), I figured I'd
drop a note here in case anyone will be interested in trying out a demo I
set up to explore the idea, espoused by many in the proposal discussion,
that such a proposal should not be a separate project, but integrate with
Wikipedia's lead/introduction sections instead.

I called my demo Primerpedia (suggestions for better names welcome, see
[4]), and it can be accessed here: http://waldir.github.com/primerpedia

It uses the API to fetch the lead section of an article (currently only
loading random articles is implemented), and displays it isolated, which
should provide a good way to test its expected self-containing, summarizing
properties, as defined in the MOS:LEAD guideline.

I am aware of several shortcomings in the current implementation and plan
to improve it further; any suggestions or issue reports are welcome at [5]
or directly on this thread for convenience.

--Waldir

1. https://meta.wikimedia.org/wiki/Concise_Wikipedia
2.
https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2012-12-03/Discussion_report
3.
https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2012-12-17/Discussion_report
4. https://github.com/waldir/primerpedia/wiki
5. https://github.com/waldir/primerpedia/issues
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l