Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Daniel Friesen
On Sun, 12 May 2013 21:26:42 -0700, Moriel Schottlender mor...@gmail.com  
wrote:



Hello everyone,

I'd like to get your opinions and critique on my very first MediaWiki
extension, which, I hope, will be helpful to other developers.
...
The extension is available on GitHub:
https://github.com/mooeypoo/MediaWiki-ExtensionStatus along with
screenshots and a short possible todo list.


* Your RL module should have a remote path in addition to a local one, it's
  also proper to declare the media type for your css. You should also drop
  the `@CHARSET UTF-8`.
* The wfMessage calls should probably be $this-msg instead.
* i18n messages are generally used for opening and closing parenthesis  
iirc.

* You want $this-getLanguage() not $wgLang.
* Don't mess with the error_reporting setting in extension code.


...
Please let me know what you think!

Moriel
(mooeypoo)


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Moriel Schottlender
On Mon, May 13, 2013 at 2:21 AM, Daniel Friesen
dan...@nadir-seen-fire.comwrote:

 * Your RL module should have a remote path in addition to a local one, it's
   also proper to declare the media type for your css. You should also drop
   the `@CHARSET UTF-8`.
 * The wfMessage calls should probably be $this-msg instead.
 * i18n messages are generally used for opening and closing parenthesis
 iirc.
 * You want $this-getLanguage() not $wgLang.
 * Don't mess with the error_reporting setting in extension code.


Thanks a lot Daniel! Much appreciated.

I also noticed I should go through the styleguide to make sure I follow the
proper conventions (spaces, loops, etc). I'm going to go over it all in the
next couple of days.

One quick question -- I'm not sure I understand your first comment. What do
you mean RL module? a remote path where? do you mean in the
$wgExtensionCredits[ 'specialpage' ][] array?

Thanks again!

Moriel

-- 
No trees were harmed in the creation of this post.
But billions of electrons, photons, and electromagnetic waves were terribly
inconvenienced during its transmission!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Yury Katkov
I very much lie the idea of this extension: it's very useful to know
what changes have been made in a software one is using. Does it only
work with WMF's gerrit? Many extensions are not hosted there. I think
that for them you can just compare the version number of the extension
with its current version number on mediawiki.org.
-
Yury Katkov, WikiVote



On Mon, May 13, 2013 at 8:26 AM, Moriel Schottlender mor...@gmail.com wrote:
 Hello everyone,

 I'd like to get your opinions and critique on my very first MediaWiki
 extension, which, I hope, will be helpful to other developers.

 I noticed that there's no easy way of seeing if extensions that we have
 installed on our MediaWiki require update, and there are even some
 extensions that get so little regular attention that when they do get
 updated, there's no real way of knowing it (unless we check specifically).

 It can be annoying when encountering problems or bugs, and then discovering
 that one of our extensions (probably the one we least expected) actually
 has an update that fixed this bug.

 So, I thought to try and solve this issue with my extension. Since
 MediaWiki's changes are submitted through gerrit, I thought I'd take
 advantage of that and perform a remote check to see if there are any new
 commits that appeared in any of the extensions since they were installed.

 How it works, briefly: the system compares the local repository date to the
 list of latest commits on gerrit's repo for the extension to see how many
 commits a user is behind on. If the user doesn't have a local git for the
 extension (or if they downloaded the extension manually) the system falls
 back to testing the local modification date for the files. It's not
 perfect, but it can give people a general idea of whether or not their
 extensions need some TLC.

 The extension is available on GitHub:
 https://github.com/mooeypoo/MediaWiki-ExtensionStatus along with
 screenshots and a short possible todo list.

 There's a list of things I plan to try and improve, some of them are meant
 to make the lives of newbie developers (like me!) easier, but I'd love it
 if I could get feedback from you all and see if you think this could be
 helpful to you.
 Would you like to see anything else in it? Do you think I'm in the right
 direction or am I doing it all wrong?

 Be merciless!
 Okay, maybe not *completely* merciless, but please don't hold back. This is
 my very first extension, and beyond wanting to make this a good extension,
 I also want to get a sense of whether or not I got into MW development
 right, and if there is anything I should have done (or be doing)
 differently.

 Please let me know what you think!

 Moriel
 (mooeypoo)



 --
 No trees were harmed in the creation of this post.
 But billions of electrons, photons, and electromagnetic waves were terribly
 inconvenienced during its transmission!
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] GSoC - Prototype for jQuery.IME extensions for Firefox and Chrome

2013-05-13 Thread Praveen Singh
Hi,

I have made prototype extensions for Firefox and Chrome related to this
project.
These extensions port jQuery.IME completely to the client side and can be
installed in the browser.
These extension would allow users to use multilingual input methods on any
website and not just on MediaWiki enabled websites.

Source code and installation instructions are available at:
https://github.com/pravee-n/prototype.jquery.ime

Let me know if you have any queries. I would love to get some suggestions
regarding how to improve these extensions.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Daniel Friesen
On Sun, 12 May 2013 23:35:12 -0700, Moriel Schottlender mor...@gmail.com  
wrote:



On Mon, May 13, 2013 at 2:21 AM, Daniel Friesen
dan...@nadir-seen-fire.comwrote:
Thanks a lot Daniel! Much appreciated.

I also noticed I should go through the styleguide to make sure I follow  
the
proper conventions (spaces, loops, etc). I'm going to go over it all in  
the

next couple of days.

One quick question -- I'm not sure I understand your first comment. What  
do

you mean RL module? a remote path where? do you mean in the
$wgExtensionCredits[ 'specialpage' ][] array?


In the $wgResourceModules['ext.ExtensionStatus']. You need to set a remote
path in addition to the local one. Otherwise ResourceLoader (RL) will point
things to the wrong path in debug mode.
Since you're in an extension you can use 'remoteExtPath' =  
'ExtensionStatus'.



Thanks again!

Moriel




--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Moriel Schottlender
On Mon, May 13, 2013 at 2:44 AM, Yury Katkov katkov.ju...@gmail.com wrote:

 I very much lie the idea of this extension: it's very useful to know
 what changes have been made in a software one is using. Does it only
 work with WMF's gerrit? Many extensions are not hosted there. I think
 that for them you can just compare the version number of the extension
 with its current version number on mediawiki.org.


Thanks for the comment Yuri!

For the moment, it's only using gerrit for the extension repos, but the
plan is to add more options after the extension is stable and passes
some review about its value and the strategy I'm using. It's definitely
planned.

I was actually thinking of adding alternative repo location option for
extensions that can't be found on gerrit, so users can put in the
repository themselves.

I didn't think about comparing the versions, I was thinking of trying to
see if there are revisions in the commit history -- but this is a good
idea. I'll check into that.

Thanks!


Moriel

-- 
No trees were harmed in the creation of this post.
But billions of electrons, photons, and electromagnetic waves were terribly
inconvenienced during its transmission!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Markus Glaser
Hi Moriel,

I like that idea very much. In the use case I have in mind, though, I do have 
actual releases. Do you think it's possible for your extension to also consider 
tags? I am thinking of something like a tagging convention, e.g. RELEASE 
v1.20. ExtensionStatus could then parse the tag and send it back to the user.

Best,
Markus
(mglaser)


-Ursprüngliche Nachricht-
Von: wikitech-l-boun...@lists.wikimedia.org 
[mailto:wikitech-l-boun...@lists.wikimedia.org] Im Auftrag von Moriel 
Schottlender
Gesendet: Montag, 13. Mai 2013 06:27
An: wikitech-l@lists.wikimedia.org
Betreff: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

Hello everyone,

I'd like to get your opinions and critique on my very first MediaWiki 
extension, which, I hope, will be helpful to other developers.

I noticed that there's no easy way of seeing if extensions that we have 
installed on our MediaWiki require update, and there are even some extensions 
that get so little regular attention that when they do get updated, there's no 
real way of knowing it (unless we check specifically).

It can be annoying when encountering problems or bugs, and then discovering 
that one of our extensions (probably the one we least expected) actually has an 
update that fixed this bug.

So, I thought to try and solve this issue with my extension. Since MediaWiki's 
changes are submitted through gerrit, I thought I'd take advantage of that and 
perform a remote check to see if there are any new commits that appeared in any 
of the extensions since they were installed.

How it works, briefly: the system compares the local repository date to the 
list of latest commits on gerrit's repo for the extension to see how many 
commits a user is behind on. If the user doesn't have a local git for the 
extension (or if they downloaded the extension manually) the system falls back 
to testing the local modification date for the files. It's not perfect, but it 
can give people a general idea of whether or not their extensions need some TLC.

The extension is available on GitHub:
https://github.com/mooeypoo/MediaWiki-ExtensionStatus along with screenshots 
and a short possible todo list.

There's a list of things I plan to try and improve, some of them are meant to 
make the lives of newbie developers (like me!) easier, but I'd love it if I 
could get feedback from you all and see if you think this could be helpful to 
you.
Would you like to see anything else in it? Do you think I'm in the right 
direction or am I doing it all wrong?

Be merciless!
Okay, maybe not *completely* merciless, but please don't hold back. This is my 
very first extension, and beyond wanting to make this a good extension, I also 
want to get a sense of whether or not I got into MW development right, and if 
there is anything I should have done (or be doing) differently.

Please let me know what you think!

Moriel
(mooeypoo)



--
No trees were harmed in the creation of this post.
But billions of electrons, photons, and electromagnetic waves were terribly 
inconvenienced during its transmission!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Moriel Schottlender
On Mon, May 13, 2013 at 4:06 AM, Markus Glaser gla...@hallowelt.biz wrote:

 I like that idea very much. In the use case I have in mind, though, I do
 have actual releases. Do you think it's possible for your extension to also
 consider tags? I am thinking of something like a tagging convention, e.g.
 RELEASE v1.20. ExtensionStatus could then parse the tag and send it
 back to the user.


Hi Markus,

Absolutely, I wish it was already a convention. I created the 'read the
remote git' to go around that problem. I could, however, set the system to
first check a release tag and then fall back to testing dates/commits like
it does now if the release tag is unavailable.

The problem with tags, though, is that we will need to have some common
location that keeps the newest release on record so the extension can then
compare the local tag against a remote update. I believe this is what's
done in systems like Wordpress and Drupal, but their extension database
system is completely different, too, and I don't think it fits MW at all.
For that matter, their extensions are more 'individual-based' rather than
collaborative to the community, that won't work here.

It will also require extension updaters/developers to update those tags.
I think it's fairly easy to add a local vs remote tag comparison, the
question is how it can be implemented in terms of convention so all
extensions end up following it. Is it realistic?


-- 
No trees were harmed in the creation of this post.
But billions of electrons, photons, and electromagnetic waves were terribly
inconvenienced during its transmission!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Brian Wolff
On 2013-05-13 5:16 AM, Moriel Schottlender mor...@gmail.com wrote:

 On Mon, May 13, 2013 at 4:06 AM, Markus Glaser gla...@hallowelt.biz
wrote:

  I like that idea very much. In the use case I have in mind, though, I do
  have actual releases. Do you think it's possible for your extension to
also
  consider tags? I am thinking of something like a tagging convention,
e.g.
  RELEASE v1.20. ExtensionStatus could then parse the tag and send it
  back to the user.
 

 Hi Markus,

 Absolutely, I wish it was already a convention. I created the 'read the
 remote git' to go around that problem. I could, however, set the system to
 first check a release tag and then fall back to testing dates/commits like
 it does now if the release tag is unavailable.

 The problem with tags, though, is that we will need to have some common
 location that keeps the newest release on record so the extension can then
 compare the local tag against a remote update. I believe this is what's
 done in systems like Wordpress and Drupal, but their extension database
 system is completely different, too, and I don't think it fits MW at all.
 For that matter, their extensions are more 'individual-based' rather than
 collaborative to the community, that won't work here.

 It will also require extension updaters/developers to update those tags.
 I think it's fairly easy to add a local vs remote tag comparison, the
 question is how it can be implemented in terms of convention so all
 extensions end up following it. Is it realistic?


 --
 No trees were harmed in the creation of this post.
 But billions of electrons, photons, and electromagnetic waves were
terribly
 inconvenienced during its transmission!
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

We actually do autocreate tags (REL1_XX) whenever we do a release that
corresponds to the version of extension included in the installer (if it is
included) and for use by special:extensiondistributor. How much people
update the tags varries by extension, with many not really being updated
but some are.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSOC - 2013, Project Help

2013-05-13 Thread Andre Klapper
Hi Vishal,

On Sat, 2013-05-11 at 06:20 +, hungers.to.nurt...@gmail.com wrote:
 This is my project http://www.mediawiki.org/wiki/User:Making-it-yours
 for GSOC - 2013, which I hope haven’t interested you. 

Dozens of projects were introduced on this list in the last weeks, so
I'd appreciate if you could include a summary in the email subject
instead of making me click your link to find out whether your topic
interests me or not.

 Recently, I encountered with an analogy for my project which I wish to
 discuss with you so that I can persuade you. Head to this
 https://play.google.com/store/books and select a free book ( say Pride
 and Prejudice). Select some text and see how elegantly we can do
 operations I wish Wikipedia must have but not limited to.

Again I'd very much appreciate if you could describe *in words* what you
refer to, instead of making me click and only get a Sorry, the Books
category on Google Play is not yet available in your country.

andre
-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [GSoC 2013] Wikidata Entity Suggester prototype

2013-05-13 Thread Nilesh Chakraborty
Hi everyone,

I'm working on a prototype for the Wikidata Entity Suggester (Bug
#46555https://bugzilla.wikimedia.org/show_bug.cgi?id=46555).
As of now, it is a command-line client, completely written in Java, that
fetches recommendations from a Myrrix server layer.

Please take a look at the GitHub repository here:
https://github.com/nilesh-c/wikidata-entity-suggester/
I would really appreciate it if you can take the time to go through the
README and provide me with some much-needed feedback. Any questions or
suggestions are welcome. If you're curious, you can set up the whole thing
on your own machine.

Check out a few examples too:
https://github.com/nilesh-c/wikidata-entity-suggester/wiki/Examples

It can suggest properties and values for new/not-yet-created items (and
also currently present items), if it's given a few properties/values as
input data.

I intend to write a REST API and/or a simple PHP frontend for it before I
set it up on a remote VPS, so that everyone can test it out. Some
experimentation and quality optimization is also due.

Cheers,
Nilesh
(User Page - https://www.mediawiki.org/wiki/User:Nilesh.c)

-- 
A quest eternal, a life so small! So don't just play the guitar, build one.
You can also email me at cont...@nileshc.com or visit my
websitehttp://www.nileshc.com/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bugzilla Weekly Report

2013-05-13 Thread Željko Filipin
On Mon, May 13, 2013 at 5:00 AM, reporter repor...@kaulen.wikimedia.orgwrote:

 General/Unknown 12
 Site requests   12
 General/Unknown 11


General/Unknown is listed twice.

Željko
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bugzilla Weekly Report

2013-05-13 Thread Brian Wolff
On 2013-05-13 7:21 AM, Željko Filipin zfili...@wikimedia.org wrote:

 On Mon, May 13, 2013 at 5:00 AM, reporter repor...@kaulen.wikimedia.org
wrote:

  General/Unknown 12
  Site requests   12
  General/Unknown 11
 

 General/Unknown is listed twice.

 Željko
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

With different results each time ;)

Presumably they represent general/unknown components from 2 different
products.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [GSoC 2013] Wikidata Entity Suggester prototype

2013-05-13 Thread Denny Vrandečić
That's awesome!

Two things:
* how set are you on a Java-based solution? We would prefer PHP in order to
make it more likely to be deployed.
* could you provide a link to a running demo?

Cheers,
Denny



2013/5/13 Nilesh Chakraborty nil...@nileshc.com

 Hi everyone,

 I'm working on a prototype for the Wikidata Entity Suggester (Bug
 #46555https://bugzilla.wikimedia.org/show_bug.cgi?id=46555).
 As of now, it is a command-line client, completely written in Java, that
 fetches recommendations from a Myrrix server layer.

 Please take a look at the GitHub repository here:
 https://github.com/nilesh-c/wikidata-entity-suggester/
 I would really appreciate it if you can take the time to go through the
 README and provide me with some much-needed feedback. Any questions or
 suggestions are welcome. If you're curious, you can set up the whole thing
 on your own machine.

 Check out a few examples too:
 https://github.com/nilesh-c/wikidata-entity-suggester/wiki/Examples

 It can suggest properties and values for new/not-yet-created items (and
 also currently present items), if it's given a few properties/values as
 input data.

 I intend to write a REST API and/or a simple PHP frontend for it before I
 set it up on a remote VPS, so that everyone can test it out. Some
 experimentation and quality optimization is also due.

 Cheers,
 Nilesh
 (User Page - https://www.mediawiki.org/wiki/User:Nilesh.c)

 --
 A quest eternal, a life so small! So don't just play the guitar, build one.
 You can also email me at cont...@nileshc.com or visit my
 websitehttp://www.nileshc.com/
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bugzilla Weekly Report

2013-05-13 Thread Željko Filipin
Fresh charts:

http://www.mediawiki.org/wiki/Bugzilla_Weekly_Report

On Mon, May 13, 2013 at 5:00 AM, reporter repor...@kaulen.wikimedia.orgwrote:

 Top 5 bug report closers
 ..
 jrobson [AT] wikimedia.org  17
 mmullie [AT] wikimedia.org  17


jrobson and mmullie now appear in Top 5 bug report closers chart.

Željko
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [GSoC 2013] Wikidata Entity Suggester prototype

2013-05-13 Thread Nilesh Chakraborty
Thank you! :)

Simply put, there are two prime components here:
1. The recommendation engine (Myrrix, need to run a .jar file as a daemon,
done. Easier than deploying Lucene)
2. Recommendation client (Myrrix has a rich Java API. My current code uses
it to provide recommendations. The actual client-side Java API-using code
is less than 150 LOC, minus a couple of classes that are injected into the
Myrrix daemon.)

Now I have three options here:
i) To write a PHP wrapper over (2)
ii) I can expose (2) as a REST-based API which can be easily used from PHP
code.
iii) Completely replace (2) with PHP code.

(i) and (ii) are feasible options. But (iii) would mean rewriting quite a
large amount of code/functionality, that's already in the Java API, in PHP.
And I can't see any gains from going with (iii) since it wouldn't really
help deployment any more than (i) or (ii).

I am a bit busy with my university exams; I will try to deploy this on a
VPS, update the repo with some PHP code and a link to the demo, and share
it here in a couple of days.

Cheers,
Nilesh



On Mon, May 13, 2013 at 4:02 PM, Denny Vrandečić 
denny.vrande...@wikimedia.de wrote:

 That's awesome!

 Two things:
 * how set are you on a Java-based solution? We would prefer PHP in order to
 make it more likely to be deployed.
 * could you provide a link to a running demo?

 Cheers,
 Denny



 2013/5/13 Nilesh Chakraborty nil...@nileshc.com

  Hi everyone,
 
  I'm working on a prototype for the Wikidata Entity Suggester (Bug
  #46555https://bugzilla.wikimedia.org/show_bug.cgi?id=46555).
  As of now, it is a command-line client, completely written in Java, that
  fetches recommendations from a Myrrix server layer.
 
  Please take a look at the GitHub repository here:
  https://github.com/nilesh-c/wikidata-entity-suggester/
  I would really appreciate it if you can take the time to go through the
  README and provide me with some much-needed feedback. Any questions or
  suggestions are welcome. If you're curious, you can set up the whole
 thing
  on your own machine.
 
  Check out a few examples too:
  https://github.com/nilesh-c/wikidata-entity-suggester/wiki/Examples
 
  It can suggest properties and values for new/not-yet-created items (and
  also currently present items), if it's given a few properties/values as
  input data.
 
  I intend to write a REST API and/or a simple PHP frontend for it before I
  set it up on a remote VPS, so that everyone can test it out. Some
  experimentation and quality optimization is also due.
 
  Cheers,
  Nilesh
  (User Page - https://www.mediawiki.org/wiki/User:Nilesh.c)
 
  --
  A quest eternal, a life so small! So don't just play the guitar, build
 one.
  You can also email me at cont...@nileshc.com or visit my
  websitehttp://www.nileshc.com/
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l




 --
 Project director Wikidata
 Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
 Tel. +49-30-219 158 26-0 | http://wikimedia.de

 Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
 Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
 der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
 Körperschaften I Berlin, Steuernummer 27/681/51985.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
A quest eternal, a life so small! So don't just play the guitar, build one.
You can also email me at cont...@nileshc.com or visit my
websitehttp://www.nileshc.com/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Aarti K. Dwivedi
Hi,

   The idea is obviously good and useful. In OPW round 5(
http://www.mediawiki.org/wiki/Outreach_Program_for_Women/Round_5 ) there
was a project
for pulling files from a git repository. A part of the project dealt with
updating the files. The approach( checking for last commit id or
modification date )
could be similar. Just telling you that similar work has been done on a
smaller scale before.

Cheers,
Rtdwivedi


On Mon, May 13, 2013 at 1:52 PM, Brian Wolff bawo...@gmail.com wrote:

 On 2013-05-13 5:16 AM, Moriel Schottlender mor...@gmail.com wrote:
 
  On Mon, May 13, 2013 at 4:06 AM, Markus Glaser gla...@hallowelt.biz
 wrote:
 
   I like that idea very much. In the use case I have in mind, though, I
 do
   have actual releases. Do you think it's possible for your extension to
 also
   consider tags? I am thinking of something like a tagging convention,
 e.g.
   RELEASE v1.20. ExtensionStatus could then parse the tag and send it
   back to the user.
  
 
  Hi Markus,
 
  Absolutely, I wish it was already a convention. I created the 'read the
  remote git' to go around that problem. I could, however, set the system
 to
  first check a release tag and then fall back to testing dates/commits
 like
  it does now if the release tag is unavailable.
 
  The problem with tags, though, is that we will need to have some common
  location that keeps the newest release on record so the extension can
 then
  compare the local tag against a remote update. I believe this is what's
  done in systems like Wordpress and Drupal, but their extension database
  system is completely different, too, and I don't think it fits MW at all.
  For that matter, their extensions are more 'individual-based' rather than
  collaborative to the community, that won't work here.
 
  It will also require extension updaters/developers to update those tags.
  I think it's fairly easy to add a local vs remote tag comparison, the
  question is how it can be implemented in terms of convention so all
  extensions end up following it. Is it realistic?
 
 
  --
  No trees were harmed in the creation of this post.
  But billions of electrons, photons, and electromagnetic waves were
 terribly
  inconvenienced during its transmission!
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 We actually do autocreate tags (REL1_XX) whenever we do a release that
 corresponds to the version of extension included in the installer (if it is
 included) and for use by special:extensiondistributor. How much people
 update the tags varries by extension, with many not really being updated
 but some are.

 -bawolff
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Aarti K. Dwivedi
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Quo Vadis, Vagrant

2013-05-13 Thread Ori Livneh
I'm writing to report on progress with Mediawiki-Vagrant, and to tell
you a bit about how it could be useful to you. If you've looked at
MediaWiki-Vagrant in the past and thought, 'big deal -- I know how to
configure MediaWiki', this e-mail is for you.

But first, a small snippet to whet your appetite:

# == Class: role::mobilefrontend
# Configures MobileFrontend, the MediaWiki extension which powers
# Wikimedia mobile sites.
class role::mobilefrontend {
include role::mediawiki
include role::eventlogging

mediawiki::extension { 'MobileFrontend':
settings = {
wgMFForceSecureLogin = false,
wgMFLogEvents= true,
}
}
}

MW-V has evolved to become a highly organized and consistent Puppet
code base for describing MediaWiki development environments. Puppet,
you'll recall, is the same configuration management and software
automation tool that TechOps uses to run the cluster. Puppet provides
a domain-specific language for articulating software configuration in
a declarative way. You tell Puppet what resources are configured on
your machine and what relationships inhere among them, and Puppet
takes your description and executes it.

MW-V uses Puppet to automate the configuration of MediaWiki  related
bits of software. But MW-V goes a bit further, too: it exploits the
flexibility of Puppet's syntax and semantics to give MediaWiki
developers a set of affordances for describing their own development
setup. The description takes the concrete form of a Puppet 'role'.
These can be submitted as patches against the MW-V repo and thus
shared with others. The combination of Vagrant / Puppet / VirtualBox
make it quite easy to select and swap machine roles.

Roles can be toggled by adding or removing a line, 'include
role::name', from puppet/manifests/site.pp. The current set of roles
are listed here:
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/vagrant.git;a=blob;f=puppet/manifests/roles.pp

They include a VisualEditor role that is powered by a local Parsoid
instance, and roles for Selenium testing, Echo, MobileFrontend,
GettingStarted, and EventLogging.

If you're interested in checking it out:
0. Delete any old instances.
1. Download  install VirtualBox: https://www.virtualbox.org/wiki/Downloads
2. Download  instal Vagrant: http://downloads.vagrantup.com/tags/v1.2.2
3. git clone https://gerrit.wikimedia.org/r/p/mediawiki/vagrant.git
4. Edit puppet/manifests/site.pp and uncomment the role you want to check out.
5. Run 'vagrant up' from the repository's root directory.

Finally: wait! The first run takes an obnoxiously long time (15-20
minutes). Subsequent runs are much faster (~5 seconds if there are no
major configuration changes.)

The documentation on MediaWiki.org is still meager, but it is
concentrated here: http://www.mediawiki.org/wiki/Mediawiki-Vagrant

If you're interested in using MW-V to document and automate your
development setup, get in touch -- I'd be happy to help. I'm 'ori-l'
on IRC, and I'll be at the Amsterdam Hackathon later this month too.

Finally, if you're interested in how this relates to Labs (short
answer: the use-cases are largely complementary), Andrew Bogott and I
have an open submission for a Wikimania talk on this subject:
https://wikimania2013.wikimedia.org/wiki/Submissions/Turnkey_Mediawiki_Test_Platforms:_Vagrant_and_Labs

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to get permission to view deleted revision information via API?

2013-05-13 Thread Claudia Müller-Birn
On May 8, 2013, at 3:46 PM, Chad innocentkil...@gmail.com wrote:
 
 For a personal wiki, this means you need the 'deletedhistory' and 
 'deletedtext'
 permissions (which is by default assigned to the admin group).
 
 On WMF wikis, this requires being an administrator (although I think there's
 some exception for researchers handed out via Meta).
 
 -Chad

Hi Chad,

Thank you very much for your information. Your reply came so fast that I 
somehow missed it. Sorry. 

Since this question occurred within a research project I will try to apply for 
researcher access.

Claudia 

 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [IRC - bot] wm-bot outage

2013-05-13 Thread Petr Bena
ok so after almost 2 days of recovering I am getting somewhere... so
here is a list of things that were lost and I am unable to recover
them:

* changes to access lists for all channels since 1. February 2013
* statistics for all channels where they were enabled since 1. February 2013
* seen data since 1. February 2013
* infobot metadata (names of users who created a key etc) (for
#mediawiki completely gone, for other channels all key data since 1.
February are gone)
* configuration changes of all channels

here is the list of things that were somewhat recovered or need
additional recovery, but where recovery is possible

* list of all removed / added channels since 1. February 2013
(channels that were added after 1. February have empty access l. so
should you need to regain access to them, just poke me on irc - petan)
* infobot data (not including metadata) - raw keys and raw aliases.
They were preserved thanks to html dumps that live on different
physical storage. I copied all of them to
http://bots.wmflabs.org/~wm-bot/db/backup/ there is a way to convert
these html dumps back to xml format they are stored in - but it's long
and painfull. So far I recovered only largest DB of #mediawiki which
is itself far larger than all other db's together. Given that most of
other channels contain only few keys, I decided not to recover them.
You can always copy paste the keys from preserved html back to bot.
* RC changes
* RSS changes


Right now the bot (wm-bot) is running ONLY with logging module - so
that it only can log channels, where logging was enabled, all other
functions are disabled and I am still working on recovery - but it
/should/ be back fully operational today.


On Sun, May 12, 2013 at 10:59 AM, Petr Bena benap...@gmail.com wrote:
 Hi, I got 2 news, one good and one bad

 Good one is that since today anybody can freely call me idiot as long you 
 want.

 The bad one is that I /accidentally/ rm -r'ed system folder of wm-bot.
 That doesn't mean logs are gone, they were in data folder which is
 unaffected (and backed up every day). System folder contains mostly
 only configuration and stuff that I would never expect to get
 accidentally rm'ed, and last external backup I have is 3 months old.

 So... wm-bot will be down for some time (hopefully max several hours
 until I manually rewrite all missing configuration files (200+ files)

 I apologize for all issues caused by this, I will also try to start up
 some minimal version asap that will only perform the logging of
 channels.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [IRC - bot] wm-bot outage

2013-05-13 Thread Petr Bena
wm-bot is back as it used to be. Should there be any problem with
anything - poke me on irc, nick petan

On Mon, May 13, 2013 at 4:16 PM, Petr Bena benap...@gmail.com wrote:
 ok so after almost 2 days of recovering I am getting somewhere... so
 here is a list of things that were lost and I am unable to recover
 them:

 * changes to access lists for all channels since 1. February 2013
 * statistics for all channels where they were enabled since 1. February 2013
 * seen data since 1. February 2013
 * infobot metadata (names of users who created a key etc) (for
 #mediawiki completely gone, for other channels all key data since 1.
 February are gone)
 * configuration changes of all channels

 here is the list of things that were somewhat recovered or need
 additional recovery, but where recovery is possible

 * list of all removed / added channels since 1. February 2013
 (channels that were added after 1. February have empty access l. so
 should you need to regain access to them, just poke me on irc - petan)
 * infobot data (not including metadata) - raw keys and raw aliases.
 They were preserved thanks to html dumps that live on different
 physical storage. I copied all of them to
 http://bots.wmflabs.org/~wm-bot/db/backup/ there is a way to convert
 these html dumps back to xml format they are stored in - but it's long
 and painfull. So far I recovered only largest DB of #mediawiki which
 is itself far larger than all other db's together. Given that most of
 other channels contain only few keys, I decided not to recover them.
 You can always copy paste the keys from preserved html back to bot.
 * RC changes
 * RSS changes


 Right now the bot (wm-bot) is running ONLY with logging module - so
 that it only can log channels, where logging was enabled, all other
 functions are disabled and I am still working on recovery - but it
 /should/ be back fully operational today.


 On Sun, May 12, 2013 at 10:59 AM, Petr Bena benap...@gmail.com wrote:
 Hi, I got 2 news, one good and one bad

 Good one is that since today anybody can freely call me idiot as long you 
 want.

 The bad one is that I /accidentally/ rm -r'ed system folder of wm-bot.
 That doesn't mean logs are gone, they were in data folder which is
 unaffected (and backed up every day). System folder contains mostly
 only configuration and stuff that I would never expect to get
 accidentally rm'ed, and last external backup I have is 3 months old.

 So... wm-bot will be down for some time (hopefully max several hours
 until I manually rewrite all missing configuration files (200+ files)

 I apologize for all issues caused by this, I will also try to start up
 some minimal version asap that will only perform the logging of
 channels.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Please Welcome Latest TechOps member - Alexandros Kosiaris

2013-05-13 Thread Ct Woo
 All,

We are excited to announce Alexandros Kosiaris will join us this Monday
 (2013-05-17) as a full-time member of the Technical Operations staff.  He
will be based in Athens, Greece.

Alex (short for Alexandros) has a 10-year experience in the System
Engineering, having worked as a senior systems Engineer in GRNET NOC, the
Network Operations Center of the Greek Research and Education Network and
National Technical University of Athens' Network Operations Center.

An avid user of open-source software for more than 13 years he also tries
to contribute whenever he can. He maintains a number of FreeBSD ports plus
a couple of small open-source projects.

He lives in Athens, Greece. Aside from all things technical, he also enjoys
wind-surfing, skiing and basketball.

Alex will be in San Francisco office this coming Monday and please drop by
to welcome him!

Thanks,

CT Woo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Mark A. Hershberger
On 05/13/2013 12:26 AM, Moriel Schottlender wrote:
 I'd like to get your opinions and critique on my very first MediaWiki
 extension, which, I hope, will be helpful to other developers.

I like this idea.  I wonder if you could work with the WikiApiary site
(http://wikiapiary.com/) as you adapt this to track releases rather than
just updates in Gerrit.

I've CC'd Jamie Thingelstad of WikiApiary in case he has any insights
into this.  Maybe an API is available that you could use.

Finally, have you thought of making it possible to update the extension
in-place like WordPress does?


-- 
http://hexmode.com/

Imagination does not breed insanity. Exactly what does breed insanity
is reason. Poets do not go mad; but chess-players do.
-- G.K. Chesterson

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Max Semenik
Hi, I've seen recently a lot of code like this:

$html = Html::openElement( 'div', array( 'class' = 'foo' )
. Html::rawElement( 'p', array(),
Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
$somePotentiallyUnsafeText
)
)
. Html::closeElement( 'div' );

IMO, cruft like this makes things harder to read and adds additional
performance overhead. It can be simplified to

$html = 'div class=foo'p'
. Html::rawElement( 'p', array(),
Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
$somePotentiallyUnsafeText
)
)
. '/p/div';

What's your opinion, guys and gals?

-- 
Best regards,
  Max Semenik ([[User:MaxSem]])


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Daniel Barrett
I think your question answers itself, because you have a syntax error in your 
suggested HTML string:

  $html = 'div class=foo'p'
  ...

Use the Html class and you'll have fewer worries about malformed HTML.

WordPress code is full of hard-coded HTML strings and it's maddening to read 
and understand. Personally I'm thankful for MediaWiki's Html class to keep 
things orderly.

DanB

-Original Message-
From: wikitech-l-boun...@lists.wikimedia.org 
[mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of Max Semenik
Sent: Monday, May 13, 2013 1:27 PM
To: Wikimedia developers
Subject: [Wikitech-l] Code style: overuse of Html::element()

Hi, I've seen recently a lot of code like this:

$html = Html::openElement( 'div', array( 'class' = 'foo' )
. Html::rawElement( 'p', array(),
Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
$somePotentiallyUnsafeText
)
)
. Html::closeElement( 'div' );

IMO, cruft like this makes things harder to read and adds additional 
performance overhead. It can be simplified to

$html = 'div class=foo'p'
. Html::rawElement( 'p', array(),
Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
$somePotentiallyUnsafeText
)
)
. '/p/div';

What's your opinion, guys and gals?

--
Best regards,
  Max Semenik ([[User:MaxSem]])


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Chris Steipp
On Mon, May 13, 2013 at 10:26 AM, Max Semenik maxsem.w...@gmail.com wrote:
 Hi, I've seen recently a lot of code like this:

 $html = Html::openElement( 'div', array( 'class' = 'foo' )
 . Html::rawElement( 'p', array(),
 Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
 $somePotentiallyUnsafeText
 )
 )
 . Html::closeElement( 'div' );

 IMO, cruft like this makes things harder to read and adds additional
 performance overhead. It can be simplified to

 $html = 'div class=foo'p'
 . Html::rawElement( 'p', array(),
 Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
 $somePotentiallyUnsafeText
 )
 )
 . '/p/div';

 What's your opinion, guys and gals?

I'm probably a bad offender here, but you've unintentionally proved my
point ;). Note that in your example, you used a single instead of a
double quote after foo. Obviously, if you're using an IDE, syntax
highlighting would have helped you, but my point being that when you
use the classes, you're less likely to make those little mistakes that
could potentially have disastrous consequences (like using single
quotes around an entity and relying on htmlspecialchars for escaping,
etc). And for security, I prefer for people to use whatever will cause
the least amount of mistakes.

Personally also, when I'm code reviewing I don't like to see  in the
php, but that's my person preference.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Tyler Romeo
Chris makes a good point. Also, it should be noted that the Html class does
a lot more than just escape stuff. It does a whole bunch of attribute
validation and standardization to make output HTML5-sanitary. While in
simple cases like the one above it will not make a difference, it is
probably better to maintain a uniform approach when generating HTML output.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com


On Mon, May 13, 2013 at 2:05 PM, Chris Steipp cste...@wikimedia.org wrote:

 On Mon, May 13, 2013 at 10:26 AM, Max Semenik maxsem.w...@gmail.com
 wrote:
  Hi, I've seen recently a lot of code like this:
 
  $html = Html::openElement( 'div', array( 'class' = 'foo' )
  . Html::rawElement( 'p', array(),
  Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
  $somePotentiallyUnsafeText
  )
  )
  . Html::closeElement( 'div' );
 
  IMO, cruft like this makes things harder to read and adds additional
  performance overhead. It can be simplified to
 
  $html = 'div class=foo'p'
  . Html::rawElement( 'p', array(),
  Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
  $somePotentiallyUnsafeText
  )
  )
  . '/p/div';
 
  What's your opinion, guys and gals?

 I'm probably a bad offender here, but you've unintentionally proved my
 point ;). Note that in your example, you used a single instead of a
 double quote after foo. Obviously, if you're using an IDE, syntax
 highlighting would have helped you, but my point being that when you
 use the classes, you're less likely to make those little mistakes that
 could potentially have disastrous consequences (like using single
 quotes around an entity and relying on htmlspecialchars for escaping,
 etc). And for security, I prefer for people to use whatever will cause
 the least amount of mistakes.

 Personally also, when I'm code reviewing I don't like to see  in the
 php, but that's my person preference.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] 1.21rc5 -- final release candidate before release on May 15

2013-05-13 Thread Mark A. Hershberger
The final release candidate for MediaWiki 1.21 (rc5) is available for
download.  See the end of this email for details.  There are a number of
bug fixes in this release that were not in rc4.  Please test this
version and report any problems.  Barring any show-stoppers, the final
release will be on May 15 will be the same as this with just a change to
the version number and updated release notes.

Changes since rc4:

* Changes to JS/JSON value encoding were backed out.
* API: Fix parameter validation in setnotificationtimestamp
commit 18eb39ecb5b89518c3d4a6cc79834a2c121477ff
Author: Brad Jorsch bjor...@wikimedia.org
Date:   Fri Mar 15 14:03:19 2013 -0400

* (bug 47271) $wgContentHandlerUseDB should be set to false during the
upgrade
* (bug 47489) Installer now automatically selects the next-best database
type if
 the PHP mysql extension is not loaded, preventing fatal errors in some
cases.
* (bug 47202) wikibits: FF2Fixes.css should not be loaded in Firefox 20.
* (bug 46590) 'AbortChangePassword' hook added/documented (find bug)
* (bug 46848) toggleLinkPremade added linksPassthru
* (bug 47950) Fix IndexPager detection of is first state
* (bug 47304) added safeXmlEncodings to UploadBase for svg uploads
* (bug 46608) CoreParserFunctions::anchorencode should return a string


* Full release notes:
https://www.mediawiki.org/wiki/Release_notes/1.21


**
Download:
http://download.wikimedia.org/mediawiki/1.21/mediawiki-core-1.21.0rc5.tar.gz
http://download.wikimedia.org/mediawiki/1.21/mediawiki-1.21.0rc5.tar.gz

Patch to previous version (1.20.0), without interface text:
http://download.wikimedia.org/mediawiki/1.21/mediawiki-1.21.0rc5.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.21/mediawiki-i18n-1.21.0rc5.patch.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.21/mediawiki-core-1.21.0rc5.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.21/mediawiki-1.21.0rc5.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.21/mediawiki-1.21.0rc5.patch.gz.sig
http://download.wikimedia.org/mediawiki/1.21/mediawiki-i18n-1.21.0rc5.patch.gz.sig

Public keys:
https://secure.wikimedia.org/keys.html


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [IRC - bot] wm-bot outage

2013-05-13 Thread Ori Livneh
On Mon, May 13, 2013 at 9:06 AM, Petr Bena benap...@gmail.com wrote:

 wm-bot is back as it used to be. Should there be any problem with
 anything - poke me on irc, nick petan


Thanks! wm-bot is very useful, and your diligence in reporting the outage
and fixing it was excellent.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Moriel Schottlender
Guys - thank you so much for the comments, I really appreciate it!

The future goal for this could very well be something that helps people
upgrade their extensions (as similar as possible to something like
wordpress, which is very newbie-friendly) and download the files directly
with or without git available.

I think the biggest challenge is the target audience here. I wanted to try
and make sure this extension works well for newbies who need upgrade but
*also* to experienced users that might need to pull from the repositories.
Also, I'm a windows user (I do all the gerrit stuff from a VM) so I aimed
towards compliance there too.

Aiming towards both crowds is tricky, but I think it's possible with your
feedback :)

I'm going to sit and write a to do list and try to prioritize it. Now
that I know the extension is worthwhile, I might as well open a page for it
in MW and put it there, and let people contribute their thoughts here or in
its talk page.

(Quick disclaimer -- my finals are this week, and I do this in my 'break'
time, so please bear with me if I'm a tad slow on answering or submitting
improvements! Sadly, exams don't study themselves -- though that *could*
have been an awesome extension ;)


Please keep the comments coming! This is really helpful :)

Thanks,

Moriel



On Mon, May 13, 2013 at 12:58 PM, Mark A. Hershberger m...@everybody.orgwrote:

 On 05/13/2013 12:26 AM, Moriel Schottlender wrote:
  I'd like to get your opinions and critique on my very first MediaWiki
  extension, which, I hope, will be helpful to other developers.

 I like this idea.  I wonder if you could work with the WikiApiary site
 (http://wikiapiary.com/) as you adapt this to track releases rather than
 just updates in Gerrit.

 I've CC'd Jamie Thingelstad of WikiApiary in case he has any insights
 into this.  Maybe an API is available that you could use.

 Finally, have you thought of making it possible to update the extension
 in-place like WordPress does?


 --
 http://hexmode.com/

 Imagination does not breed insanity. Exactly what does breed insanity
 is reason. Poets do not go mad; but chess-players do.
 -- G.K. Chesterson




-- 
No trees were harmed in the creation of this post.
But billions of electrons, photons, and electromagnetic waves were terribly
inconvenienced during its transmission!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Thomas Gries
Am 13.05.2013 21:12, schrieb Moriel Schottlender:
 Guys - thank you so much for the comments, I really appreciate it!
Hi,
I just wanted to _/mention/_:

1.
This was merged into MediaWiki core on 16.04.2013:
https://gerrit.wikimedia.org/r/#/c/54986/
Add git HEAD date to Special:Version for core and extensions

2.
The following is pending, but perhaps better be solved by your and
DaSch's extension:
https://gerrit.wikimedia.org/r/#/c/59373/
(bug 47264 https://bugzilla.wikimedia.org/47264) Special:Version: add
last local update timestamp for core and extensions

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [GSoC 2013] Wikidata Entity Suggester prototype

2013-05-13 Thread Matthew Flaschen
On 05/13/2013 06:11 AM, Nilesh Chakraborty wrote:
 Hi everyone,
 
 I'm working on a prototype for the Wikidata Entity Suggester (Bug
 #46555https://bugzilla.wikimedia.org/show_bug.cgi?id=46555).
 As of now, it is a command-line client, completely written in Java, that
 fetches recommendations from a Myrrix server layer.

Is all the Myrrix code you're using open source?  It looks like only the
Serving Layer is, but they also have a proprietary Computation Layer.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikitech] Please Welcome Latest TechOps member - Alexandros Kosiaris

2013-05-13 Thread Matthew Flaschen
On 05/13/2013 12:48 PM, Ct Woo wrote:
  All,
 
 We are excited to announce Alexandros Kosiaris will join us this Monday
  (2013-05-17) as a full-time member of the Technical Operations staff. 
 He will be based in Athens, Greece.

Welcome!

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [GSoC 2013] Wikidata Entity Suggester prototype

2013-05-13 Thread Nilesh Chakraborty
Hi Matt,

Yes, you're right, they are available as separately licensed downloads.
Only the stand-alone Serving Layer is needed for the Entity Suggester.
It's licensed under Apache v. 2.0. Since I'm using the software as-is,
without any code modifications, I suppose it's compatible with what
Wikidata would allow?

Given the amount of data in the data dump, we won't be needing to use a
Hadoop cluster with multiple machines. The proprietary Computation Layer
is only needed for heavy-weight distributed processing.

Cheers,
Nilesh



On Tue, May 14, 2013 at 1:48 AM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:

 On 05/13/2013 06:11 AM, Nilesh Chakraborty wrote:
  Hi everyone,
 
  I'm working on a prototype for the Wikidata Entity Suggester (Bug
  #46555https://bugzilla.wikimedia.org/show_bug.cgi?id=46555).
  As of now, it is a command-line client, completely written in Java, that
  fetches recommendations from a Myrrix server layer.

 Is all the Myrrix code you're using open source?  It looks like only the
 Serving Layer is, but they also have a proprietary Computation Layer.

 Matt Flaschen

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
A quest eternal, a life so small! So don't just play the guitar, build one.
You can also email me at cont...@nileshc.com or visit my
websitehttp://www.nileshc.com/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [GSoC 2013] Wikidata Entity Suggester prototype

2013-05-13 Thread Matthew Flaschen
On 05/13/2013 04:28 PM, Nilesh Chakraborty wrote:
 Hi Matt,
 
 Yes, you're right, they are available as separately licensed downloads.
 Only the stand-alone Serving Layer is needed for the Entity Suggester.
 It's licensed under Apache v. 2.0. Since I'm using the software as-is,
 without any code modifications, I suppose it's compatible with what
 Wikidata would allow?

Apache 2.0-licensed software should be fine, even if you do need/want to
modify it.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Paul Selitskas
Also, standards can change sometimes and same tags may change. It's better
to change a thing in one place than chasing the mysterious bug.


On Mon, May 13, 2013 at 9:22 PM, Tyler Romeo tylerro...@gmail.com wrote:

 Chris makes a good point. Also, it should be noted that the Html class does
 a lot more than just escape stuff. It does a whole bunch of attribute
 validation and standardization to make output HTML5-sanitary. While in
 simple cases like the one above it will not make a difference, it is
 probably better to maintain a uniform approach when generating HTML output.

 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com


 On Mon, May 13, 2013 at 2:05 PM, Chris Steipp cste...@wikimedia.org
 wrote:

  On Mon, May 13, 2013 at 10:26 AM, Max Semenik maxsem.w...@gmail.com
  wrote:
   Hi, I've seen recently a lot of code like this:
  
   $html = Html::openElement( 'div', array( 'class' = 'foo' )
   . Html::rawElement( 'p', array(),
   Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId
 ),
   $somePotentiallyUnsafeText
   )
   )
   . Html::closeElement( 'div' );
  
   IMO, cruft like this makes things harder to read and adds additional
   performance overhead. It can be simplified to
  
   $html = 'div class=foo'p'
   . Html::rawElement( 'p', array(),
   Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId
 ),
   $somePotentiallyUnsafeText
   )
   )
   . '/p/div';
  
   What's your opinion, guys and gals?
 
  I'm probably a bad offender here, but you've unintentionally proved my
  point ;). Note that in your example, you used a single instead of a
  double quote after foo. Obviously, if you're using an IDE, syntax
  highlighting would have helped you, but my point being that when you
  use the classes, you're less likely to make those little mistakes that
  could potentially have disastrous consequences (like using single
  quotes around an entity and relying on htmlspecialchars for escaping,
  etc). And for security, I prefer for people to use whatever will cause
  the least amount of mistakes.
 
  Personally also, when I'm code reviewing I don't like to see  in the
  php, but that's my person preference.
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
З павагай,
Павел Селіцкас/Pavel Selitskas
Wizardist @ Wikimedia projects
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Antoine Musso
Le 13/05/13 19:26, Max Semenik a écrit :
 Hi, I've seen recently a lot of code like this:
 
 $html = Html::openElement( 'div', array( 'class' = 'foo' )
 . Html::rawElement( 'p', array(),
 Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
 $somePotentiallyUnsafeText
 )
 )
 . Html::closeElement( 'div' );
 
 IMO, cruft like this makes things harder to read and adds additional
 performance overhead.

Html is just like our Xml class, that let us raise the probability that
the result code will be valid.  That is also a good way to make sure the
content is properly escaped, though in the example above that could lead
to some mistake due to all the nested calls.

For the performance overhead, it surely exist but it is most probably
negligible unless the methods are in a heavily used code path.


Ideally we would use templates to generate all of that. That will let us
extract the views logic out of the PHP code.


-- 
Antoine hashar Musso

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Yuri Astrakhan
On one hand, I prefer to have a properly formatted code, but on the other,
most systems i have worked with have a very high cost of string
concatenation, and I have a strong suspicion PHP is guilty of that too.
Constructing HTML one element/value at a time might prove to be on of the
bigger perf bottlenecks.

From my personal experience, once I worked on a networking lib for
proprietary protocol, and noticed that there was a lot of logging calls in
the form of Log(value1= + value1 +  value2= + value2 ...). After I
switched it to the form Log(value1={0}, value2={1}, value1, value2), the
code became an order of magnitude faster because the logging
framework deferred concatenation until the last moment after it knew that
logging is needed, and the actual concatenation was done for the whole
complex string with values, not one substring at a time.


On Mon, May 13, 2013 at 6:10 PM, Antoine Musso hashar+...@free.fr wrote:

 Le 13/05/13 19:26, Max Semenik a écrit :
  Hi, I've seen recently a lot of code like this:
 
  $html = Html::openElement( 'div', array( 'class' = 'foo' )
  . Html::rawElement( 'p', array(),
  Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
  $somePotentiallyUnsafeText
  )
  )
  . Html::closeElement( 'div' );
 
  IMO, cruft like this makes things harder to read and adds additional
  performance overhead.

 Html is just like our Xml class, that let us raise the probability that
 the result code will be valid.  That is also a good way to make sure the
 content is properly escaped, though in the example above that could lead
 to some mistake due to all the nested calls.

 For the performance overhead, it surely exist but it is most probably
 negligible unless the methods are in a heavily used code path.


 Ideally we would use templates to generate all of that. That will let us
 extract the views logic out of the PHP code.


 --
 Antoine hashar Musso


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Jon Robson
Personally as a frontend developer I find maintaining html the
Html::openElement way in PHP a nightmare.

Following on from Antoine's post, I experimented recently with using a
template engine Mustache that works on both javascript and PHP and
allows separation of HTML templates from PHP code.  In the mobile team
we are increasingly finding overlap in things we need to render in
javascript and in PHP. MobileFrontend currently uses Hogan (which is
essentially Mustache) in our javascript code base and it helps make
our javascript easier to maintain and read. We'd love the same for PHP
- there's even a bug for that [1].

These templates make escaping simple - you either do {{variable}} to
render escaped or {{{html}}} to render HTML.

My personal belief is taking this approach would lead to much more
readable code (especially when it comes to skins). The proof is in the
pudding - [2][3]

We also have an HTML validation script [4] that I wrote about earlier
which allows us to validate pages and avoid submitting invalid code so
I wouldn't use this as an argument against...

[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=44130
[2] https://github.com/jdlrobson/Minerva/blob/evenmorevanilla/Minerva.php#L72
[3] 
https://github.com/jdlrobson/Minerva/blob/evenmorevanilla/minerva/templates/main.html
[4] http://www.gossamer-threads.com/lists/wiki/wikitech/355021

On Mon, May 13, 2013 at 3:27 PM, Yuri Astrakhan
yastrak...@wikimedia.org wrote:
 On one hand, I prefer to have a properly formatted code, but on the other,
 most systems i have worked with have a very high cost of string
 concatenation, and I have a strong suspicion PHP is guilty of that too.
 Constructing HTML one element/value at a time might prove to be on of the
 bigger perf bottlenecks.

 From my personal experience, once I worked on a networking lib for
 proprietary protocol, and noticed that there was a lot of logging calls in
 the form of Log(value1= + value1 +  value2= + value2 ...). After I
 switched it to the form Log(value1={0}, value2={1}, value1, value2), the
 code became an order of magnitude faster because the logging
 framework deferred concatenation until the last moment after it knew that
 logging is needed, and the actual concatenation was done for the whole
 complex string with values, not one substring at a time.


 On Mon, May 13, 2013 at 6:10 PM, Antoine Musso hashar+...@free.fr wrote:

 Le 13/05/13 19:26, Max Semenik a écrit :
  Hi, I've seen recently a lot of code like this:
 
  $html = Html::openElement( 'div', array( 'class' = 'foo' )
  . Html::rawElement( 'p', array(),
  Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
  $somePotentiallyUnsafeText
  )
  )
  . Html::closeElement( 'div' );
 
  IMO, cruft like this makes things harder to read and adds additional
  performance overhead.

 Html is just like our Xml class, that let us raise the probability that
 the result code will be valid.  That is also a good way to make sure the
 content is properly escaped, though in the example above that could lead
 to some mistake due to all the nested calls.

 For the performance overhead, it surely exist but it is most probably
 negligible unless the methods are in a heavily used code path.


 Ideally we would use templates to generate all of that. That will let us
 extract the views logic out of the PHP code.


 --
 Antoine hashar Musso


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Jon Robson
http://jonrobson.me.uk
@rakugojon

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Matthew Walker

 In the mobile team we are increasingly finding overlap in things we need
 to render in javascript and in PHP

Out of curiosity -- what are you rendering in JS that you wouldn't already
have in the DOM or wouldn't be able to render server side?

We'd love the same for PHP - there's even a bug for that.

Not that it's ready for General release yet but Fundraising has been using
Twig for a while in parts of it's code (our thank you email generator, and
unsubscribe processor). I enjoy the ability the create code reusable and
code-review-able extensions like [1] -- which seems to be a plus for Twig
over Mustache. I also enjoy the built in file system loader and template
inheritance mechanisms that twig offers.

Chris S. has it in his review queue to security review twig so that I can
make my Twig/MediaWiki integration more official.

[1] https://gerrit.wikimedia.org/r/#/c/63252/

~Matt Walker
Wikimedia Foundation
Fundraising Technology Team


On Mon, May 13, 2013 at 5:23 PM, Jon Robson jdlrob...@gmail.com wrote:

 Personally as a frontend developer I find maintaining html the
 Html::openElement way in PHP a nightmare.

 Following on from Antoine's post, I experimented recently with using a
 template engine Mustache that works on both javascript and PHP and
 allows separation of HTML templates from PHP code.  In the mobile team
 we are increasingly finding overlap in things we need to render in
 javascript and in PHP. MobileFrontend currently uses Hogan (which is
 essentially Mustache) in our javascript code base and it helps make
 our javascript easier to maintain and read. We'd love the same for PHP
 - there's even a bug for that [1].

 These templates make escaping simple - you either do {{variable}} to
 render escaped or {{{html}}} to render HTML.

 My personal belief is taking this approach would lead to much more
 readable code (especially when it comes to skins). The proof is in the
 pudding - [2][3]

 We also have an HTML validation script [4] that I wrote about earlier
 which allows us to validate pages and avoid submitting invalid code so
 I wouldn't use this as an argument against...

 [1] https://bugzilla.wikimedia.org/show_bug.cgi?id=44130
 [2]
 https://github.com/jdlrobson/Minerva/blob/evenmorevanilla/Minerva.php#L72
 [3]
 https://github.com/jdlrobson/Minerva/blob/evenmorevanilla/minerva/templates/main.html
 [4] http://www.gossamer-threads.com/lists/wiki/wikitech/355021

 On Mon, May 13, 2013 at 3:27 PM, Yuri Astrakhan
 yastrak...@wikimedia.org wrote:
  On one hand, I prefer to have a properly formatted code, but on the
 other,
  most systems i have worked with have a very high cost of string
  concatenation, and I have a strong suspicion PHP is guilty of that too.
  Constructing HTML one element/value at a time might prove to be on of the
  bigger perf bottlenecks.
 
  From my personal experience, once I worked on a networking lib for
  proprietary protocol, and noticed that there was a lot of logging calls
 in
  the form of Log(value1= + value1 +  value2= + value2 ...). After I
  switched it to the form Log(value1={0}, value2={1}, value1, value2),
 the
  code became an order of magnitude faster because the logging
  framework deferred concatenation until the last moment after it knew that
  logging is needed, and the actual concatenation was done for the whole
  complex string with values, not one substring at a time.
 
 
  On Mon, May 13, 2013 at 6:10 PM, Antoine Musso hashar+...@free.fr
 wrote:
 
  Le 13/05/13 19:26, Max Semenik a écrit :
   Hi, I've seen recently a lot of code like this:
  
   $html = Html::openElement( 'div', array( 'class' = 'foo' )
   . Html::rawElement( 'p', array(),
   Html::element( 'span', array( 'id' =
 $somePotentiallyUnsafeId ),
   $somePotentiallyUnsafeText
   )
   )
   . Html::closeElement( 'div' );
  
   IMO, cruft like this makes things harder to read and adds additional
   performance overhead.
 
  Html is just like our Xml class, that let us raise the probability that
  the result code will be valid.  That is also a good way to make sure the
  content is properly escaped, though in the example above that could lead
  to some mistake due to all the nested calls.
 
  For the performance overhead, it surely exist but it is most probably
  negligible unless the methods are in a heavily used code path.
 
 
  Ideally we would use templates to generate all of that. That will let us
  extract the views logic out of the PHP code.
 
 
  --
  Antoine hashar Musso
 
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l



 --
 Jon Robson
 http://jonrobson.me.uk
 @rakugojon

 ___
 Wikitech-l mailing list
 

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Jon Robson
Currently we are experimenting with lazy loading pages and talk pages that
are loaded within a page. Both need templates to render in both PHP and
JavaScript. The fact that we use sections in mobile requires looping so
it's not possible to determine a template for a page with sections from the
html in a stub page for example. We are also thinking about separating
language from HTML. The JavaScript would need an equivalent PHP fallback..
(see https://bugzilla.wikimedia.org/show_bug.cgi?id=40678)

twig reminds me of jinja python templates. It looks far too powerful for my
liking (casual glance so sorry if I've read this wrong). Personal
experience tells me a powerful templating language encourages bad habits.
Filters like upper belong in code not templates in my opinion. The thing
Juliusz and I liked about Hogan was the fact it was very simple and close
to logic less (only has ifs and loops). The dumber the better :) Also is
there a JS implementation?

That said I like the idea that MediaWiki could be template agnostic and
work with various templating languages and we could RESOLVED LATER a
standard :)
On 13 May 2013 18:18, Matthew Walker mwal...@wikimedia.org wrote:

 
  In the mobile team we are increasingly finding overlap in things we need
  to render in javascript and in PHP

 Out of curiosity -- what are you rendering in JS that you wouldn't already
 have in the DOM or wouldn't be able to render server side?

 We'd love the same for PHP - there's even a bug for that.

 Not that it's ready for General release yet but Fundraising has been using
 Twig for a while in parts of it's code (our thank you email generator, and
 unsubscribe processor). I enjoy the ability the create code reusable and
 code-review-able extensions like [1] -- which seems to be a plus for Twig
 over Mustache. I also enjoy the built in file system loader and template
 inheritance mechanisms that twig offers.

 Chris S. has it in his review queue to security review twig so that I can
 make my Twig/MediaWiki integration more official.

 [1] https://gerrit.wikimedia.org/r/#/c/63252/

 ~Matt Walker
 Wikimedia Foundation
 Fundraising Technology Team


 On Mon, May 13, 2013 at 5:23 PM, Jon Robson jdlrob...@gmail.com wrote:

  Personally as a frontend developer I find maintaining html the
  Html::openElement way in PHP a nightmare.
 
  Following on from Antoine's post, I experimented recently with using a
  template engine Mustache that works on both javascript and PHP and
  allows separation of HTML templates from PHP code.  In the mobile team
  we are increasingly finding overlap in things we need to render in
  javascript and in PHP. MobileFrontend currently uses Hogan (which is
  essentially Mustache) in our javascript code base and it helps make
  our javascript easier to maintain and read. We'd love the same for PHP
  - there's even a bug for that [1].
 
  These templates make escaping simple - you either do {{variable}} to
  render escaped or {{{html}}} to render HTML.
 
  My personal belief is taking this approach would lead to much more
  readable code (especially when it comes to skins). The proof is in the
  pudding - [2][3]
 
  We also have an HTML validation script [4] that I wrote about earlier
  which allows us to validate pages and avoid submitting invalid code so
  I wouldn't use this as an argument against...
 
  [1] https://bugzilla.wikimedia.org/show_bug.cgi?id=44130
  [2]
 
 https://github.com/jdlrobson/Minerva/blob/evenmorevanilla/Minerva.php#L72
  [3]
 
 https://github.com/jdlrobson/Minerva/blob/evenmorevanilla/minerva/templates/main.html
  [4] http://www.gossamer-threads.com/lists/wiki/wikitech/355021
 
  On Mon, May 13, 2013 at 3:27 PM, Yuri Astrakhan
  yastrak...@wikimedia.org wrote:
   On one hand, I prefer to have a properly formatted code, but on the
  other,
   most systems i have worked with have a very high cost of string
   concatenation, and I have a strong suspicion PHP is guilty of that too.
   Constructing HTML one element/value at a time might prove to be on of
 the
   bigger perf bottlenecks.
  
   From my personal experience, once I worked on a networking lib for
   proprietary protocol, and noticed that there was a lot of logging calls
  in
   the form of Log(value1= + value1 +  value2= + value2 ...). After I
   switched it to the form Log(value1={0}, value2={1}, value1, value2),
  the
   code became an order of magnitude faster because the logging
   framework deferred concatenation until the last moment after it knew
 that
   logging is needed, and the actual concatenation was done for the whole
   complex string with values, not one substring at a time.
  
  
   On Mon, May 13, 2013 at 6:10 PM, Antoine Musso hashar+...@free.fr
  wrote:
  
   Le 13/05/13 19:26, Max Semenik a écrit :
Hi, I've seen recently a lot of code like this:
   
$html = Html::openElement( 'div', array( 'class' = 'foo' )
. Html::rawElement( 'p', array(),
Html::element( 'span', array( 'id' =
  

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Ryan Lane
On Sun, May 12, 2013 at 11:26 PM, Moriel Schottlender mor...@gmail.comwrote:

 Hello everyone,

 I'd like to get your opinions and critique on my very first MediaWiki
 extension, which, I hope, will be helpful to other developers.

 I noticed that there's no easy way of seeing if extensions that we have
 installed on our MediaWiki require update, and there are even some
 extensions that get so little regular attention that when they do get
 updated, there's no real way of knowing it (unless we check specifically).

 It can be annoying when encountering problems or bugs, and then discovering
 that one of our extensions (probably the one we least expected) actually
 has an update that fixed this bug.

 So, I thought to try and solve this issue with my extension. Since
 MediaWiki's changes are submitted through gerrit, I thought I'd take
 advantage of that and perform a remote check to see if there are any new
 commits that appeared in any of the extensions since they were installed.


I like the idea of this, but if this is used widely it's going to kill our
gerrit server. It hits gitweb directly. Gitweb calls are uncached and are
fairly expensive to run against our server. We replicate all repositories
to github, and they allow this kind of thing. Is there any way you can
change this to use the github replicas?

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Dmitriy Sintsov

On 13.05.2013 21:26, Max Semenik wrote:

Hi, I've seen recently a lot of code like this:

$html = Html::openElement( 'div', array( 'class' = 'foo' )
 . Html::rawElement( 'p', array(),
 Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
 $somePotentiallyUnsafeText
 )
 )
 . Html::closeElement( 'div' );

IMO, cruft like this makes things harder to read and adds additional
performance overhead. It can be simplified to

$html = 'div class=foo'p'
 . Html::rawElement( 'p', array(),
 Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
 $somePotentiallyUnsafeText
 )
 )
 . '/p/div';

What's your opinion, guys and gals?

In my Extension:QPoll I implemented tag arrays, which are more compact 
and support auto-closing of tags (no closeElement() is needed):

http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/QPoll/includes/qp_renderer.php?revision=103452view=markup

For performance reasons I did not bother about attribute validation and 
inner text quotation, performing these tasks in caller instead.

Output generarion was simple recursive traversal of nested PHP arrays.

However, it also has methods to dynamically add columns and rows to 
html tables, including colspans and rowspans.


The class worked well enough allowing to manipulate content, however 
subtree inserting and moving was not so elegant.


When I forced to abandon MediaWiki development due to financial reasons, 
I re-thinked the class and made the same tagarrays based

on XMLDocument and XMLWriter:

https://bitbucket.org/sdvpartnership/questpc-framework/src/a5482dd1035b6393f52049cda98c9539b6f77b6c/includes/Xml/XmlTree.php?at=master
https://bitbucket.org/sdvpartnership/questpc-framework/src/a5482dd1035b6393f52049cda98c9539b6f77b6c/includes/Xml/Writer/GenericXmlWriter.php?at=master

They are slower than echo $var; for sure, but allow much more powerful 
tag manipulation and templating in jQuery-like way. And the output is 
always valid and of course XMLDocument

automatically cares about inner text escaping and so on.

Here's an example of newer version of tag array definition:
array( '@tag' = 'div', 'class' = 'foo', array( '@tag' = 'p', array( 
'@tag' = 'span', 'id' = $id, $text ) ) );


String keys starting from '@' are special keys: '@tag' is a tag name, 
'@len' (optional) is count of child nodes.

Another string keys are attribute key / value pairs.
Integer keys are nested nodes - either nested tags or text nodes.
Also there are three special tag names:
'@tag' = '#text'
'@tag' = '#comment'
'@tag'='#cdata'

Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Moriel Schottlender
I can change this to the github repos, yes, all I need to know is the
schema of the url.
Also, Reedy made the excellent suggestion of Caching results -- which I am
working on -- that will also reduce both load time and heavy traffic time.

But actually, working in front of github will be somewhat easier, too,
since github has a better REST API.

Are all (or most) extensions in Github under https://github.com/wikimedia/ with
a mediawiki-extensions- prefix? Can I use that as a general rule of thumb?


On Mon, May 13, 2013 at 11:33 PM, Ryan Lane rlan...@gmail.com wrote:

 On Sun, May 12, 2013 at 11:26 PM, Moriel Schottlender mor...@gmail.com
 wrote:

  Hello everyone,
 
  I'd like to get your opinions and critique on my very first MediaWiki
  extension, which, I hope, will be helpful to other developers.
 
  I noticed that there's no easy way of seeing if extensions that we have
  installed on our MediaWiki require update, and there are even some
  extensions that get so little regular attention that when they do get
  updated, there's no real way of knowing it (unless we check
 specifically).
 
  It can be annoying when encountering problems or bugs, and then
 discovering
  that one of our extensions (probably the one we least expected) actually
  has an update that fixed this bug.
 
  So, I thought to try and solve this issue with my extension. Since
  MediaWiki's changes are submitted through gerrit, I thought I'd take
  advantage of that and perform a remote check to see if there are any new
  commits that appeared in any of the extensions since they were installed.
 
 
 I like the idea of this, but if this is used widely it's going to kill our
 gerrit server. It hits gitweb directly. Gitweb calls are uncached and are
 fairly expensive to run against our server. We replicate all repositories
 to github, and they allow this kind of thing. Is there any way you can
 change this to use the github replicas?

 - Ryan
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
No trees were harmed in the creation of this post.
But billions of electrons, photons, and electromagnetic waves were terribly
inconvenienced during its transmission!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New extension: Special:ExtensionStatus (opinions?)

2013-05-13 Thread Tyler Romeo
On Tue, May 14, 2013 at 1:57 AM, Moriel Schottlender mor...@gmail.comwrote:

 Are all (or most) extensions in Github under https://github.com/wikimedia/with
 a mediawiki-extensions- prefix? Can I use that as a general rule of
 thumb?


Yes. They're mirrored automatically.

Also, you're probably thinking this already, but just in case, make sure
not to remove Gerrit checking when you remove GitHub checking. Always good
to have that as a backup on the chance we ever move away from GitHub.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l