[Wikitech-l] PHP 5.5.8 and 5.4.24 have been released on 2014-01-10

2014-01-14 Thread Thomas Gries

The PHP development team announces the immediate availability of PHP
5.4.24. About 14 bugs were fixed. All PHP 5.4 users are encouraged to
upgrade to this version.

The PHP development team announces the immediate availability of PHP
5.5.8. This release fixes about 20 bugs against PHP 5.5.7 components.

See http://www.php.net/archive/2014.php#id2014-01-10-1 .

Be reminded, that PHP 5.5.x versions include the bytecode cache Zend
Optimizer+ which usually let your mediawiki run quicker.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] PHPUnit versioning

2014-01-14 Thread Brad Jorsch (Anomie)
This morning I tried to run some unit tests, and to my surprise it failed
with an error that PHPUnit 3.7.0 is now required. This was apparently done
in Gerrit change 105920[1] in response to bug 59759.[2]

Grepping through 1.23wmf10 finds the PHPUnit function complained about in
that bug in only a few extensions (Diff, EducationProgram, DataTypes). I
ran unit tests against various extensions that I have locally installed,
and core, AntiSpoof, cldr, FlaggedRevs, Gadgets, GeoData, Math, OAuth,
ParserFunctions, Scribunto, and TitleBlacklist all appear to work fine in
3.6.10. TemplateData uses another 3.7-only function
(assertJsonStringEqualsJsonString), which is also used in one test in
EventLogging.

The problem here is that Ubuntu's upcoming 14.04 Trusty Tahr, as well as
Debian unstable (sid), currently contain PHPUnit 3.6.10.[3][4] It seems to
me that requiring our developers to manually install a different version of
phpunit is instead going to make it even less likely for developers to
write or run tests, and will make it that much more difficult to fix tests
if they break.

Is there anything in 3.7 that we actually need? The problematic functions
seem to just be additional asserts that could probably be worked around. Or
is there any hope that Debian and/or Ubuntu will upgrade PHPUnit any time
soon?[5]


 [1]: https://gerrit.wikimedia.org/r/#/c/105920/
 [2]: https://bugzilla.wikimedia.org/show_bug.cgi?id=59759
 [3]: http://packages.debian.org/phpunit
 [4]: http://packages.ubuntu.com/phpunit
 [5]: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=697343

-- 
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Chad
On Tue, Jan 14, 2014 at 7:58 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org
 wrote:

 The problem here is that Ubuntu's upcoming 14.04 Trusty Tahr, as well as
 Debian unstable (sid), currently contain PHPUnit 3.6.10.[3][4] It seems to
 me that requiring our developers to manually install a different version of
 phpunit is instead going to make it even less likely for developers to
 write or run tests, and will make it that much more difficult to fix tests
 if they break.

 Is there anything in 3.7 that we actually need? The problematic functions
 seem to just be additional asserts that could probably be worked around. Or
 is there any hope that Debian and/or Ubuntu will upgrade PHPUnit any time
 soon?[5]


What version is available via PEAR? Installing via that is no more
manual than apt.

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Tyler Romeo
On Tue, Jan 14, 2014 at 11:07 AM, Chad innocentkil...@gmail.com wrote:

 What version is available via PEAR? Installing via that is no more
 manual than apt.


Also don't forget composer as well.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Brian Wolff
On Jan 14, 2014 11:58 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org
wrote:

 This morning I tried to run some unit tests, and to my surprise it failed
 with an error that PHPUnit 3.7.0 is now required. This was apparently done
 in Gerrit change 105920[1] in response to bug 59759.[2]

 Grepping through 1.23wmf10 finds the PHPUnit function complained about in
 that bug in only a few extensions (Diff, EducationProgram, DataTypes). I
 ran unit tests against various extensions that I have locally installed,
 and core, AntiSpoof, cldr, FlaggedRevs, Gadgets, GeoData, Math, OAuth,
 ParserFunctions, Scribunto, and TitleBlacklist all appear to work fine in
 3.6.10. TemplateData uses another 3.7-only function
 (assertJsonStringEqualsJsonString), which is also used in one test in
 EventLogging.

 The problem here is that Ubuntu's upcoming 14.04 Trusty Tahr, as well as
 Debian unstable (sid), currently contain PHPUnit 3.6.10.[3][4] It seems to
 me that requiring our developers to manually install a different version
of
 phpunit is instead going to make it even less likely for developers to
 write or run tests, and will make it that much more difficult to fix tests
 if they break.

 Is there anything in 3.7 that we actually need? The problematic functions
 seem to just be additional asserts that could probably be worked around.
Or
 is there any hope that Debian and/or Ubuntu will upgrade PHPUnit any time
 soon?[5]


  [1]: https://gerrit.wikimedia.org/r/#/c/105920/
  [2]: https://bugzilla.wikimedia.org/show_bug.cgi?id=59759
  [3]: http://packages.debian.org/phpunit
  [4]: http://packages.ubuntu.com/phpunit
  [5]: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=697343

 --
 Brad Jorsch (Anomie)
 Software Engineer
 Wikimedia Foundation
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Ive been manually commenting out the version check for a little while now
when i run them locally.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Jeroen De Dauw
Hey,

What version is available via PEAR? Installing via that is no more
 manual than apt.


Sebastian recommends that you use the phar, which is a lot easier then
PEAR. Instructions on how to use it can be found at:

https://github.com/sebastianbergmann/phpunit#installation

This morning I tried to run some unit tests, and to my surprise it failed
 with an error that PHPUnit 3.7.0 is now required.


For Diff this has been required for close to a year I think.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Chad
On Tue, Jan 14, 2014 at 8:21 AM, Jeroen De Dauw jeroended...@gmail.comwrote:

 Hey,

 What version is available via PEAR? Installing via that is no more
  manual than apt.
 

 Sebastian recommends that you use the phar, which is a lot easier then
 PEAR. Instructions on how to use it can be found at:

 https://github.com/sebastianbergmann/phpunit#installation


Can we use the phar in core? Since we construct our own PHPUnit
environment and don't run the standard PHPUnit binary...can you
use a phar file for loading a library and not just executing a script?

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Jeroen De Dauw
Hey,

 can you use a phar file for loading a library and not just executing a
script?

Yeah, you can include the phar (with a PHP include statement).

 Can we use the phar in core?

Sure. One reason I've seen brought forward to bundle such a phar with a
project is that then everyone runs the same version of PHPUnit.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Tim Landscheidt
Brad Jorsch (Anomie) bjor...@wikimedia.org wrote:

 [...]

 The problem here is that Ubuntu's upcoming 14.04 Trusty Tahr, as well as
 Debian unstable (sid), currently contain PHPUnit 3.6.10.[3][4] It seems to
 me that requiring our developers to manually install a different version of
 phpunit is instead going to make it even less likely for developers to
 write or run tests, and will make it that much more difficult to fix tests
 if they break.

 [...]

+1.  This has been a problem in the past as well, and it's
especially frustrating to see that ancient browsers or com-
ponents no longer supported upstream receive attention,
while for a central development tool manual intervention is
needed for essentially syntactic sugar.

Tim


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Chad
On Tue, Jan 14, 2014 at 8:39 AM, Jeroen De Dauw jeroended...@gmail.comwrote:

 Hey,

  can you use a phar file for loading a library and not just executing a
 script?

 Yeah, you can include the phar (with a PHP include statement).

  Can we use the phar in core?

 Sure. One reason I've seen brought forward to bundle such a phar with a
 project is that then everyone runs the same version of PHPUnit.


Well, we shouldn't bundle it, but it'd be nice to use it if you've
got it, rather than having PHPUnit installed in your include_path

Something like:

`php phpunit.php --phar=/foo/bar/phpunit.phar`

would be fantastic imho.

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Brad Jorsch (Anomie)
On Tue, Jan 14, 2014 at 11:07 AM, Chad innocentkil...@gmail.com wrote:

 What version is available via PEAR? Installing via that is no more
 manual than apt.


Err, yes it is. With apt it gets upgraded whenever I upgrade anything else,
while pear is in its own little world with its own dependencies and such.

pear, pecl, cpan, npm, and the rest of those language-specific package
managers seem like a very poor substitute to me.


On Tue, Jan 14, 2014 at 11:07 AM, Chad innocentkil...@gmail.com wrote:

 What version is available via PEAR?


On Tue, Jan 14, 2014 at 11:13 AM, Tyler Romeo tylerro...@gmail.com wrote:

 Also don't forget composer as well.


On Tue, Jan 14, 2014 at 11:21 AM, Jeroen De Dauw jeroended...@gmail.comwrote:

 Sebastian recommends that you use the phar


All the more reason to use none of the above.


-- 
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] RFC process - open questions

2014-01-14 Thread Sumana Harihareswara
On the RFC Process talk page, I'm presenting some questions about our RFC
process and suggesting *my* answers:
https://www.mediawiki.org/wiki/Talk:Requests_for_comment/Process#Process_questionsYou
may find this super boring and I will not blame you if you skip the
whole discussion, but I may ask some of you personally to give your
thoughts on specific questions.

Big questions include:

Should we just have one RFC process, or a few different processes (to
fast-track some proposals, divide them up by core/mobile/language/UX/apps
or something like that, etc.)?
Should we add time limits to any part of the process?
How do we ensure that WMF does not do all the decision-making?

I'd love to keep discussion on that talkpage - please feel free to
rearrange into sections if you think it makes sense!

Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Chad
On Tue, Jan 14, 2014 at 9:31 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org
 wrote:

 On Tue, Jan 14, 2014 at 11:07 AM, Chad innocentkil...@gmail.com wrote:

  What version is available via PEAR? Installing via that is no more
  manual than apt.
 

 Err, yes it is. With apt it gets upgraded whenever I upgrade anything else,
 while pear is in its own little world with its own dependencies and such.

 pear, pecl, cpan, npm, and the rest of those language-specific package
 managers seem like a very poor substitute to me.


Fair enough. I just realized what a hypocrite I was being :)

I do like the idea of letting you specify the phar from the command
line though, so you don't have to worry about include_paths or can
swap out different versions at runtime.

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] A Multimedia Vision for 2016

2014-01-14 Thread Fabrice Florin
Dear Gerard,

Thank you so much for your kind words about the proposed Multimedia Vision for 
Wikimedia sites by 2016. (1)

I am glad that our first user stories resonate with you. They intentionally 
focus on ways that our community may interact through multimedia -- and we view 
these types of productive collaborations between different user groups as a key 
objective for our work.

We really appreciate your thoughtful blog post about this vision (2) and fully 
agree with you that more user stories will be needed to illustrate the scope of 
possible interactions between different communities around the world -- from 
schools to professional or personal sites around the world. We aim to identify 
more user stories like these to inform our next steps.

We are actively working with Lydia, Daniel and the Wikidata team to implement 
structured data on Commons and integrate it with Wikidata later this year, in 
collaboration with our community. We expect this work will improve a range of 
multimedia workflows as a result, from curation to search and beyond. We will 
definitely address the points you raise.

I would also like to thank all the community members who have joined our 
discussion about this multimedia vision (3). We are grateful for your feedback, 
and very glad to see a partnership develop between our community and the 
foundation around these goals, so we may better serve our users together.

If you haven’t commented yet, please share your feedback here, after viewing 
the video:

http://ur1.ca/gdljy

You are all invited to join our office hours IRC chat about multimedia this 
Thursday, January 16 at 19:00 UTC (4) — we look forward to discussing this 
vision and other media projects with you then. More on this later.

Thanks again for everyone’s wonderful work in helping share free knowledge 
through multimedia.

All the best,


Fabrice


(1) Multimedia Vision 2016:
https://commons.wikimedia.org/wiki/Commons:Multimedia_Features/Vision_2016

(2) Blog Post by Gerard:
http://ultimategerardm.blogspot.nl/2014/01/wikimedia-multimedia-featuresvision-2016.html

(3) Discuss the Multimedia Vision:
https://commons.wikimedia.org/wiki/Commons_talk:Multimedia_Features/Vision_2016

(4) Multimedia Office Hours chat on IRC: Thursday at 19:00 UTC
https://meta.wikimedia.org/wiki/IRC_office_hours#Upcoming_office_hours


___

Fabrice Florin
Product Manager, Multimedia
Wikimedia Foundation

Multimedia Hub:
https://www.mediawiki.org/wiki/Multimedia

Profile:
https://www.mediawiki.org/wiki/User:Fabrice_Florin_(WMF)


On Jan 10, 2014, at 4:01 AM, wikimedia-l-requ...@lists.wikimedia.org wrote:

 Date: Fri, 10 Jan 2014 09:33:30 +0100
 From: Gerard Meijssen gerard.meijs...@gmail.com
 To: Wikimedia developers wikitech-l@lists.wikimedia.org,WikiData-l
   wikidat...@lists.wikimedia.org,   Wikimedia Commons Discussion 
 List
   common...@lists.wikimedia.org,Wikimedia Mailing List
   wikimedi...@lists.wikimedia.org
 Subject: Re: [Wikimedia-l] [Wikitech-l] A Multimedia Vision for 2016
 Message-ID:
   cao53wxuxycnbfoe6fkugohvvopgb+mc+h9kbhk4btm74fqo...@mail.gmail.com
 Content-Type: text/plain; charset=UTF-8
 
 Hoi,
 Fabrice, I very much love the two stories described in the vision. It
 describes not only a functionality that is technical, it also describes how
 our community may interact. That is great.
 
 What I missed are the consequences of the planned integration of Commons
 with Wikidata. I blogged about it [1] and I suggest three more stories that
 could be told because they are enabled by this integration. What I do not
 fully understand is how the community aspects will integrate in an
 environment that will be more multi lingual and multi cultural as a
 consequence.
 
 I have confidence that the three stories that I suggest will be realised by
 2016. Not only that, I am pretty sure that as a consequence the amount of
 traffic that our servers will have to handle will grow enormously to the
 extend that I am convinced that our current capacity will not be able to
 cope. Then again, they are the luxury problems that make us appreciate how
 much room we still have for growth.
 Thanks,
 GerardM
 
 
 [1]
 http://ultimategerardm.blogspot.nl/2014/01/wikimedia-multimedia-featuresvision-2016.html
 
 
 On 10 January 2014 01:39, Fabrice Florin fflo...@wikimedia.org wrote:
 
 Happy new year, everyone!
 
 Many thanks to all of you who contributed to our multimedia programs last
 year! Now that we have a new multimedia team at WMF, we look forward to
 making some good progress together this year.
 
 To kick off the new year, here is a proposed multimedia vision for 2016,
 which was prepared by our multimedia and design teams, with guidance from
 community members:
 
 http://blog.wikimedia.org/2014/01/09/multimedia-vision-2016/
 
 This possible scenario is intended for discussion purposes, to help us
 visualize how we could improve our user experience over the next three
 

Re: [Wikitech-l] RFC process - open questions

2014-01-14 Thread Gryllida


On Wed, 15 Jan 2014, at 4:52, Sumana Harihareswara wrote:
 On the RFC Process talk page, I'm presenting some questions about our RFC
 process and suggesting *my* answers:

Where can I find previous RFCs? It's a thing I haven't heard of before.
How are contributors expected to find it?

Gryllida

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC process - open questions

2014-01-14 Thread Matthew Flaschen

On 01/14/2014 04:21 PM, Gryllida wrote:



On Wed, 15 Jan 2014, at 4:52, Sumana Harihareswara wrote:

On the RFC Process talk page, I'm presenting some questions about our RFC
process and suggesting *my* answers:


Where can I find previous RFCs?


They are listed at https://www.mediawiki.org/wiki/Requests_for_comment .

Matt Flaschen


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: Wikimania 2014 scholarship now accepting application

2014-01-14 Thread Quim Gil
It looks like this annoucement didn't make it to this list before.


 Original Message 
Subject: [MediaWiki-l] Wikimania 2014 scholarship now accepting application
Date: Wed, 08 Jan 2014 17:43:31 +
From: Katie Chan k...@ktchan.info
Reply-To: MediaWiki announcements and site admin list
mediawik...@lists.wikimedia.org
To: wikivoyag...@lists.wikimedia.org, mediawik...@lists.wikimedia.org

Hi all,

Scholarship applications for Wikimania 2014 in London are now being
accepted. Applications are open until the end of the day UTC on 17 February.

Wikimania 2014 scholarships is an award given to an individual to enable
them to attend Wikimania in London from 6-10 August, 2014.

Only a single type of scholarship will be available from the Wikimedia
Foundation for Wikimania 2014. A Wikimedia Foundation scholarship will
cover the cost of an individual's round-trip travel costs as arranged by
the Wikimedia Foundation travel agency, shared accommodation as arranged
by the Wikimedia Foundation, and registration for Wikimania.

Applicants will be rated using a pre-determined selection process and
selection criteria by the Scholarship Committee, who will determine
which are successful. To learn more about Wikimania 2014 scholarships,
please visit https://wikimania2014.wikimedia.org/wiki/Scholarships.

To apply for a scholarship, fill out the application form on
http://scholarships.wikimedia.org/apply. It is highly recommended that
applicants review all of the material on the Scholarships page and the
associated FAQ before submitting an application.

If you have any question, please contact
wikimania-scholarsh...@wikimedia.org or leave a message on
https://wikimania2014.wikimedia.org/wiki/Talk:Scholarships.

Katie Chan
Chair, Scholarship Committee

-- 
Katie Chan
Any views or opinions presented in this e-mail are solely those of the
author and do not necessarily represent the view of any organisation the
author is associated with or employed by.


Experience is a good school but the fees are high.
  - Heinrich Heine


___
MediaWiki-l mailing list
mediawik...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Need a way to modify text before indexing (was SearchUpdate)

2014-01-14 Thread vitalif

Hi!

Change https://gerrit.wikimedia.org/r/#/c/79025/ that was merged to 1.22 
breaks my TikaMW extension - I used that hook to extract contents from 
binary files so the user can then search on it.


Maybe you can add some other hook for this purpose?

See also https://github.com/mediawiki4intranet/TikaMW/issues/2

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Inclupedia: Developers wanted

2014-01-14 Thread Matthew Flaschen

On 01/13/2014 01:59 PM, Nathan Larson wrote:

I can't exactly post a bug to MediaZilla saying Create Inclupedia and
then have a bunch of different bugs it depends on, because non-WMF projects
are beyond the scope of MediaZilla.


That's not the case.  There are components for software the WMF does not 
use.  This ranges from major projects like Semantic MW to one-off 
extensions that WMF does not have a use for (e.g. Absentee Landlord) to 
tools that work *with* MW but are not part of it (e.g. Tools Labs tools, 
Pywikibot).


It's true everything on Bugzilla is related to MediaWiki or the WMF in 
some way, though (some more direct than others).


Matt Flaschen


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Facebook Open Academy coordination

2014-01-14 Thread Quim Gil
Hi, today at the Engineering Community Team IRC meeting we had a
discussion about the Facebook Open Academy program:

https://www.mediawiki.org/wiki/Facebook_Open_Academy#Projects

See the minutes and full logs at
https://www.mediawiki.org/wiki/Engineering_Community_Team/Meetings#2014-01-14

In order to be on the same page, mentors of the different projects
please get your basic project information ready at the table in the
Facebook_Open_Academy page. Also ask your students to add themselves there.

As commented in the meeting, I'm hoping to have Flow enabled in the main
project pages, as a way to vehiculate project discussions better.
Maryana could not promise a date, but she likes the idea. I'm hoping for
early February. If the teams want to consider other options such as a
specialized existing or new mailing that is also fine.

Questions? Feedback? Be my guest. It is the first time we join this
program and there are still many question marks. It's going to be fine.  :)

-- 
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Need a way to modify text before indexing (was SearchUpdate)

2014-01-14 Thread Chad
On Tue, Jan 14, 2014 at 2:33 PM, vita...@yourcmc.ru wrote:

 Hi!

 Change https://gerrit.wikimedia.org/r/#/c/79025/ that was merged to 1.22
 breaks my TikaMW extension - I used that hook to extract contents from
 binary files so the user can then search on it.

 Maybe you can add some other hook for this purpose?

 See also https://github.com/mediawiki4intranet/TikaMW/issues/2


SearchEngine subclasses can implement getTextFromContent() if they want
to override the normal text fetching behavior.

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Inclupedia: Developers wanted

2014-01-14 Thread Nathan Larson
On Tue, Jan 14, 2014 at 6:22 PM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:

 That's not the case.  There are components for software the WMF does not
 use.  This ranges from major projects like Semantic MW to one-off
 extensions that WMF does not have a use for (e.g. Absentee Landlord) to
 tools that work *with* MW but are not part of it (e.g. Tools Labs tools,
 Pywikibot).


I guess it would depend on making the scope of the bug broad enough that it
would seem useful for more than just one site. E.g. one could put
implement the functionality needed for Inclupedia. That functionality
could be reused for any number of sites, just like SMW's code. On the other
hand, if someone were to say Switch configuration setting x to true on
Inclupedia that would be of little interest to non-Inclupedia users, I
would think. I assume that Bugzilla is not intended as the place for all
technical requests for the entire wikisphere?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC process - open questions

2014-01-14 Thread MZMcBride
Sumana Harihareswara wrote:
On the RFC Process talk page, I'm presenting some questions about our RFC
process and suggesting *my* answers:
https://www.mediawiki.org/wiki/Talk:Requests_for_comment/Process

You've begun a discussion about changes to the process seemingly without
making any attempt to discuss or define deficiencies or problems in the
current process. Your talk page questions have every indication of a
classic pattern in bug reporting, where a user shows up having a bit of
knowledge and a proposed solution, but doesn't describe the symptoms or
the problem or try to explain what he or she is trying to accomplish.
Unsurprisingly, this approach often works very poorly.

As a direct example, you ask Should we add time limits to any part of the
process? and then proceed to lay out your personal views on what an
appropriate timeline might look like for RFCs. But taking a step back:
what problem, exactly, are you trying to solve? This doesn't appear to be
addressed anywhere on that talk page.

Almost all of the other sections/questions have the same issue.

On the talk page, you also reference an architecture review committee
as though that's a real thing that exists. I'm not sure this is the case.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC process - open questions

2014-01-14 Thread Sumana Harihareswara
Yo, MZ, did you miss the very first question, where I asked whether the
current process is good enough? I'm totally cool with the answer yeah it
is (except for numbering, I really want to be able to disambiguate RFCs on
similar topics).

Sounds like you'd like more clarification on problems I'm seeing - sure,
happy to talk about that, I'll see what needs adding. Other people who have
interacted more thoroughly with the RFC process could also jump in. And it
seems reasonable to me to ask whether specific ideas make sense for us.
I've found that sometimes seeing what other similar communities do has
given me ideas for how to improve our own processes -- sometimes we don't
see a certain kind of problem until we see someone else solving it!

But whoa, harsh phrasings, dude. :(

Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation


On Tue, Jan 14, 2014 at 7:47 PM, MZMcBride z...@mzmcbride.com wrote:

 Sumana Harihareswara wrote:
 On the RFC Process talk page, I'm presenting some questions about our RFC
 process and suggesting *my* answers:
 https://www.mediawiki.org/wiki/Talk:Requests_for_comment/Process

 You've begun a discussion about changes to the process seemingly without
 making any attempt to discuss or define deficiencies or problems in the
 current process. Your talk page questions have every indication of a
 classic pattern in bug reporting, where a user shows up having a bit of
 knowledge and a proposed solution, but doesn't describe the symptoms or
 the problem or try to explain what he or she is trying to accomplish.
 Unsurprisingly, this approach often works very poorly.

 As a direct example, you ask Should we add time limits to any part of the
 process? and then proceed to lay out your personal views on what an
 appropriate timeline might look like for RFCs. But taking a step back:
 what problem, exactly, are you trying to solve? This doesn't appear to be
 addressed anywhere on that talk page.

 Almost all of the other sections/questions have the same issue.

 On the talk page, you also reference an architecture review committee
 as though that's a real thing that exists. I'm not sure this is the case.

 MZMcBride



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Inclupedia: Developers wanted

2014-01-14 Thread Brian Wolff
On Jan 14, 2014 8:20 PM, Nathan Larson nathanlarson3...@gmail.com wrote:

 On Tue, Jan 14, 2014 at 6:22 PM, Matthew Flaschen
 mflasc...@wikimedia.orgwrote:

  That's not the case.  There are components for software the WMF does not
  use.  This ranges from major projects like Semantic MW to one-off
  extensions that WMF does not have a use for (e.g. Absentee Landlord) to
  tools that work *with* MW but are not part of it (e.g. Tools Labs tools,
  Pywikibot).


 I guess it would depend on making the scope of the bug broad enough that
it
 would seem useful for more than just one site. E.g. one could put
 implement the functionality needed for Inclupedia. That functionality
 could be reused for any number of sites, just like SMW's code. On the
other
 hand, if someone were to say Switch configuration setting x to true on
 Inclupedia that would be of little interest to non-Inclupedia users, I
 would think. I assume that Bugzilla is not intended as the place for all
 technical requests for the entire wikisphere?


Yeah, i think shell-type requests for non-wmf wikis has traditionally been
out of scope for our bugzilla (possible exception: acawiki). OTOH I dont
know if anyone has ever really asked to use our bugzilla in such a manner,
so maybe opinions differ.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ARM servers

2014-01-14 Thread Tim Starling
I wrote:
 But I think
 it would make more sense to have a bare metal provisioning process for
 misc servers which allowed smaller numbers of Intel cores per server,
 where that fits the application. That would improve energy efficiency
 without the need to deploy a new architecture.

Actually, after looking into this, I don't think it would help very much.

I had a look at ssl1, which is an idle PowerEdge 410. It has two Intel
Xeon X5650 packages, i.e. 12 cores in total. According to powertop,
one package is in the C6 state about 95% of the time, and the other is
about 99% C6. According to the datasheet [1], this processor model
should use 10W per package in C6, i.e. 20W total.

Meanwhile, the server's RAC is reporting a system power usage of 160W,
which would imply that the non-CPU parts of the system are using 88%
of the idle power. I don't know where all the power goes, but it looks
like it isn't to the processor.

[1]
http://www.intel.com/content/www/us/en/processors/xeon/xeon-5600-vol-1-datasheet.html

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is Foxway a right way?

2014-01-14 Thread Pavel Astakhov

13.01.2014 14:58, Pavel Astakhov пишет:

Hi! I would like to discuss an idea.

In MediaWiki is not very convenient to docomputingusing the syntax of 
the wiki. We have to use several extensions like Variables, Arrays, 
ParserFunctions and others. If there are a lot of computing, such as 
data processing received from Semantic MediaWiki, the speed of page 
construction becomes unacceptable. To resolve this issue have to do 
another extension (eg Semantic Maps displays data from SMW on Maps). 
Becomes a lot of these extensions, they don't work well with each 
other and these time-consuming to maintain.


I know about the existence of extension Scribunto, but I think that 
you can solve this problem by another, more natural way. I suggest 
using PHP code in wiki pages, in the same way as it is used for html 
files. In this case, extension can be unificated. For example, get the 
data from DynamicPageList, if necessary to process, and transmit it to 
display other extensions, such as Semantic Result Formats.This will 
give users more freedom for creativity.


In order to execute PHP code safely I decided to try to make a 
controlled environment. I wrote it in pure PHP, it is lightweight and 
in future can be included in the core. It can be viewed as an 
extension Foxway. The first version in branch master. It gives an idea 
of what it is possible in principle to do and there's even something 
like a debugger. It does not work very quickly and I decided to try to 
fix it in a branch develop. There I created two classes, Compiler and 
Runtime.


The first one processes PHP source code and converts it into a set of 
instructions that the class Runtime can execute very quickly. I took a 
part of the code from phpunit tests to check the performance. On my 
computer, pure PHP executes them on average in 0.0025 seconds, and the 
class Runtime in 0.05, it is 20 times slower, but also have the 
opportunity to get even better results. I do not take in the 
calculation time of class Compiler, because it needs to be used once 
when saving a wiki page. Data returned from this class is amenable to 
serialize and it can be stored in the database. Also, if all the 
dynamic data handle as PHP code, wiki markup can be converted into 
html when saving and stored in database. Thus, when requesting a wiki 
page from the server it will be not necessary to build it every time 
(I know about the cache). Take the already prepared data (for Runtime 
and html) and enjoy. Cache is certainly necessary, but only for pages 
with dynamic data, and the lifetime of the objects in it can be 
greatly reduced since performance will be higher.


I also have other ideas associated with the use of features that 
provide this realization. I have already made some steps in this 
direction and I think that all of this is realistic and useful.
I'm not saying that foxway ready for use. It shows that this idea can 
work and can work fast enough. It needs to be rewritten to make it 
easier to maintain, and I believe that it can work even faster.


I did not invent anything new. We all use the html + php. Wiki markup 
replaces difficult html and provides security, but what can replace 
the scripting language?


I would like to know your opinion: is it really useful or I am wasting 
my time?


Best wishes. Pavel Astakhov (pastakhov).



Do not consider Foxway as PHP interpreter.
Consider it as a faster and more powerful alternative to Magic words.


Why is it faster?

1) Foxway compiles the source code. (Magic words does not). Not 
important here PHP, LUA or wiki markup. It could be anything. It is 
important that the compiled code is executed very quickly. (20 times 
slower than pure PHP)


2) Foxway itself does not touch the data to be operated on, they can be 
anything. (All data in the Magic words only as strings.) It's safe, 
because foxway can only transmit the data. All responsibility is only on 
extensions that process these data. All built-in functions that process 
data, are built-in extensions and can be disabled or replaced by others. 
They were integrated for convenience only. Foxway uses function echo 
to push data to html page. It is pre-disinfected string or data that 
gives other extension under their own responsibility (just like Magic 
words).


Why is it more powerful?

1) All extensions can be unified. They will do one thing, but do it 
well. Such extensions will be easier to maintain and develop. We all get 
a lot of freedom to choose what to use and how. This is the Unix philosophy.


2) Huge number of functions already done, just use them.

3) It should look like script language and it looks like this. Magic 
words looks very confusing and inconvenient for more use.


Why does foxway understand the syntax of PHP code and not the other, 
such as LUA or Python?

Indeed it could understand any syntax, but I chose PHP for several reasons.
1) Mediawiki is written in PHP, and PHP provides the lexical analysis of 
the PHP source code and