Re: [Wikitech-l] Writing unit tests for extensions that use dbr calls

2017-09-28 Thread Kevin Israel
On 09/27/2017 11:31 AM, Jim Hu wrote:
> Query: SELECT  
> annotation_id,row_id,original_row_data,annotation_timestamp,user_id,team_id,session_id,annotation_inning
>   FROM `unittest_cacao_annotation` INNER JOIN `unittest_cacao_user` ON 
> ((annotation_user = cacao_user.id))   WHERE annotation_id = "6057"  
> Function: CacaoModelAnnotation::load
> Error: 1054 Unknown column 'cacao_user.id' in 'on clause' (localhost)
MediaWikiTestCase sets $wgDBprefix to cause database queries to refer to
temporary tables instead of the original tables. If your code does not
handle table prefixes correctly, you can get such an error message.
>   'cacao_user' => array('INNER JOIN', 
> 'annotation_user = cacao_user.id' ),
To make a query like this work with a table prefix, you should alias the
referenced table ("cacao_user" in this case), then in the condition,
refer to the table using the alias:

$result = $dbr->select(
array( 'cacao_annotation', 'u' => 'cacao_user' ),

[...]

'cacao_user' => array( 'INNER JOIN', 'annotation_user = u.id' ),

-- 
Kevin Israel -- Wikipedia editor, MediaWiki developer
https://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The right way to inject user-supplied JavaScript?

2015-12-29 Thread Kevin Israel
On 12/29/2015 03:59 PM, Daniel Barrett wrote:
> I run a private wiki for developers (not accessible from the Internet) that 
> lets any wiki page author run JavaScript on a page by adding a tag:
> 
>  alert("hi"); 
> 
> (We understand  the security implications, which is why the wiki isn't 
> accessible by the world.) When we upgraded to MediaWiki 1.26 (from 1.24), a 
> problem occurred: the  tag stopped recognizing the "mediawiki" 
> and "mw" objects, but otherwise works. The following code reports an 
> undefined variable "mw":
> 
>  mw.loader.using() 
> 
> I assume this is because the  extension builds a 

Re: [Wikitech-l] [BREAKING CHANGE] Plans to move Cite configuration from wikitext messages to CSS styles

2014-12-16 Thread Kevin Israel
On 12/16/2014 07:45 AM, Brad Jorsch (Anomie) wrote:
 On Tue, Dec 16, 2014 at 5:44 AM, Marc Ordinas i Llopis 
 marc...@wikimedia.org wrote:
 
 Due to how MediaWiki's messages system works, changes to the display
 styles need to be copied into each of the ~300 display languages for users,
 else those users with different languages will see different reference
 styles on the same page.
 
 
 That sounds like bug T33216, which was fixed a while ago. Does this
 actually occur now?

Though I'm not completely sure this is the one in question, for quite
some time there has been a known issue[1] with the way MediaWiki's
message fallback chain is implemented. A patch[2] to fix this was posted
for review, and I left some comments on it almost a year ago. There have
been no updates since.

[1]: Multiple reports:
 https://phabricator.wikimedia.org/T3495
 https://phabricator.wikimedia.org/T48579
 https://phabricator.wikimedia.org/T50956
 https://phabricator.wikimedia.org/T57473

[2]: https://gerrit.wikimedia.org/r/#/c/72867/

-- 
Kevin Israel - MediaWiki developer, Wikipedia editor
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to flag new messages as read using api

2014-04-21 Thread Kevin Israel
On 04/21/2014 07:49 AM, Bartosz Dziewoński wrote:
 On Mon, 21 Apr 2014 13:40:16 +0200, Petr Bena benap...@gmail.com wrote:
 
 Using api.php?action=querymeta=userinfouiprop=rights|hasmsg I get
 information about new message.

 Now I can read it using api, but that doesn't flag the talk page as
 read. What do I need to do in order to flag it as read other than
 running external browser instead of api?
 
 Use the action=setnotificationtimestamp API to mark your own talk page
 as 'visited'.

I looked at this a while ago and found that this won't get rid of the
orange bar (or its equivalent in the API). And this still seems to be true.

The relevant method is User::clearNotification() (which in turn calls
User::setNewtalk()). That API module doesn't call it, instead updating
the watchlist table directly to allow batching.

So the user_newtalk row (or for anons, the object cache entry) is not
deleted or updated.

-- 
Kevin Israel - MediaWiki developer, Wikipedia editor
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Config class and 1.23

2014-04-18 Thread Kevin Israel
Should the Config and GlobalConfig classes and the associated
RequestContext methods be reverted from 1.23 as an incomplete feature?
As far as I can tell, it is not yet used anywhere, so reverting it
should be easy.

getConfig() was added to IContextSource in 101a2a160b05[1]. Then
the method was changed to return a new class of object (Config) instead
of a SiteConfiguration object in fbfe789b987b[2]; however, the Config
class faces significant changes in I5a5857fc[3].

[1]: https://gerrit.wikimedia.org/r/#/c/92004/
[2]: https://gerrit.wikimedia.org/r/#/c/109266/
[3]: https://gerrit.wikimedia.org/r/#/c/109850/

-- 
Kevin Israel - MediaWiki developer, Wikipedia editor
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] importImages.php on symlinked windows wikis

2013-10-03 Thread Kevin Israel
On 10/03/2013 07:08 PM, James Montalvo wrote:
 I have several wikis all symlinked to a single prime wiki. Only the
 images, cache, and mw-config directories and LocalSettings.php file
 are unique for each wiki.
 
 Normally for maintenance scripts I just add --conf
 C:/path/to/LocalSettings.php to perform the maintenance script on a
 particular wiki but for importImages.php that does not entirely work. If I
 use it to upload to a particular wiki, the file-namespace pages get created
 on the correct wiki, but the actual files get put in the images directory
 of the prime wiki.
 
 Using --conf C:/path/to/LocalSettings.php does partially work. If I leave
 it off then the file-namespace pages get created in the prime wiki, instead
 of the intended wiki. So it appears --conf just isn't impacting file upload
 location.

The default value for $wgUploadDirectory is $IP/images (as determined
in includes/Setup.php), and the --conf option does not affect $IP. Have
you tried setting the MW_INSTALL_PATH environment variable to override
the $IP detection, or manually setting $wgUploadDirectory in the
appropriate LocalSettings.php file?

 Any assistance would be greatly appreciated. I'm fairly new to MW
 development, and this is my first post on the mailing list.

This question might have been better suited for mediawiki-l.

-- 
Kevin Israel - MediaWiki developer, Wikipedia editor
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is assert() allowed?

2013-07-30 Thread Kevin Israel
On 07/30/2013 06:28 PM, Tim Starling wrote:
 On 31/07/13 07:28, Max Semenik wrote:
 I remeber we discussed using asserts and decided they're a bad
 idea for WMF-deployed code - yet I see

 Warning:  assert() [a href='function.assert'function.assert/a]:
 Assertion failed in
 /usr/local/apache/common-local/php-1.22wmf12/extensions/WikibaseDataModel/DataModel/Claim/Claims.php
 on line 291
 
 The original discussion is here:
 
 http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/59620
 
 Judge for yourself.

I'll further elaborate on the [...] you have to put the source code
inside a string [...] part. From the [documentation][1]:

 If the assertion is given as a string it will be evaluated as PHP
 code by assert().

As in: that function is just as evil as eval(), and the innocent looking

assert( $_GET[id]  0 );

can actually be a security vulnerability, depending on server
configuration (yes, servers can be and are misconfigured). And when
assert() is used like this (yes, there actually is one of these in
WikibaseDataModel):

assert( $this-functionFromSuperclass() );

it might be necessary to check multiple files to verify that a string
is not passed to assert().

Perhaps it might make sense to do

assert( (bool)( ... ) );

though, as pointed out previously, this really is no better than, say:

if ( !( ... ) ) {
throw new MWException( '...' );
}

[1]: http://php.net/manual/en/function.assert.php

-- 
Kevin Israel - MediaWiki developer, Wikipedia editor
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Special:Upload

2013-06-10 Thread Kevin Israel
On 06/10/2013 04:55 PM, Lee Worden wrote:
 Hi -
 
 In the extension development I'm doing, I need a custom file-upload
 interface.  I'm building it around the existing Special:Upload, and as a
 side benefit I've been building a rewritten version of
 Extension:MultiUpload.
 
 In order to do this in what seems to me a reasonable, future-compatible
 way - particularly by calling Special:Upload's methods rather than
 duplicating their code in extension classes - I've needed to split out
 some of Special:Upload's code into separate functions that can be
 overridden in subclasses.  Those changes are in gerrit and bugzilla now:
 https://gerrit.wikimedia.org/r/#/c/67173/
 https://bugzilla.wikimedia.org/show_bug.cgi?id=48581
 
 I'm posting here in case people want to discuss those changes.  Ideally
 I'd like to backport that to 1.19 so I can support LTS users.

Only bug fixes are backported to released versions of MediaWiki - not
new features. https://www.mediawiki.org/wiki/Backporting_fixes

 Also, I'd like to submit my MultiUpload code for review, but I'm not
 sure how to do that, because it looks like Extension:MultiUpload hasn't
 been brought over from svn to gerrit.  I'd either submit it as a commit
 that replaces most of the extension's code, or propose it as a separate
 extension.  Please advise me...

https://www.mediawiki.org/wiki/Git/New_repositories/Requests seems to
be the place to request that a new repo be created. When creating a new
repo in Gerrit, Demon/Chad can import the SVN history for you.

 Thanks!
 Lee Worden

-- 
Kevin Israel - MediaWiki developer, Wikipedia editor
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] database encoding for field with mathematical expressions

2013-05-24 Thread Kevin Israel
On 05/23/2013 11:31 PM, phy...@physikerwelt.de wrote:
 Hi,
 
 I'm a testing a new rendering option for the math / element and had
 problems to store MathML elements in the database field
 math_mathml which is of type text.
 The MathML elements contain a wide range of Unicode characters like the
 INVISIBLE TIMES that is encoded as 0xE2 0x81 0xA2 in UTF-8 or even 4 byte
 chars like MATHEMATICAL BOLD CAPITAL A  0xF0 0x9D 0x90 0x80 .
 In some rar cases I had problem to retrieve the stored value correctly from
 MySQL.
 To fix that problem I'm now using the PHP functions utf8_encode /decode to
 which is not a very intuitive solution.
 Do you know a better method to solve this issue without to change the
 database layout.
 
 Best
 Physikerwelt

If you use MySQL, when you installed MediaWiki (or created the table),
did you choose the UTF-8 option instead of binary? The underlying
MySQL character set is utf8[1], which does not support characters
above U+ (four-byte characters).

This is mentioned in the web installer (message 'config-charset-help'):

 In binary mode, MediaWiki stores UTF-8 text to the database in binary
 fields. This is more efficient than MySQL's UTF-8 mode, and allows
 you to use the full range of Unicode characters. In UTF-8 mode, MySQL
 will know what character set your data is in, and can present and
 convert it appropriately, but it will not let you store characters
 above the Basic Multilingual Plane[2].

MySQL 5.5 did introduce a new utf8mb4 character set, which does
support four-byte characters; however, MediaWiki does not currently
support that option (now filed as bug 48767).

The WMF of course has to use the 'binary' option (actually, UTF-8 stored
in latin1 columns, as mentioned in bug 32217) to allow storage
of all sorts of obscure characters from different languages.

utf8_encode()/utf8_decode() work around the problem because they replace
byte values 80 to FF with two-byte characters from U+0080 to U+00FF,
(encoded as C2 80 to C3 BF) and the 'utf8' option does allow those
characters.

[1]: https://dev.mysql.com/doc/refman/5.5/en/charset-unicode-utf8.html
[2]: http://en.wikipedia.org/wiki/Mapping_of_Unicode_character_planes
-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC How/whether MediaWiki could use ZendOptimizuerPlus -- ZendOptimizerPlus, opcode cache, PHP 5.4, APC, memcache ???

2013-03-22 Thread Kevin Israel
On 03/22/2013 07:16 AM, Platonides wrote:
 APC can do two things:
 1) Keep the compiled php opcodes, so php execution is faster.
 2) Allow the application to store values in the web server memory (kept
 accross requests).
 
 ZendOptimizer only does 1. [...]
 The «APC  is a must have for larger MediaWikis» is due to 1. In fact,
 wikimedia is not using APC for 2, but memcached.

With one exception: a [live hack][1] to use apc_inc() instead of rand()
to generate a 32-bit TRANS-ID for HTCP cache purging.

Why is this hack in place? Is it particularly useful for [monitoring
packet loss][2]?

[1]:
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blobdiff;h=897397e41fe14bdf7dcd02eb61a9744880a5e1a3;hb=b7bc01d0ccea6a6a817aed31d781ce6693ee9417;hpb=1256724550556e5e35810bb88b20ef87dbe1ce47

[2]:
https://svn.wikimedia.org/viewvc/mediawiki/trunk/udpmcast/htcpseqcheck.py?view=markup

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Live recent changes feed

2013-03-10 Thread Kevin Israel
On 03/10/2013 12:19 AM, Victor Vasiliev wrote:
 After recent discussion on this list I realized that this has been
 in discussion for as long as four years I went WTF and decided to
 Just Go Ahead and Fix It. As a result, I made a patch to MediaWiki
 which allows it to output recent changes feed in JSON: 
 https://gerrit.wikimedia.org/r/#/c/52922/
 
 Also, I wrote a daemon which captures this feed and serves them
 through WebSockets and simple text-oriented protocol [...] : 
 https://github.com/wikimedia/mediawiki-rcsub
 
 This daemon is written in Python using Twisted and Autobahn and it
 takes ~200 lines of code (initial version took ~80).

One thing you should consider is whether to escape non-ASCII
characters (characters above U+007F) or to encode them using UTF-8.

Python's json.dumps() escapes these characters by default
(ensure_ascii = True). If you don't want them escaped (as hex-encoded
UTF-16 code units), it's best to decide now, before clients with
broken UTF-8 support come into use.

I recently made a [patch][1] (not yet merged) that would add an opt-in
UTF8_OK feature to FormatJson::encode(). The new option would
unescape everything above U+007F (except for U+2028 and U+2029, for
compatibility with JavaScript eval() based parsing).

 I hope that now getting recent changes via reasonable format is a
 matter of code review and deployment, and we will finally get
 something reasonable to work with (with access from web
 browsers!).

I don't consider encoding 撤销由158.64.77.102于2013年1月22日 (二)
16:46的版本24659468中的繁简破坏 (90 bytes using UTF-8) as

\u64a4\u9500\u7531158.64.77.102\u4e8e2013\u5e741\u670822\u65e5
(\u4e8c)
16:46\u7684\u7248\u672c24659468\u4e2d\u7684\u7e41\u7b80\u7834\u574f
(141 bytes)

to be reasonable at all for a brand-new protocol running over an 8-bit
clean channel.

[1]: https://gerrit.wikimedia.org/r/#/c/50140/

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Nightly shallow clones of mediawiki/core

2013-03-10 Thread Kevin Israel
On 03/10/2013 06:03 PM, Bartosz Dziewoński wrote:
 A shallow clone certainly shouldn't be as large as a normal one.
 Something's borked.

--depth 0 is what's broken. --depth 1 works fine.

$ git clone --depth 1
https://gerrit.wikimedia.org/r/p/mediawiki/core.git core-shallow
Cloning into 'core-shallow'...
remote: Counting objects: 2815, done
remote: Finding sources: 100% (2815/2815)
remote: Getting sizes: 100% (2665/2665)
remote: Compressing objects:  61% (1631/2660)
remote: Total 2815 (delta 317), reused 1182 (delta 147)
Receiving objects: 100% (2815/2815), 17.87 MiB | 1.16 MiB/s, done.
Resolving deltas: 100% (342/342), done.

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Live recent changes feed

2013-03-10 Thread Kevin Israel
On 03/10/2013 06:27 PM, Victor Vasiliev wrote:
 On 03/10/2013 06:30 AM, Kevin Israel wrote:
 On 03/10/2013 12:19 AM, Victor Vasiliev wrote:
 One thing you should consider is whether to escape non-ASCII
 characters (characters above U+007F) or to encode them using UTF-8.
 
 Whatever the JSON encoder we use does.
 
 Python's json.dumps() escapes these characters by default
 (ensure_ascii = True). If you don't want them escaped (as hex-encoded
 UTF-16 code units), it's best to decide now, before clients with
 broken UTF-8 support come into use.
 
 As long as it does not add newlines, this is perfectly fine protocol-wise.

If Whatever the JSON encoder we use does means that one day, the
daemon starts sending UTF-8 encoded characters, it is quite possible
that existing clients will break because of previously unnoticed
encoding bugs. So I would like to see some formal documentation of the
protocol.

 I recently made a [patch][1] (not yet merged) that would add an opt-in
 UTF8_OK feature to FormatJson::encode(). The new option would
 unescape everything above U+007F (except for U+2028 and U+2029, for
 compatibility with JavaScript eval() based parsing).
 
 The part between MediaWiki and the daemon does not matter that much
 (except for hitting the size limit on packets, and even then we are on
 WMF's internal network, so we should not expect any packet loss and
 problems with fragmentation). The daemon extracts the wiki name from the
 JSON it received, so it reencodes the change anyways in the middle.

It's good to know that it's quite easy to change the format of the
internal UDP packets without breaking existing clients -- that it's
possible to start using UTF-8 on the UDP side if necessary.

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-06 Thread Kevin Israel
On 03/06/2013 07:30 AM, Platonides wrote:
 On 06/03/13 13:24, Platonides wrote:
 I just checked and there are 73 authors of the resources of MediaWiki
 core. More than I expected, but not unworkable. We could relicense our
 css and javascript as MIT, MPL, GPL-with-explicit-exception...
 
 I was going to provide the full list: [...]

Don't forget the 58 other authors of skins/ (although some commits
touching that path might not be to CSS or JS):

$ git log --format=format:%an --no-merges resources/ | sort -u 
../resources.txt
$ git log --format=format:%an --no-merges skins/ | sort -u  ../skins.txt
$ comm -23 ../skins.txt ../resources.txt
Adam Miller
Ævar Arnfjörð Bjarmason
Alex Shih-Han Lin
Alex Z
Anders Wegge Jakobsen
ankur
Arne Heizmann
Benny Situ
Charles Melbye
Daniel Cannon
Daniel Kinzler
Erik Moeller
Evan Prodromou
Gabriel Wicke
Guillaume Blanchard
Guy Van den Broeck
Huji
Ilmari Karonen
isarra
Jack Phoenix
Jan Luca Naumann
Jan Paul Posma
Jens Frank
Jimmy Collins
Jon Harald Søby
Jure Kajzer
karun
Katie Filbert
Laurence Parry
Leon Weber
Lisa Ridley
Lupin
Magnus Manske
Marcin Cieślak
Matt Johnston
Michael Dale
Mohamed Magdy
Nicholas Pisarro, Jr
Nick Jenkins
Nimish Gautam
Patrick Reilly
Philip Tzou
Platonides
Purodha B Blissenbach
Remember the dot
River Tarnell
Rob Church
Robert Stojnić
Rotem Liss
Ryan Schmidt
Shinjiman
SQL
Tobias
Tom Gilder
Tpt
Victor Vasiliev
X!
Zheng Zhu

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP Analyzer (now open-source!)

2013-03-04 Thread Kevin Israel
On 03/04/2013 10:35 AM, Tyler Romeo wrote:
 Well, recently PHP Analyzer, which is made by the same company as php-cs
 and does code logic analysis, was just open sourced. [...]

As far as I can tell, PHP Analyzer is a completely separate program
written by a different author (Johannes Schmitt), whose [hosted
service][1] also incorporates PHPCS. What, if anything, suggests he is
involved in the development of PHPCS at all?

[1]: https://scrutinizer-ci.com/
-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Merging wikitech and labsconsole on Thursday at 1 PM PDT

2013-02-27 Thread Kevin Israel
On 02/27/2013 09:25 AM, Tyler Romeo wrote:
 Is using rewrite a good idea, or would it be better to just redirect so
 that there's only one actual URI? (I don't have an answer to that, just
 asking.)

https://httpd.apache.org/docs/current/rewrite/flags.html#flag_r

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ResourceLoader sanity check

2013-02-14 Thread Kevin Israel
On 02/14/2013 05:29 AM, Nikola Smolenski wrote:
 It starts by performing a quick sanity check that bails out if the
 current browser is not supported. [...]
 Browsers such as Internet Explorer 5 and early versions of Mozilla fall
 in this category.
 
 While I see that it works, I can't find where exactly in the code is
 this sanity check located. Any pointers?

This is located in resources/startup.js , although I only see a check
for IE  6, not a check for early Mozilla versions. This file is loaded
by ResourceLoaderStartUpModule::getScript() using file_get_contents()
and becomes the beginning of the dynamically generated startup module.

git log led me to https://bugzilla.wikimedia.org/show_bug.cgi?id=35906
regarding the missing Mozilla check.

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making the query opensearch API working fine on a self hosted wikimedia server

2013-02-13 Thread Kevin Israel
Wikipedia database dumps do include the site's templates; however, you
need to install all the extensions listed under Greffons de l'analyseur
syntaxique on

http://fr.wikipedia.org/wiki/Sp%C3%A9cial:Version

In particular, the ParserFunctions extension is necessary for {{#if:
(used by many Wikipedia templates) to work correctly and not show up on
screen.

http://www.mediawiki.org/wiki/Extension:ParserFunctions

On 02/13/2013 05:13 PM, Hicham TAHIRI wrote:
 Thanks Andre  Brian !
 In fact I didn't install any template !
 Is there a tutorial about that ? Same for adding extensions, I didn't find
 a way to that from the admin panel ?
 Any link(s) will be welcome
 
 2013/2/13 Andre Klapper aklap...@wikimedia.org
 
 Wild guess: Your HTML is not well-formed (unclosed elements like p or
 small) because many used templates (e.g. literal {{#if:
 2-7499-0796-9 in the Manoukian article) are missing / uninterpreted.
 Templates used in the New York article don't include such markup.

 Make sure templates are installed and try again? :)

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Anonymous user id on wikipedia?

2012-12-18 Thread Kevin Israel
On 12/18/2012 03:28 PM, Tyler Romeo wrote:
 1) This I have no idea about, but it's definitely not in the core, because
 my test wiki doesn't set this cookie. It has to be an extension.

Merely calling the mediaWiki.user.id() JavaScript function, which was
introduced into core MediaWiki in
https://www.mediawiki.org/wiki/Special:Code/MediaWiki/78539 , sets the
one-year cookie. Nothing in core MW (except for the corresponding QUnit
test) actually uses the function.

However, the following code in
extensions/E3Experiments/experiments/openTask.js does call that
function. I can confirm this code is executed merely by loading
Wikipedia's Main Page.

// FIXME for anons, calling mw.user.id() simply ensures the
// mediaWiki.user.id cookie is set, if it isn't already.
if ( !$.cookie( 'mediaWiki.user.id' ) )  {
if ( mw.user.id() === mw.user.getName() ) {
$.cookie( 'mediaWiki.user.id', generateId(), { expires:
365, path: '/' } );
}
}

 3) That is done on purpose. It's a convenience feature. Notice how when you
 logout and then go back to the login page that your username is already
 filled in for you. AFAIK, it isn't used in any way by MediaWiki to identify
 the user.

Even if you do not check Remember my login on this browser, the
username is saved for 180 days (which, by the way, is four times the
duration set out in the WMF privacy policy). As far as I can tell, this
feature has existed at least since the phase3 reorg in 2003, if not
before then.

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Anonymous user id on wikipedia?

2012-12-18 Thread Kevin Israel
On 12/19/2012 12:47 AM, Tyler Romeo wrote:
 Maybe I'm missing something, but where is the 180 days number coming from.
 When User::setCookies() sets the cookies, it gives it no expiry, so in
 reality the cookie persists until the browser removes it.

From User::setCookies():
foreach ( $cookies as $name = $value ) {
if ( $value === false ) {
$this-clearCookie( $name );
} else {
$this-setCookie( $name, $value, 0, $secure );
}
}

From the doc comment for User::setCookie():
@param $exp Int Expiration time, as a UNIX time value; if 0 or not
specified, use the default $wgCookieExpiration

From WebResponse::setcookie():
if ( $expire == 0 ) {
$expire = time() + $wgCookieExpiration;
}

From DefaultSettings.php:
$wgCookieExpiration = 180*86400;

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Help with Gerrit

2012-12-16 Thread Kevin Israel
On 12/16/2012 08:44 AM, Leonard Wallentin wrote:
 
 Hello, here's from a Gerrit newbie: I get the following error message when 
 running 'git review' on some changes, after following the instructions at 
 http://www.mediawiki.org/wiki/Git/Workflow#How_to_submit_a_patch line by 
 line.I have previously submitted a change (once)  without any errors. What 
 could be wrong? Where do I start looking?
 
 : 2012-12-16 14:33:21.901478 Running: git push gerrit 
 HEAD:refs/publish/master/addrealname: Permission denied (publickey).: fatal: 
 The remote end hung up unexpectedly
 I have coped the contents of ~/.ssh/id_rsa.pub to the SSH Public Keys tab of 
 my settings at gerrit.wikimedia.org

When you run `git remote -v`, you should get this line in the output:

gerrit  ssh://rot...@gerrit.wikimedia.org:29418/mediawiki/core.git (push)

The port number (29418) does matter; the SSH server running on the
standard port (22) is not the one built into Gerrit. See bug 35611.

If the port number (or your username, as indicated in Gerrit's settings)
is wrong, you need to change it in your Git configuration:

git remote set-url gerrit
ssh://rot...@gerrit.wikimedia.org:29418/mediawiki/core.git

 Thanks  for any hints you may be able to give me!/Leo

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Help with Gerrit

2012-12-16 Thread Kevin Israel
On 12/16/2012 02:50 PM, Kevin Israel wrote:
 The port number (29418) does matter; the SSH server running on the
 standard port (22) is not the one built into Gerrit. See bug 35611.

The Permission denied (publickey). error can also occur if your private
key has not yet been loaded into ssh-agent. See

https://www.mediawiki.org/wiki/Git/Workflow#Troubleshooting

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] want to contribute

2012-11-04 Thread Kevin Israel
On 11/04/2012 01:00 PM, Harsh Kothari wrote:
 Hello all
 
 I am Harsh Kothari from Gujarat, India. I want to contribute. I am more 
 active on Gujarati Wikipedia. More then 3k+ edits are there and I have also 
 made some wikibot scripts for doing various work on Gujarati Wikipedia. I 
 want to solve bugs, create new scripts for Gujarati as well as English. So 
 How can I start? I am expert in java script and python and very good 
 knowledge of php, Json, Xml etc. So please guide me how can I start. How I 
 can solve the bugs?

Harsh,

Our homepage at https://www.mediawiki.org/ links to some information for
new contributors, including How to become a MediaWiki hacker.

If you have a question, even after reading our FAQ and help pages, feel
free to ask it in #mediawiki on irc.freenode.net.
https://www.mediawiki.org/wiki/MediaWiki_on_IRC

Thanks for offering to contribute!

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] want to contribute

2012-11-04 Thread Kevin Israel
On 11/04/2012 02:06 PM, Harsh Kothari wrote:
 Hi 
 Thanks for your answer :). How can I fix or solve the bug on Gujarati 
 Wikipedia?

Which bug? Has it been reported on Bugzilla? We can't answer your
question until you tell us about the bug you're referring to.

https://bugzilla.wikimedia.org/

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New Imagescaler disto/packages

2012-10-05 Thread Kevin Israel
On 10/02/2012 09:10 AM, Peter Youngmeister wrote:
 Hello all,
 
 I just enabled srv190 as an imagescaler running ubuntu 12.04 precise with
 new versions of imagemagick and librsvg.

Here's an error message I got. It looks like srv190's copy of rsvg
doesn't have our special security patch[1]:

htmlheadtitleError generating thumbnail/title/head
body
h1Error generating thumbnail/h1
p
Error creating thumbnail: Unknown option --no-external-filesbr /

/p
!--
http://en.wikipedia.org/w/thumb_handler.php/f/f0/Consolidated_Contractors_Company_Logo.svg/200px-Consolidated_Contractors_Company_Logo.svg.png
--
!-- srv190 --

/body
/html


 This is a test for upgrading all of our imagescalers to newer versions
 of... many things. Upgrading these boxes and these packages is often
 problematic, so please let me know if you notice any issues with the new
 setup.
 

So I'm letting you know.

 Thanks!
 
 --peter
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 

[1]:
https://svn.wikimedia.org/viewvc/mediawiki/trunk/debs/librsvg/debian/patches/

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Reworking API help

2012-05-14 Thread Kevin Israel
On 05/14/2012 02:57 AM, Max Semenik wrote:
 On 14.05.2012, 7:04 Kevin wrote:
 I propose moving API help to a new special page Special:ApiHelp, which
 would run a few preg_replace operations and then the parser on
 individual portions of the documentation to format them as HTML. The
 combined output for all modules would be cached in memcached as in the
 old ApiHelp class.
 
 https://en.wikipedia.org/wiki/Special:ApiSandbox
 

Are you suggesting the removal of the existing API help function from
core and the introduction of these missing features into ApiSandbox: a
clear, unambiguous Must POST indication, Required and Deprecated
markers for parameters, and a way of viewing default parameter values?

Currently, Special:ApiSandbox only serves as a supplement to, not a
direct replacement for the complete documentation page, and the
documentation at MediaWiki.org will always be out of date or partially
irrelevant to an older version of MediaWiki a particular wiki might run
(along with the set of installed extensions).

Some API functionality is implemented in the form of specific parameters
part of more general modules. For example, one can use
action=querymeta=userinfouiprop=hasmsg to check for new messages, not
the (potentially) confusingly similar meta=allmessages. I am thinking of
users who have no prior knowledge of the API or of MediaWiki internals.
If users are truly familiar with MediaWiki internals, they might as well
just look at the PHP source code in Git.

Do you see a way that Special:ApiSandbox could accommodate a single-page
view of all API modules, so that API users can more easily determine
which parts of the API they need to use?

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Reworking API help

2012-05-13 Thread Kevin Israel
While considering adding syntax highlighting to the format=jsonfm output
of the API, I noticed the hackishness of the current API help page
system[1] and started working on a replacement that generates a [...] a
fully-HTML version of the help message[2] while remaining
fully compatible with existing extensions.

I propose moving API help to a new special page Special:ApiHelp, which
would run a few preg_replace operations and then the parser on
individual portions of the documentation to format them as HTML. The
combined output for all modules would be cached in memcached as in the
old ApiHelp class.

Here are a few questions about the best way to implement this:

1. Some members of ApiBase I need access to are marked protected, yet
   special pages have to subclass SpecialPage, not ApiBase. Which of
   these possible solutions is least hackish?

   (a) Generating the help page in an API module and then making
   an internal API request to execute that module when accessing
   the special page. The special page would show the result.

   (b) Internally calling action=paraminfo, individually requesting
   information on each API module to avoid encountering API limits.
   This would avoid duplicating lines 183-200 and 223-227 of
   includes/api/ApiParamInfo.php .

   (c) Adding an allmodules option to action=paraminfo, which would
   only be allowed for internal requests because I am unsure of how
   to cache the result.[3]
   This would have the same advantage as option (b).

2. In bug 26681[1], Sam Reed suggested moving ApiHelp out of core.
   I disagree. One of the main uses of the API is for coding bots
   and user scripts, which are a quicker and more convenient way to
   automate wiki processes than extensions that a server admin must
   install. Having accurate, easy-to-read documentation specific to
   the MediaWiki version and set of extensions is extremely useful
   when coding a bot or user script. So does API help really not
   belong in core?

3. Special:ApiHelp would need about ten CSS rules to display properly.
   Is creating a separate ResourceLoader module the norm in
   this situation?

4. To fit as many parameters on screen as possible, Special:ApiHelp
   would use a tabular layout similar to the current text-based output
   format. Is there any advantage to using definition lists over tables
   (or vice-versa), keeping in mind that CSS can style the definition
   list to appear in two columns?

5. Certain tags can apply to modules (i.e. Read, Write,
   Must POST, Can generate), which will go in the module's heading.
   Therefore, I need to reduce the tags' font size to that of the
   body text similar to .editsection. Is there a good alternative to
   copying the .editsection code for each skin (or just using the
   percentages for Vector), given the limitations of CSS?

I would greatly appreciate your input.

[1]: https://bugzilla.wikimedia.org/show_bug.cgi?id=26681
[2]: quoted from includes/api/ApiFormatBase.php
[3]: https://bugzilla.wikimedia.org/show_bug.cgi?id=26680

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] WURFL licensing concerns and Git migration

2012-03-20 Thread Kevin Israel
Our MobileFrontend extension, which is currently deployed on Wikimedia
sites, uses WURFL to detect the mobile devices it targets. However, I
recently became aware the version of the WURFL data files we use has a
rather restrictive license.

http://tech.groups.yahoo.com/group/wmlprogramming/message/34311

The license seems to suggest we are not even supposed to redistribute
verbatim copies or install the data files on multiple servers rather
than only making [...] one copy [...], if not merely fail to grant
such permission. Currently, the files are in our Subversion repository
and are going to end up in Git soon.

I am not a lawyer, and I realize this is probably a matter for the
Wikimedia Foundation to handle, albeit one of urgent importance to us.
If I am not mistaken, proper removal of infringing material from Git
repositories is somewhat painful in that it causes all child SHA-1
hashes to change, so I feel resolution of the above licensing concern
blocks Git migration of at least the MobileFrontend extension.

-- 
Wikipedia user PleaseStand
http://en.wikipedia.org/wiki/User:PleaseStand

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l