[Wikitech-l] Ajaxify page requests extension

2016-05-06 Thread Cyken Zeraux
I've been working on a little project that intercepts normal wiki links
from reloading the entire page and instead requests that page with ajax,
seperating out the content and replacing it on the page. So far, I've
gotten this to work with only the main content, however I need to also be
able to safely grab the 'mw-navigation' and 'footer' which seems to also be
standardized in most skins.

The only problem with that is client side JS cannot safely parse returned
pages for all three elements because it is possible for users to specify
replica elements with the same ID in-page, and DomDocument is not optimized
or well supported.

I would like to be able to grab all of the page content before its echoed
out, DomDocument in PHP, grab my elements, and echo out JSON of them. The
client side js plugin would then refresh those elements.

I've done a good bit of trial and error and research in the Mediawiki docs
for this, and it seems its not currently supported because the active skin
is the one that echoes out the entire page, not Mediawiki itself.

Am I wrong in my findings, and are there any plans to make Mediawiki handle
pages and echo them out? It would break standard currently, but I feel that
having skins build up an output string and passing it to Mediawiki rather
than delivering the content itself is a better approach.

Thank you for your time.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Discovery Weekly Update for the week starting 2016-05-02

2016-05-06 Thread Chris Koerner
Greetings,
I hope you had a good week.

Here are the updates from the Discovery department for the week.

* Wikipedia.org portal A/B test for adding in descriptive text to the
sister wiki links  went live on
May 3, 2016.
* Updated statistics for wiktionary.org
 on May 3, 2016
* Released a few minor fixes to the Portal on May 3, 2016
* New version of Blazegraph and WDQS deployed, including geospatial search

.
* New completion suggester code was reverted from planned ElasticSearch 2.x
release
,
affecting our plans for runtime updates to suggester.
* It is now possible to use SPARQL queries
 from interactive graphs
.
* Deb attended the Reading department offsite this week. Initial response
was that it was a great opportunity to learn and discovery new ways we can
collaborate in the future.



Feedback and suggestions on this weekly update are welcome.

The full update, and archive of past updates, can be found on Mediawiki.org:

https://www.mediawiki.org/wiki/Discovery/Status_updates

-- 
Yours,
Chris Koerner
Community Liaison - Discovery
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki-Codesniffer 0.7.1 released

2016-05-06 Thread Legoktm
Hello!

MediaWiki-Codesniffer 0.7.1 is now available for use in your MediaWiki
extensions and other projects. 0.7.0 was a botched release that had
bugs, please don't use it. Here are the notable changes since the
last release (0.6.0):

* Also check for space after elseif in SpaceAfterControlStructureSniff
(Lethexie)
* Factor our tokenIsNamespaced method (addshore)
* Make IfElseStructureSniff can detect and fix multiple white spaces
after else (Lethexie)
* Make SpaceyParenthesisSniff can fix multiple white spaces between
parentheses (Lethexie)
* Make spacey parenthesis sniff work with short array syntax (Kunal Mehta)
* Speed up PrefixedGlobalFunctionsSniff (addshore)
* Update squizlabs/php_codesniffer to 2.6.0 (Paladox)

Thanks to some awesome work by Addshore, MW-CS is now twice as fast for
large repos like mediawiki/core!

This release also features contributions from GSoC student Lethexie who
will be working on improving our static analysis tools this summer!

-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] REL1_27 branches up

2016-05-06 Thread Adam Baso
Chad and Chris, thanks for the update. We'll keep the thread posted.

On Thu, May 5, 2016 at 11:25 AM, Chris Steipp  wrote:

> On Thu, May 5, 2016 at 8:50 AM, Chad  wrote:
>
> > On Thu, May 5, 2016 at 8:19 AM Gergo Tisza  wrote:
> >
> > > On Thu, May 5, 2016 at 4:31 PM, Chad  wrote:
> > >
> > > > Well then it sounds like it won't make the 1.27 release. We've known
> > > > this branching was coming for the last 6 months :)
> > > >
> > >
> > > Is there a way to do a backport before the release candidate gets
> > > published? The problem with doing major changes after an LTS release is
> > > that both the old and the new version has to be maintained for the full
> > > support cycle.
> > >
> > >
> > If it can get done in time, sure. I know there's a strong desire to land
> it
> > prior to the release but we can't hold it up forever :)
> >
> >
> There were very minor changes I suggested to the patches that Gergo was
> cleaning up earlier this week (like, can we not call the api module
> ResetPassword when the special page is PasswordReset). At this point, I'm
> fine if it juts gets merged.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 2016-05-04 Scrum of Scrums meeting notes

2016-05-06 Thread Željko Filipin
On Thu, May 5, 2016 at 6:27 PM, Jon Robson  wrote:

> Yes. Thanks. I think I've done my bit. Let me know if there is
> anything else you need from me.
>

Thanks Jon. I will add you to a few gerrit patches.

Željko
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] garbage characters show up when fetching wikimedia api

2016-05-06 Thread Marius Hoch

Hi,

that sounds like https://phabricator.wikimedia.org/T133866.

Cheers

Marius

On 05.05.2016 21:56, Trung Dinh wrote:

Hi all,
I have an issue why trying to parse data fetched from wikipedia api.
This is the piece of code that I am using:
api_url = 
'http://en.wikipedia.org/w/api.php'
api_params = 
'action=query=recentchanges=5000=edit=0=newer=json=20160504022715'

f = urllib2.Request(api_url, api_params)
print ('requesting ' + api_url + '?' + api_params)
source = urllib2.urlopen(f, None, 300).read()
source = json.loads(source)

json.loads(source) raised the following exception " Expecting , delimiter: line 1 
column 817105 (char 817104"

I tried to use source.encode('utf-8') and some other encodings but they all 
didn't help.
Do we have any workaround for that issue ? Thanks :)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l