Re: [Wikitech-l] InstantClick

2014-02-09 Thread Marco Fleckinger



On 02/09/2014 04:03 AM, Brandon Harris wrote:


Mobile users don’t have hover effects, and thus can only use this on
click, which is how things get loaded anyway.


Samsung's Note series since the Galaxy Note 2 have this.

--Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Status of the new PDF Renderer

2014-01-28 Thread Marco Fleckinger

Hi,

On 2014-01-23 19:38, Matthew Walker wrote:


If you want to set this up locally; I can help with that if you jump on IRC
#mediawiki-pdfhack on freenode. I'm mwalker.

Thank you a lot for helping me installing the stack. Although it is an 
early stage of the project, it is working quite well. We were looking 
for such a solution for months now. This is the first one doing the job 
we need.


Yesterday I showed the output to my employer. He was very delighted 
about the result. So, many commendations from our company.


Cheers,

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] new PDF Renderer - mw-ocg-bundler issue

2014-01-27 Thread Marco Fleckinger


C. Scott Ananian canan...@wikimedia.org wrote:
I don't happen to know the Fedora packages, but I'd be glad to add
them to the README when you get a set which works.

may you also add it to the article:

https://mediawiki.org/wiki/PDF_rendering/Installation/en

Unfortunately, the hard parts are identifying all the font packages,
as package names for non-latin fonts seem to be wildly inconsistent
even between different Ubuntu releases. :(

Hm interesting. Concerning Tex everything went quite for me on Debian, although 
Matthew shared Ubuntu's packet names. Is this relevant to the article as well?

Marco
-- 
Diese Nachricht wurde von meinem Android-Mobiltelefon mit K-9 Mail gesendet.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Status of the new PDF Renderer

2014-01-19 Thread Marco Fleckinger

Hi Matthew,

greate work, thank you for sharing. In my company we need an extension 
like this. A few months ago, I was not successful to find a solution 
accepting UTF-8 encoded Unicode characters greater than 0x7F inside URLs.


Here I couldn't find an article, which was not forwarded to another 
article. Therefore I just created the article [1] and tried if it is 
possible to render it as PDF.


Though this is just a sprint's result, there are still some bugs, but 
mainly rendering was possible to render the article. That, compared to 
my experiences, is a really good result.


Is it also possible to set this up behind a firewall?

[1] 
http://ocg-collection-alpha.wmflabs.org/index.php/Test_German_Umlauts_%C3%A4%C3%B6%C3%BC%C3%84%C3%96%C3%9C%C3%9F%E2%86%92%E2%80%93%E2%80%9E%E2%80%9C


Cheers,

Marco

On 01/18/2014 03:42 AM, Matthew Walker wrote:

All,

We've just finished our second sprint on the new PDF renderer. A
significant chunk of renderer development time this cycle was on non latin
script support, as well as puppetization and packaging for deployment. We
have a work in progress pipeline up and running in labs which I encourage
everyone to go try and break. You can use the following featured articles
just to see what our current output is:
 * http://ocg-collection-alpha.wmflabs.org/index.php/Alexis_Bachelot
 *
http://ocg-collection-alpha.wmflabs.org/index.php/Atlantis:_The_Lost_Empire

Some other articles imported on that test wiki:
 * http://ur1.ca/gg0bw

Please note that some of these will fail due to known issues noted below.

You can render any page in the new renderer by clicking the sidebar link
Download as WMF PDF; if you Download as PDF you'll be using the old
renderer (useful for comparison.) Additionally, you can create full books
via Special:Book -- our renderer is RDF to Latex (PDF) and the old
renderer is e-book (PDF). You can also try out the RDF to Text (TXT)
renderer, but that's not on the critical path. As of right now we do not
have a bugzilla project entry so reply to this email, or email me directly
-- we'll need one of: the name of the page, the name of the collection, or
the collection_id parameter from the URL to debug.

There are some code bits that we know are still missing that we will have
to address in the coming weeks or in another sprint.
 * Attribution for images and text. The APIs are done, but we still need
to massage that information into the document.
 * Message translation -- right now all internal messages are in English
which is not so helpful to non English speakers.
 * Things using the cite tag and the Cite extension are not currently
supported (meaning you won't get nice references.)
 * Tables may not render at all, or may break the renderer.
 * Caching needs to be greatly improved.

Looking longer term into deployment on wiki, my plans right now are to get
this into beta labs for general testing and connect test.wikipedia.org up
to our QA hardware for load testing. The major blocker there is acceptance
of the Node.JS 0.10, and TexLive 2012 packages into reprap, our internal
aptitude package source. This is not quite as easy as it sounds, we already
use TexLive 2009 in production for the Math extension and we must apply
thorough tests to ensure we do not introduce any regressions when we update
to the 2012 package. I'm not sure what actual dates for those migrations /
testing will be because it greatly depends on when Ops has time. In the
meantime, our existing PDF cluster based on mwlib will continue to serve
our offline needs. Once our solution is deployed and tested, mwlib
(pdf[1-3]) will be retired here at the WMF and print on demand services
will be provided directly by PediaPress servers.

For the technically curious; we're approximately following the parsoid
deployment model -- using trebuchet to push out a source repository
(services/ocg-collection) that has the configuration and node dependencies
built on tin along with git submodules containing the actual service code.

It may not look like it on the surface, but we've come a long way and it
wouldn't have been possible without the (probably exasperated) help from
Jeff Green, Faidon, and Ori. Also big thanks to Brad and Max for their
work, and Gabriel for some head thunking. C. Scott and I are not quite off
the hook yet, as indicated by the list above, but hopefully soon enough
we'll be enjoying the cake and cookies from another new product launch.
(And yes, even if you're remote if I promised you cookies as bribes I'll
ship them to you :p)

~Matt Walker
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Analytics] Fwd: Page view stats we can believe in

2013-02-14 Thread Marco Fleckinger

Hi,

On 02/14/2013 12:04 PM, Federico Leva (Nemo) wrote:


Why are those bots not using the API, by the way?


One possible reason, I can imagine:

Maybe because it's turned off on many private wikis. I saw spam on such 
ones as well. Obviously there is a framework, which doesn't need to use 
the API.


Cheers

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-13 Thread Marco Fleckinger



On 02/12/2013 05:30 PM, Mark A. Hershberger wrote:

On 02/11/2013 11:25 AM, Daniel Barrett wrote:

Imagine if Wikipedia had a separate wiki for every city in the world. The same 
problem would result.


I find it is easier to imagine what would happen if each language had a
separate Wikipedia.  We would end up with slightly different facts
maintained on each wiki.

Come on, this will be a similar discussion of what is the NPOV 
concerning the Falkland island on the English and the Spanish Wikipedia. 
IMHO each community should organize his wiki on it's own. Meta, 
Mediawiki, Commons and Wikidata already have interlanguage-communities 
and I think this doesn't work bad.


Wikidata will be a bit different because it will integrate itself into 
the wikis' structures. Therefore I think that there will be discussion. 
So it's really great that the developers let the consumers the choice if 
they wanted to use wikidata or not.


Cheers

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Font at dv.wikipedia and dv.wiktionary

2013-02-13 Thread Marco Fleckinger

Hi,

On 02/12/2013 10:25 PM, Ryan Kaldari wrote:

We don't have to limit ourselves to free license fonts for what we use
on our own servers. We only have to limit ourselves to free license
fonts for what we distribute with our software. Of course, we should
always try to support free license fonts when they are available, but
there is no reason for us to artificially limit ourselves to free fonts
for our own projects (assuming the licensing fees are reasonable).

So, what about Gill Sans MT used by Wikimedia and its chapters? Do we we 
have a license also using those on non M$-operating systems?


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] deployment of the first phase of Wikidata on enwp

2013-02-13 Thread Marco Fleckinger
Congrats, also on this list :-)

Cheers,

Marco



Lydia Pintscher lydia.pintsc...@wikimedia.de schrieb:

Heya :)

Third time's a charm, right? We're live on the English Wikipedia with
phase 1 now  \o/
Details are in this blog post:
http://blog.wikimedia.de/2013/02/13/wikidata-live-on-the-english-wikipedia
An FAQ is being worked on at
http://meta.wikimedia.org/wiki/Wikidata/Deployment_Questions
Thanks everyone who helped! I'm happy to answer questions at
http://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical).
Please also let me know about any issues there.


Cheers
Lydia

--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata

Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: Re: How to speed up the review in gerrit?

2013-02-13 Thread Marco Fleckinger



On 02/14/2013 02:12 AM, Matthew Flaschen wrote:

On 02/13/2013 07:57 PM, Chad wrote:

On Wed, Feb 13, 2013 at 4:35 PM, Sumana Harihareswara
suma...@wikimedia.org  wrote:

Maybe our rule should be: if an extension is not
deployed on Wikimedia sites, then we should basically allow anyone to
merge new code in (disallowing self-merges), unless the existing
maintainers object.



Having a can review all extensions group is easy, but allowing for
exemptions will be a pain to manage the ACLs for. For every extension
that opts out of being reviewed by this group, we'd have to adjust its
ACL to block the inherited permissions.


How about instead of can review all extensions, we make it easier to
request review rights on non-WMF extensions?

Good idea, but in general there could just be 3+ different classes of 
extensions? The class can be calculated by its importance, e.g. 
installed on WMF-sites, number of other wikis using it, etc.



You still have to ask for each extension you want, but if the
maintainer's okay with it (or not around), the burden of proof is less.

No one really needs review rights on *all* the non-deployed extensions,
only the few they work on.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: Re: How to speed up the review in gerrit?

2013-02-13 Thread Marco Fleckinger



On 02/14/2013 03:28 AM, Chad wrote:

On Wed, Feb 13, 2013 at 9:22 PM, Marco Fleckinger
marco.fleckin...@wikipedia.at  wrote:

Having a can review all extensions group is easy, but allowing for
exemptions will be a pain to manage the ACLs for. For every extension
that opts out of being reviewed by this group, we'd have to adjust its
ACL to block the inherited permissions.



How about instead of can review all extensions, we make it easier to
request review rights on non-WMF extensions?


Good idea, but in general there could just be 3+ different classes of
extensions? The class can be calculated by its importance, e.g. installed on
WMF-sites, number of other wikis using it, etc.



Having classes of extensions is difficult to maintain from an ACL
standpoint. Permissions in Gerrit are directly inherited (and there's no
multiple inheritance), so things in mediawiki/extensions/* all have the
same permissions. So having rules that apply to only some of those
repositories requires editing ACLs for each repository in each group.


Sorry, I think you misunderstood me. I meant classes like:

* Used by WMF
* non-WMF very important
* non-WMF important
* non-WMF less important
* non-WMF unimportant

No multiple inheritance will be needed for this model.

Cheers,

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Integrating MediaWiki into MS SBS

2013-02-05 Thread Marco Fleckinger



On 02/05/2013 12:03 PM, Bináris wrote:

Failed. Our sysop is like a stone. The main argument is that we have a
Microsoft environment and no need of a new development . I was advised to
use Sharepoint wiki instead which is a fade mockery of MediaWiki. I give it
up. :-(((

The farmer doesn't want to eat anything he doesn't know. I don't know 
this sentence's popularity in Hungary (AFAIK?), but in German it's quite 
famous.


Sorry about that.

You may annoy him a little bit. :=D

Therefore you will need user credentials. BTW, is it already activated? 
Does your company even want to use it? If you already have some 
wiki-stuff, you may need to convert this. People will need to be 
introduced into this.


Just be some kind of creative! :-)

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Integrating MediaWiki into MS SBS

2013-02-02 Thread Marco Fleckinger

Hi,

We're using this at another organization. But the servers are 
Debian-based. But is cool to have the same password on nearly 
everything. If you need it, I maybe could extract you the relevant 
information from this config.


Try to get your sysadmin to let you operate on a VM possibly with Linux, 
just as a test. Then nothing influences the original installation and 
you're independent.


Marco

On 02/02/2013 03:42 PM, Ryan Lane wrote:

On Fri, Feb 1, 2013 at 6:29 PM, Jay Ashworthj...@baylink.com  wrote:


- Original Message -

From: Dan Andreescudandree...@wikimedia.org



The following manual seems to be the most actively maintained guide
for getting MediaWiki installed on Windows:

http://www.mediawiki.org/wiki/Manual:Running_MediaWiki_on_Windows

If you run into any problems, I'd suggest adding them to the manual
along with any resolutions you or others come up with. Good luck!


I'm not sure he actually wants to run it on Windows.

He may just need SSO with Active Directory.

https://encrypted.google.com/search?q=mediawiki+active+directory



If that's the case, the LDAP extension works for that:

http://www.mediawiki.org/wiki/Extension:LDAP_Authentication

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcoming Runa Bhattacharjee

2013-01-28 Thread Marco Fleckinger


Welcome here!

Languages are your topic? Really great playground!

Marco

Matthew Flaschen mflasc...@wikimedia.org schrieb:

On 01/28/2013 04:10 PM, Alolita Sharma wrote:
 Runa believes newer platforms and devices used for content delivery
and
 adoption through local languages make it an exciting time for growth
of
 open source language tools.

Welcome!  Indeed, it's an exciting time.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Interactions between the visual editor and Wikidata

2013-01-11 Thread Marco Fleckinger

Hi,

I would assume that there will be something like a shortcut or a button 
like integrate Wikidata-property here, where you could inline search 
for this.


Internally i could immagine a list where all items are tracked and a 
span class=wikidata id=wikidata_key where the id of this html-tag 
is the key in the internal list.


Marco

On 01/11/13 21:32, Strainu wrote:

Hi,

Starting from a very different problem, I found myself asking a very
strange question: How will the Visual Editor interact with Wikidata?

Namely, let's say that somebody is editing the article about Romania
in the visual editor. The article contains an infobox with all the
data brought from Wikidata. The user will edit one of those fields
(say, the president's name). Will that change automatically be
reflected back to Wikidata? If not, will it be even possible to edit
those fields? If the change is pushed to Wikidata, how are conflicts
handled (2 users editing the president's name on different wikis
during a very crowded period when changes take a while to replicate
and one of them makes a mistake)?

I seem to remember that wikidata edit-in-place is planned for phase
II, but I always assumed this would happen using some kind of popup
window, which kindof beats the purpose of a visual editor.

Thanks,
   Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Interactions between the visual editor and Wikidata

2013-01-11 Thread Marco Fleckinger

[include wikidata-l]

On 01/11/13 23:27, Chris Steipp wrote:

Since we have CORS, javascript on the wikipedia page could make the
update directly in wikidata. I'd let the wikidata / ve people decide
if that's desirable, but it's possible.

Maybe yes, but wikidata allows different opinions. How do you decide, if 
the chunk of text replaced in VE should replace the data or if it should 
be just adding it by another chunk of data.


Just a few examples:

1. Replacement:

a) data gets clearer
We did not knew the exact number of killed/injured people of a big 
accident/catastrophes relevant for all Wikipedias (also the German one 
;-) ) recently after it happend. Fucushima would be an example or a big 
Earth quake like in Haiti.


With the time you will get an exact number, because it needed some time 
to figure this out and the number changes as some more people die 
afterwards initially caused by the accident.


Articles are created very soon after anything like that. In the 
beginning they use the data populated also by the media. With the time 
this will get clearer and the data will turn more precise. So also those 
values need to be changed.


b) Wrong entered data:
Sometimes it happens, however, that somebody enter wrong data. This can 
happen intentionally and just by accident. If you change this data in 
the VE, you definitely have to replace this on Wikidata.


2. Additional Information:

There is an election in a country. The former president was not 
reelected. If this is changed on Wikidata, the information should not be 
replaced on Wikidata, it should be added with the qualifier since:date 
and the previous value should be altered by until:date



In the end all edits in by authors in the current editor and in the 
visual one will are just tracked as changes, but how do you decide 
weather to replace the information or add it?


Marco


On Fri, Jan 11, 2013 at 1:41 PM, Marco Fleckinger
marco.fleckin...@wikipedia.at  wrote:

Hi,

I would assume that there will be something like a shortcut or a button like
integrate Wikidata-property here, where you could inline search for this.

Internally i could immagine a list where all items are tracked and aspan
class=wikidata id=wikidata_key  where the id of this html-tag is the key
in the internal list.

Marco


On 01/11/13 21:32, Strainu wrote:


Hi,

Starting from a very different problem, I found myself asking a very
strange question: How will the Visual Editor interact with Wikidata?

Namely, let's say that somebody is editing the article about Romania
in the visual editor. The article contains an infobox with all the
data brought from Wikidata. The user will edit one of those fields
(say, the president's name). Will that change automatically be
reflected back to Wikidata? If not, will it be even possible to edit
those fields? If the change is pushed to Wikidata, how are conflicts
handled (2 users editing the president's name on different wikis
during a very crowded period when changes take a while to replicate
and one of them makes a mistake)?

I seem to remember that wikidata edit-in-place is planned for phase
II, but I always assumed this would happen using some kind of popup
window, which kindof beats the purpose of a visual editor.

Thanks,
Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Fwd: Re: [Wikimedia-l] No access to the Uzbek Wikipedia in Uzbekistan

2012-12-27 Thread Marco Fleckinger



Do we have one extra machine left. Then we could set up this as NAT-Router. 
This will replace another machine if we do not have one extra IP left. The 
original ports need to be forwarded to that then. 

Cheers

Marco 

 Original-Nachricht 
Von: Leslie Carr lc...@wikimedia.org
Gesendet: Fri Dec 28 00:03:33 MEZ 2012
An: Wikimedia Mailing List wikimedi...@lists.wikimedia.org
Betreff: Re: [Wikimedia-l] No access to the Uzbek Wikipedia in Uzbekistan

On Thu, Dec 27, 2012 at 2:37 PM, Marco Fleckinger
marco.fleckin...@wikipedia.at wrote:




 Leslie Carr lc...@wikimedia.org schrieb:

On Thu, Dec 27, 2012 at 1:39 PM, Marco Fleckinger
marco.fleckin...@wikipedia.at wrote:

 Just an idea, which is not very beautiful: What about a router
forwarding ports to the correct machine by using iptables? Would that
also work in connection with search engines?

Are you suggesting we use different nonstandard ports for each
different wiki/language combo that resides on the same IP ?

 Yes exactly!


I guess that is theoretically possible with a more intrusive load
balancer in the middle. We need the HOST information from the http
header to be added as we have our varnish caches serving multiple
services, not one(or more) per language/project combo.  I'm pretty
sure that lvs doesn't have this ability (which we use).  Some large
commercial load balancers have the ability to rewrite some headers,
but that would be a pretty intensive operation (think lots of cpu
needed, since it needs to terminate SSL and then rewrite headers) and
would probably be expensive.  If you have another way you think we can
do this, I am all ears!

We may want to move this discussion to wikitech-l as all the technical
discussions probably bore most of the people on wikimedia-l

Leslie


 ___
 Wikimedia-l mailing list
 wikimedi...@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l



-- 
Leslie Carr
Wikimedia Foundation
AS 14907, 43821
http://as14907.peeringdb.com/

___
Wikimedia-l mailing list
wikimedi...@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia

2012-12-14 Thread Marco Fleckinger

Hi Gabriel,

thank you for that information. Actually I already knew of the project. 
Therefore I could imagine a process as I described. This IMHO doesn't 
need much i18n, because there is an defined syntax for wt and for html.


On 2012-12-13 20:57, Gabriel Wicke wrote:

On 12/13/2012 06:43 AM, Marco Fleckinger wrote:

Implementing this is not very easy, but developers can may use some of
the old ideas. Parsing the other way around has to be realized really
from the scratch but is easier because everything is in a tree. not in a
single text-string.

Neither de- nor searalizing includes any surface, testing could be done
automatically really easy comparing the results of conventional and the
new parsing. The result of the serialization can be compared with the
original markup.


we (the Parsoid team) have been doing many of the things you describe in
the last year:


Ah, that was the project's name. ;-)


* We wrote a new bidirectional parser / serializer - see
http://www.mediawiki.org/wiki/Parsoid. This includes a grammar-based
tokenizer, async/parallel token stream transformations and HTML5 DOM
building.

Thank you for pointing to that. Will also be interested for one of my 
private project.



* We developed a HTML5 / RDFa document model spec at
http://www.mediawiki.org/wiki/Parsoid/MediaWiki_DOM_spec.

* Our parserTests runner tests wt2html (wikitext to html), wt2wt,
html2html and html2wt modes with the same wikitext / HTML pairs as used
in the PHP parser tests. We have roughly doubled the number of such
pairs in the process.

* Automated and distributed round-trip tests are currently run over a
random selection of 100k English Wikipedia pages:
http://parsoid.wmflabs.org:8001/. This test infrastructure can easily be
pointed at a different set of pages or another wiki.

Once the top results in English are reached it should not be a big deal 
testing it on other wikis.



Parsoid is by no means complete, but we are very happy with how far we
already got since last October.


Congratulation for that results so far.

Cheers

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia

2012-12-13 Thread Marco Fleckinger

On 2012-12-12 16:58, Chris McMahon wrote:

Would it be possible to enable VE on test2 in the same way?  I would like
to use it in a noisy way, and would rather not make noise on enwiki.

Do you really mean http://test2.wikipedia.org? AFAIK the wikidata team 
uses this installation as a testing client for http://wikidata.org 
repository concerning the interwiki-links.


Cheers

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Alpha version of the VisualEditor now available on the English Wikipedia

2012-12-13 Thread Marco Fleckinger

Hello,

On 2012-12-12 20:04, James Alexander wrote:

On Wed, Dec 12, 2012 at 8:57 AM, Andre Klapperaklap...@wikimedia.orgwrote:


On Tue, 2012-12-11 at 19:30 -0800, James Forrester wrote:

This is not the final form of the VisualEditor in lots of different
ways. We know of a number of bugs, and we expect you to find more. We
do not recommend people trying to use the VisualEditor for their
regular editing yet. We would love your feedback on what we have done
so far – whether it’s a problem you discovered, an aspect that you
find confusing, what area you think we should work on next, or
anything else, please do let us know.[1]

[1] - https://en.wikipedia.org/wiki/Wikipedia:VisualEditor/Feedback



Playing the bad cop who's reading random feedback pages daily:

As https://www.mediawiki.org/wiki/VisualEditor/Feedback also exists I
wonder if the VisualEditor deployment on en.wp and its related feedback
is so different from upstream that it needs a separate feedback page
(instead of e.g. a soft redirect to the mw: one), or other reasons. Or
does the en.wp one somehow make it easier for testers to report issues?
When we deploy VE to other Wikipedias, will there also be separate VE
feedback pages (maybe due to the different languages)?

Note: I'm not criticizing it, I'm just trying to understand, and I'm
picking VE as the most recent example.

Thanks in advance for explaining,
andre
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/




Risker said many of the reasons but the biggest reason is that a large
portion of testers would not move wiki. Opening up a local spot for
feedback drastically increases the amount of feedback you get which can be
really helpful. Personally I think we should do it on as many wikis as we
can for major projects like this but it's obviously difficult to do on many
because of both the language barriers and watching too many feedback
channels.

Yet another thing that once a product like Echo works cross wiki it could
be helpful for :) but that's a bit of a ways away.

The Wikidata-Team lays focus on the testing on RTL-wikis. The first 
Wikipedia ever will be huwiki, because their community decided 
themselves to be the first. Itwiki wanted to be the second one, but the 
Wikidata-Team wanted to test RTL, therefore they asked the hewiki.


Here I think i18n is very important as well, but I think it had already 
been tested many years ago. The PHP-regex-construct aka. wikiparser 
which is unidirectional has to be reimplemented by a real parser in JS.


Implementing this is not very easy, but developers can may use some of 
the old ideas. Parsing the other way around has to be realized really 
from the scratch but is easier because everything is in a tree. not in a 
single text-string.


Neither de- nor searalizing includes any surface, testing could be done 
automatically really easy comparing the results of conventional and the 
new parsing. The result of the serialization can be compared with the 
original markup.


The components including surfaces will for sure need i18n. Using icons 
instead of texts in a menu can help getting around this issue very 
easily. Although using many icons we also need text, but IMHO this 
should be avoided, also in foresight to the need of translation. ;-)


So this early version is for testing user's experience with this 
surface. I think this is really great and am impressed of the result it 
is a bit slow but, guys, it's still the alpha 1, what should I expect?


It also does not work on all browsers, as mentioned. E.g. the latest 
Firefox (aka. Iceweasel) on Debian Wheezy (not yet released), is not 
supported. This is okay for me.


I also think that people using outdated browsers should not have the 
same great experience. Everything necessary should work, but is the VE 
really essential?


Cheers

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bikol Wikipedia logo issue

2012-12-13 Thread Marco Fleckinger

Hello,

On 2012-12-13 16:35, Butch Bustria wrote:

Can you help us fix the Wikipedia Logo on bcl.wikipedia.org ?

The local community is asking for assistance.


In the HTML-text

background-size:contain; is missing for the a-tag containing the logo 
as background;


You could also use a reduced resolution like eg. de.wikipedia.org. There 
a version in resolution 135x155 is used. You use 1650x1751.


Cheers

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Sul on external Scripts

2012-12-07 Thread Marco Fleckinger

Hi,

I know it is technically possible to use the SUL account outside of 
Wikimedia-Projects. We heard, that it is not very much liked to use this 
possibility. Maybe somebody of you could tell us if that is so and maybe 
why?


Cheers,

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Sul on external Scripts

2012-12-07 Thread Marco Fleckinger

No I'm asking of using Wikimedia's CentralAuth using in our tools.

On 08/12/12 00:23, K. Peachey wrote:

Are you referring to using Wikimedia CentralAuth accounts to auth
against other provider wiki sites?

Or using your own CentralAuth setup for your site(s)?

On Sat, Dec 8, 2012 at 9:07 AM, Marco Fleckinger
marco.fleckin...@wikipedia.at  wrote:

Hi,

I know it is technically possible to use the SUL account outside of
Wikimedia-Projects. We heard, that it is not very much liked to use this
possibility. Maybe somebody of you could tell us if that is so and maybe
why?

Cheers,

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Sul on external Scripts

2012-12-07 Thread Marco Fleckinger

Hi,

actually it would be for a WLM-project we want do extend. This is a bit 
higher priorized, because it will not be such a big deal. In the 
meantime we could use the old stuff authentication or TUSC in our project.


While doing that – why not working on such a proposal. I could do some 
tests and then develop a concept to decide wheather to implement it or not.


@Alex: Sorry somebody forgot to reply to all.

On 08/12/12 01:11, Sébastien Santoro wrote:

Yes, we need that. Toys like TUSC should be replaced by scalable and
correct stuff at middle term. But this is not currently one of the top
priority, so we also need people to implement it, maintain it. This is
not only a development issue (I wouldn't be surprise if the current
OAuth providers extensions would be virtually mature) but a strong
support and integration work afterwards.

You need to prepare something like an OAuth2 provider infrastructure
to ask the user's Wikimedia home project to do authentication and
submit your tool the appropriated result.

Please note somewhere your proposal, and if possible, what and who you
need to make it real.

On Sat, Dec 8, 2012 at 12:33 AM, Marco Fleckinger
marco.fleckin...@wikipedia.at  wrote:

No I'm asking of using Wikimedia's CentralAuth using in our tools.


On 08/12/12 00:23, K. Peachey wrote:


Are you referring to using Wikimedia CentralAuth accounts to auth
against other provider wiki sites?

Or using your own CentralAuth setup for your site(s)?

On Sat, Dec 8, 2012 at 9:07 AM, Marco Fleckinger
marco.fleckin...@wikipedia.at   wrote:


Hi,

I know it is technically possible to use the SUL account outside of
Wikimedia-Projects. We heard, that it is not very much liked to use this
possibility. Maybe somebody of you could tell us if that is so and maybe
why?

Cheers,

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bugzilla upgrade [was: bugzilla.wikimedia.org downtime: Now.]

2012-12-06 Thread Marco Fleckinger



On 06/12/12 18:06, Chad wrote:

On Wed, Dec 5, 2012 at 2:57 PM, Brion Vibberbr...@pobox.com  wrote:

On Wed, Dec 5, 2012 at 11:53 AM, Chadinnocentkil...@gmail.com  wrote:

On Tue, Dec 4, 2012 at 2:44 PM, Andre Klapperaklap...@wikimedia.org
wrote:


Bah, plaintext 4 ever ;-)


Everything else is nonsense with K9 on Android.


Seems I'm outnumbered on this. I've already changed my preferences,
so it's not a big deal.


+1

Cheers

Marco

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l