You might want to have a look at the Annoying little bugs page [1],
I think it's meant exactly for people like you.
[1]: https://www.mediawiki.org/wiki/Annoying_little_bugs
Petr Onderka
[[en:User:Svick]]
On Thu, Dec 12, 2013 at 10:27 AM, Amanpreet Singh
amanpreet.iitr2...@gmail.com wrote
articles (e.g. w/, w/e).
Petr Onderka
[[en:User:Svick]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Shouldn't Special:MWOAuth with no other parameters do something better than
just returning an error?
Also, how is normal user supposed to learn about
Special:MWOAuthManageMyGrants?
I would expect this to be available from Preferences, but I didn't find
anything there.
Petr Onderka
awesome, but not as easy
to just drop in as a library.)
I'm certainly going to try to use some library for delta compression,
because they seem to do pretty much exactly what's needed here. Thanks for
the suggestions.
Petr Onderka
___
Wikitech-l
.
Petr Onderka
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
a big C++11 fan. ;) ).
Yeah, I'm already using unique_ptr. And I will use lambdas if I think they
would be useful in some code.
Petr Onderka
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech
the new dumps with minimal changes to
your code.
Keeping the dumps in a text-based format doesn't make sense, because that
can't be updated efficiently, which is the whole reason for the new dumps.
Petr Onderka
On Mon, Jul 1, 2013 at 11:10 PM, Byrial Jensen byr...@vip.cybercity.dkwrote:
Hi
tell the application what you want, it will
look it up and output only that.
Also, even if you couldn't seek, I don't see how is this any worse than the
current situation, when you also can't seek into a specific position of the
compressed XML (unless you use multistream dumps).
Petr Onderka
I'm primarily a Windows guy, so I'm trying to write the code in a portable
way and I will make sure the application works on both Linux and Windows.
Petr Onderka
On Wed, Jul 3, 2013 at 4:49 PM, Erik Zachte ezac...@wikimedia.org wrote:
it will now be a command line application that outputs
/operations%2Fdumps%2Fincremental/refs%2Fheads%2Fgsoc
On Wed, Jul 3, 2013 at 10:13 PM, Byrial Jensen byr...@vip.cybercity.dkwrote:
At 03-07-2013 18:29, Petr Onderka wrote:
I'm primarily a Windows guy, so I'm trying to write the code in a
portable way and I will make sure the application works
model and format.
Is there something else needed for ContentHandler?
The dumps don't really care what is the format or encoding of the revision
text, it's just a byte stream to them.
Petr Onderka
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
it work for your use case?
Any comments or suggestions are welcome.
Petr Onderka
[[User:Svick]]
[1]: http://www.mediawiki.org/wiki/User:Svick/Incremental_dumps
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman
: I'd like to use this library in my future C# code.
But I realize creating something that works only in C# doesn't make sense,
because most people in this community don't use it.
So, to me writing the code so that it can be used from anywhere makes the
most sense
Petr Onderka
On Mon, Jul 1
might consider adding
a split instruction to diff dumps, but that's probably not necessary now.
Petr Onderka
[1]: http://www.mediawiki.org/wiki/Manual:Page_table#page_restrictions
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
and the implementation sticks to it.)
Theoretically, I could use compressed XML in internal data structures, but
I think that just combines the disadvantages of both.
So, the size is not the main reason not to use XML, it's just one of the
reasons.
Petr Onderka
On Mon, Jul 1, 2013 at 7:26 PM
forget about deletions (and undeletions).
You could somewhat solve some of these problems (e.g. by adding indexes),
but I don't think you can solve all of them.
Petr Onderka
On Mon, Jul 1, 2013 at 9:13 PM, Dmitriy Sintsov ques...@rambler.ru wrote:
On 01.07.2013 22:56, Tyler Romeo wrote:
Petr
if it works.
Also, I think that reading the binary format isn't going to be the biggest
issue if you're implementing your own library for incremental dumps,
especially if I'm going to use delta compression of revision texts.
Petr Onderka
On Mon, Jul 1, 2013 at 9:16 PM, Daniel Friesen
dan...@nadir
- modal view opens
Click 2 on the video in modal view - video starts playing
And the proposed change is to:
Click 1 on thumbnail - modal window opens and starts playing
This is not about playing videos as soon as a page loads.
Petr Onderka
[[en:User:Svick
This is probably not what you want to hear, but one way would be to get a
Toolserver account.
That way, you wouldn't need the query service, you could run those queries
by yourself.
Petr Onderka
[[en:User:Svick]]
On Wed, May 22, 2013 at 10:03 PM, Tuszynski, Jaroslaw W.
jaroslaw.w.tuszyn
of Wikimedia projects). The primary
advantage of this new format will be that it should take shorter time to
create the dump, because the previous dump can be reused.
Any comments or co-mentors (as far as I know, Ariel Glenn is currently the
only potential mentor on this project) are welcome.
Petr
When I'm confused about timezones, I tend to use Wolfram Alpha:
http://www.wolframalpha.com/input/?i=19%3A00pm+UTC+on+May+1%2C+2013
19:00pm UTC on May 1, 2013
1 day 33 minutes 44 seconds in the future
Petr Onderka
[[en:User:Svick]]
On Tue, Apr 30, 2013 at 8:22 PM, Jiabao Wu jiabao.f
There is a user script [1] that does a primitive version of this.
I have found it to be quite useful, so I think it's a good idea to do this
properly.
Petr Onderka
[[en:User:Svick]]
[1]: http://en.wikipedia.org/wiki/User:Lampak/MyLanguages
On Thu, Apr 18, 2013 at 6:50 PM, Pau Giner pgi
for my local MediaWiki installation even with Expect:
100-Continue set.
Petr Onderka
[[en:User:Svick]]
On Wed, Apr 17, 2013 at 5:50 AM, Tyler Romeo tylerro...@gmail.com wrote:
Found this interesting articles on designing an API for what it's worth.
Thought some people my find it interesting
], it says that a proxy *has to* return
417 if the target server doesn't support HTTP/1.1.
Though I have no idea why would the specification require this.
Petr Onderka
[[en:user:Svick]]
[1]: http://www.w3.org/Protocols/rfc2616/rfc2616-sec8.html#sec8.2.3
On Wed, Apr 17, 2013 at 10:33 PM, Brian Wolff
So you're suggesting we go *against* the HTTP standard? That's not exactly
what you're supposed to do.
Well, ignoring the header makes more sense to me and, personally, I would
prefer that behavior.
But it's a minor issue and I think going against the standard is not
actually worth it.
Petr
now, I think the best option is to download the whole dump again.
Petr Onderka
[[en:User:Svick]]
[1]: http://dumps.wikimedia.org/other/incr/
[2]:
https://www.mediawiki.org/wiki/Summer_of_Code_2013#Incremental_data_dumps
On Sun, Apr 14, 2013 at 4:37 PM, Sajid Hussain sajidbinrah...@gmail.comwrote
wouldn't end well.
Petr Onderka
[[en:User:Svick]]
2013/3/28 Denny Vrandečić denny.vrande...@wikimedia.de:
We have a first write up of how we plan to support queries in Wikidata.
Comments on our errors and requests for clarifications are more than
welcome.
https://meta.wikimedia.org/wiki/Wikidata
will have to find help elsewhere.
Personally, I would suggest that you ask your questions on stackoverflow.com.
Petr Onderka
[[en:User:Svick]]
On Sun, Mar 17, 2013 at 1:39 PM, Ted Reynard ted.reyn...@yahoo.com wrote:
Hi guys
Does anybody know about serial programming using termios?
I found this link
itself.
And, as I understand it, that's what you claim is required and
what others claim would be a waste of bandwidth
[1]: http://ajax.googleapis.com/ajax/libs/jquery/1.9.1/jquery.min.js
Petr Onderka
[[en:User:Svick]]
___
Wikitech-l mailing list
According to
http://hu.wikipedia.org/w/api.php?action=querymeta=siteinfosiprop=namespaces
(and http://en.wikipedia.org/wiki/Wikipedia:Namespace), it's 828.
Petr Onderka
[[en:User:Svick]]
On Thu, Feb 21, 2013 at 1:30 AM, Bináris wikipo...@gmail.com wrote:
What is the number of the new namespace
at compile time.
More information is at http://en.wikipedia.org/wiki/User:Svick/LinqToWiki.
Any comments are welcome.
Petr Onderka
[[en:User:Svick]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo
I didn't realize that was a requirement for libraries (the library
allowed setting the UserAgent, but it didn't force it until now).
I have made that change and setting the UserAgent is now required.
Petr Onderka
[[en:User:Svick]]
On Sun, Feb 17, 2013 at 7:42 PM, Yuri Astrakhan yuriastrak
://www.mediawiki.org/wiki/API:Etiquette
Petr Onderka
[[en:User:Svick]]
On Sat, Feb 2, 2013 at 8:07 PM, Small M smallma...@yahoo.com wrote:
Hello,
There is a lack of guidelines regarding the rate at the mediawiki API may be
used.
Specifically for the enwiki and commons api (if it doesn't matter, please
and autopratica are synonyms (or at least closely related).
Petr Onderka
[[en:User:Svick]]
On Sun, Sep 16, 2012 at 4:28 PM, Amir E. Aharoni
amir.ahar...@mail.huji.ac.il wrote:
Hi,
If I search for the word autopratica [1] in the Italian Wikipedia,
the article Terapia [2] comes up as the first
I have a feeling that they are trying to make Wikipedia pretty,
but at the cost of making it much less functional.
For example, to get to Czech Wikipedia from www.wikipedia.org,
I have to roll over the top right corner?
That's absolutely unusable, I would never think of that.
Petr Onderka
email),
it is the way I described.
Petr Onderka
[[en:User:Svick]]
On Tue, Aug 14, 2012 at 10:51 PM, Daniel Zahn dz...@wikimedia.org wrote:
On 14 August 2012 20:08, Mark Holmquist mtrac...@member.fsf.org wrote:
For example, to get to Czech Wikipedia from www.wikipedia.org,
I have to roll over
This information is in the iwlinks table [1], I don't know about any
Special: page that can be used to access it.
You can search the table for iwl_prefix = 'wikisource'.
[1]: http://www.mediawiki.org/wiki/Manual:Iwlinks_table
Petr Onderka
[[en:User:Svick]]
On Thu, May 3, 2012 at 10:46 PM, Lars
You can do that, only the URL is slightly longer:
http://en.wikipedia.org/wiki?curid=2312711
Although I don't understand what would be the benefit of doing that.
Petr Onderka
[[User:Svick]]
On Sat, Feb 18, 2012 at 14:09, John Erling Blad jeb...@gmail.com wrote:
In some cases it would
specific query.
Petr Onderka
[[en:User:Svick]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
://github.com/svick/mediawiki/commit/868910637445ea0dcf3ad84bc1ee9fc337f7b9c3
Petr Onderka
[[en:User:Svick]]
On Thu, Nov 10, 2011 at 11:37, Roan Kattouw roan.katt...@gmail.com wrote:
On Wed, Nov 9, 2011 at 11:36 PM, Petr Onderka gsv...@gmail.com wrote:
Is this information available somewhere
It's caused by incorrect use of templates for Harvard referencing like
Template:Harv.
Some of those errors are caused by a change in Template:Cite book and
similar few months back, when they were changed, so that they don't
produce the anchors on default, because it was often causing invalid
HTML.
41 matches
Mail list logo