Just to clarify - you saw this on the password recovery only, not the
main password set from user prefs right?
Indeed.
--
Jeroen De Dauw
* http://blog.bn2vs.com
* http://wiki.bn2vs.com
Don't panic. Don't be evil. 50 72 6F 67 72 61 6D 6D 69 6E 67 20 34 20 6C 69
66 65!
--
2010/5/25 Platonides platoni...@gmail.com:
Seems it doesn't work so well. It was inadvertedly broken for wikitext
transclusions when the interwiki points to the nice url. See
'wgEnableScaryTranscluding and Templates/Images?' thread at mediawiki-l
Well, in my tests, images are well included
About the question from Alex about transcluding sections: is it
possible to request only a section through the API? I searched about
this but didn't find :(
--
Peter Potrowl
Ask ThomasV, #lst is particularly cared by him, to deepest level of
knowledge! I guess he met too your same
On Tue, May 25, 2010 at 7:41 AM, Peter17 peter...@gmail.com wrote:
Mmmh sorry, I'm not really sure I understand... My suggestion is
to use a shared database that would store the remote calls, not the
content of the pages... In my mind, fetching the distant pages would
be done through the
Aryeh Gregor wrote:
On Mon, May 24, 2010 at 8:27 PM, Q overlo...@gmail.com wrote:
I would have to suggest to not go the shared database route unless the
code can be fixed so that shared databases actually work with all of the
DB backends.
I don't see why it shouldn't be easy to get it
church.of.emacs.ml wrote:
However, you'd have to worry that each distant wiki uses only a fair
amount of the home wiki server's resources. E.g. set a limit of
inclusions (that limit would have to be on the home-wiki-server-side)
and disallow infinite loops (they're always fun).
Infinite loops
On 25 May 2010 15:30, Platonides platoni...@gmail.com wrote:
church.of.emacs.ml wrote:
However, you'd have to worry that each distant wiki uses only a fair
amount of the home wiki server's resources. E.g. set a limit of
inclusions (that limit would have to be on the home-wiki-server-side)
and
Hi, all,
I try to use MWDumper to import data, however, the importer only has two
choise for Mysql and PostGreSQL. What I am supposed to do if I want import
Wikipedia data into Oracle database?
Thanks very much.
Zeyi
___
Wikitech-l mailing list
(I'm going to use local wiki here for what Peter is calling distant
wiki, and foreign wiki for what he's calling home wiki. This
seems to better match the terminology we use for Commons.)
On Tue, May 25, 2010 at 7:41 AM, Peter17 peter...@gmail.com wrote:
Yes. The shared database would be only
2010/5/25 Aryeh Gregor simetrical+wikil...@gmail.com:
Having Wikimedia servers send HTTP requests to each other instead of
just doing database queries does not sound like a great idea to me.
You're hitting several extra servers for no reason, including extra
requests to an application server.
On Tue, May 25, 2010 at 2:58 PM, Roan Kattouw roan.katt...@gmail.com wrote:
This is true if, indeed, all parsing is done on the distant wiki.
However, if parsing is done on the home wiki, you're not simply
requesting data that's ready-baked in the DB and API calls make sense.
That's true -- if
2010/5/25 Aryeh Gregor simetrical+wikil...@gmail.com:
Templates will often miss the parser cache, because different
invocations will use different parameters. Even *with* the parser
cache, parsing is *still* one of the most expensive operations
Wikimedia does, so I'm not so sanguine.
I
On Tue, May 25, 2010 at 8:58 PM, Roan Kattouw roan.katt...@gmail.comwrote:
To the point of whether parsing on the on the distant wiki makes more
sense: I guess there are points to be made both ways. I originally
subscribed to the idea of parsing on the home wiki so expanding the
same template
On Tue, May 25, 2010 at 3:48 PM, Roan Kattouw roan.katt...@gmail.com wrote:
Also note that you wouldn't technically be parsing, just preprocessing
on the home wiki, which is certain to be less expensive (how much less
I don't know)
This is a good point.
and that you'd be doing this on some
As a followup to this thread: I'm going to make some minor modifications to
the strings in the file, then check in what I've got. There may be specific
instances where we'll need to figure out a better way of handling things,
but after looking at this more, I think those instances may be more
Aryeh Gregor wrote:
Ok, I will keep this in mind. Parsing the template on the home wiki
seems necessary because it can use other templates hosted on that wiki
to render correctly... I think it is the most logical way to do, isn't
it?
I think parsing the template on the local wiki is better,
On Tue, May 25, 2010 at 5:50 PM, Platonides platoni...@gmail.com wrote:
But I guess that's much better handled by just using a proper export,
and having the templates included in that, so never mind.
Yes. Perhaps they could have a Special:ImportFromRemote to do one-click
imports.
And this
On Tue, May 25, 2010 at 5:50 PM, Platonides platoni...@gmail.com wrote:
There are imho fewer variables set by the caller wiki, which could be
passed with the query.
I don't get what you're saying here.
For intra-Wikimedia query, they could directly ask an apache. They can
even send the query
On 2010-05-25 23:41, Peter17 wrote:
2010/5/25 Platonides platoni...@gmail.com:
Seems it doesn't work so well. It was inadvertedly broken for wikitext
transclusions when the interwiki points to the nice url. See
'wgEnableScaryTranscluding and Templates/Images?' thread at mediawiki-l
Well,
Should this not be reported into bugzilla (with the upstream keyword)
so that if others look for it later they can find it.
-Peachey
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Tue, May 25, 2010 at 7:02 PM, K. Peachey p858sn...@yahoo.com.au wrote:
Should this not be reported into bugzilla (with the upstream keyword)
so that if others look for it later they can find it.
-Peachey
No, it should be filed upstream :)
-Chad
2010/5/26 Jim Tittsler j...@onnz.net:
On 2010-05-25 23:41, Peter17 wrote:
2010/5/25 Platonides platoni...@gmail.com:
Seems it doesn't work so well. It was inadvertedly broken for wikitext
transclusions when the interwiki points to the nice url. See
'wgEnableScaryTranscluding and
22 matches
Mail list logo