Hi,
I have written a parser for MediaWiki syntax and have set up a test
site for it here:
http://libmwparser.kreablo.se/index.php/Libmwparsertest
and the source code is available here:
http://svn.wikimedia.org/svnroot/mediawiki/trunk/parsers/libmwparser
A preprocessor will take care of parser
Hi,
Pretty awesome work you've done!
On Thu, Sep 23, 2010 at 11:27 AM, Andreas Jonsson
andreas.jons...@kreablo.se wrote:
I think that this demonstrates the feasability of replacing the
MediaWiki parser. There is still a lot of work to do in order to turn
it into a full replacement, however.
2010-09-23 11:34, Bryan Tong Minh skrev:
Hi,
Pretty awesome work you've done!
On Thu, Sep 23, 2010 at 11:27 AM, Andreas Jonsson
andreas.jons...@kreablo.se wrote:
I think that this demonstrates the feasability of replacing the
MediaWiki parser. There is still a lot of work to do in
Op 23 sep 2010, om 14:14 heeft Andreas Jonsson het volgende geschreven:
2010-09-23 11:34, Bryan Tong Minh skrev:
Hi,
Pretty awesome work you've done!
On Thu, Sep 23, 2010 at 11:27 AM, Andreas Jonsson
andreas.jons...@kreablo.se wrote:
I think that this demonstrates the feasability of
On 23.09.2010 11:34, Bryan Tong Minh wrote:
Pretty awesome work you've done!
+1.
I just was about writing an own parser for an application, but that’s
really great. :)
Robin
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
2010-09-23 14:56, Krinkle skrev:
Op 23 sep 2010, om 14:47 heeft Andreas Jonsson het volgende geschreven:
2010-09-23 14:17, Krinkle skrev:
Op 23 sep 2010, om 14:14 heeft Andreas Jonsson het volgende
geschreven:
2010-09-23 11:34, Bryan Tong Minh skrev:
Hi,
On 23 September 2010 14:25, Andreas Jonsson andreas.jons...@kreablo.se wrote:
That's possible, but I believe that the set of broken templates can be
limited to a great extent. To deploy a new parser on an existing
site, one would need a tool that walks the existing pages and warns
about
On Wed, 22 Sep 2010 12:30:35 -0700, Brion Vibber wrote:
On Wed, Sep 22, 2010 at 11:09 AM, Dan Nessett dness...@yahoo.com
wrote:
Some have mentioned the possibility of using the wiki family logic to
help achieve these objectives. Do you have any thoughts on this? If you
think it is a good
Given a test matrix with multiple OSes, this ain't something
individual devs will be running in full over and over as they work.
Assume automated batch runs, which can be distributed over as many
databases and clients as you like.
For small test subsets that are being used during testing the
Another trick that comes to mind is adding a little smarts to your
tests to prevent multiple runs from stomping on each other.
For StatusNet's remote subscription features, I have a client-side
test set which registers user accounts on two sites and confirms that
behavior is as expected posting
On Thu, 23 Sep 2010 09:24:18 -0700, Brion Vibber wrote:
Given a test matrix with multiple OSes, this ain't something individual
devs will be running in full over and over as they work. Assume
automated batch runs, which can be distributed over as many databases
and clients as you like.
For
Hi.
Other good test software is JMeter http://jakarta.apache.org/jmeter/
You could record and replay your http requests
--
Lcdo. Wilfredo Rafael Rodríguez Hernández
msn,googletalk = wilfre...@gmail.com
cv = http://www.wilfredor.co.cc
Tim Starling wrote:
Ryan Lane wrote:
ParserFunctions is an amazing example of this. MediaWiki
simply doesn't work without ParserFunctions. You can tell me it does,
and that people can live without it, but I refuse to believe that. We
get support issues extremely frequently that end in install
On Thu, Sep 23, 2010 at 9:46 AM, Dan Nessett dness...@yahoo.com wrote:
I am very much in favor of keeping it simple. I think the issue is
whether we will support more than one regression test (or individual test
associated with a regression test) running concurrently on the same test
wiki. If
I have been making the assumption that in MediaWiki, the $_SESSION is
hidden from the
user. While applications may use the session to obtain data that's later
shown to the user,
there should be no way for the user to obtain the entire $_SESSION
contents.
So, for instance, I can hide a
On Thu, 23 Sep 2010 10:29:58 -0700, Brion Vibber wrote:
On Thu, Sep 23, 2010 at 9:46 AM, Dan Nessett dness...@yahoo.com wrote:
I am very much in favor of keeping it simple. I think the issue is
whether we will support more than one regression test (or individual
test associated with a
2010/9/23 Neil Kandalgaonkar ne...@wikimedia.org:
I have been making the assumption that in MediaWiki, the $_SESSION is
hidden from the
user. While applications may use the session to obtain data that's later
shown to the user,
there should be no way for the user to obtain the entire
On Thu, Sep 23, 2010 at 1:04 PM, Dan Nessett dness...@yahoo.com wrote:
After thinking about this some more I think you are right. We should at
least start with something simple and only make it more complex (e.g.,
wiki resource switching) if the simple approach has significant problems.
As far as I know, yes. MediaWiki sets a session cookie with an ID that
uniquely identifies the session. The session data itself is stored in
some session storage (by default we let PHP handle it, on WMF we stick
it in memcached, I believe). So unless there's some ridiculous
vulnerability
On Thu, 23 Sep 2010 14:10:24 -0700, Brion Vibber wrote:
+ URLs identify test wikis. Only one regression test can run at time on
any one of these. How do you synchronize regression test initiation so
there is some sort of lock on a test wiki currently running a
regression test?
Simplest
In fact, I advised Aurthur not to store exactly that (credit card
information) in sessions for this reason - but I also think there are
few things that are as sensitive as credit card information, passwords,
and social security numbers.
- Trevor
On 9/23/10 2:24 PM, Ryan Lane wrote:
As far
On Thu, Sep 23, 2010 at 2:31 PM, Dan Nessett dness...@yahoo.com wrote:
Not sure I get this. Here is what I understand would happen when a
developer checks in a revision:
+ A script runs that manages the various regression tests run on the
revision (e.g., parserTests, PHPUnit tests, the
On Thu, 23 Sep 2010 14:41:32 -0700, Brion Vibber wrote:
On Thu, Sep 23, 2010 at 2:31 PM, Dan Nessett dness...@yahoo.com wrote:
Not sure I get this. Here is what I understand would happen when a
developer checks in a revision:
+ A script runs that manages the various regression tests run on
On Thu, Sep 23, 2010 at 2:54 PM, Dan Nessett dness...@yahoo.com wrote:
On Thu, 23 Sep 2010 14:41:32 -0700, Brion Vibber wrote:
There's no need to have a fixed set of URLs; just as with Wikimedia's
public-hosted sites you can add individually-addressable wikis
dynamically at whim without
On Thu, 23 Sep 2010 15:50:48 -0700, Brion Vibber wrote:
On Thu, Sep 23, 2010 at 2:54 PM, Dan Nessett dness...@yahoo.com wrote:
On Thu, 23 Sep 2010 14:41:32 -0700, Brion Vibber wrote:
There's no need to have a fixed set of URLs; just as with Wikimedia's
public-hosted sites you can add
On 9/23/10 2:24 PM, Ryan Lane wrote:
The contents of that session on the server are unencrypted, correct?
Depending on what the secret is, he may or may not want to use it. For
instance, that is probably a terrible place to put credit card numbers
temporarily.
Good point, but in this case
Hi all,
I have made a list of all the 1.9M articles in NS0 (including
redirects / short pages) using the Toolserver; now I have the list I'm
going to download every single of 'em (after the trial period tonight,
I want to see how this works out. I'd like to begin with downloading
the whole thing
On Fri, Sep 24, 2010 at 1:36 AM, Neil Kandalgaonkar ne...@wikimedia.org wrote:
On 9/23/10 2:24 PM, Ryan Lane wrote:
The contents of that session on the server are unencrypted, correct?
Depending on what the secret is, he may or may not want to use it. For
instance, that is probably a terrible
On Thu, Sep 23, 2010 at 4:03 PM, Dan Nessett dness...@yahoo.com wrote:
Thinking about this a bit, we seem to have come full circle. If we use a
URL per regression test run, then we need to multiplex wiki resources.
When you set up a wiki family, the resources are permanent. But, for a
test
On 9/24/10, Marcin Cieslak sa...@saper.info wrote:
There are static dumps available here:
http://download.wikimedia.org/dewiki/
Is there any problem with using them?
I think they are from June 2008.
A fresh static dump would be good.
--
John Vandenberg
On Fri, Sep 24, 2010 at 3:44 AM, Marcin Cieslak sa...@saper.info wrote:
John Vandenberg jay...@gmail.com wrote:
http://download.wikimedia.org/dewiki/
Is there any problem with using them?
I think they are from June 2008.
Are they?
http://download.wikimedia.org/dewiki/20100903/
These
On Thu, 23 Sep 2010 20:13:23 -0700, Brion Vibber wrote:
On Thu, Sep 23, 2010 at 7:19 PM, Dan Nessett dness...@yahoo.com wrote:
I appreciate your recent help, so I am going to ignore the tone of your
last message and focus on issues. While a test run can set up, use and
then delete the
On 2010-09-24, Dmitriy Sintsov wrote:
One probably can rename it to another temporary name? Then move to final
location during the next request, according to previousely passed
cookie?
Speaking of cookies, there are millions ways of looking at them, FF's
WebDeveloper extension, HTTP
33 matches
Mail list logo