[Wikitech-l] Major trouble with Cirrus forceSearchIndex.php script

2019-11-16 Thread Aran via Wikitech-l
Hi all,

We're having terrible trouble with the Cirrus search maintenance script
for initialising the elastic indexes:
forceSearchIndex.php --skipLinks --indexOnSkip...

It's happening with MW 1.31 .. 1.33, we're using redis job queue and a
single instance of Elastic on the same host (these are low traffic
wikis). Debian 10.2, PHP 7.3.

No matter what parameters we use (--queue or not, different --maxJobs,
or --fromId/--toId, --batchSize etc etc) we're always finding that
hundreds of elastic docs are not being created.

There's nothing about the articles themselves that are preventing it, if
we run the maintenance script on just a single missing one afterwards it
gets created no problem, and also each time this problem happens, there
are many differences in the missing docs.

Please if anyone has heard of this kind of things and could point us in
the right direction here that would be awesome!

Thanks a lot,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] InnoDB / MyISAM

2019-07-28 Thread Aran via Wikitech-l
Yep, I just had a look and all the searchindex tables are empty and the
tables doesn't seem to be accessed at all when using $wgSearchType =
'DcsCirrusSearch', so there's definitely no need for any MyISAM anywhere :-)

On 28/07/19 2:33 PM, Aran via Wikitech-l wrote:
> We're using InnoDB > 5.7 so full text search is definitely supported -
> but also we use Elastic search anyway, does this mean we don't even need
> the searchindex table?
>
>
> On 28/07/19 1:07 PM, Manuel Arostegui wrote:
>> On Sat, Jul 27, 2019 at 4:03 PM Bartosz Dziewoński 
>> wrote:
>>
>>> The 'searchindex' table uses MyISAM because until recently, InnoDB did
>>> not support fulltext indexes, which MediaWiki uses for the search. All
>>> other tables should use InnoDB.
>>>
>>> According to https://stackoverflow.com/a/9397060 fulltext indexes are
>>> available on InnoDB since MySQL 5.6.4. If you're running that version or
>>> newer, it is possible you could use InnoDB for that table, but as far as
>>> I know no one has tried it before.
>>>
>>> According to https://www.mediawiki.org/wiki/Compatibility#Database
>>> MediaWiki only requires MySQL 5.5.8, so we can't change that in our
>>> table definitions (yet).
>>>
>>> No idea about the 'math' table.
>> The math table isn't used, and it is being dropped in production:
>> https://phabricator.wikimedia.org/T196055
>>
>> Regarding MyISAM vs InnoDB: Always use InnoDB unless you have a very good
>> reason to use MyISAM (like the one mentioned about full-text indexes).
>>
>> Cheers
>> Manuel.
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] InnoDB / MyISAM

2019-07-28 Thread Aran via Wikitech-l
We're using InnoDB > 5.7 so full text search is definitely supported -
but also we use Elastic search anyway, does this mean we don't even need
the searchindex table?


On 28/07/19 1:07 PM, Manuel Arostegui wrote:
> On Sat, Jul 27, 2019 at 4:03 PM Bartosz Dziewoński 
> wrote:
>
>> The 'searchindex' table uses MyISAM because until recently, InnoDB did
>> not support fulltext indexes, which MediaWiki uses for the search. All
>> other tables should use InnoDB.
>>
>> According to https://stackoverflow.com/a/9397060 fulltext indexes are
>> available on InnoDB since MySQL 5.6.4. If you're running that version or
>> newer, it is possible you could use InnoDB for that table, but as far as
>> I know no one has tried it before.
>>
>> According to https://www.mediawiki.org/wiki/Compatibility#Database
>> MediaWiki only requires MySQL 5.5.8, so we can't change that in our
>> table definitions (yet).
>>
>> No idea about the 'math' table.
>
> The math table isn't used, and it is being dropped in production:
> https://phabricator.wikimedia.org/T196055
>
> Regarding MyISAM vs InnoDB: Always use InnoDB unless you have a very good
> reason to use MyISAM (like the one mentioned about full-text indexes).
>
> Cheers
> Manuel.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] InnoDB / MyISAM

2019-07-27 Thread Aran via Wikitech-l
Hello,

I've recently noticed that in a wiki farm I look after, about 20% of the
tables are MyISAM, but it doesn't seem very consistent - i.e. some
tables that are MyISAM in some wikis are InnoDB in others. searchindex
and math look to be all MyISAM.

From what I've read about them it seems that InnoDB is the best option,
especially for recent versions. Would it be a good idea to change them
all to InnoDB?

Thanks,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Redis job runner service

2019-06-22 Thread Aran via Wikitech-l
Hello,

Could somebody please help me understand the configuration of the
Wikimedia Job Runner service?
https://github.com/wikimedia/mediawiki-services-jobrunner/blob/master/jobrunner.sample.json

I'm not quite clear on how groups work, it would seem that the
groups/runners parameter is the number of threads assigned to each
group..? But then why does the "gwt" group have runners=0?

Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Redis job runner

2019-06-19 Thread Aran via Wikitech-l
Hello,

I've set up the redisJobRunnerService and redisJobChronService from here:
https://github.com/wikimedia/mediawiki-services-jobrunner

They're working well and processing jobs as they should be, but I was
wondering if anyone knows of any documentation about the configuration
to fine-tune the prioritisation of various job types differently. All
I've really been able to find about the configuration is the README in
the repo.

Thanks,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Write expectations exception

2019-06-18 Thread Aran via Wikitech-l
Hi Kosta,

I had initially not wanted to use the job queue, because it can take
minutes for jobs to execute and most of our writes need to be done
within a few seconds. But I see from that manual page that I could make
a class of jobs and a job-runner mechanism that would allow important
jobs to always get executed almost immediately, which makes the
job-queue a workable solution.

Thanks,
Aran


On 18/06/19 11:41 AM, Kosta Harlan wrote:
> My understanding is that eventually there will be enforcement in the
> WMF production environments, but I’m not sure about MediaWiki itself.
>
> If you’re doing writes on GET requests, the job queue might be useful
> to you: https://www.mediawiki.org/wiki/Manual:Job_queue
>
> Kosta
>
>> On Jun 17, 2019, at 8:29 AM, Aran via Wikitech-l
>> > <mailto:wikitech-l@lists.wikimedia.org>> wrote:
>>
>> |In a MediaWiki-based project I'm working on I'm getting many of these
>> kinds of exceptions: [DBPerformance] Expectation (writes <= 0) by
>> MediaWiki::main not met (actual: 8) |
>>
>> |I've read up on the Database transactions article in mediawiki.org
>> <http://mediawiki.org> and
>> can see that to remove the exceptions we'd need to conform to some very
>> specific criteria for all our db write, which is quite a problem for
>> this particular project.
>>
>> My question is, are these criteria ever likely to be enforced in future
>> MW versions, or will they always just be warnings to help guide
>> performance improvements?
>> |||
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org <mailto:Wikitech-l@lists.wikimedia.org>
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Write expectations exception

2019-06-17 Thread Aran via Wikitech-l
|In a MediaWiki-based project I'm working on I'm getting many of these
kinds of exceptions: [DBPerformance] Expectation (writes <= 0) by
MediaWiki::main not met (actual: 8) |

|I've read up on the Database transactions article in mediawiki.org and
can see that to remove the exceptions we'd need to conform to some very
specific criteria for all our db write, which is quite a problem for
this particular project.

My question is, are these criteria ever likely to be enforced in future
MW versions, or will they always just be warnings to help guide
performance improvements?
|||


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Problem importing images

2018-11-18 Thread Aran via Wikitech-l
Hi all,

I've installed a new 1.31 and imported content from an xml dump using
importDump.php which included file descriptions, then imported the
images from a directory of files using importImages.php.

About half of the files have imported successfully, but the other half
say the image is not found when I go to the image page in the wiki. i
can see the image in the proper place in the upload directory so it has
imported properly.

If I try and re-upload the image manually I can't because it says an
exact replica already exists, but click on that link to the already
existing one still says "No file by this name exists, but you can upload
it".

I've tried running all the related maintenance scripts such as
rebuildImages.php with no success :-(

Any ideas?

Thanks,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] $wgVisualEditorParsoidForwardCookies

2018-10-19 Thread Aran via Wikitech-l
Thanks, that's the page I was following, but I haven't needed to do any
of those things it says are needed for private wikis and it's saving
edits no problem.

On 19/10/18 2:47 PM, C. Scott Ananian wrote:
> That's a good question!  I'm not aware of any recent changes that
> would have made $wgVisualEditorParsoidForwardCookies not necessary,
> but I *do* recall that we don't use those long-form configuration
> variable names any more.
>
> Latest documentation seems to be at
> https://www.mediawiki.org/wiki/Extension:VisualEditor#Linking_with_Parsoid_in_private_wikis
>
> Hope this helps!
>  --scott
>
> On Fri, Oct 19, 2018 at 7:25 AM Aran via Wikitech-l
>  <mailto:wikitech-l@lists.wikimedia.org>> wrote:
>
> Hi,
>
> just wondering what the situation is with
> $wgVisualEditorParsoidForwardCookies these days. The documentation for
> visual editor says that it's needed if you have a wiki that's not
> editable by the public. But I've just set it up on such a wiki and
> editing/saving an article through visual editor works fine without me
> having set this or allowed anonymous edits by localhost. Is there no
> longer any need to worry about settings specifically for locked
> down wikis?
>
> Thanks,
>
> Aran
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org <mailto:Wikitech-l@lists.wikimedia.org>
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
> -- 
> (http://cscott.net)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] $wgVisualEditorParsoidForwardCookies

2018-10-19 Thread Aran via Wikitech-l
Hi,

just wondering what the situation is with
$wgVisualEditorParsoidForwardCookies these days. The documentation for
visual editor says that it's needed if you have a wiki that's not
editable by the public. But I've just set it up on such a wiki and
editing/saving an article through visual editor works fine without me
having set this or allowed anonymous edits by localhost. Is there no
longer any need to worry about settings specifically for locked down wikis?

Thanks,

Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bitcoin donations

2017-08-22 Thread Aran
Cool! I've made my donation - I see after clicking "cash" on the
Wikipedia form the "other ways to give" shows up - I think that should
be permanently visible because I didn't guess I needed to click that to
find other methods.


On 22/08/17 19:59, David Gerard wrote:
> https://wikimediafoundation.org/wiki/Ways_to_Give
>
> Thank you very much!
>
>
> - d.
>
> On 22 August 2017 at 23:58, Aran <a...@organicdesign.co.nz> wrote:
>> If you guys had a bitcoin option in your donation form you'd get more
>> donations (like from me!). I don't have any balance in paypal, but
>> sending some bitcoin would be easy.
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Bitcoin donations

2017-08-22 Thread Aran
If you guys had a bitcoin option in your donation form you'd get more
donations (like from me!). I don't have any balance in paypal, but
sending some bitcoin would be easy.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Question re what-links-here

2017-06-02 Thread Aran
Perhaps 1.28's script already does this since my 1.28's didn't have this
problem, their backlinks were up to date immediately after running the
script.


On 02/06/17 16:34, zppix e wrote:
> Sounds like a decent comprimise. I'm not to familar with the code behind
> maintaince scripts to do this myself, so I  could make a ticket if need be.
>
> On Friday, June 2, 2017, Gergo Tisza  wrote:
>
>> On Fri, Jun 2, 2017 at 7:38 AM, zppix e > > wrote:
>>
>>> I was wondering if maybe we should have refreshLinks.php invaldiate
>> cache,
>>> cause when I think of RefreshLinks.php I think it will update the links
>>> instantly, not overtime.
>>>
>> On large sites that would have an unwelcome performance impact. Adding a
>> command-line option to do so would make sense I guess.
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org 
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Question re what-links-here

2017-05-31 Thread Aran
What I meant by the code being changed programmatically is that some of
the content in the wiki is in the form of parser-functions, and the PHP
code that is responsible for rendering the parser-function was changed
which meant that the wiki's content has changed but without any edits
having been made. Thus I needed to refresh the links manually which the
maintenance script.


On 31/05/17 15:17, Aran wrote:
> No I just ran the refreshLink.php maintenance script from the shell.
>
>
> On 31/05/17 15:12, יגאל חיטרון wrote:
>> Did you use a bot? An API javascript? In any case, try the rest one.
>> Igal
>>
>> On May 31, 2017 21:04, "Aran" <a...@organicdesign.co.nz> wrote:
>>
>>> Hello,
>>>
>>> I have some wikis which have had some of their content changed
>>> programmatically due to the code in a parser-function being changed.
>>>
>>> In the wikis that are MediaWiki 1.27, the links in the what-links-here
>>> didn't update after I ran the refreshLinks maintenance script. The ones
>>> that are 1.28's in the same situation the links updated fine after
>>> running the script... is this a known issue? is there any way I can
>>> update the links in the 1.27's without upgrading the wiki?
>>>
>>> Thanks,
>>> Aran
>>>
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Question re what-links-here

2017-05-31 Thread Aran
No I just ran the refreshLink.php maintenance script from the shell.


On 31/05/17 15:12, יגאל חיטרון wrote:
> Did you use a bot? An API javascript? In any case, try the rest one.
> Igal
>
> On May 31, 2017 21:04, "Aran" <a...@organicdesign.co.nz> wrote:
>
>> Hello,
>>
>> I have some wikis which have had some of their content changed
>> programmatically due to the code in a parser-function being changed.
>>
>> In the wikis that are MediaWiki 1.27, the links in the what-links-here
>> didn't update after I ran the refreshLinks maintenance script. The ones
>> that are 1.28's in the same situation the links updated fine after
>> running the script... is this a known issue? is there any way I can
>> update the links in the 1.27's without upgrading the wiki?
>>
>> Thanks,
>> Aran
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Question re what-links-here

2017-05-31 Thread Aran
Strange, I've just noticed after coming back to it that the 1.27's are
looking fine now... does the refreshLinks script launch child processes
in the background or something?


On 31/05/17 15:03, Aran wrote:
> Hello,
>
> I have some wikis which have had some of their content changed
> programmatically due to the code in a parser-function being changed.
>
> In the wikis that are MediaWiki 1.27, the links in the what-links-here
> didn't update after I ran the refreshLinks maintenance script. The ones
> that are 1.28's in the same situation the links updated fine after
> running the script... is this a known issue? is there any way I can
> update the links in the 1.27's without upgrading the wiki?
>
> Thanks,
> Aran
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Question re what-links-here

2017-05-31 Thread Aran
Hello,

I have some wikis which have had some of their content changed
programmatically due to the code in a parser-function being changed.

In the wikis that are MediaWiki 1.27, the links in the what-links-here
didn't update after I ran the refreshLinks maintenance script. The ones
that are 1.28's in the same situation the links updated fine after
running the script... is this a known issue? is there any way I can
update the links in the 1.27's without upgrading the wiki?

Thanks,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] CirrusSearch forceSearchIndex.php issue

2017-04-02 Thread Aran
Hello,

I've found that running forceSearchIndex.php with --skipLinks
--indexOnSkip options is only processing about 1000-1500 pages and then
stopping. It's not due to error or anything because I can set --fromId
and run it again and it does another batch. I've tried setting --limit
and --batchSize and all sorts but nothing allows it to do more than this
amount at a time anyone have any idea what might be going on here?

(It's happening on both MW 1.27 with Elastic Search 1.75 and MW 1.28
with elastic 2.4.4)

Thanks,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problem with updateSearchIndexConfig.php --startover

2017-03-31 Thread Aran
For some reason forceSearchIndex.php --skipLinks --indexOnSkip is only
doing around 1000 pages and then stopping, if I set --fromId to the next
ID it does another 1000 or so, setting --limit with a high number does
not allow it to do more...?


On 31/03/17 13:02, Erik Bernhardson wrote:
> On Fri, Mar 31, 2017 at 8:25 AM, Aran <a...@organicdesign.co.nz> wrote:
>
>> Hello,
>>
>> I have some wikis running elastic search and after some development
>> updates I find I need to rebuild the search indexes which I do as follows:
>>
>> .../CirrusSearch/maintenance/forceSearchIndex.php --skipLinks
>> --indexOnSkip
>> .../CirrusSearch/maintenance/forceSearchIndex.php --skipParse
>>
>> But I found that this doesn't work for new wikis that don't have indexes
>> at all and so I put the following command before the other two:
>>
>> .../CirrusSearch/maintenance/updateSearchIndexConfig.php --startOver
>>
>>
> The above three commands should be all you need, I use it all the time for
> resetting development indices. The full order of commands should be:
>
> updateSearchIndexConfig.php --startOver
> forceSearchIndex.php --skipLinks --indexOnSkip
> forceSearchIndex.php --skipParse
>
> Later I found that this resulted in only the first namespace in each
>> wiki being included in any search results...
>>
>> What is the proper command I should run to completely nuke and rebuild
>> all indexes on a wiki?
>>
>> Thanks,
>> Aran
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problem with updateSearchIndexConfig.php --startover

2017-03-31 Thread Aran
Thanks, that's good to know - I've narrowed my problem down a bit
further and found that the problem is with the second of those commands
not the first. For some reason forceSearchIndex --skipLinks
--indexOnSkip is only processing one namespace, it will process others
only if I explicitly specify them with the --namespace option?!


On 31/03/17 13:02, Erik Bernhardson wrote:
> On Fri, Mar 31, 2017 at 8:25 AM, Aran <a...@organicdesign.co.nz> wrote:
>
>> Hello,
>>
>> I have some wikis running elastic search and after some development
>> updates I find I need to rebuild the search indexes which I do as follows:
>>
>> .../CirrusSearch/maintenance/forceSearchIndex.php --skipLinks
>> --indexOnSkip
>> .../CirrusSearch/maintenance/forceSearchIndex.php --skipParse
>>
>> But I found that this doesn't work for new wikis that don't have indexes
>> at all and so I put the following command before the other two:
>>
>> .../CirrusSearch/maintenance/updateSearchIndexConfig.php --startOver
>>
>>
> The above three commands should be all you need, I use it all the time for
> resetting development indices. The full order of commands should be:
>
> updateSearchIndexConfig.php --startOver
> forceSearchIndex.php --skipLinks --indexOnSkip
> forceSearchIndex.php --skipParse
>
> Later I found that this resulted in only the first namespace in each
>> wiki being included in any search results...
>>
>> What is the proper command I should run to completely nuke and rebuild
>> all indexes on a wiki?
>>
>> Thanks,
>> Aran
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Problem with updateSearchIndexConfig.php --startover

2017-03-31 Thread Aran
Hello,

I have some wikis running elastic search and after some development
updates I find I need to rebuild the search indexes which I do as follows:

.../CirrusSearch/maintenance/forceSearchIndex.php --skipLinks --indexOnSkip
.../CirrusSearch/maintenance/forceSearchIndex.php --skipParse

But I found that this doesn't work for new wikis that don't have indexes
at all and so I put the following command before the other two:

.../CirrusSearch/maintenance/updateSearchIndexConfig.php --startOver

Later I found that this resulted in only the first namespace in each
wiki being included in any search results...

What is the proper command I should run to completely nuke and rebuild
all indexes on a wiki?

Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] CirrusSearch index incompleteness problem

2017-01-28 Thread Aran
This has finally been solved!

The problem was a parser-function that relied on the $wgTitle global
which was not available in the context of a command-line index update.
Now it's using $parser->getTitle() instead of relying on a global and
all results are present :-)

Thanks for your help,
Aran

On 26/01/17 17:36, Erik Bernhardson wrote:
> On Thu, Jan 26, 2017 at 11:30 AM, Aran <a...@organicdesign.co.nz> wrote:
>
>> Hello,
>>
>> I'm managing some mediawiki 1.27.1's running CirrusSearch 0.2 with
>> Elasticsearch 1.7.5. I been noticing that there are often search results
>> missing so I started running the forceSearchIndex.php script each night
>> on a cron job.
>>
>> But I'm still finding results missing. Today I re-ran the script
>> manually and then found that one of the missing results showed up and
>> that the result count for that term had increased from 18 to 23. I ran
>> the script again and it increased more to 37. I ran more times but the
>> result count did not increase any more.
>>
>> The commands I've been doing are:
>> forceSearchIndex.php --skipLinks --indexOnSkip
>> forceSearchIndex.php --skipParse
>>
>> Is this the correct way to do a full index rebuild? is there some
>> parameter that can ensure that no pages get missed?
>>
>>
> This is the correct way to rebuild documents in place. It sounds like
> something is running into errors while building documents though. Could you
> check your logs for errors related to CirrusSearch?
>
>
>> Thanks,
>> Aran
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] CirrusSearch index incompleteness problem

2017-01-26 Thread Aran
I can't see any errors in the debug log... one page had five occurrences
of a search term and only showed up after rebuilding the index many
times. I kept running it because I knew the page should be showing up in
the results and eventually it got included.


On 26/01/17 17:36, Erik Bernhardson wrote:
> On Thu, Jan 26, 2017 at 11:30 AM, Aran <a...@organicdesign.co.nz> wrote:
>
>> Hello,
>>
>> I'm managing some mediawiki 1.27.1's running CirrusSearch 0.2 with
>> Elasticsearch 1.7.5. I been noticing that there are often search results
>> missing so I started running the forceSearchIndex.php script each night
>> on a cron job.
>>
>> But I'm still finding results missing. Today I re-ran the script
>> manually and then found that one of the missing results showed up and
>> that the result count for that term had increased from 18 to 23. I ran
>> the script again and it increased more to 37. I ran more times but the
>> result count did not increase any more.
>>
>> The commands I've been doing are:
>> forceSearchIndex.php --skipLinks --indexOnSkip
>> forceSearchIndex.php --skipParse
>>
>> Is this the correct way to do a full index rebuild? is there some
>> parameter that can ensure that no pages get missed?
>>
>>
> This is the correct way to rebuild documents in place. It sounds like
> something is running into errors while building documents though. Could you
> check your logs for errors related to CirrusSearch?
>
>
>> Thanks,
>> Aran
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] CirrusSearch index incompleteness problem

2017-01-26 Thread Aran
Hello,

I'm managing some mediawiki 1.27.1's running CirrusSearch 0.2 with
Elasticsearch 1.7.5. I been noticing that there are often search results
missing so I started running the forceSearchIndex.php script each night
on a cron job.

But I'm still finding results missing. Today I re-ran the script
manually and then found that one of the missing results showed up and
that the result count for that term had increased from 18 to 23. I ran
the script again and it increased more to 37. I ran more times but the
result count did not increase any more.

The commands I've been doing are:
forceSearchIndex.php --skipLinks --indexOnSkip
forceSearchIndex.php --skipParse

Is this the correct way to do a full index rebuild? is there some
parameter that can ensure that no pages get missed?

Thanks,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Creating a broken link

2017-01-25 Thread Aran
You need to add the "new" class to a link for it to be a red link, but
there's no simple way to do this in standard wikitext markup. You'd have
to something like enable raw html, add an extension that allows adding
attributes to link syntax, or you maybe add a CSS rule so you could for
example put a span with class new in the link anchor.


On 25/01/17 11:52, Victor Porton wrote:
> On Wed, 2017-01-25 at 13:45 +, John wrote:
>> Does the page exist already?
> No. The issue is to create a link to a page which does not exist.
>
> I can hard-code HTML link in my PHP script, but I wonder if there is a
> better way.
>
>> On Wed, Jan 25, 2017 at 8:23 AM Victor Porton 
>> wrote:
>>
>>> How to create a broken ("edit", "red") link to a page?
>>>
>>> That is I want to generate a HTML code which displays a link,
>>> clicking
>>> which leads to the editor (for a page). The link should be red.
>>>
>>> What is the right way to do this?
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New 1.27/1.28 login mechanism

2017-01-17 Thread Aran
Hello,

I have a login system that extends the
AbstractPrimaryAuthenticationProvider and uses an AuthenticationRequest
that returns an empty fields array so that the login form is bypassed
completely and login is determined by some other environmental
parameters. But in 1.27 this does not bypass the login form.

What is the proper way I should be making the login page determine login
without a login form?

Thanks,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Adding custom parameters to Elastica back-end

2016-10-31 Thread Aran
Hi Guys,

I'm using CirrusSearch on mw1.27 and wondering if there's a way to add
custom parameters to the Elastica back-end such as for example,

"sort": ["namespace_text": "asc"]

Thanks,

Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Changing a default system message

2016-10-26 Thread Aran
Thanks a lot, that hook did the trick :-)


On 26/10/16 18:53, Krinkle wrote:
> If you want this to be part of site configuration or an extension, you can
> consider using the 'MessageCache::get' hook, which allows you to override
> which key will be retrieved. (For an individual site, I'd recommend
> changing the value through the MediaWiki namespace page for that interface
> message key.)
>
> We use this hook at Wikimedia to change the default for some localisation
> messages on all Wikimedia wikis at once - while still allowing individual
> sites to also override themselves via the MediaWiki-namespace.
>
> See the WikimediaMessages extension for an example:
>
> https://www.mediawiki.org/wiki/Manual:Hooks/MessageCache::get
>
> https://github.com/wikimedia/mediawiki-extensions-WikimediaMessages/blob/d742c363/WikimediaMessages.hooks.php#L22-L53
>
> This example may be more complicated than what you need, but it's a
> starting point from which you can simplify. here is a bit more
>
> -- Krinkle
>
>
>
>
> On Sat, Oct 22, 2016 at 7:12 PM, Aran <a...@organicdesign.co.nz> wrote:
>
>> Hi,
>>
>> I have an extension that change some of the default system messages for
>> example "talk" to "comment", but since upgrading to 1.27 these messages
>> no longer change. I've tried rebuildmessages.php and
>> rebuildLocalisationCache.php but nothing seems to allow me to override
>> these default messages any more. New messages that the extension
>> introduces can be changed no problem.
>>
>> Does anyone here know the proper procedure for doing this?
>>
>> Thanks,
>>
>> Aran
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Override system message from php

2016-10-23 Thread Aran
Can someone please tell me a way to override a system message such as
'nosuchuser' or 'wrongpassword' from PHP?


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Changing a default system message

2016-10-22 Thread Aran
Oh sorry I mis-read your message, you mean the article not the file..
but still it would be preferable if the functionality could be all
packaged into the extension without needing to apply content as well.


On 22/10/16 16:15, James Hare wrote:
> On Saturday, October 22, 2016 at 2:12 PM, Aran wrote:
>> Hi,
>>  
>> I have an extension that change some of the default system messages for
>> example "talk" to "comment", but since upgrading to 1.27 these messages
>> no longer change. I've tried rebuildmessages.php and
>> rebuildLocalisationCache.php but nothing seems to allow me to override
>> these default messages any more. New messages that the extension
>> introduces can be changed no problem.
>>  
>> Does anyone here know the proper procedure for doing this?
>>  
>> Thanks,
>>  
>> Aran
>>  
> Is there any reason this needs to be handled by the extension? Can you edit 
> the corresponding MediaWiki-namespace page to change the label?
>
> —
> James Hare
> http://harej.co  
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Changing a default system message

2016-10-22 Thread Aran
All the functionality really needs to be packaged into the extension so
that there's no code-base files being changed - and also that's only one
example, there are other messages changed too that are not namespaces,
such as the failed login message.


On 22/10/16 16:15, James Hare wrote:
> On Saturday, October 22, 2016 at 2:12 PM, Aran wrote:
>> Hi,
>>  
>> I have an extension that change some of the default system messages for
>> example "talk" to "comment", but since upgrading to 1.27 these messages
>> no longer change. I've tried rebuildmessages.php and
>> rebuildLocalisationCache.php but nothing seems to allow me to override
>> these default messages any more. New messages that the extension
>> introduces can be changed no problem.
>>  
>> Does anyone here know the proper procedure for doing this?
>>  
>> Thanks,
>>  
>> Aran
>>  
> Is there any reason this needs to be handled by the extension? Can you edit 
> the corresponding MediaWiki-namespace page to change the label?
>
> —
> James Hare
> http://harej.co  
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Changing a default system message

2016-10-22 Thread Aran
Hi,

I have an extension that change some of the default system messages for
example "talk" to "comment", but since upgrading to 1.27 these messages
no longer change. I've tried rebuildmessages.php and
rebuildLocalisationCache.php but nothing seems to allow me to override
these default messages any more. New messages that the extension
introduces can be changed no problem.

Does anyone here know the proper procedure for doing this?

Thanks,

Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SQLite Mediawiki support

2016-09-26 Thread Aran
I developed a MediaWikiLite system many years ago which worked
reasonably well. It was for having a wiki on a memory stick that
included the content and ability to edit it in the field without net
access. It ran on SQLite and use Nanoweb, a web-server written in PHP to
reduce dependencies further.

It's very out of date now, but may be helpful:

https://www.organicdesign.co.nz/MediaWikiLite


On 26/09/16 11:00, Jefsey wrote:
> The personal way I am using wikimedia as an SQLite textbase I can
> easily copy/backup from machine to machine and modifiy through
> external bundled applications leads me to consider there is a need for
> a wikimedia user 100% compatible "WIKILite" integrated/maintained
> solution set.
>
> 1. has something like that been investigated in the past?
> 2. I would be interested by comments on the idea?
> 3. also about the approach that can best help users and possibly
> wikimedia dévelopment?
>
> I note that as a networked individual/professionnal I am interested in
> multi-agent oriented interwares and would like to investigate
> "wikilite" networking capabilities (both about what networked
> architectures could bring, and aboout capacity based content
> security/protection).
>
> Thank you for your attention.
> Best
> jfc
>  
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] CirrusSearch index not update after edits

2016-08-08 Thread Aran
Ah yes that was the problem, the run rate was set to zero! Thanks a lot :-)


On 07/08/16 16:03, John wrote:
> Check your job queue?
>
> On Sun, Aug 7, 2016 at 2:43 PM, Aran <a...@organicdesign.co.nz> wrote:
>
>> Hi,
>>
>> I've installed the current CirrusSearch on a MediaWiki 1.27 with local
>> ElasticSearch 1.7.5 and everything went smoothly, except that the index
>> doesn't update by itself after I edit pages. Only if I re-run the
>> maintenance script do new edits get included in search results.
>>
>> I don't have $wgDisableSearchUpdate=true in LocalSettings.php and have
>> tried settings it to false too.
>>
>> Any ideas what might be the trouble?
>>
>> Thanks,
>>
>> Aran
>>
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] CirrusSearch index not update after edits

2016-08-07 Thread Aran
Hi,

I've installed the current CirrusSearch on a MediaWiki 1.27 with local
ElasticSearch 1.7.5 and everything went smoothly, except that the index
doesn't update by itself after I edit pages. Only if I re-run the
maintenance script do new edits get included in search results.

I don't have $wgDisableSearchUpdate=true in LocalSettings.php and have
tried settings it to false too.

Any ideas what might be the trouble?

Thanks,

Aran



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upcoming SyntaxHighlight_GeSHi changes

2015-06-27 Thread Aran
A nodejs service would be nice, but probably not on the cards in the
near future - note however that the extension does include a simple
method of ensuring that the highlighting module is called for blocks
even if they're inserted into the page after load time such as in Ajax
calls like live preview.


On 23/06/15 12:14, Ori Livneh wrote:
 On Tue, Jun 23, 2015 at 7:23 AM, Aran a...@organicdesign.co.nz wrote:

 Also for those that prefer the client to do the work I recently made an
 extension to use the highlight.js module at highlightjs.org.
 https://www.mediawiki.org/wiki/Extension:HighlightJS

 This is very nice, and may be a target for a future migration. It'd be
 great if the extension had a server-side implementation as well in the form
 of a node.js service that MediaWiki could call.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upcoming SyntaxHighlight_GeSHi changes

2015-06-23 Thread Aran
Also for those that prefer the client to do the work I recently made an
extension to use the highlight.js module at highlightjs.org.
https://www.mediawiki.org/wiki/Extension:HighlightJS

On 22/06/15 21:48, Ori Livneh wrote:
 Hello,

 Over the course of the next two days, a major update to the
 SyntaxHighlight_GeSHi extension will be rolled out to Wikimedia wikis. The
 change swaps geshi, the unmaintained PHP library which performs the lexical
 analysis and output formatting of code, for another library, called
 Pygments.

 The roll-out will remove support for 31 languages while adding support for
 several hundred languages not previously supported, including Dart, Rust,
 Julia, APL, Mathematica, SNOBOL, Puppet, Dylan, Racket, Swift, and many
 others. See https://people.wikimedia.org/~ori/geshi_changes.txt for a
 full list. The languages that will lose support are mostly obscure, with
 the notable exception of ALGOL68, Oz, and MMIX.

 The change is expected to slightly improve the time it takes to load and
 render all pages on all wikis (not just those that contain code blocks!),
 at the cost of a slight penalty (about a tenth of a second) on the time it
 takes to save edits which introduce or modify a block of highlighted code
 to an article.

 Lastly, the way the extension handles unfamiliar languages will change.
 Previously, if the specified language was not supported by the extension,
 instead of a code block, the extension would print an error message. From
 now on, it will simply output a plain, unhighlighted block of monospaced
 code.

 The wikitext syntax for highlighting code will remain the same.

 -- Ori
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Getting article content

2015-06-18 Thread Aran
I just noticed that Article::getContent is deprecated now, and the code
says that WikiPage::getContent is now the preferred method. What's the
recommended way to get the current text content of a normal wikitext
article now? Would it be this?
$text = $article-getPage()-getContent()-getNativeData();


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RfC] Feature to watch categorylinks

2015-03-30 Thread Aran
The the CategoryWatch extension was created a few years ago for this,
but hasn't been updated in a long time so may need some minor fixes by now.
https://www.mediawiki.org/wiki/Extension:CategoryWatch

On 30/03/15 12:40, Kai Nissen wrote:
 Hi there,

 I created a Request for comments[1] and a corresponding Phabricator
 Task[2] regarding a feature that enables users to watch categories for
 page additions and removals.

 If you are interested in this topic or somehow involved in
 Extension:Echo or watchlist features, feel free to participate in the
 discussion.

 Cheers,
 Kai


 [1]: https://www.mediawiki.org/wiki/Requests_for_comment/Watch_Categorylinks
 [2]: https://phabricator.wikimedia.org/T94414




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Parsoid

2014-08-28 Thread Aran
Hello,

I'm trying to install parsoid on Ubuntu 12. I installed nodejs from
source, but when I try and install parsoid via apt-get it fails saying
that it depends on nodejs (= 0.8.0) even though node --version returns
v0.10.31!

Anyone have any ideas what could be wrong?

Cheers,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Parsoid

2014-08-28 Thread Aran
Yeah I tried installing from apt-get first, but it installed 0.6.x,
Ubuntu 12 is quite old now.

On 28/08/14 13:23, Gabriel Wicke wrote:
 On 08/28/2014 08:46 AM, Brad Jorsch (Anomie) wrote:
 On Thu, Aug 28, 2014 at 11:25 AM, Aran a...@organicdesign.co.nz wrote:

 I'm trying to install parsoid on Ubuntu 12. I installed nodejs from
 source, but when I try and install parsoid via apt-get it fails saying
 that it depends on nodejs (= 0.8.0) even though node --version returns
 v0.10.31!

 Anyone have any ideas what could be wrong?

 The package manager doesn't know anything about software you manually
 installed.

 The ideal thing to do would be to just install the nodejs package: I see
 Ubuntu trusty has 0.10.25, and Debian has 0.10.29 in both testing and
 unstable.
 +1 for using the regular package rather than a manual install from source.
 Normally the right nodejs package should be automatically pulled in when you
 install parsoid from the repository as described in [1]. What happens when
 you just do a 'apt-get install nodejs' ?

 Gabriel

 [1]: https://www.mediawiki.org/wiki/Parsoid/Setup#Ubuntu_.2F_Debian_on_amd64

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Parsoid

2014-08-28 Thread Aran
Yeah that's what it installed, so then I uninstalled and did it from
source instead.

On 28/08/14 13:40, Brad Jorsch (Anomie) wrote:
 On Thu, Aug 28, 2014 at 12:23 PM, Gabriel Wicke gwi...@wikimedia.org
 wrote:

 What happens when you just do a 'apt-get install nodejs' ?

 Presumably it installs nodejs version 0.6.12~dfsg1-1ubuntu1, since that's
 the version available in Ubuntu precise (which I assume is what was meant
 by Ubuntu 12).




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Parsoid

2014-08-28 Thread Aran
Thanks I was able to use the equivs package to get parsoid to run
properly - I also then found the following link in some fine print on MW
Parsoid/Setup page which works too:
https://github.com/joyent/node/wiki/Installing-Node.js-via-package-manager

On 28/08/14 12:46, Brad Jorsch (Anomie) wrote:
 On Thu, Aug 28, 2014 at 11:25 AM, Aran a...@organicdesign.co.nz wrote:

 I'm trying to install parsoid on Ubuntu 12. I installed nodejs from
 source, but when I try and install parsoid via apt-get it fails saying
 that it depends on nodejs (= 0.8.0) even though node --version returns
 v0.10.31!

 Anyone have any ideas what could be wrong?

 The package manager doesn't know anything about software you manually
 installed.

 The ideal thing to do would be to just install the nodejs package: I see
 Ubuntu trusty has 0.10.25, and Debian has 0.10.29 in both testing and
 unstable. You may be able to just download the source package and rebuild
 it for precise.

 Or you could try using the equivs package to fake out the package manager.




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Help Please! mangled special characters after import

2014-07-10 Thread Aran
Hello,

I've recently had a server disaster and had to recover my wiki from
backup dumps. But after importing all the special characters are mangled
causing many links to be broken and pages that have special characters
in the titles are missing!

Here's an example of a page with many broken characters and red links
showing for pages with special characters in the titles.
http://www.organicdesign.co.nz/2014_Holiday_in_Brazil

I'm sure this must be able to be fixed because when I look at the DB
content from the shell by selecting content from the text table all the
special characters are rendering with no problem...?

I noticed that the mysqdump command was using
--default-character-set=latin1 which may be a legacy setting now?

Help much appreciated
Thanks a lot,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Help Please! mangled special characters after import

2014-07-10 Thread Aran
Problem solved thanks!

Just needed the same default charset on import :-)

On 10/07/14 16:36, Aran wrote:
 Hello,

 I've recently had a server disaster and had to recover my wiki from
 backup dumps. But after importing all the special characters are mangled
 causing many links to be broken and pages that have special characters
 in the titles are missing!

 Here's an example of a page with many broken characters and red links
 showing for pages with special characters in the titles.
 http://www.organicdesign.co.nz/2014_Holiday_in_Brazil

 I'm sure this must be able to be fixed because when I look at the DB
 content from the shell by selecting content from the text table all the
 special characters are rendering with no problem...?

 I noticed that the mysqdump command was using
 --default-character-set=latin1 which may be a legacy setting now?

 Help much appreciated
 Thanks a lot,
 Aran

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Sub-categories in SMW

2014-03-26 Thread Aran

Hello,

Could someone please tell me if there's any way to have an #ask query 
return sub-categories? I've tried all the different configuration 
options and nothing seems to allow it.


For example if I write the query {{#ask:[[Category:Foo]]}} then all the 
normal pages which are categories in Foo will show up in the results, 
but none of the sub-categories of Foo.


Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Sub-categories in SMW

2014-03-26 Thread Aran

Sorry I phrased that very badly, I meant:

If I write the query {{#ask:[[Category:Foo]]}} then all the normal pages 
which are in Category:Foo will show up in the results, but none of the 
sub-categories of Foo (i.e. category pages which are members of 
Category:Foo).



On 26/03/14 15:19, Aran wrote:

Hello,

Could someone please tell me if there's any way to have an #ask query 
return sub-categories? I've tried all the different configuration 
options and nothing seems to allow it.


For example if I write the query {{#ask:[[Category:Foo]]}} then all 
the normal pages which are categories in Foo will show up in the 
results, but none of the sub-categories of Foo.


Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Comet

2013-11-14 Thread Aran
Wouldn't WebSocket be the better choice for a full duplex channel?

On 14/11/13 21:12, Lee Worden wrote:
 In the MW extension development I'm doing, I'm thinking of writing
 some operations that use [[Comet_(programming)]] to deliver continuous
 updates to the client, rather than the Ajax pattern of one request,
 one response.

 Has anyone messed with this?  Any code I should crib from, or advice
 or cautionary tales?  Also, if it develops into something useful, I
 could split it out for others to use.

 Thanks,
 LW

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New API options problem

2013-10-10 Thread Aran
My command was:

action = 'options',
token = $token,
format = 'xml',
change = 'realname=Foo Bar'

Which resulted in success but did nothing.


On 10/10/13 06:22, Andre Klapper wrote:
 Hi Aran,

 On Wed, 2013-10-09 at 16:53 -0300, Aran wrote:
 I'm trying to set a user's realname option on a local wiki (1.21.1)
 using the API. I can login successfully, confirm my login token fine and
 then I can get the options token and make the call to change options
 using the token.

 I get Success returned when I change the realname (or any other valid
 option such as password etc). But when I look in the user's preferences,
 or directly in the user db table there's nothing changed.
 Could you provide the exact command, so somebody else could try to
 reproduce?

 andre

 (Also, for future reference, please avoid replying to other emails and
 replacing the subject line with your topic. Start a new thread instead.)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New API options problem

2013-10-09 Thread Aran
Hi,

I'm trying to set a user's realname option on a local wiki (1.21.1)
using the API. I can login successfully, confirm my login token fine and
then I can get the options token and make the call to change options
using the token.

I get Success returned when I change the realname (or any other valid
option such as password etc). But when I look in the user's preferences,
or directly in the user db table there's nothing changed.

Is there something else I need to configure? or is this a bug?

Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problem with SVG thumbnails

2013-07-02 Thread Aran
Yep that's my problem, thanks a lot :-)

On 02/07/13 07:28, Bartosz Dziewoński wrote:
 This is most likely bug 45054, fixed in MediaWiki 1.21. It has a
 rather simple workaround, too, see
 https://bugzilla.wikimedia.org/show_bug.cgi?id=45054 .



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Math rendering problem

2013-07-02 Thread Aran
Hi Guys,

I've just upgraded my wiki from 1.19.2 to 1.21.1 to fix the SVG
rendering problem which now is all fine, but now my Math rendering has
broken. I'm getting the following error:

Failed to parse (PNG conversion failed; check for correct installation
of latex and dvipng (or dvips + gs + convert))

This error seems very common, but none of the solutions I've found have
worked (creating latex.fmt, running fmtutil-sys --all, setting $wgTexvc
etc).

All the packages are installed and were running fine for 1.19, I've
downloaded Extension:Math for 1.21 and ran 'make' which generated a
texvc binary with no errors.

Any ideas what may be wrong?

Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Math rendering problem

2013-07-02 Thread Aran
I've found that the logged shell command actually does execute properly
and creates the .png when executed manually from shell - even when I
execute it as the www-data user that the web-server runs as.

But from the wiki it creates the tmp/hash.tex file, but not the png, and
there's nothing logged anywhere to say why it's not been able to do it.

On 02/07/13 12:32, Aran wrote:
 Hi Guys,

 I've just upgraded my wiki from 1.19.2 to 1.21.1 to fix the SVG
 rendering problem which now is all fine, but now my Math rendering has
 broken. I'm getting the following error:

 Failed to parse (PNG conversion failed; check for correct installation
 of latex and dvipng (or dvips + gs + convert))

 This error seems very common, but none of the solutions I've found have
 worked (creating latex.fmt, running fmtutil-sys --all, setting $wgTexvc
 etc).

 All the packages are installed and were running fine for 1.19, I've
 downloaded Extension:Math for 1.21 and ran 'make' which generated a
 texvc binary with no errors.

 Any ideas what may be wrong?

 Thanks,
 Aran

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problem with SVG thumbnails

2013-07-01 Thread Aran
Hello,

My wiki's giving an error generating SVG thumbnails, e.g.

Cannot parse integer value '-h214' for -w

Has anyone come across a solution for this? I'm seeing it on many sites
around the net including my own - I think it started after I upgraded to
1.19.

Here's a live example:
http://www.organicdesign.co.nz/File:Nginx-logo.svg

Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problems uploading word 2007 .doc files

2013-01-09 Thread Aran Dunkley
Thanks, yes that's the same problem, and they have some potential
workarounds there.

On 08/01/13 16:51, Luke Welling WMF wrote:
 Is this bug the same issue?  It looks like somebody put up a partial fix

 https://bugzilla.wikimedia.org/show_bug.cgi?id=38432

 - Luke Welling


 On Mon, Jan 7, 2013 at 6:30 PM, Aran Dunkley a...@organicdesign.co.nzwrote:

 The file was a .doc, but I've tried changing it to docx and get the same
 result. Some other .doc and .docx files that are word 2007 upload no
 problem. But I don't see how it can complain when I have the MimeType
 verification disabled - completely disabling the verification would be
 no problem since only specific users can upload.

 p.s. this is a MW 1.19.2 on Ubuntu Server 11.10 with PHP 5.3.6

 On 07/01/13 21:21, Matma Rex wrote:
 On Mon, 07 Jan 2013 23:47:58 +0100, Aran Dunkley
 a...@organicdesign.co.nz wrote:

 Hello, can someone please help me with this .doc upload problem? I've
 tried everything and even setting  $wgVerifyMimeType to false fails to
 solve it. No matter what I do I keep getting the following error when I
 upload *some* word 2007 .doc files:

 The file is a corrupt or otherwise unreadable ZIP file. It cannot be
 properly checked for security.

 I don't know how that check can even be happening with $wgVerifyMimeType
 disabled, but still the error occurs?!
 Word 2007 uses a .docx format as far as I know, not .doc. Which one
 were you using in your configuration?

 Also, .docx files are essentially ZIP files with magic data inside.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Problems uploading word 2007 .doc files

2013-01-07 Thread Aran Dunkley
Hello, can someone please help me with this .doc upload problem? I've
tried everything and even setting  $wgVerifyMimeType to false fails to
solve it. No matter what I do I keep getting the following error when I
upload *some* word 2007 .doc files:

The file is a corrupt or otherwise unreadable ZIP file. It cannot be
properly checked for security.

I don't know how that check can even be happening with $wgVerifyMimeType
disabled, but still the error occurs?!

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Problems uploading word 2007 .doc files

2013-01-07 Thread Aran Dunkley
The file was a .doc, but I've tried changing it to docx and get the same
result. Some other .doc and .docx files that are word 2007 upload no
problem. But I don't see how it can complain when I have the MimeType
verification disabled - completely disabling the verification would be
no problem since only specific users can upload.

p.s. this is a MW 1.19.2 on Ubuntu Server 11.10 with PHP 5.3.6

On 07/01/13 21:21, Matma Rex wrote:
 On Mon, 07 Jan 2013 23:47:58 +0100, Aran Dunkley
 a...@organicdesign.co.nz wrote:

 Hello, can someone please help me with this .doc upload problem? I've
 tried everything and even setting  $wgVerifyMimeType to false fails to
 solve it. No matter what I do I keep getting the following error when I
 upload *some* word 2007 .doc files:

 The file is a corrupt or otherwise unreadable ZIP file. It cannot be
 properly checked for security.

 I don't know how that check can even be happening with $wgVerifyMimeType
 disabled, but still the error occurs?!

 Word 2007 uses a .docx format as far as I know, not .doc. Which one
 were you using in your configuration?

 Also, .docx files are essentially ZIP files with magic data inside.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Clone a specific extension version

2012-12-05 Thread Aran Dunkley
Hi Guys,
How do I get a specific version of an extension using git?
I want to get Validator 0.4.1.4 and Maps 1.0.5, but I can't figure out
how to use git to do this...
Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Awful trouble with 1.19 adding p to html

2012-09-27 Thread Aran Dunkley
Hello, does anyone here know why the parser insists on adding p tags
to HTML results returned by parser functions?

I'm trying to upgrade the TreeAndMenu extension to work with MediaWiki
1.19 and I can't get the parser to stop adding p tags throughout the
result returned by the parser-function expansion.

I've said isHTML = true and tried setting noparse to true and many other
things, but what ever I do - even removing all whitespace, it still adds
p's!

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Awful trouble with 1.19 adding p to html

2012-09-27 Thread Aran Dunkley
I removed all line breaks from the code being text being returned - but
the confusing thing is that the content being returned is marked as
isHTML (not wikitext) so the parser should be leaving it alone
completely and not checking for line breaks.

On 27/09/12 16:39, Isarra Yos wrote:
 On 27/09/2012 07:56, Aran Dunkley wrote:
 Hello, does anyone here know why the parser insists on adding p tags
 to HTML results returned by parser functions?

 I'm trying to upgrade the TreeAndMenu extension to work with MediaWiki
 1.19 and I can't get the parser to stop adding p tags throughout the
 result returned by the parser-function expansion.

 I've said isHTML = true and tried setting noparse to true and many other
 things, but what ever I do - even removing all whitespace, it still adds
 p's!

 Is it inserting extra linebreaks? I don't really know anything about
 what you're working with specifically, but general wikitext can be
 really thrown off by an unexpected linebreak.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Clone an extension from a specific REL branch

2012-07-11 Thread Aran
Hi,

I'm just wondering how to clone an extension for a particular branch...
e.g. using Subversion I could do this:

svn co
http://svn.wikimedia.org/svnroot/mediawiki/branches/REL1_18/extensions/CheckUser

What's the equivalent git command to get that same version of the extension?

Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Technical questions regarding categorisation

2012-02-27 Thread Aran
Hi there,

I'm writing an extension that needs to do the same effect on updating
categorisation as editing a page and saving it, but from the PHP code.
The article being edit/saved has a parser-function in it which
categorises the article.

When I manually edit the page and then save it without changing
anything, the categorisation is updated by the parser-function and I can
see the article listed in that category page.

But when I do this same operation from the code, I see the updated
category link at the bottom of the article, but the title does not show
up in the associated category page unless the categorisation took place
via a manual edit/save.

To do the edit/save from the PHP I'm doing this:

$article-doEdit( $article-fetchContent(), $summary, EDIT_UPDATE );

Does anyone here have any idea what might be different between this
approach and manually edit/save? neither one creates a new revision, but
the manual process does something more which places the article into the
category page properly.

Cheers,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] ResourceLoaderDynamicStyles meta tag

2012-02-20 Thread Aran
I noticed that new MediaWiki installations have a new meta tag which is
not in the HTML specification and means that MediaWiki sites cannot be
made W3C compliant.

For example, running the validator over mediawiki.org gives the
following error (amongst a few others complaining about deprecrated
cellpaddign etc):

Bad value ResourceLoaderDynamicStyles for attribute name on element
meta: Keyword resourceloaderdynamicstyles is not registered.

Would it be worth creating a short description page about the tag in
mediawiki.org and registering it on
http://wiki.whatwg.org/wiki/MetaExtensions to propose it be included in
the specification?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Loading jQuery

2011-08-02 Thread Aran Dunkley
I've figured it out now thanks :) the problem was that my scripts had
things running inline that needed to be deferred until after the JS
modules had loaded.

On 02/08/11 18:10, Roan Kattouw wrote:
 On Mon, Aug 1, 2011 at 1:55 PM, Aran Dunkley a...@organicdesign.co.nz wrote:
 Yes I'm using both jQuery and jQueryUI in some of my extensions and
 they've broken in MediaWiki 1.17 because neither of them are being
 loaded any more. I've tried many different variations of using
 $wgOut-addModules and setting items in $wgResourceModules but no matter
 what I do I can't seem to get the jQuery scripts to load. The browsers
 error log just says that the $, jQuery and other functions aren't defined.

 That's very strange. jQuery should definitely be there, otherwise
 something is very wrong. I wouldn't know how to debug this without
 more information. Is the JavaScript console reporting any JS errors?
 
 Roan Kattouw (Catrope)
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Loading jQuery

2011-08-01 Thread Aran Dunkley
Hello, I'm trying to update some of my extensions to work with the
ResourceLoader in 1.17, and I can't work out how to get jQuery to load.
I've tried the example at
http://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_for_extension_developers

and tried adding it directly,
$out-addModules( array( 'jquery.ui' ) );

But nothing I do will actually get the script to load and become
available, can anyone tell me the new syntax to get jQuery there?

Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Loading jQuery

2011-08-01 Thread Aran Dunkley
Yes I'm using both jQuery and jQueryUI in some of my extensions and
they've broken in MediaWiki 1.17 because neither of them are being
loaded any more. I've tried many different variations of using
$wgOut-addModules and setting items in $wgResourceModules but no matter
what I do I can't seem to get the jQuery scripts to load. The browsers
error log just says that the $, jQuery and other functions aren't defined.

On 01/08/11 23:51, Jelle Zijlstra wrote:
 Are you aware that jQuery and jQuery UI are two different things?
 
 2011/8/1 Aran Dunkley a...@organicdesign.co.nz
 
 Hello, I'm trying to update some of my extensions to work with the
 ResourceLoader in 1.17, and I can't work out how to get jQuery to load.
 I've tried the example at

 http://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_for_extension_developers

 and tried adding it directly,
 $out-addModules( array( 'jquery.ui' ) );

 But nothing I do will actually get the script to load and become
 available, can anyone tell me the new syntax to get jQuery there?

 Thanks,
 Aran

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Nuking deleted pages and their revisions

2011-05-30 Thread Aran Dunkley
Hello,

I have a lot of deleted pages in my wiki and was wondering how to free
up the database by getting rid of them completely. I don't want to loose
the history of the pages that aren't deleted though, is there an
extension for this?

Thanks,
Aran
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Cite extension problem

2010-12-13 Thread Aran Dunkley
Hello, I've just upgraded the Cite extension on our wikis and it's now
giving me this error on all refs: Cite error: Ran out of custom link
labels for group . Define more in the
[[MediaWiki:cite_link_label_group-]] message.], does anyone know the
quickest way to fix this?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Database dump character set problems

2010-04-15 Thread Aran Dunkley
Hi I'm wondering if anyone can help with this multibyte character 
corruption:
http://aqes.organicdesign.tv/Categor%C3%ADa:Arquitecto

The site was moved from a shared host to a dedicated server, but now 
it's not rendering the multibyte characters properly in the titles, but 
seems ok in the content. The mediawiki version went from 1.15.1 to 
1.15.3 and the MySQL version 5.1.30 to 5.1.45 and all the configuration 
seems identical on the new server.

I've tried all the tips about --default-character-set to latin1 or utf8 
etc mentioned in the MW manual page but nothing seems to change it no 
matter what I use for export or import.

A solution would be greatly appreciated as their site has been down for 
almost a week now while trying to solve this.

Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Database dump character set problems

2010-04-15 Thread Aran Dunkley
$wgDBmysql5 is set to false, the show create table for page gives this 
on both the original and the new server:

mwiki_page | CREATE TABLE `mwiki_page` (
  `page_id` int(8) unsigned NOT NULL AUTO_INCREMENT,
  `page_namespace` int(11) NOT NULL DEFAULT '0',
  `page_title` varchar(255) CHARACTER SET utf8 COLLATE utf8_bin NOT NULL,
  `page_restrictions` tinyblob NOT NULL,
  `page_counter` bigint(20) unsigned NOT NULL DEFAULT '0',
  `page_is_redirect` tinyint(1) unsigned NOT NULL DEFAULT '0',
  `page_is_new` tinyint(1) unsigned NOT NULL DEFAULT '0',
  `page_random` double unsigned NOT NULL DEFAULT '0',
  `page_touched` varchar(14) CHARACTER SET latin1 COLLATE latin1_bin NOT 
NULL DEFAULT '',
  `page_latest` int(8) unsigned NOT NULL DEFAULT '0',
  `page_len` int(8) unsigned NOT NULL DEFAULT '0',
  PRIMARY KEY (`page_id`),
  UNIQUE KEY `name_title` (`page_namespace`,`page_title`),
  KEY `page_random` (`page_random`),
  KEY `page_len` (`page_len`)
) ENGINE=MyISAM AUTO_INCREMENT=19105 DEFAULT CHARSET=latin1 
COLLATE=latin1_spanish_ci

Platonides wrote:
 Aran Dunkley escribió:
   
 Hi I'm wondering if anyone can help with this multibyte character 
 corruption:
 http://aqes.organicdesign.tv/Categor%C3%ADa:Arquitecto

 The site was moved from a shared host to a dedicated server, but now 
 it's not rendering the multibyte characters properly in the titles, but 
 seems ok in the content. The mediawiki version went from 1.15.1 to 
 1.15.3 and the MySQL version 5.1.30 to 5.1.45 and all the configuration 
 seems identical on the new server.

 I've tried all the tips about --default-character-set to latin1 or utf8 
 etc mentioned in the MW manual page but nothing seems to change it no 
 matter what I use for export or import.

 A solution would be greatly appreciated as their site has been down for 
 almost a week now while trying to solve this.

 Thanks,
 Aran
 

 What do you have $wgDBmysql5 set to?
 What's the output of doing SHOW CREATE TABLE page; ?



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
   


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] GoogleWave extension

2009-08-03 Thread Aran
If the Google Wave Fedaration Protocol is really all its cracked up to 
be and becomes a properly open XMPP extension maintained by the IETF 
then it could be good to support it at the level of the back end along 
with the database classes...

Magnus Manske wrote:
 On Mon, Aug 3, 2009 at 4:07 PM, Micke Nordinmickew...@gmail.com wrote:
   
 If somebody is interested, I wrote an extension for embedding (the upcoming)
 Google waves in MediaWiki as a tag extension:

 http://www.mediawiki.org/wiki/Extension:GoogleWave

 If someone has a wave developer sandbox account I'd be glad if that person 
 would
 help me test the extension a bit more than I'm able to do by my self. A 
 preview
 of the extension is avalible at http://mickenordin.se/wiki/Wave for now (I 
 know,
 my wiki looks more like WordPress than MediaWiki :)).

 If nothing shows up try this:
 http://mickenordin.se/mediawiki/index.php?title=Waveaction=purge

 /Micke
 

 Nice! I tried to guess your sandbox name and shared a test wave with
 you, but I don't know if it worked...

 But now: When can we open any MediaWiki page as a wave, replay the
 entire history, use the WYSIWYG wave editor to fix a typo and insert^W
 remove some POV, save and wonder how it will deal with an edit
 conflict? And flagged revisions?

 Seriously, I think there is quite some potential here...

 Magnus

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
   


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Problems with the recent version of Cite Extension

2009-03-17 Thread Aran
I've had this trouble too, I just commented out the offending line - 
that could be a problem under some specific condition, but it works fine 
for me as an interim solution.

O. O. wrote:
 Brad Jorsch wrote:
   
 On Mon, Mar 16, 2009 at 04:47:18PM -0700, O. O.  wrote:
 
 I have installed Mediawiki 1.14.0 http://www.mediawiki.org/wiki/Download  
 and am trying to get the  Cite Extension  
 http://www.mediawiki.org/wiki/Extension:Cite version 1.14.0 to work.

 When accessing the Main_Page I get the error:

 Fatal error: Call to undefined method  
 ParserOptions::getIsSectionPreview() in  
 /var/www/wiki2/extensions/Cite/Cite_body.php on line 699
   
 In r47655 (the official 1.14.0 revision), line 699 is a blank line
 between two functions. Note that the to-be-1.14.0 branch was branched
 from trunk r45489, and no subsequent backports added a call to
 getIsSectionPreview.

 In the version currently served when 1.14.x is selected at
 http://www.mediawiki.org/wiki/Special:ExtensionDistributor/Cite (the
 filename claims it to be r45577), line 699 is a blank line between two
 functions.

 The function call to getIsSectionPreview was added in r46271
 (getIsSectionPreview itself was added in r46270), but in that revision
 it was on line 674, not 699. Only from r47190 can I find a version of
 Cite_body.php with a call to getIsSectionPreview on line 699.

 Based on the above, it seems you're using the *trunk* version if the
 Cite Extension, not the 1.14.0 version. It's possible someone changed
 http://www.mediawiki.org/wiki/Special:ExtensionDistributor/Cite to serve
 a correct version of the extension since you downloaded it, I don't know
 how that extension works.
 

 Thanks Brad. I too do not know how the extension works though I did not 
 go into the code. Yes, I am using the version of Cite that is available 
 from http://www.mediawiki.org/wiki/Special:ExtensionDistributor/Cite and 
 I chose the 1.14.x version. As I could not get this to work, I chose the 
 1.13.x version. I checked this yesterday and it did not work for me.

 Thanks again,
 O. O.


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
   


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Illegal mix of collations after upgrade to 1.14

2009-03-11 Thread Aran
Hi I've tried upgrading my 1.11 to 1.14 and get this illegal mix of 
collations error. I went through the normal upgrade procedure first but 
this failed, so I then tried exporting as XML and importing into a 
completely fresh 1.14 install, and still I get the error!

I've found that by setting $wgDBmysql5 to false things seem to work ok, 
but is this really a good solution? I'd like everything to be running up 
to date, not in some backward compatibility mode... does anyone have any 
idea how to fix the problem properly?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Backward compatibility in svn extensions

2009-02-20 Thread Aran
Hi I'm just wondering what the policy is with regards to changes to 
extension code in the svn in the case where the modification is 
compatible only with recent versions. Shouldn't extensions be designed 
to be as backward compatible as is practical rather than focussing 
exclusively on supporting the current release?

If there is a conflict in this regard what's the best solution? is it to 
wrap the new change into a conditional block based on $wgVersion?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] GitTorrent (pie-in-the-sky post)

2008-12-04 Thread Aran
I've looked in to this a little but still in quite a pie in the sky way, 
I made the sqlite db layer with the idea of it being simpler to 
incorporate into a client based mediawikilite app, I made some notes 
at these articles:
   http://www.organicdesign.co.nz/MediaWikiLite
   http://www.organicdesign.co.nz/PeerPedia
A lite mediawiki could then work as a peer and sqlite integrate with a 
distributed storage system such as a DHT or with Git. Perhaps interwiki 
could be used as an addressing scheme to separate different wikis within 
the common distributed storage space?

David Gerard wrote:
 http://advogato.org/article/994.html

 Peer-to-peer git repositories. Imagine a MediaWiki with the data
 stored in git, and updates distributed peer-to-peer.

 Imagine if Wikipedia could be mirrored locally, run on a local
 mirror, where content was pushed and pulled, GPG-Digitally-signed;
 content shared via peer-to-peer instead of overloading the Wikipedia
 servers.

 This would certainly go some way to solving the a good dump is all
 but impossible problem ...


 (so, anyone hacked up a git backend for MediaWiki revisions rather
 than MySQL? :-) )


 - d.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
   


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l