[Wikitech-l] Major trouble with Cirrus forceSearchIndex.php script

2019-11-16 Thread Aran via Wikitech-l
Hi all,

We're having terrible trouble with the Cirrus search maintenance script
for initialising the elastic indexes:
forceSearchIndex.php --skipLinks --indexOnSkip...

It's happening with MW 1.31 .. 1.33, we're using redis job queue and a
single instance of Elastic on the same host (these are low traffic
wikis). Debian 10.2, PHP 7.3.

No matter what parameters we use (--queue or not, different --maxJobs,
or --fromId/--toId, --batchSize etc etc) we're always finding that
hundreds of elastic docs are not being created.

There's nothing about the articles themselves that are preventing it, if
we run the maintenance script on just a single missing one afterwards it
gets created no problem, and also each time this problem happens, there
are many differences in the missing docs.

Please if anyone has heard of this kind of things and could point us in
the right direction here that would be awesome!

Thanks a lot,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] InnoDB / MyISAM

2019-07-28 Thread Aran via Wikitech-l
Yep, I just had a look and all the searchindex tables are empty and the
tables doesn't seem to be accessed at all when using $wgSearchType =
'DcsCirrusSearch', so there's definitely no need for any MyISAM anywhere :-)

On 28/07/19 2:33 PM, Aran via Wikitech-l wrote:
> We're using InnoDB > 5.7 so full text search is definitely supported -
> but also we use Elastic search anyway, does this mean we don't even need
> the searchindex table?
>
>
> On 28/07/19 1:07 PM, Manuel Arostegui wrote:
>> On Sat, Jul 27, 2019 at 4:03 PM Bartosz Dziewoński 
>> wrote:
>>
>>> The 'searchindex' table uses MyISAM because until recently, InnoDB did
>>> not support fulltext indexes, which MediaWiki uses for the search. All
>>> other tables should use InnoDB.
>>>
>>> According to https://stackoverflow.com/a/9397060 fulltext indexes are
>>> available on InnoDB since MySQL 5.6.4. If you're running that version or
>>> newer, it is possible you could use InnoDB for that table, but as far as
>>> I know no one has tried it before.
>>>
>>> According to https://www.mediawiki.org/wiki/Compatibility#Database
>>> MediaWiki only requires MySQL 5.5.8, so we can't change that in our
>>> table definitions (yet).
>>>
>>> No idea about the 'math' table.
>> The math table isn't used, and it is being dropped in production:
>> https://phabricator.wikimedia.org/T196055
>>
>> Regarding MyISAM vs InnoDB: Always use InnoDB unless you have a very good
>> reason to use MyISAM (like the one mentioned about full-text indexes).
>>
>> Cheers
>> Manuel.
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] InnoDB / MyISAM

2019-07-28 Thread Aran via Wikitech-l
We're using InnoDB > 5.7 so full text search is definitely supported -
but also we use Elastic search anyway, does this mean we don't even need
the searchindex table?


On 28/07/19 1:07 PM, Manuel Arostegui wrote:
> On Sat, Jul 27, 2019 at 4:03 PM Bartosz Dziewoński 
> wrote:
>
>> The 'searchindex' table uses MyISAM because until recently, InnoDB did
>> not support fulltext indexes, which MediaWiki uses for the search. All
>> other tables should use InnoDB.
>>
>> According to https://stackoverflow.com/a/9397060 fulltext indexes are
>> available on InnoDB since MySQL 5.6.4. If you're running that version or
>> newer, it is possible you could use InnoDB for that table, but as far as
>> I know no one has tried it before.
>>
>> According to https://www.mediawiki.org/wiki/Compatibility#Database
>> MediaWiki only requires MySQL 5.5.8, so we can't change that in our
>> table definitions (yet).
>>
>> No idea about the 'math' table.
>
> The math table isn't used, and it is being dropped in production:
> https://phabricator.wikimedia.org/T196055
>
> Regarding MyISAM vs InnoDB: Always use InnoDB unless you have a very good
> reason to use MyISAM (like the one mentioned about full-text indexes).
>
> Cheers
> Manuel.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] InnoDB / MyISAM

2019-07-27 Thread Aran via Wikitech-l
Hello,

I've recently noticed that in a wiki farm I look after, about 20% of the
tables are MyISAM, but it doesn't seem very consistent - i.e. some
tables that are MyISAM in some wikis are InnoDB in others. searchindex
and math look to be all MyISAM.

From what I've read about them it seems that InnoDB is the best option,
especially for recent versions. Would it be a good idea to change them
all to InnoDB?

Thanks,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Redis job runner service

2019-06-22 Thread Aran via Wikitech-l
Hello,

Could somebody please help me understand the configuration of the
Wikimedia Job Runner service?
https://github.com/wikimedia/mediawiki-services-jobrunner/blob/master/jobrunner.sample.json

I'm not quite clear on how groups work, it would seem that the
groups/runners parameter is the number of threads assigned to each
group..? But then why does the "gwt" group have runners=0?

Thanks,
Aran

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Redis job runner

2019-06-19 Thread Aran via Wikitech-l
Hello,

I've set up the redisJobRunnerService and redisJobChronService from here:
https://github.com/wikimedia/mediawiki-services-jobrunner

They're working well and processing jobs as they should be, but I was
wondering if anyone knows of any documentation about the configuration
to fine-tune the prioritisation of various job types differently. All
I've really been able to find about the configuration is the README in
the repo.

Thanks,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Write expectations exception

2019-06-18 Thread Aran via Wikitech-l
Hi Kosta,

I had initially not wanted to use the job queue, because it can take
minutes for jobs to execute and most of our writes need to be done
within a few seconds. But I see from that manual page that I could make
a class of jobs and a job-runner mechanism that would allow important
jobs to always get executed almost immediately, which makes the
job-queue a workable solution.

Thanks,
Aran


On 18/06/19 11:41 AM, Kosta Harlan wrote:
> My understanding is that eventually there will be enforcement in the
> WMF production environments, but I’m not sure about MediaWiki itself.
>
> If you’re doing writes on GET requests, the job queue might be useful
> to you: https://www.mediawiki.org/wiki/Manual:Job_queue
>
> Kosta
>
>> On Jun 17, 2019, at 8:29 AM, Aran via Wikitech-l
>> > <mailto:wikitech-l@lists.wikimedia.org>> wrote:
>>
>> |In a MediaWiki-based project I'm working on I'm getting many of these
>> kinds of exceptions: [DBPerformance] Expectation (writes <= 0) by
>> MediaWiki::main not met (actual: 8) |
>>
>> |I've read up on the Database transactions article in mediawiki.org
>> <http://mediawiki.org> and
>> can see that to remove the exceptions we'd need to conform to some very
>> specific criteria for all our db write, which is quite a problem for
>> this particular project.
>>
>> My question is, are these criteria ever likely to be enforced in future
>> MW versions, or will they always just be warnings to help guide
>> performance improvements?
>> |||
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org <mailto:Wikitech-l@lists.wikimedia.org>
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Write expectations exception

2019-06-17 Thread Aran via Wikitech-l
|In a MediaWiki-based project I'm working on I'm getting many of these
kinds of exceptions: [DBPerformance] Expectation (writes <= 0) by
MediaWiki::main not met (actual: 8) |

|I've read up on the Database transactions article in mediawiki.org and
can see that to remove the exceptions we'd need to conform to some very
specific criteria for all our db write, which is quite a problem for
this particular project.

My question is, are these criteria ever likely to be enforced in future
MW versions, or will they always just be warnings to help guide
performance improvements?
|||


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Problem importing images

2018-11-18 Thread Aran via Wikitech-l
Hi all,

I've installed a new 1.31 and imported content from an xml dump using
importDump.php which included file descriptions, then imported the
images from a directory of files using importImages.php.

About half of the files have imported successfully, but the other half
say the image is not found when I go to the image page in the wiki. i
can see the image in the proper place in the upload directory so it has
imported properly.

If I try and re-upload the image manually I can't because it says an
exact replica already exists, but click on that link to the already
existing one still says "No file by this name exists, but you can upload
it".

I've tried running all the related maintenance scripts such as
rebuildImages.php with no success :-(

Any ideas?

Thanks,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] $wgVisualEditorParsoidForwardCookies

2018-10-19 Thread Aran via Wikitech-l
Thanks, that's the page I was following, but I haven't needed to do any
of those things it says are needed for private wikis and it's saving
edits no problem.

On 19/10/18 2:47 PM, C. Scott Ananian wrote:
> That's a good question!  I'm not aware of any recent changes that
> would have made $wgVisualEditorParsoidForwardCookies not necessary,
> but I *do* recall that we don't use those long-form configuration
> variable names any more.
>
> Latest documentation seems to be at
> https://www.mediawiki.org/wiki/Extension:VisualEditor#Linking_with_Parsoid_in_private_wikis
>
> Hope this helps!
>  --scott
>
> On Fri, Oct 19, 2018 at 7:25 AM Aran via Wikitech-l
>  <mailto:wikitech-l@lists.wikimedia.org>> wrote:
>
> Hi,
>
> just wondering what the situation is with
> $wgVisualEditorParsoidForwardCookies these days. The documentation for
> visual editor says that it's needed if you have a wiki that's not
> editable by the public. But I've just set it up on such a wiki and
> editing/saving an article through visual editor works fine without me
> having set this or allowed anonymous edits by localhost. Is there no
> longer any need to worry about settings specifically for locked
> down wikis?
>
> Thanks,
>
> Aran
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org <mailto:Wikitech-l@lists.wikimedia.org>
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
> -- 
> (http://cscott.net)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] $wgVisualEditorParsoidForwardCookies

2018-10-19 Thread Aran via Wikitech-l
Hi,

just wondering what the situation is with
$wgVisualEditorParsoidForwardCookies these days. The documentation for
visual editor says that it's needed if you have a wiki that's not
editable by the public. But I've just set it up on such a wiki and
editing/saving an article through visual editor works fine without me
having set this or allowed anonymous edits by localhost. Is there no
longer any need to worry about settings specifically for locked down wikis?

Thanks,

Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l