Yes, it probably is due to a race condition.

Hopefully, there's a better solution than using unbuffered queries - if
that is a solution.

-Yaron

On Sun, Dec 13, 2015 at 12:51 AM, Ed <[email protected]> wrote:

> Hi,
>
> The duplicate rows seem to happen as a result of a race condition when the
> new record has been queued and somewhere in the process we reedit or
> regenerate and queue a second one.
>
> I'm not sure if it is buffering, using multiple connections, something
> else.
>
> I was going to mess around with the DBO_NOBUFFER setting but the comments
> in "includes/db/Database.php" are a bit ominous:
>
>          * Unbuffered queries are very troublesome in MySQL:
>          *
>          *   - If another query is executed while the first query is being
> read
>          *     out, the first query is killed. This means you can't call
> normal
>          *     MediaWiki functions while you are reading an unbuffered
> query result
>          *     from a normal wfGetDB() connection.
>          *
>          *   - Unbuffered queries cause the MySQL server to use large
> amounts of
>          *     memory and to hold broad locks which block other queries.
>
> Thoughts?
>
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Reply via email to