https://bugzilla.wikimedia.org/show_bug.cgi?id=38822

--- Comment #7 from Tim Starling <[email protected]> 2012-09-27 23:46:01 
UTC ---
(In reply to comment #6)
> Thanks for the feedback, Tim. You are mentioning 5 issues:
> 
> 1) 57KB of site and language data: yes, this has been on our todo list 
> forever.
> I hope we get this moved to a separate resource next week.
> 
> 2) "Wikipedia" should not be hardcoded anywhere - where did you find this?
> Maybe as a default setting, or some such?

In ItemView.php:

/**
 * Returns a list of all the sites that can be used as a target for a site
link.
 *
 * @static
 * @return array
 */
public static function getSiteDetails() {
...
    if ( $site->getType() === Site::TYPE_MEDIAWIKI && $site->getGroup() ===
'wikipedia' ) {

The autocomplete feature that this function services also references Wikipedia
in a message (wikibase-error-autocomplete-connection). 

There doesn't seem to be any way to populate the sites table with data other
than the data that comes from meta.wikimedia.org, I had to patch
Utils::insertDefaultSites() to set up my test instance.

populateInterwiki.php also unconditionally references Wikipedia.

> 3) global variables: will do. The convention changed a couple of times it
> seems, causing confusion. Is our main settings array acceptable as
> $egWBSettings, or would it become $wgWBSettings?

I think $wg is the best convention, since if everything uses it, a
configuration UI can drop the prefix. It's almost universal in extensions
deployed to WMF, the only exception is the Contest extension, which is another
one of Jeroen's projects.

> The last two points are really only relevant to the client, even though the
> code is in Wikibase/lib. These issues shouldn't block the deployment of the
> repo, some of the code, like the pollForChanges script, should probably be
> moved. Anyway: 
> 
> 4) That we have to (potentially) update all Wikipedias after a single edit on
> Wikidata lies in the nature of the project, I would think. We are thinking
> about how to make this more efficient by batching updates. I'll try to prepare
> a writeup explaining how we currently envision the percolation of the changes.

Aren't we talking about a deployment in October? It seems like a pretty basic
feature to be starting so late.

> 5) I have worked on remote DB support for ORMTable (and by extension
> ChangesTable) yesterday, see I261a2a31. I have not yet figured out though how
> to correctly set up a LBFactory_multi to test this. Can you help me with that?
> What would a simple setup for two masters (and no slaves) look like?

Here is my LocalSettings.php, if it helps:

http://paste.tstarling.com/p/drrHMe.html

Apologies for the accumulated cruft. It has configuration for various
multi-wiki features. For multiple masters, it would be basically the same,
except with $wgLBFactoryConf having:

'sectionsByDB' => array(
   'enwiki' => 's1',
),
'sectionLoads' => array(
   's1' => array( 'local1' => 1 ),
   'DEFAULT' => array( 'local2' => 1 ),
),

It's possible to run multiple MySQL servers on the same host. There's a helper
script for it called mysqld_multi:

http://dev.mysql.com/doc/refman/5.1/en/mysqld-multi.html

For MediaWiki, it's necessary to use different IP addresses rather than
different ports to separate the instances.

-- 
Configure bugmail: https://bugzilla.wikimedia.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are on the CC list for the bug.

_______________________________________________
Wikibugs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to