Re: [Wikitech-l] Interwiki recommendations

2009-02-05 Thread Rolf Lampa
Lars Aronsson skrev:

 These are two recipies for how you can improve your language 
 version of Wikipedia in an evening.  But we don't have any 
 cookbook to collect all such recipies, do we?  Should we?
 These two recipies have now been used for the Swedish Wikipedia 
 and we don't need to follow them again.  But they can be reused on 
 every other language.  

Hi Lars,

It'd probably a good idea to start up a collection of Wiki Patterns 
somewhere, in English. With some kind of voting for the best 
ideas/practices. Then the best ideas might spread.

Regards,

// Rolf Lampa


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Interwiki recommendations

2009-02-03 Thread Chad
On Tue, Feb 3, 2009 at 12:26 PM, Gerard Meijssen
gerard.meijs...@gmail.comwrote:

 Hoi,
 Have a look at this: http://www.omegawiki.org/Expression:Nederland This is
 structured data It can be shown in multiple languages. It does allow for
 interwiki links... It already works with MediaWiki ...

 So the question is, why re-invent the wheel ?
 Thanks,
GerardM

 2009/2/3 Michael Dale md...@wikimedia.org

  We really need a wikidata type site. We ran into similar issues with
  structured data between government data wikis. Yaron hacked up a
  (relatively simple) extension called External Data for pulling external
  data into a given page.
 
  This ends up working very well, allowing us to effortlessly transclude
  shared datasets into templates of multiple wikis. This is fundamentally
  good as it moves queriable maintained structured data away from multiple
  instances of user maintained semi-structured data.
 
  For the wikimedia context I think something like wikidata.wikimedia.org
  needs to be created. It could be a semanticMediaWiki wiki installation
  extended with localized page aliases. A single page-id or concept
  would have many title columns for each language. (The localized title
  columns can be propagated by the existing database of inter-wiki
  language links). Furthermore since properties/relations have
  titles/page-ids; they could also be localized. Allowing you to query
  the shared structured dataset in your local language.
 
  Then something like external data extension will tie wikidata to all the
  current language wikis. This can be thought of as commons but for data.
  (likewise external to wikimedia wikis could use this structured data).
  This lets template authors concentrate on localized representation of
  the data (calling the native language properties) , articles authors can
  focus on the article (instead of huge seed of hard to maintain template
  data), and structured data folks can focus on importing data into the
  central shared repository.
 
 
  --michael
 


I think Michael Dale said the same thing. He was just saying that perhaps
we could have a central wiki (based on SMW) to hold the data, and allow
the individual projects to use ExternalData to grab the data into the
presentation format of their choice. Not a terrible idea, IMO :)

Combining enwiki and frwiki into one mega-pedia isn't the best idea. Each
wiki should be free to not only decide content as they so choose, but also
to use the presentation they prefer. However, there's no need to duplicate
raw facts as mentioned, so a central wiki of the raw facts query-able by
ExternalData would serve this purpose. Allow wikis to retain their autonomy,
but give them access to centralized data that doesn't change in different
languages.

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Interwiki recommendations

2009-02-03 Thread Marcus Buck
Gerard Meijssen hett schreven:
 Hoi,
 Have a look at this: http://www.omegawiki.org/Expression:Nederland This is
 structured data It can be shown in multiple languages. It does allow for
 interwiki links... It already works with MediaWiki ...

 So the question is, why re-invent the wheel ?
 Thanks,
 GerardM
Is it transcludable to other wikis?

Marcus Buck

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Interwiki recommendations

2009-02-03 Thread Roan Kattouw
Gerard Meijssen schreef:
 Hoi,
 Have a look at this: http://www.omegawiki.org/Expression:Nederland This is
 structured data It can be shown in multiple languages. It does allow for
 interwiki links... It already works with MediaWiki ...

 So the question is, why re-invent the wheel ?
I think the wheel is called Semantic MediaWiki in this case :P

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Interwiki recommendations

2009-02-03 Thread Michael Dale
hmm ... my feeling is that it would be easier to adapt semantic wiki to 
include language aliasing than adapt omega wiki to be less dictionary 
centric in the context of a shared structured data site with _lossy_ 
defined community ontologies... But fundamentally there is no reason why 
an external data extension could not pull from both such external 
structured data systems.

to elaborate a bit...
As far as I can tell OmegaWiki would have to go a long way to be used in 
the same way that semantic wiki is used today. And an even further way 
to integrate with how wikimedia uses templates and infoboxes today.  
Omega seems to tie relations to items in a specif defined concepts name 
space, rather than arbitrary wiki pages. This makes a lot of sense if 
designing a multi-lingual language representation system but not so 
ideal for a general purpose shared structured data template propagation 
system tied to specif article entries.

OmegaWiki is seems to be focused on multilengual language representation 
rather than multiple data type representation, re-usage, and template 
integration. Semantic wiki appears to be more of a flexible platform in 
this area as it has spawned dozens of extensions and hundreds of usages 
in a wide range of contexts.

Semantic wiki has already done a few revisions to optimize the storage 
and retrieval of multiple data-types, its more closely tied to the svn 
version of mediaWiki, they do regular releases etc.

That being said let me reiterate an external data extension could 
support multiple remote data systems.

--michael


Gerard Meijssen wrote:
 Hoi,
 Have a look at this: http://www.omegawiki.org/Expression:Nederland This is
 structured data It can be shown in multiple languages. It does allow for
 interwiki links... It already works with MediaWiki ...

 So the question is, why re-invent the wheel ?
 Thanks,
 GerardM

 2009/2/3 Michael Dale md...@wikimedia.org

   
 We really need a wikidata type site. We ran into similar issues with
 structured data between government data wikis. Yaron hacked up a
 (relatively simple) extension called External Data for pulling external
 data into a given page.

 This ends up working very well, allowing us to effortlessly transclude
 shared datasets into templates of multiple wikis. This is fundamentally
 good as it moves queriable maintained structured data away from multiple
 instances of user maintained semi-structured data.

 For the wikimedia context I think something like wikidata.wikimedia.org
 needs to be created. It could be a semanticMediaWiki wiki installation
 extended with localized page aliases. A single page-id or concept
 would have many title columns for each language. (The localized title
 columns can be propagated by the existing database of inter-wiki
 language links). Furthermore since properties/relations have
 titles/page-ids; they could also be localized. Allowing you to query
 the shared structured dataset in your local language.

 Then something like external data extension will tie wikidata to all the
 current language wikis. This can be thought of as commons but for data.
 (likewise external to wikimedia wikis could use this structured data).
 This lets template authors concentrate on localized representation of
 the data (calling the native language properties) , articles authors can
 focus on the article (instead of huge seed of hard to maintain template
 data), and structured data folks can focus on importing data into the
 central shared repository.


 --michael

 Marcus Buck wrote:
 
 Lars Aronsson hett schreven:

   
 What is the best way to organize infobox templates for geographic
 places, the one used on the French, the Polish, or the Turkish
 Wikipedia?  What are the most important features in use on other
 languages of Wikipedia, that my language is still missing?

 Are these questions of a kind that you sometimes ask yourself?
 If so, where do you go to find the answers?  Are we all just
 copying ideas from the English Wikipedia?  Or inventing our own
 wheels? Has anybody collected stories of how one project learned
 something useful from another one?

 
 As you are speaking of infoboxes and crosswiki, I want to chip in
 another thought: why do we actually place infobox templates on every
 single wiki? In 2007 I created some semiautomatic bot articles about
 municipalities on my home wiki. In 2008 they had elections and elected
 new mayors. So my articles mentioning the mayors were outdated. The
 articles in the main language of that country were updated relatively
 quickly, Mine are not yet. I plan to do, but who does that for all
 articles in all language editions?

 An example: Bavaria held communal elections in March 2008. Enough time
 to update infoboxes. The municipality Adelzhausen got Lorenz Braun as
 new mayor, replacing Thomas Goldstein. I checked all interwikis of the
 German article. Two had it right. Both were created after the elections.
 Four don't mention the 

Re: [Wikitech-l] Interwiki recommendations

2009-02-03 Thread Lars Aronsson
Michael Dale wrote:

 We really need a wikidata type site.

A very easy and ugly workaround would be to store an image on 
Wikimedia Commons, containting the letters Barack Obama and 
having the filename President_of_USA.png.  Next time change comes 
to the White House, the image is replaced with a new version.  
Then each language article could contain: The president of the 
United States is [[file:President of USA.png]], and the right 
name would automatically appear there.  You'd need such images for 
the names of all officicials, population of all cities, and so on. 

(Don't scream. I copied this idea from the visitor counters of 
mid-1990s websites, which were implemented by transcluding 
images presenting the current number of visitors.)

From this ugly hack, it's easy to conceive that Wikimedia Commons 
could distribute not only images but also text snippets and data.
An easy way to do this is to treat the template namespace 
similar to the file namespace.  Files (images) that aren't found 
on the local wiki, are imported from Wikimedia Commons.  If the 
template {{president of the USA}} is not found locally, that 
template would be sought on Wikimedia Commons.

With the right parameter setup and #switch: constructs, it could 
be handled with just a few templates, e.g. {{president|USA}}
containing {{#switch: {{{1}}}|USA=Obama|Russia=Medvedev}}.

Of course, we'd need similar templates for Cyrillic and Arabic 
scripts. But they could be named {{Президент}} and contain
{{#switch:{{{1}}}|США=Обама|Россия=Медведев}}, etc.

Of course, this kind of transclusion doesn't help you to write the 
article about Barack Obama, where it says that he was elected 
president of the United States in 2008 and took office in January 
2009.  But that is not a piece of text that needs to be 
automatically updated.

It's interesting what would happen if the template imported from 
Commons calls other templates, that do exist on the local wiki.  
In programming language terms, it would imply that template 
expansion has dynamic scope, just like Emacs Lisp, rather than 
static or lexical scope (like most languages),
http://en.wikipedia.org/wiki/Scope_(programming)


-- 
  Lars Aronsson (l...@aronsson.se)
  Aronsson Datateknik - http://aronsson.se

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Interwiki recommendations

2009-02-03 Thread Gerard Meijssen
Hoi,
I really like Semantic MediaWiki. But horses for courses. SMW is great for
content where articles exist within one MediaWiki installation. It is TRULY
project based. OmegaWiki is data that is indeed a separate database. The
reason for a separate database is exactly because you do not want to create
unnecessary dependencies. This is where OmegaWiki and Semantic MediaWiki are
essentially different.
   GerardM

2009/2/3 Roan Kattouw roan.katt...@home.nl

 Gerard Meijssen schreef:
  Hoi,
  Have a look at this: http://www.omegawiki.org/Expression:Nederland This
 is
  structured data It can be shown in multiple languages. It does allow for
  interwiki links... It already works with MediaWiki ...
 
  So the question is, why re-invent the wheel ?
 I think the wheel is called Semantic MediaWiki in this case :P

 Roan Kattouw (Catrope)

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Interwiki recommendations

2009-02-02 Thread Marcus Buck
Lars Aronsson hett schreven:
 What is the best way to organize infobox templates for geographic 
 places, the one used on the French, the Polish, or the Turkish 
 Wikipedia?  What are the most important features in use on other 
 languages of Wikipedia, that my language is still missing?

 Are these questions of a kind that you sometimes ask yourself?  
 If so, where do you go to find the answers?  Are we all just 
 copying ideas from the English Wikipedia?  Or inventing our own 
 wheels? Has anybody collected stories of how one project learned 
 something useful from another one?
As you are speaking of infoboxes and crosswiki, I want to chip in 
another thought: why do we actually place infobox templates on every 
single wiki? In 2007 I created some semiautomatic bot articles about 
municipalities on my home wiki. In 2008 they had elections and elected 
new mayors. So my articles mentioning the mayors were outdated. The 
articles in the main language of that country were updated relatively 
quickly, Mine are not yet. I plan to do, but who does that for all 
articles in all language editions?

An example: Bavaria held communal elections in March 2008. Enough time 
to update infoboxes. The municipality Adelzhausen got Lorenz Braun as 
new mayor, replacing Thomas Goldstein. I checked all interwikis of the 
German article. Two had it right. Both were created after the elections. 
Four don't mention the mayor at all, and six still mentioned the old 
mayor. No wiki had bothered to update the information.

It would be much easier, if we had a central repository for the data. We 
would place infoboxes in the central wiki. Each wiki then could fetch 
the data from the central wiki just as images are fetched from Commons 
and render the data into a localised infobox. That would be much more 
accurate than maintaining redundant info on potentially hundreds of wikis.

Marcus Buck

PS: And that would be interesting in regard to botopedias too. Volapük 
Wikipedia was massively critized for creating masses of bot content. 
With a central wiki for data creating articles for example for all the 
~37,000 municipalities of France would essentially be reduced to 
creating a template that renders the central content into an article. 
Little Wikipedias could greatly benefit, if they just had to create some 
templates to make available info on hundreds of thousands of topics to 
the speakers of their language. It would be very basic, infobox-like 
information, but it would be information.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Interwiki recommendations

2009-02-02 Thread Chad
Step 1 would be making interwiki transclusion not suck.
Its been a long-standing back-burner project of mine.

-Chad

On Feb 2, 2009 8:05 PM, Marcus Buck w...@marcusbuck.org wrote:

Lars Aronsson hett schreven:

 What is the best way to organize infobox templates for geographic 
places, the one used on the F...
As you are speaking of infoboxes and crosswiki, I want to chip in
another thought: why do we actually place infobox templates on every
single wiki? In 2007 I created some semiautomatic bot articles about
municipalities on my home wiki. In 2008 they had elections and elected
new mayors. So my articles mentioning the mayors were outdated. The
articles in the main language of that country were updated relatively
quickly, Mine are not yet. I plan to do, but who does that for all
articles in all language editions?

An example: Bavaria held communal elections in March 2008. Enough time
to update infoboxes. The municipality Adelzhausen got Lorenz Braun as
new mayor, replacing Thomas Goldstein. I checked all interwikis of the
German article. Two had it right. Both were created after the elections.
Four don't mention the mayor at all, and six still mentioned the old
mayor. No wiki had bothered to update the information.

It would be much easier, if we had a central repository for the data. We
would place infoboxes in the central wiki. Each wiki then could fetch
the data from the central wiki just as images are fetched from Commons
and render the data into a localised infobox. That would be much more
accurate than maintaining redundant info on potentially hundreds of wikis.

Marcus Buck

PS: And that would be interesting in regard to botopedias too. Volapük
Wikipedia was massively critized for creating masses of bot content.
With a central wiki for data creating articles for example for all the
~37,000 municipalities of France would essentially be reduced to
creating a template that renders the central content into an article.
Little Wikipedias could greatly benefit, if they just had to create some
templates to make available info on hundreds of thousands of topics to
the speakers of their language. It would be very basic, infobox-like
information, but it would be information.

___ Wikitech-l mailing list
wikitec...@lists.wikimedia
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l