I wonder if such a model would work:

* A group of volunteers selects images from Wikimedia Commons and uses a
wiki to sort and rank them and finally create the list of images on a
particular page. This page can hold already images "parked" for future
publication - like we plan a few weeks in advance, so we only have to do
the whole selection process maybe four times a year.
This may be a designated area in the members wiki, which includes the
Commons files.

* A script regularly reads this wiki page and determines which is now
the current image of the day. It adds it to a RSS feed which holds the
last n images of the day in descending order.
This may be run as a cron job on the WMCH server, I am happy to set up
some script (I already have done both PHP / MediaWiki API and RSS
scripts) or just provide shell access to someone who manages this script.

* Everyone else can then use the RSS feed to get the image of the day on
their website, upload it to Tumblr, whatever, send a tweet, post it on
facebook... (the latter two things can easily be done with
http://twitterfeed.com/ which automatically posts from RSS to Twitter,
Facebook and LinkedIn).

The Wikipedia Article of the Day of de:wp is also published through RSS.

The

/Manuel
-- 
Wikimedia CH - Verein zur Förderung Freien Wissens
Lausanne, +41 (21) 34066-22 - www.wikimedia.ch

_______________________________________________
http://wikimedia.ch Wikimedia CH website
Wikimediach-l mailing list
https://lists.wikimedia.org/mailman/listinfo/wikimediach-l

Antwort per Email an