Hey my Bot lists disambiguation conflicts under
https://www.wikidata.org/wiki/User:Sk!dbot/disambiguation_page_conflict
This list is little bit old but I can start the task again and update it. I
also could start from nl wiki to make sure that with on every
disambiguation conflict page there is a
For the bot removing interwiki links that are redirects etc my new code
should be ready by this weekend (I hope) and this should give the lists I
have a big clear out! :)
Addshore
On 8 Jul 2013 04:32, Romaine Wiki romaine_w...@yahoo.com wrote:
Today we reached at nl-wiki the situation that +
We have received quite a few requests for an extended deadline. We
understand that working with large amount of data such as DBpedia is
difficult and time consuming.
The deadline will therefore be extended until Thursday, July 18th, 23:59
Hawaii time. However, we would like to appeal to all
I just wanted to say thank you! That's truly amazing work.
As far as I can tell, more than 200 Million lines of wikitext have so far
been removed from the Wikipedias. That's 200 Million lines that do not have
to maintained anymore.
(I have not run the actual analysis yet, I have been waiting for
Denny Vrandečić, 08/07/2013 13:16:
I just wanted to say thank you! That's truly amazing work.
As far as I can tell, more than 200 Million lines of wikitext have so
far been removed from the Wikipedias. That's 200 Million lines that do
not have to maintained anymore.
(I have not run the actual
Hi,
I have one question concerning wikidata:
at
http://en.wikipedia.org/wiki/Schr%C3%B6dinger_equation#Time-dependent_equation
we have the statement
Ψ is the wave function
I have developed a system that discovers the relation between Ψ and
the page wave function
Is there a way to model that in
Just a quick add-on to Jane and Paul about the scope of data in Wikidata. I
think it is inevitable that Wikidata will start holding excess data that isn't
being used in Wikipedia. Take the climate boxes that are on many city pages
that show the average high and low per month for the last 5
Here's my approach to software code problems: we need less of it, not
more. We need to remove domain logic from source code and move it into
data, which can be managed and on which UI can be built.
In that way we can build generic scalable software agents. That is the
way to Semantic Web.
http://en.wikipedia.org/wiki/Homoiconicity
From: hale.michael...@live.com
To: wikidata-l@lists.wikimedia.org
Date: Mon, 8 Jul 2013 15:57:44 -0400
Subject: Re: [Wikidata-l] Accelerating software innovation with Wikidata and
improved Wikicode
In the functional programming language family
Yes, that is one of the reasons functional languages are getting popular:
https://www.fpcomplete.com/blog/2012/04/the-downfall-of-imperative-programming
With PHP and JavaScript being the most widespread (and still misused)
languages we will not get there soon, however.
On Mon, Jul 8, 2013 at
All positive change is gradual. In the meantime, for those of us with ample
free time for coding, it'd be nice to have a place to check in code and unit
tests that are organized roughly in the same way as Wikipedia. Maybe such a
project already exists and I just haven't found it yet.
Date:
My bot started his task. All items with pages to disambiguation pages and
pages to none disambiguation pages will get reported to:
https://www.wikidata.org/wiki/User:Sk!dbot/disambiguation_page_conflict
The nl community did a good job there are currently none (my bot scanned
allready all articles
Could you also offer the same service for other wikipedias? Would be
really useful.
Thanks for the service.
- Svavar Kjarrval
On 08/07/13 23:05, swuensch wrote:
My bot started his task. All items with pages to disambiguation pages
and pages to none disambiguation pages will get reported to:
13 matches
Mail list logo