Well, you would run into many of the same decisions we already face about how
much to limit automated uploads of data if you wanted to turn it into a live
programming platform. You can certainly already use DBpedia and Wikidata to get
datasets for many cool demonstrations of functional programmi
Could you also offer the same service for other wikipedias? Would be
really useful.
Thanks for the service.
- Svavar Kjarrval
On 08/07/13 23:05, swuensch wrote:
> My bot started his task. All items with pages to disambiguation pages
> and pages to none disambiguation pages will get reported to:
Wikidata seems like a good platform for functional computing, it "just"
needs Lisp-like lists (which would be an expansion of
queries/tree-searches) and processing capabilities. What you say it is also
true, it would be ahead of the times, because high-level computing
languages never expanded as mu
My bot started his task. All items with pages to disambiguation pages and
pages to none disambiguation pages will get reported to:
https://www.wikidata.org/wiki/User:Sk!dbot/disambiguation_page_conflict
The nl community did a good job there are currently none (my bot scanned
allready all articles
All positive change is gradual. In the meantime, for those of us with ample
free time for coding, it'd be nice to have a place to check in code and unit
tests that are organized roughly in the same way as Wikipedia. Maybe such a
project already exists and I just haven't found it yet.
> Date: M
Yes, that is one of the reasons functional languages are getting popular:
https://www.fpcomplete.com/blog/2012/04/the-downfall-of-imperative-programming
With PHP and JavaScript being the most widespread (and still misused)
languages we will not get there soon, however.
On Mon, Jul 8, 2013 at 10:57
http://en.wikipedia.org/wiki/Homoiconicity
From: hale.michael...@live.com
To: wikidata-l@lists.wikimedia.org
Date: Mon, 8 Jul 2013 15:57:44 -0400
Subject: Re: [Wikidata-l] Accelerating software innovation with Wikidata and
improved Wikicode
In the functional programming language family (thin
In the functional programming language family (think Lisp) there is no
fundamental distinction between code and data.
> Date: Mon, 8 Jul 2013 22:47:46 +0300
> From: marty...@graphity.org
> To: wikidata-l@lists.wikimedia.org
> Subject: Re: [Wikidata-l] Accelerating software innovation with Wikida
Here's my approach to software code problems: we need less of it, not
more. We need to remove domain logic from source code and move it into
data, which can be managed and on which UI can be built.
In that way we can build generic scalable software agents. That is the
way to Semantic Web.
Martynas
Just a quick add-on to Jane and Paul about the scope of data in Wikidata. I
think it is inevitable that Wikidata will start holding excess data that isn't
being used in Wikipedia. Take the climate boxes that are on many city pages
that show the average high and low per month for the last 5 years
There are lots of code snippets scattered around the internet, but most of them
can't be wired together in a simple flowchart manner. If you look at object
libraries that are designed specifically for that purpose, like Modelica, you
can do all sorts of neat engineering tasks like simulate the t
Here is my 2 cents.
I have paid my dues writing CRUD apps for business. They all want the same
thing, something that keeps track of entities and controls how the
organization interacts with those entities.
In one year, for instance, I worked on systems for an academic department
and a lo
I am all for a "dictionary of code snippets", but as with all
dictionaries, you need a way to group them, either by alphabetical
order or "birth date". It sounds like you have an idea how to group
those code samples, so why don't you share it? I would love to build
my own "pipeline" from a series o
Are you wanting to add a string property in Wikidata to store MathML equations
and formulas?http://www.wikidata.org/wiki/Wikidata:Property_proposal
As I side note, when I look here I get a bunch of script errors starting
halfway down the
page.http://www.wikidata.org/wiki/Wikidata:List_of_pro
All disambiguation pages of the Dutch Wikipedia are in this category:
https://nl.wikipedia.org/wiki/Categorie:Wikipedia:Doorverwijspagina
Let me know when you have created such list.
Romaine
--- On Mon, 7/8/13, swuensch wrote:
From: swuensch
Subject: Re: [Wikidata-l] Update nl-wiki & requ
Hi,
I have one question concerning wikidata:
at
http://en.wikipedia.org/wiki/Schr%C3%B6dinger_equation#Time-dependent_equation
we have the statement
Ψ is the wave function
I have developed a system that discovers the relation between Ψ and
the page wave function
Is there a way to model that in w
Denny Vrandečić, 08/07/2013 13:16:
I just wanted to say thank you! That's truly amazing work.
As far as I can tell, more than 200 Million lines of wikitext have so
far been removed from the Wikipedias. That's 200 Million lines that do
not have to maintained anymore.
(I have not run the actual a
I just wanted to say thank you! That's truly amazing work.
As far as I can tell, more than 200 Million lines of wikitext have so far
been removed from the Wikipedias. That's 200 Million lines that do not have
to maintained anymore.
(I have not run the actual analysis yet, I have been waiting for
We have received quite a few requests for an extended deadline. We
understand that working with large amount of data such as DBpedia is
difficult and time consuming.
The deadline will therefore be extended until Thursday, July 18th, 23:59
Hawaii time. However, we would like to appeal to all auth
For the bot removing interwiki links that are redirects etc my new code
should be ready by this weekend (I hope) and this should give the lists I
have a big clear out! :)
Addshore
On 8 Jul 2013 04:32, "Romaine Wiki" wrote:
> Today we reached at nl-wiki the situation that + 64% of the
> interwiki
Hey my Bot lists disambiguation conflicts under
https://www.wikidata.org/wiki/User:Sk!dbot/disambiguation_page_conflict
This list is little bit old but I can start the task again and update it. I
also could start from nl wiki to make sure that with on every
disambiguation conflict page there is a n
21 matches
Mail list logo