Am 17.10.2013 20:16, schrieb Kingsley Idehen:
On 10/17/13 12:46 PM, Daniel Kinzler wrote:
I've run it through our variant of Vapour re. Linked Data verification:
http://bit.ly/1gM7oYa .
Nearly there. Your use of 302 is what's going to trip up existing Linked
Data
clients. Why aren't you
Am 21.10.2013 16:48, schrieb Kingsley Idehen:
Can someone not change 302 to 303 re: RewriteRule ^/entity/(.*)$
https://www.wikidata.org/wiki/Special:EntityData/$1 [R=302,QSA] ?
The thing is that we intended this to be an internal apache rewrite, not a HTTP
redirect at all. Because
Am 16.10.2013 15:11, schrieb Kingsley Idehen:
On 10/2/13 1:09 PM, Daniel Kinzler wrote:
Am 02.10.2013 17:00, schrieb Kingsley Idehen:
Daniel,
When will the fixed data be generated and published?
October 14, if all goes well.
-- daniel
Am 01.10.2013 20:14, schrieb Tom Morris:
How about not creating a fork just so you can delete a couple of
directories? The full download is a whopping 260KB. Is that really too
big/complex to include in its entirety and just ignore the parts you don't
use?
Not deploying code we do not use,
Am 02.10.2013 17:00, schrieb Kingsley Idehen:
Daniel,
When will the fixed data be generated and published?
October 14, if all goes well.
-- daniel
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
/reduce the code
base to the parts needed in a particular scenario?
-- daniel
Am 27.09.2013 01:17, schrieb Nicholas Humfrey:
On 26/09/2013 15:33, Daniel Kinzler daniel.kinz...@wikimedia.de wrote:
Am 26.09.2013 14:54, schrieb Nicholas Humfrey:
Wikidata uses a fork of EasyRdf:
https
Am 26.09.2013 14:54, schrieb Nicholas Humfrey:
Wikidata uses a fork of EasyRdf:
https://github.com/Wikidata/easyrdf_lite
Which should handle this correctly.
Looks like it doesn't, but I'll investigate.
However I don't seem to be able to content negotiate for Turtle today.
This is
Hi all!
As discussed at the MediaWiki Architecture session at Wikimania, I have created
an RFC for the TitleValue class, which could be used to replace the heavy-weight
Title class in many places. The idea is to show case the advantages (and
difficulties) of using true value objects as opposed to
Am 25.09.2013 14:06, schrieb Lydia Pintscher:
On Wed, Sep 25, 2013 at 1:13 PM, Antoine Isaac ais...@few.vu.nl wrote:
Hello Denny,
I think we in Europeana had the same problem in the GLAMwiki toolset project
[1].
We wanted to submit the metadata we had for Europeana objects to be uploaded
in
Hi all!
We have to impose a fixed limit on search result, since search results can not
be ordered by a unique ID, so paging is expensive.
The default for this limit is 50, but it SHOULD be 500 for bots. But the higher
limit for bots is currently not applied by the wbsearchentities module -
Am 03.09.2013 21:43, schrieb David Cuenca:
A couple of months ago there was a conversation about what to do with the
identifiers that should be owl:sameAs [1]
It's unclear to me where owl:sameAs would be used... it should definitely NOT
be used to point to descriptions of the same thing in
Am 13.09.2013 18:24, schrieb Benjamin Good:
Daniel,
Even 500 seems like a very low limit for this system unless I'm
misunderstanding something. Unless there is another way to execute queries
that return more rows than that, this would negate the possibility of a
huge number of applications
Hi all!
We have to impose a fixed limit on search result, since search results can not
be ordered by a unique ID, so paging is expensive.
The default for this limit is 50, but it SHOULD be 500 for bots. But the higher
limit for bots is currently not applied by the wbsearchentities module -
Am 13.09.2013 18:24, schrieb Benjamin Good:
Daniel,
Even 500 seems like a very low limit for this system unless I'm
misunderstanding something. Unless there is another way to execute queries
that return more rows than that, this would negate the possibility of a
huge number of applications
Hi all.
With today's deployment, the Wikibase API modules used on wikidata.org will
change from using lower-case IDs (q12345) to upper-case IDs (Q12345). This is
done for consistency with the way IDs are shown in the UI and used in URLs.
The API will continue to accept entity IDs in lower-case
Am 06.09.2013 15:02, schrieb Denny Vrandečić:
in the case of MediaWiki wikis, I guess http://en.wikipedia.org/wiki/**
Technical_University_of_Denmarkhttp://en.wikipedia.org/wiki/Technical%20University%20of%20Denmark
should
be preferred, but that is besides the point.
Yes, spaces are not
Am 03.09.2013 11:50, schrieb Lydia Pintscher:
On Mon, Sep 2, 2013 at 11:56 AM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
OK, based on the discussion so far, we will add the data type to the snak in
the external export, and keep the string data value for the URL data type.
That should
Am 30.08.2013 17:21, schrieb Denny Vrandečić:
I do see an advantage of stating the property datatype in a snak in the
external JSON representation, and am trying to understand what prevents us
from doing so.
Not much, the SnakSerializer would need access to the PropertyDataTypeLookup
service,
Am 25.08.2013 19:19, schrieb Markus Krötzsch:
If we have an IRI DV, considering that URLs are special IRIs, it seems clear
that IRI would be the best way of storing them.
The best way of storing them really depends on the storage platform. It may be a
string or something else.
I think the
Am 23.08.2013 19:19, schrieb Lydia Pintscher:
Heya folks :)
The URL datatype is now available on test.wikidata.org. It'd be
amazing if you could give it some testing. We hope to roll it out on
Monday however there are a few things outside our hands that still
need review. It might happen that
Am 10.08.2013 22:42, schrieb Jiang BIAN:
So is there a spec about the stable external format?
If you could include a version number of the format used by the data, it
will be much easier to write compatible code and/or notice the changes
immediately.
I don't think there's a formal spec,
Am 31.07.2013 13:42, schrieb Tim Starling:
We could have a library of PHPUnit-style assertion functions which
throw exceptions and don't act like eval(), I would be fine with that.
Maybe MWAssert::greaterThan( $foo, $bar ) or something.
I like that! Should support an error message as an
There seems to be an issue with Jenkins. It appears to use an old version of
other extensions under some circumstances.
It's like this:
If you submit change 33 for extension A, which needs change 44 in extension B
(which isn't merged yet), jenkins will fail correctly fail.
BUT: When change 44
Am 29.06.2013 18:21, schrieb Sven Manguard:
I have to imagine that the reason why is that Wikivoyage is the closest
project
to Wikipedia out of all of the sister projects in many important ways. Yes,
their page organization system is a little bit different, but not as different
as say
Am 28.06.2013 11:45, schrieb Denny Vrandečić:
* Wikipedia will not automatically and suddenly display links to
Wikivoyage.
The behavior on Wikipedia actually remains completely unchanged by this
deployment.
Let's make sure we have thorough tests for this, I'm not 100% sure how
Am 27.06.2013 16:08, schrieb Hady elsahar:
Hello all,
inside : http://www.wikidata.org/wiki/Special:EntityData/Q1.nt
page i found the triple
http://www.wikidata.org/wiki/Special:EntityData/Q1000
http://schema.org/about http://www.wikidata.org/entity/Q1000 .
it's a little bit
Am 28.06.2013 18:10, schrieb Gregor Hagedorn:
But the problem seems to be that http://www.wikidata.org/wiki/Q1000
actually seems to be an information resource, i.e. under this URI html
content is directly being returned (rather than being http 303
redirected).
A quick follow up to this morning's mail:
I discussed this issue with Denny for a while, and we came up with this:
* I'll explore the possibility of using a BadValue object instead of a BadSnak,
that is, model the error on the DataValue level. My initial impression was that
this would be more
Am 25.06.2013 14:58, schrieb Nikola Smolenski:
Do you think it would it be possible to have this data on the actual image
page,
where current page text would be just one of the items?
In theory yes, but I think that would create more problems than it would solve.
For one, wikitext as data
Am 25.06.2013 15:37, schrieb David Cuenca:
From the proposal it is not very clear to me what is the relationship between
the data stored in Commons and the data stored in Wikidata is going to look
like.
I assume that Work (item) will be linked to existing work items in
Wikidata,
is that
Am 21.06.2013 14:44, schrieb Gerard Meijssen:
Hoi,
Denny, when you look at the data currently in Wikidata, you find what is in
essence more than a basis for a translation dictionary.
I would say it's excellent as a thesaurus which can be used for cross-lingual
tagging, named entity
Am 19.06.2013 15:57, schrieb Denny Vrandečić:
http://www.wikidata.org/wiki/Wikidata:Wiktionary
To the best of our knowledge, we have checked all discussions on this topic,
and
also related work like OmegaWiki, Wordnet, etc., and are building on top of
that.
I would like to point out that
Am 13.06.2013 06:38, schrieb Jeroen De Dauw:
Hey,
Putting the DataType id in PropertyValueSnaks at this point seems like a bad
idea for several reasons. Doing so would cost us quite some work end end up
with
a more complicated system as foundation.
Changing it now would be hard.
But I
Am 13.06.2013 03:22, schrieb Daniel Werner:
-1
Had to deal with this in the frontend as well and don't think this is
inconvenient. It seems like the cleanest approach. Polluting the Snaks with
information like this for performance or convenience reasons will probably
cause
more trouble in
On 26.04.2013 21:13, Sebastian Hellmann wrote:
Hi Daniel,
Am 26.04.2013 18:01, schrieb Daniel Kinzler:
You guys are the only reason the interface still exists :) DBpedia is the
only
(regular) external user (LuceneSearch is the only internal user). Note that
there's nobody really
On 04.05.2013 12:05, Jona Christopher Sahnwaldt wrote:
On 26 April 2013 17:15, Daniel Kinzler daniel.kinz...@wikimedia.de wrote:
*internal* JSON representation, which is different from what the API returns,
and may change at any time without notice.
Somewhat off-topic: I didn't know you have
On 04.05.2013 19:13, Jona Christopher Sahnwaldt wrote:
We will produce a DBpedia release pretty soon, I don't think we can
wait for the real dumps. The inter-language links are an important
part of DBpedia, so we have to extract data from almost all Wikidata
items. I don't think it's sensible
non-JS *view* without full editing capabilities would be sufficient for
supporting the mobile version.
Another note: Daniel Werner and Henning Snater are the people most involved with
designing CSS and JS for the Wikibase UI.
Good luck and have fun,
-- daniel
--
Daniel Kinzler
is perfectly in sync, it might work...
Are you going to try this? Would be great if you could give us feedback!
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
On 22.04.2013 21:23, Lydia Pintscher wrote:
Heya folks :)
Phase 2 is not live on English Wikipedia. For more details please see
http://blog.wikimedia.de/?p=14896
I suppose it is *now* live on English Wikipedia :)
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland
, there is no roadmap for
integrating Wikibase specific Solr search with the MediaWiki search page. It's
on the list, but there are no concrete plans yet.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V
.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
to the
peroperty's talk page.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo
unsure though how well we could control scaling in such a
setup.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https
of the
wikidata page).
hth
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
?
But it's not a fact. It's a claim someone makes.
That may seem like a fine distinction, but it's really fundamental to
understanding how Wikidata/Wikibase is different from DBpedia, Freebase, Cyc,
etc.
Wikidata doesn't collect facts. It collects statements (sourced claims).
-- daniel
--
Daniel
the Wikidata API for integration? Or is it talking directly to
the Wikidata database?
It's talking directly to the database.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V
like to see this, but our priority is to get Wikimedia sites feature
complete first.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l
for keeping the bots from adding interlanguage links would
evolve. Can't huwiki simply opt out of the interwiki bot stuff?
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V
and telling MediaWiki that it's MySQL. From looking at [1],
this should Just Work (tm).
-- daniel
[1] https://en.wikipedia.org/wiki/MariaDB
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V
is probably not the best choice for
your Wikibase install.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https
page (say, the page Moon on en.wikipedia.org). There
either is one, or none.
5. you only get one result or nothing (I expected a list of all Items
with
the given label in the given language)
Again, that's what Special:ItemDIsambiguation does.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
re-using Wikibase code for that extension, especially for storing the data. They
would probably also be happy about a common code base for parsing, normalizing
and rendering coordinates.
We should cut them in on this discussion, I think.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia
require one query for each player.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman
be bad though, I
think.
However, we should probably store whether the level of certainty was given
explicitly or estimated automatically based on the number of significant digits
- then we can still ignore automatic values when desired.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia
On 20.12.2012 20:31, Friedrich Röhrs wrote:
Hi,
tried to enter the height of the eiffel tower. 324 meters. It suggested 324m
+-100m.
That's strange. When I enter 324m, it correctly suggests 324m+/-1 for me.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland
to be 90+/-20cm or something.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo
system defined. You can
still use them, but you cannot compare them to values in another system of
measurement.
-- daniel
PS: the above reflects my personal ideas on how to best do this, this is not a
finished design and wasn't discussed with the rest of the wikidata team.
--
Daniel Kinzler
On 19.12.2012 18:13, Gregor Hagedorn wrote:
On 19 December 2012 17:03, Daniel Kinzler daniel.kinz...@wikimedia.de wrote:
Indeed we do: https://wikidata.org/wiki/Wikidata:Glossary
I use precision exactly like that: significant digits when rendering
output or
parsing intput. It can be used
be in any unit we like.
The rendering of a value will be based on the primary data record, to the
conversion and rendering logic has access to all the additional information it
may want to use.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien
and sorted by the database (at the very least by
MySQL, but ideally, by many different database systems).
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
, interface
language, and heuristics for picking a decent unit based on dimension and
accuracy. The internal representation should use the same unit for all
quantities of a given dimension.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien
not be converted.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
. I can't
think of an example where that wouldn't feel natural.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https
, geographical longitude/latitude, etc.
Indeed.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https
know.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
that the English article can only have one outgoing interlanguage link
to german, the others are ignored (this was changed in core a few weeks ago,
unrelated to wikidata).
One solution would be to create a hub page for the law in general on the
German wikipedia too.
-- daniel
--
Daniel Kinzler
as such are not
copyrightable. But if there was a bot transferring stuff from infoboxes, it
should at least check for any actual text (e.g. long values with spaces), and
not transfer it, because of license reasons.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien
to inject changes
into the local recentchanges feed (and thus watchlist, relatedchanges, etc) was
made at the Hackathon in Berlin in June. I *thought* we had that written down
somewhere...
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien
messages.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
they are not the same but it's better to be!
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
: geo
latitude : 32.233,
longitude : -2.233,
},
}
Using magic keys (prefixed with _ or whatever) is kind of nasty, but saves
quite a bit of structural complexity.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung Freien
On 05.10.2012 20:33, Amir E. Aharoni wrote:
Would it be possible to show
the number there, to let everybody know the precise version?
I think https://gerrit.wikimedia.org/r/#/c/17333/ would do that... it's pending
review.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland
is marked as preferred. This is already necessary for basic
things like population numbers. However, I don't think we need to expire data
automatically - it should just be superseded by newer information.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland - Gesellschaft zur Förderung
to enforce any
constraint based on a class system or similar, a true ontology in the RDFS/OWL
sense is unlikely to appear. We hope however that the relations and properties
we collect about items will be useful in building such an ontology in other
systems.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http://wikimedia.de | Tel. (030) 219 158 260
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
.
This is a regression, i have reopened
https://bugzilla.wikimedia.org/show_bug.cgi?id=38263. See also
https://bugzilla.wikimedia.org/show_bug.cgi?id=40077.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http://wikimedia.de | Tel. (030
.#
The idea behind the format we are proposing is to use self-contained entries
(records) for everything, and support named keys for convenience where possible.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http://wikimedia.de | Tel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http://wikimedia.de | Tel. (030) 219 158 260
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der
2. Phase 1 (which we are currently finishing) is
only about interwikis. Phase 2 will cover almost everything you can find in
infoboxes today, including population data.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http
Works in Wikipedia either.
Anything that constitutes even a sentence can not be copied to Wikidata, or
needs an extra license statement attached. Lucky, our data structure is flexible
enough that we could even do that, though i'd like to avoid it.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
and the local transclusion interface (parser
functions).
A new transclusion draft is due some time next week, and we may also have a
first draft for the API by then. Keep an eye on the list!
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http://wikimedia.de | Tel. (030) 219 158 260
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der
of dynamic features, because we have to make sure that the basic
information is always available without the use of javascript.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http://wikimedia.de | Tel. (030) 219 158 260
for edit links. Especially not if the edit link is
supposed to invoke the on-site ajax editing interface How to you generate a
link/button for doing that?
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http://wikimedia.de | Tel
way to create infoboxes etc from
Wikidata items is very important for the acceptance of Wikidata on the clinet
wikis, I believe.
Thanks,
Daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http://wikimedia.de | Tel. (030) 219 158 260
be detected, is a much more tricky problem... but a
different topic.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http://wikimedia.de | Tel. (030) 219 158 260
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V
parameter syntax with
pseudo-parameters, e.g. {{{data.color}}} for retrieving the flat wikitext value
of the property.
Do you think that's ok? Or does it introduce complications with respect to Lua,
etc?
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher
to chatzilla on evenings or week-ends.
Cheers,
Christophe
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V
, the names of
the git repositories of the Wikibase and WikibaseClient extensions. Apparently
it is not possible to rename git repositories on Wikimedia's infrastructure, so
we are stuck with them for now.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher
This is an interesting criticism, and there's an excellent retort by Denny in
the comments. Just fyi.
-- daniel
Original Message
Subject: [Wiki-research-l] Wikidata opinion piece in The Atlantic
Date: Tue, 10 Apr 2012 16:50:49 -0700
From: En Pine deyntest...@hotmail.com
data tables (albeit in MySQL) for many things
it needs quick access too, e.g. the categories assigned to wiki pages.
HTH
Daniel Kinzler
On 09.04.2012 11:55, Soslan Khubulov wrote:
Hello!
What about the engine of Wikidata?
Do you think MediaWiki is good for structured data?
I think MediaWiki
of discussions about the project.
We could go to mediawiki.org, LQT is enabled there.
-- daniel
--
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http://wikimedia.de | Tel. (030) 219 158 260
Wikimedia Deutschland - Gesellschaft zur Förderung
this is often not the case. RDF is an intentionally simple mode. This makes it
easy to mix and match data from different sources using different standards, but
it also makes it hard to represent certain types of data efficiently or
conveniently.
Regards,
Daniel
--
Daniel Kinzler, Softwarearchitekt
301 - 395 of 395 matches
Mail list logo