Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-10-01 Thread Lydia Pintscher
On Thu, Oct 1, 2015 at 10:38 AM, Markus Krötzsch
 wrote:
> This is still *very* far away from what I would consider the core tasks of
> Wikibase development now -- better get a snappy UI first -- but it is
> conceivable in the long run.

^ This!  :)  See also https://www.wikidata.org/wiki/Wikidata:Development_plan


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-10-01 Thread Markus Krötzsch

On 01.10.2015 00:13, Daniel Kinzler wrote:

Am 30.09.2015 um 23:38 schrieb Peter F. Patel-Schneider:

I would argue that inference-making bots should be considered only as a
stop-gap measure, and that a different mechanism should be considered for
making inferences in Wikidata.  I am not arguing for Inference done Just Right
(tm).  It is not necessary to get inference perfect the first time around.
All that is required is an inference mechanism that is examinable and maybe
overridable.


To do that, you would have to bake the inference rules into software in the
backend software, out of community control, maintained by a small group of
people. It's contrary to the idea of letting the community define and maintain
the ontology and semantics.

We are actually experimenting with something in that direction -- checking
constraints defined on-wiki using rules written into software on the backend,
hard-coding rules that were defined by the community. It's conceivable that we
might end up doing something like that for inference, too, but it's a lot
harder, and the slippery slope away from the community model seems much steeper
to me.

When I started to think about, and work on, wikidata/wikibase, I believed doing
inference on the server would be a very useful. The longer I work on the
project, the more convinced I become that we have to be very careful with this.
Wikidata is a "social machine", cutting the community out of the loop is
detrimental in the long run, even if it would make some processes more 
efficient.


As you already said, it would be a bad idea to hardcode rules in the 
backend, out of community control. But the solution to this is obvious 
and already partially realised for constraints: the rules (ontology) can 
be part of the user-edited data, and the software would only need a 
mechanism for interpreting them. This is still *very* far away from what 
I would consider the core tasks of Wikibase development now -- better 
get a snappy UI first -- but it is conceivable in the long run. The path 
towards this will be via third-party tools, which may also deploy bots 
(but maybe one "reasoning" bot rather than an army of them for all kinds 
of different inferences).


The first step is to design the rule/ontology language and to provide a 
scalable implementation to compute the resulting inferences. We have 
started discussion on the Wikiproject:Reasoning, and I am looking into 
the technical aspects, but please give me a few more months to get a 
proper infrastructure set up there. It would not start with bots that 
import anything -- rather one would at first just produce inferences, 
display them somewhere, and let the community get a feeling of what 
could be worth importing there. It is also possible to add inferences to 
a query service/data dump only, avoiding all editing-related questions. 
I believe that the community cannot really start writing useful 
inference rules without tool support that allows you to preview the 
(global) consequences of such rules. I have some initial components for 
this, but Wikidata integration is only at its beginning. The goal would 
be to set up a public demonstrator this year, which proposes a format 
for inference rules and a machine to efficiently compute all of their 
conclusions.


Markus



___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-30 Thread Daniel Kinzler
Am 30.09.2015 um 23:38 schrieb Peter F. Patel-Schneider:
> I would argue that inference-making bots should be considered only as a
> stop-gap measure, and that a different mechanism should be considered for
> making inferences in Wikidata.  I am not arguing for Inference done Just Right
> (tm).  It is not necessary to get inference perfect the first time around.
> All that is required is an inference mechanism that is examinable and maybe
> overridable.

To do that, you would have to bake the inference rules into software in the
backend software, out of community control, maintained by a small group of
people. It's contrary to the idea of letting the community define and maintain
the ontology and semantics.

We are actually experimenting with something in that direction -- checking
constraints defined on-wiki using rules written into software on the backend,
hard-coding rules that were defined by the community. It's conceivable that we
might end up doing something like that for inference, too, but it's a lot
harder, and the slippery slope away from the community model seems much steeper
to me.

When I started to think about, and work on, wikidata/wikibase, I believed doing
inference on the server would be a very useful. The longer I work on the
project, the more convinced I become that we have to be very careful with this.
Wikidata is a "social machine", cutting the community out of the loop is
detrimental in the long run, even if it would make some processes more 
efficient.


-- 
Daniel Kinzler
Senior Software Developer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-30 Thread Peter F. Patel-Schneider
On 09/29/2015 08:01 AM, Daniel Kinzler wrote:
> Am 29.09.2015 um 11:05 schrieb Thomas Douillard:
>> No it's not, because of the "undoing" problem. A user can't delete a 
>> statement
>> assuming this will be enough as he will not be explicit that the statement is
>> bot added and implied by other statements, as opposed as a statement 
>> explicitely
>> inferred by Wikibase and marked explicitely as such in the UI. If Wikibase
>> tracks the root explicit statements used to make the inference, they could be
>> exposed in the UI as well to tell the user what he might have to do to 
>> correct
>> the mistake (closer to or) at the actual root.
> 
> I agree: if we had built in inference done Just Right (tm), with everything
> editable and visible in all the right places, that would be great. But this
> would add a lot of complexity to the system, and would take a lot of resources
> to implement. It would also diverge quite a bit from the classic idea of a 
> wiki,
> potentially cause community issues.
> 
> The approach using bots was never ideal, but is still hugely successful on
> wikipedia. The same seems to be the case here. Also don't underestimate the 
> fact
> that the community has a lot of experience with bots, but is generally very
> skeptical against automatic content (even just including information from
> wikidata on wikipedia pages).
> 
> So, while bots are not ideal, and a better solution is conceivable, I think 
> bots
> as the optimal solution for the moment. We should not ignore the issues that
> exist with bots, and we should not lose sight of other options. But I think we
> should focus development on more urgent things, like a better system for 
> source
> references, or unit conversion, or better tools for constraints, or for re-use
> on wikipedia.

I also strongly agree that inference-making tools should record their
premises.  There are lots of excellent reasons to do this recording, including
showing editors where changes need to be made to remove the inferred claim.
Inference-making bots that do not record how a claim was inferred are even
worse than an inferential system that does not do so, as determining which bot
made a particular inference is harder than determining which part of an
inferential system sanctions a particular inference.


What is the difference between a system of inference-making bots that record
their premises and an inferential system that records its premises?   In some
sense, not much.  I would thus argue that an inferential system is no more
complex than a set of inference-making bots.

However, an inferential system is not limited to the implementation techniques
that are needed in a bot system.   It can, for example, only perform some
inferences on an as-needed basis.  An inferential system also can be analyzed
as a whole, something that is quite difficult with a bot system.

I would argue that inference-making bots should be considered only as a
stop-gap measure, and that a different mechanism should be considered for
making inferences in Wikidata.  I am not arguing for Inference done Just Right
(tm).  It is not necessary to get inference perfect the first time around.
All that is required is an inference mechanism that is examinable and maybe
overridable.


Peter F. Patel-Schneider
Nuance Communications




___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-29 Thread Yongmin Hong
2015. 9. 30. 1:38에 "Thad Guidry" 님이 작성:
>
> Revi,
>
> I do speak or write Korean.  So I could not help you or Wikidata with
adding a description in Korean language.
> My edit is not useless for English speakers but helps make the
description more clear and easier to understand.
>

I live in Korea, and I never went to any other country since I was born. So
I don't need your help, but that's not the point ;) (Maybe you don't speak
Korean? I dunno... This shows my English is not that good!)

But I am busy in real life and other matters of wikis, so I am not as
active as I was in 2013~early 2014 when I was a bitmaybe active
admin. (Checkout the log for -revi.) Also, if you need Korean native, IIRC,
admin list by their native language is always available onwiki.

ko was just my example because it's what I learned for the first time. ;)

>
> My understanding is that enhancing parts and areas of Wikidata and its
data is the responsibility of everyone involved.  If someone only knows or
can help in one small area and make an improvement, then it is acceptable.
>
> Is this not a correct assumption ?
>

What I wanted to say is, "You can update en description (or any language
you know goes here), but you cannot expect others who has edited their
languages' description to regularly check the en descriptions' update to
match with English". Sorry for confusion, and hope this clarifies my
initial comment.

>
> Perhaps you know someone who speaks and writes Korean and could add a
Korean description to help Wikidata further as I have ? :)
>

[[d:User_talk:-revi]] is always open, but if I am unresponsible,
[[d:WD:사랑방|Korean VP]] is always open to anyone including you as well.(*)

(*) I'm typing this message in 02:48 AM and feeling too lazy to put the
full link here.

ps. Cutting the messages to part on Android is always a pain.  :-p

--
revi
https://revi.me
-- Sent from Android --
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-29 Thread Bene*

Hey guys

Am 29.09.2015 um 09:39 schrieb Thomas Douillard:
> Some did, I think. :) Anything that doesn't create a recentchanges 
entry is "hiding that it happened".


Then I'd say the alternatives are either
1) no inferences at all
2) find a solution for inferences to show up on recent changes on 
related items


This brings up a very interesting idea. We could display certain 
properties which are said to be symmetric on both items and treat it 
like we treat statement usages on clients: Showing an entry in the 
watchlist and (later) in the page history of that item. The mechanism 
for this already exists on client wikis, maybe we can make that work on 
wikidata.org as well.


Best regards
Bene

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-29 Thread Thomas Douillard
Time available and priorities of the devteam is something entirely
different :)

Just to add a thought and be complete, I must add that we focused the
discussion on bots, but that's missing
that there is not only bots but also manual and semi automated work (with
magnus' tools typically) involved : removing a statement can trigger a
constraint on symmetry, that another will resolve by re-adding the
statement you just removed. This might trigger community issue as well.

2015-09-29 17:01 GMT+02:00 Daniel Kinzler :

> Am 29.09.2015 um 11:05 schrieb Thomas Douillard:
> > No it's not, because of the "undoing" problem. A user can't delete a
> statement
> > assuming this will be enough as he will not be explicit that the
> statement is
> > bot added and implied by other statements, as opposed as a statement
> explicitely
> > inferred by Wikibase and marked explicitely as such in the UI. If
> Wikibase
> > tracks the root explicit statements used to make the inference, they
> could be
> > exposed in the UI as well to tell the user what he might have to do to
> correct
> > the mistake (closer to or) at the actual root.
>
> I agree: if we had built in inference done Just Right (tm), with everything
> editable and visible in all the right places, that would be great. But this
> would add a lot of complexity to the system, and would take a lot of
> resources
> to implement. It would also diverge quite a bit from the classic idea of a
> wiki,
> potentially cause community issues.
>
> The approach using bots was never ideal, but is still hugely successful on
> wikipedia. The same seems to be the case here. Also don't underestimate
> the fact
> that the community has a lot of experience with bots, but is generally very
> skeptical against automatic content (even just including information from
> wikidata on wikipedia pages).
>
> So, while bots are not ideal, and a better solution is conceivable, I
> think bots
> as the optimal solution for the moment. We should not ignore the issues
> that
> exist with bots, and we should not lose sight of other options. But I
> think we
> should focus development on more urgent things, like a better system for
> source
> references, or unit conversion, or better tools for constraints, or for
> re-use
> on wikipedia.
>
>
> --
> Daniel Kinzler
> Senior Software Developer
>
> Wikimedia Deutschland
> Gesellschaft zur Förderung Freien Wissens e.V.
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-29 Thread Thad Guidry
Revi,

I do speak or write Korean.  So I could not help you or Wikidata with
adding a description in Korean language.
My edit is not useless for English speakers but helps make the description
more clear and easier to understand.

My understanding is that enhancing parts and areas of Wikidata and its data
is the responsibility of everyone involved.  If someone only knows or can
help in one small area and make an improvement, then it is acceptable.

Is this not a correct assumption ?

Perhaps you know someone who speaks and writes Korean and could add a
Korean description to help Wikidata further as I have ? :)

Thad
+ThadGuidry 

On Mon, Sep 28, 2015 at 7:08 PM, Yongmin Hong  wrote:

>
> 2015. 9. 28. 오전 5:11에 "Thad Guidry" 님이 작성:
> >
> > ​​
> > I have improved the Sister City property with a MUCH better description
> to save others further grief.
> >
>
> If there is translated (e.g. ko) description and user's interface is set
> to Korean, then just editing English description is useless, since the
> Korean description will override English one. And you cannoy expect
> everyone to magically notice update on English description and update them
> every time it is updated.
>
> disclaimer: I'm phone now, and hasn't verified if there is Korean
> description on P190 or not. But this applies to all users with non-English
> default language in their preferences.
>
> --
> revi
> https://revi.me
> -- Sent from Android --
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-29 Thread Daniel Kinzler
Am 29.09.2015 um 11:05 schrieb Thomas Douillard:
> No it's not, because of the "undoing" problem. A user can't delete a statement
> assuming this will be enough as he will not be explicit that the statement is
> bot added and implied by other statements, as opposed as a statement 
> explicitely
> inferred by Wikibase and marked explicitely as such in the UI. If Wikibase
> tracks the root explicit statements used to make the inference, they could be
> exposed in the UI as well to tell the user what he might have to do to correct
> the mistake (closer to or) at the actual root.

I agree: if we had built in inference done Just Right (tm), with everything
editable and visible in all the right places, that would be great. But this
would add a lot of complexity to the system, and would take a lot of resources
to implement. It would also diverge quite a bit from the classic idea of a wiki,
potentially cause community issues.

The approach using bots was never ideal, but is still hugely successful on
wikipedia. The same seems to be the case here. Also don't underestimate the fact
that the community has a lot of experience with bots, but is generally very
skeptical against automatic content (even just including information from
wikidata on wikipedia pages).

So, while bots are not ideal, and a better solution is conceivable, I think bots
as the optimal solution for the moment. We should not ignore the issues that
exist with bots, and we should not lose sight of other options. But I think we
should focus development on more urgent things, like a better system for source
references, or unit conversion, or better tools for constraints, or for re-use
on wikipedia.


-- 
Daniel Kinzler
Senior Software Developer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-29 Thread Peter F. Patel-Schneider
On 09/28/2015 11:24 PM, Federico Leva (Nemo) wrote:
> Peter F. Patel-Schneider, 28/09/2015 22:27:
>>> >I'm aguing against making such inference part of wikibase/wikidata core
>>> >functionality, and hiding it's working ("magic").
>>> >
>>> >However, I very much hope for a whole ecosystem of tools that apply and
>>> use such
>>> >inference, and make the results obvious to users, both integrated with
>>> >wikidata.org and outside.
>>
>> Has anyone argued for performing inference and then hiding that it happened?
> 
> Some did, I think. :) Anything that doesn't create a recentchanges entry is
> "hiding that it happened".
> 
> Nemo
> 

Citation please.


peter


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-29 Thread Daniel Kinzler
Am 28.09.2015 um 20:42 schrieb John Erling Blad:
> Probability of detection (PoD) is central to fighting vandalism, and that does
> not imply making the vandalism less visible.
> 
> Symmetric statements makes vandalism appear in more places, making it more
> visible, and thereby increasing the chance for detection.

If such "magic" changes show up on the watchlist and in the page history, I
agree. Then they are edits (or at least pseudo-edits). The simplest way to
achieve this is by using bots.

> If you isolate the vandalism it will be less visible, but then it will be more
> likely that no one will ever spot it.
> 
> And yes, PoD is a military thingy and as such is disliked by the
> wikicommunities. Still sometimes it is wise to check out what is actually
> working and why it is working.

Isolation reduces the impact. I think Probability of Detection is actually a
very good metric when thinking about vandalism. But it needs to be weight
against impact/visibility.

For Wikibase integration in Wikipedia, we worked to make the PoD grow along with
the impact: pages that use info from wikidata will have changes to that data
show up in the watchlist and recen5tchanges (integration with the page history
is still planned).

-- 
Daniel Kinzler
Senior Software Developer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-29 Thread Thomas Douillard
No it's not, because of the "undoing" problem. A user can't delete a
statement assuming this will be enough as he will not be explicit that the
statement is bot added and implied by other statements, as opposed as a
statement explicitely inferred by Wikibase and marked explicitely as such
in the UI. If Wikibase tracks the root explicit statements used to make the
inference, they could be exposed in the UI as well to tell the user what he
might have to do to correct the mistake (closer to or) at the actual root.

Actually, instead of making stuffs explicit, it make stuffs hidden into
complex workflows, so imho this goes in the oposite direction to the
intended one.

2015-09-29 10:07 GMT+02:00 Andrew Gray :

> On 29 September 2015 at 08:39, Thomas Douillard <
> thomas.douill...@gmail.com> wrote:
>
>> > Some did, I think. :) Anything that doesn't create a recentchanges
>> entry is "hiding that it happened".
>>
>> Then I'd say the alternatives are either
>> 1) no inferences at all
>> 2) find a solution for inferences to show up on recent changes on related
>> items
>>
>> Of course if we want to have inferences 1 is not really an option :) and
>> if two is possible, and I don't see why it should not be, then this solves
>> the (not) hiding problem.
>>
>
> Isn't option #2 effectively the same (to the end-user) as having someone
> build a bot to fill in all the inferences as soon as one part is created?
> It's just making it faster and more robust...
>
> Andrew.
>
> --
> - Andrew Gray
>   andrew.g...@dunelm.org.uk
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-29 Thread Andrew Gray
On 29 September 2015 at 08:39, Thomas Douillard 
wrote:

> > Some did, I think. :) Anything that doesn't create a recentchanges
> entry is "hiding that it happened".
>
> Then I'd say the alternatives are either
> 1) no inferences at all
> 2) find a solution for inferences to show up on recent changes on related
> items
>
> Of course if we want to have inferences 1 is not really an option :) and
> if two is possible, and I don't see why it should not be, then this solves
> the (not) hiding problem.
>

Isn't option #2 effectively the same (to the end-user) as having someone
build a bot to fill in all the inferences as soon as one part is created?
It's just making it faster and more robust...

Andrew.

-- 
- Andrew Gray
  andrew.g...@dunelm.org.uk
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-29 Thread Thomas Douillard
> Some did, I think. :) Anything that doesn't create a recentchanges entry
is "hiding that it happened".

Then I'd say the alternatives are either
1) no inferences at all
2) find a solution for inferences to show up on recent changes on related
items

Of course if we want to have inferences 1 is not really an option :) and if
two is possible, and I don't see why it should not be, then this solves the
(not) hiding problem.

2015-09-29 8:24 GMT+02:00 Federico Leva (Nemo) :

> Peter F. Patel-Schneider, 28/09/2015 22:27:
>
>> >I'm aguing against making such inference part of wikibase/wikidata core
>>> >functionality, and hiding it's working ("magic").
>>> >
>>> >However, I very much hope for a whole ecosystem of tools that apply and
>>> use such
>>> >inference, and make the results obvious to users, both integrated with
>>> >wikidata.org and outside.
>>>
>>
>> Has anyone argued for performing inference and then hiding that it
>> happened?
>>
>
> Some did, I think. :) Anything that doesn't create a recentchanges entry
> is "hiding that it happened".
>
> Nemo
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Federico Leva (Nemo)

Peter F. Patel-Schneider, 28/09/2015 22:27:

>I'm aguing against making such inference part of wikibase/wikidata core
>functionality, and hiding it's working ("magic").
>
>However, I very much hope for a whole ecosystem of tools that apply and use 
such
>inference, and make the results obvious to users, both integrated with
>wikidata.org and outside.


Has anyone argued for performing inference and then hiding that it happened?


Some did, I think. :) Anything that doesn't create a recentchanges entry 
is "hiding that it happened".


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Yongmin Hong
2015. 9. 28. 오전 5:11에 "Thad Guidry" 님이 작성:
>
> ​​
> I have improved the Sister City property with a MUCH better description
to save others further grief.
>

If there is translated (e.g. ko) description and user's interface is set to
Korean, then just editing English description is useless, since the Korean
description will override English one. And you cannoy expect everyone to
magically notice update on English description and update them every time
it is updated.

disclaimer: I'm phone now, and hasn't verified if there is Korean
description on P190 or not. But this applies to all users with non-English
default language in their preferences.

--
revi
https://revi.me
-- Sent from Android --
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Peter F. Patel-Schneider
On 09/28/2015 08:30 AM, Daniel Kinzler wrote:
> Am 28.09.2015 um 17:27 schrieb Peter F. Patel-Schneider:
>> Are you arguing against any tool that makes inferences combining multiple
>> pieces of data in Wikidata?  Would you also argue against this if the 
>> inferred
>> information is flagged in some way?
> 
> I'm aguing against making such inference part of wikibase/wikidata core
> functionality, and hiding it's working ("magic").
> 
> However, I very much hope for a whole ecosystem of tools that apply and use 
> such
> inference, and make the results obvious to users, both integrated with
> wikidata.org and outside.

Has anyone argued for performing inference and then hiding that it happened?

One problem that I see with Wikidata at the moment is that it is not obvious
what inferences should or could be done.  There is no theory of knowledge that
stands behind Wikidata.  It seems to me that in the absence of such a theory
people are building bots that do some checks on the data in Wikipedia in an
attempt to check some of the inferences that would be sanctioned by such a
theory.  However, I do not see any determination that the bots are covering
those checks that should be made.  I guess that now people are also building
bots that make some of the inferences that would be sanctioned by a theory of
knowledge for Wikidata.  Again, however, there doesn't seem to any
determination that these bots are making correct inferences or that they are
covering a group of inferences that should be made.

I view this situation as inferior to an implementation of an integrated set of
inferences for Wikidata.


Admittedly, coming up with a knowledge theory for Wikidata is not going to be
easy.  It is much easier to just write a bot that does something that might be
useful.


peter


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread John Erling Blad
Probability of detection (PoD) is central to fighting vandalism, and that
does not imply making the vandalism less visible.

Symmetric statements makes vandalism appear in more places, making it more
visible, and thereby increasing the chance for detection.

If you isolate the vandalism it will be less visible, but then it will be
more likely that no one will ever spot it.

And yes, PoD is a military thingy and as such is disliked by the
wikicommunities. Still sometimes it is wise to check out what is actually
working and why it is working.
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread John Erling Blad
Depending on bots to set up symmetric relations is one of the things I find
very weird in Wikidata. That creates a situation where a user do an edit,
and a bot later on overrides the users previous edit. It is exactly the
same race condition that we fought earlier with the iw-bots, but now
replicated in Wikidata - a system that was supposed to remove the problem.

/me dumb, me confused.. o_O



On Mon, Sep 28, 2015 at 5:12 PM, Daniel Kinzler  wrote:

> Am 28.09.2015 um 16:43 schrieb Thomas Douillard:
> > Daniel Wrote:
> >> (*) This follows the principle of "magic is bad, let people edit".
> Allowing
> >> inconsistencies means we can detect errors by finding such
> inconsistencies.
> >> Automatically enforcing consistency may lead to errors propagating out
> of view
> >> of the curation process. The QA process on wikis is centered around
> edits, so
> >> every change should be an edit. Using a bot to fill in missing
> "reverse" links
> >> follows this idea. The fact that you found an issue with the data
> because you
> >> saw a bot do an edit is an example of this principle working nicely.
> >
> > That might prove to become a worser nightmare than the magic one ...
> It's seems
> > like refusing any kind of automation because it might surprise people
> for the
> > sake of exhausting them to let them do a lot of manual work.
>
> I'm not arguing against "any" kind of automation. I'm arguing against
> "invisible" automation baked into the backend software. We(*) very much
> encourage "visible" automation under community control like bots and other
> (semi-)automatic import tools like WiDaR.
>
> -- daniel
>
>
> (*) I'm part of the wikidata developer team, not an active member of the
> community. I'm primarily speaking for myself here, from my personal
> experience
> as a wikipedia and common admin. I know from past discussions that "bots
> over
> magic" is considered Best Practice among the dev team, and I believe it's
> also
> the approach preferred by the Wikidata community, but I cannot speak for
> them.
>
>
> --
> Daniel Kinzler
> Senior Software Developer
>
> Wikimedia Deutschland
> Gesellschaft zur Förderung Freien Wissens e.V.
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Daniel Kinzler
Am 28.09.2015 um 17:27 schrieb Peter F. Patel-Schneider:
> Are you arguing against any tool that makes inferences combining multiple
> pieces of data in Wikidata?  Would you also argue against this if the inferred
> information is flagged in some way?

I'm aguing against making such inference part of wikibase/wikidata core
functionality, and hiding it's working ("magic").

However, I very much hope for a whole ecosystem of tools that apply and use such
inference, and make the results obvious to users, both integrated with
wikidata.org and outside.


-- 
Daniel Kinzler
Senior Software Developer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Thomas Douillard
2015-09-28 17:20 GMT+02:00 Daniel Kinzler :

> Slow edit processes give
> people time for review
>


> Slow edit processes give
people time for review

Faster propagation of edits with logical inferences would also make
mistakes more visible, hence make them faster to find and review. I might
argue a little bit of redundancy like having symmetric statements would
make vandalism harder, I totally changed my mind. This will make vandalism
harder to track and really fastidious to correct, hence a maintenance
nightmare, especially for complex chain of automations.
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Peter F. Patel-Schneider
On 09/28/2015 08:12 AM, Daniel Kinzler wrote:
> Am 28.09.2015 um 16:43 schrieb Thomas Douillard:
>> Daniel Wrote:
>>> (*) This follows the principle of "magic is bad, let people edit". Allowing
>>> inconsistencies means we can detect errors by finding such inconsistencies.
>>> Automatically enforcing consistency may lead to errors propagating out of 
>>> view
>>> of the curation process. The QA process on wikis is centered around edits, 
>>> so
>>> every change should be an edit. Using a bot to fill in missing "reverse" 
>>> links
>>> follows this idea. The fact that you found an issue with the data because 
>>> you
>>> saw a bot do an edit is an example of this principle working nicely.
>>
>> That might prove to become a worser nightmare than the magic one ... It's 
>> seems
>> like refusing any kind of automation because it might surprise people for the
>> sake of exhausting them to let them do a lot of manual work.
> 
> I'm not arguing against "any" kind of automation. I'm arguing against
> "invisible" automation baked into the backend software. We(*) very much
> encourage "visible" automation under community control like bots and other
> (semi-)automatic import tools like WiDaR.
> 
> -- daniel
> 
> 
> (*) I'm part of the wikidata developer team, not an active member of the
> community. I'm primarily speaking for myself here, from my personal experience
> as a wikipedia and common admin. I know from past discussions that "bots over
> magic" is considered Best Practice among the dev team, and I believe it's also
> the approach preferred by the Wikidata community, but I cannot speak for them.

I'm not sure what you are arguing against here.

Are you arguing against any tool that makes inferences combining multiple
pieces of data in Wikidata?  Would you also argue against this if the inferred
information is flagged in some way?

peter

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Daniel Kinzler
Am 28.09.2015 um 16:41 schrieb Peter F. Patel-Schneider:
> I agree that finding the right thing to use is not easy.
> 
> However, I think that a uniform search space is better than a non-uniform one.
>   I would much prefer to look through a collection of properties than a
> collection of properties and qualifiers.  If I am writing a tool to help the
> process, I would much prefer to display a collection of properties than a
> collection of properties plus qualifiers.

I understand. But you buy flatness of the search space by only having one
dimension for modelling, instead of the multiple degrees of freedom you have
with qualifiers.

Similarly: of course it would be nice to build an ontology of everythign in the
world using a single class hierarchy (taxonomy). Nice and clean and easy to
handle. But this approach runs into problems as soon as you try to model a
non-trivial domain. The result is usually a very awkward modelling of the 
domain.

Using a more expressive model (e.g. adding interfaces as in typical OO
languages, or adding mixins, facesses, traits, etc) makes the model as such more
complex, but the mapping to the domain (in our case: the world) less complex,
and more natural.


-- 
Daniel Kinzler
Senior Software Developer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Thomas Douillard
Bots are actually much more opaque that could be explicit inference rules
(we don't have the source code of Krbot for example). It seems my problem
originated in lsjbot who created articles on nlwiki, which were imported on
Wikidata, then other statements were created ... this is actually hard to
maintain and the origin of datas is traceable, but not that easily. For the
user, a bot work is as opaque as Wikidata work, if not more opaque as the
Rules could be transparent and Wikibase could provide explanation and trace
itself the origin of datas and of inferences.

2015-09-28 17:12 GMT+02:00 Daniel Kinzler :

> Am 28.09.2015 um 16:43 schrieb Thomas Douillard:
> > Daniel Wrote:
> >> (*) This follows the principle of "magic is bad, let people edit".
> Allowing
> >> inconsistencies means we can detect errors by finding such
> inconsistencies.
> >> Automatically enforcing consistency may lead to errors propagating out
> of view
> >> of the curation process. The QA process on wikis is centered around
> edits, so
> >> every change should be an edit. Using a bot to fill in missing
> "reverse" links
> >> follows this idea. The fact that you found an issue with the data
> because you
> >> saw a bot do an edit is an example of this principle working nicely.
> >
> > That might prove to become a worser nightmare than the magic one ...
> It's seems
> > like refusing any kind of automation because it might surprise people
> for the
> > sake of exhausting them to let them do a lot of manual work.
>
> I'm not arguing against "any" kind of automation. I'm arguing against
> "invisible" automation baked into the backend software. We(*) very much
> encourage "visible" automation under community control like bots and other
> (semi-)automatic import tools like WiDaR.
>
> -- daniel
>
>
> (*) I'm part of the wikidata developer team, not an active member of the
> community. I'm primarily speaking for myself here, from my personal
> experience
> as a wikipedia and common admin. I know from past discussions that "bots
> over
> magic" is considered Best Practice among the dev team, and I believe it's
> also
> the approach preferred by the Wikidata community, but I cannot speak for
> them.
>
>
> --
> Daniel Kinzler
> Senior Software Developer
>
> Wikimedia Deutschland
> Gesellschaft zur Förderung Freien Wissens e.V.
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Daniel Kinzler
Am 28.09.2015 um 16:54 schrieb Thomas Douillard:
> An example that just happened a few minutes ago : I did this kind of edits
> because the claims are wrong :
> https://www.wikidata.org/w/index.php?title=Q12191&diff=254099391&oldid=254099376
> I think I already did this in the past. After a chat with Yamaha5, it appears 
> he
> added this to complete a symmetric relations, which means if I just remove the
> claim in this item they are likely to come back. But it might be a chain of
> works : maybe a robot had imported this from some Wikipedia, then Yamaha
> completed the symmetric relation. If I remove the symmetric claims, the robot
> might reimport them, so ... with or without inferrences and magic, we will 
> have
> to trace the origin of the problem to solve it once and for all.

Yes, bad info coming back via bots is a problem, but often it actually helps to
surface some underlying issue with the data. It's not always easy to find or fix
that underlying problem, but it's possible.

Making individual edits more powerful by adding internal automation (e.g.
removing all "child" statements from a peseron's item would also "magically"
remove the "parent" statement from the children's items) would not only make it
easier to fix problems. It would also make mistakes harder to find mistakes, and
it would give vandals a real boost in efficiency.

The Wiki Way is, in some ways, inefficient on purpose. Slow edit processes give
people time for review. It's a bit like democracy in that regard...
dictatorships are a lot more efficient than democracies, but does that make them
better?...


-- 
Daniel Kinzler
Senior Software Developer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Daniel Kinzler
Am 28.09.2015 um 16:43 schrieb Thomas Douillard:
> Daniel Wrote:
>> (*) This follows the principle of "magic is bad, let people edit". Allowing
>> inconsistencies means we can detect errors by finding such inconsistencies.
>> Automatically enforcing consistency may lead to errors propagating out of 
>> view
>> of the curation process. The QA process on wikis is centered around edits, so
>> every change should be an edit. Using a bot to fill in missing "reverse" 
>> links
>> follows this idea. The fact that you found an issue with the data because you
>> saw a bot do an edit is an example of this principle working nicely.
> 
> That might prove to become a worser nightmare than the magic one ... It's 
> seems
> like refusing any kind of automation because it might surprise people for the
> sake of exhausting them to let them do a lot of manual work.

I'm not arguing against "any" kind of automation. I'm arguing against
"invisible" automation baked into the backend software. We(*) very much
encourage "visible" automation under community control like bots and other
(semi-)automatic import tools like WiDaR.

-- daniel


(*) I'm part of the wikidata developer team, not an active member of the
community. I'm primarily speaking for myself here, from my personal experience
as a wikipedia and common admin. I know from past discussions that "bots over
magic" is considered Best Practice among the dev team, and I believe it's also
the approach preferred by the Wikidata community, but I cannot speak for them.


-- 
Daniel Kinzler
Senior Software Developer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Thomas Douillard
An example that just happened a few minutes ago : I did this kind of edits
because the claims are wrong :
https://www.wikidata.org/w/index.php?title=Q12191&diff=254099391&oldid=254099376
I think I already did this in the past. After a chat with Yamaha5, it
appears he added this to complete a symmetric relations, which means if I
just remove the claim in this item they are likely to come back. But it
might be a chain of works : maybe a robot had imported this from some
Wikipedia, then Yamaha completed the symmetric relation. If I remove the
symmetric claims, the robot might reimport them, so ... with or without
inferrences and magic, we will have to trace the origin of the problem to
solve it once and for all.

2015-09-28 16:43 GMT+02:00 Thomas Douillard :

> >
> (*) This follows the principle of "magic is bad, let people edit". Allowing
> inconsistencies means we can detect errors by finding such inconsistencies.
> Automatically enforcing consistency may lead to errors propagating out of
> view
> of the curation process. The QA process on wikis is centered around edits,
> so
> every change should be an edit. Using a bot to fill in missing "reverse"
> links
> follows this idea. The fact that you found an issue with the data because
> you
> saw a bot do an edit is an example of this principle working nicely.
>
> That might prove to become a worser nightmare than the magic one ... It's
> seems like refusing any kind of automation because it might surprise people
> for the sake of exhausting them to let them do a lot of manual work.
>
> 2015-09-28 16:23 GMT+02:00 Daniel Kinzler :
>
>> Am 27.09.2015 um 21:19 schrieb Thad Guidry:
>> > Both Sides ?  Wikidata has a true graph representation like FB ?
>> didn't know
>> > that.  Can you show me the other side your referring too ?
>>
>> "Both sides" probably means that "sister city" is a reflexive property,
>> so if
>> item A refers to item B as a sister city, item B should also refer to
>> item A as
>> a sister city. This is not automatic, and it was a conscious design
>> decision to
>> not make it automatic(*).
>>
>> What do you mean by "true graph representation"? Wikidata internally uses
>> JSON
>> structures to represent items, and items reference each other, forming a
>> graph.
>> We have a linked data interface for traversing the graph[1]. We also have
>> an RDF
>> mapping with a SPARQL endpoint[2] that allows queries against that graph.
>>
>> -- daniel
>>
>>
>> [1]
>> https://www.wikidata.org/wiki/Wikidata:Data_access#Linked_Data_interface
>> [2] https://www.wikidata.org/wiki/Wikidata:Data_access#SPARQL_endpoints
>>
>> (*) This follows the principle of "magic is bad, let people edit".
>> Allowing
>> inconsistencies means we can detect errors by finding such
>> inconsistencies.
>> Automatically enforcing consistency may lead to errors propagating out of
>> view
>> of the curation process. The QA process on wikis is centered around
>> edits, so
>> every change should be an edit. Using a bot to fill in missing "reverse"
>> links
>> follows this idea. The fact that you found an issue with the data because
>> you
>> saw a bot do an edit is an example of this principle working nicely.
>>
>>
>> --
>> Daniel Kinzler
>> Senior Software Developer
>>
>> Wikimedia Deutschland
>> Gesellschaft zur Förderung Freien Wissens e.V.
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Thomas Douillard
>
(*) This follows the principle of "magic is bad, let people edit". Allowing
inconsistencies means we can detect errors by finding such inconsistencies.
Automatically enforcing consistency may lead to errors propagating out of
view
of the curation process. The QA process on wikis is centered around edits,
so
every change should be an edit. Using a bot to fill in missing "reverse"
links
follows this idea. The fact that you found an issue with the data because
you
saw a bot do an edit is an example of this principle working nicely.

That might prove to become a worser nightmare than the magic one ... It's
seems like refusing any kind of automation because it might surprise people
for the sake of exhausting them to let them do a lot of manual work.

2015-09-28 16:23 GMT+02:00 Daniel Kinzler :

> Am 27.09.2015 um 21:19 schrieb Thad Guidry:
> > Both Sides ?  Wikidata has a true graph representation like FB ?  didn't
> know
> > that.  Can you show me the other side your referring too ?
>
> "Both sides" probably means that "sister city" is a reflexive property, so
> if
> item A refers to item B as a sister city, item B should also refer to item
> A as
> a sister city. This is not automatic, and it was a conscious design
> decision to
> not make it automatic(*).
>
> What do you mean by "true graph representation"? Wikidata internally uses
> JSON
> structures to represent items, and items reference each other, forming a
> graph.
> We have a linked data interface for traversing the graph[1]. We also have
> an RDF
> mapping with a SPARQL endpoint[2] that allows queries against that graph.
>
> -- daniel
>
>
> [1]
> https://www.wikidata.org/wiki/Wikidata:Data_access#Linked_Data_interface
> [2] https://www.wikidata.org/wiki/Wikidata:Data_access#SPARQL_endpoints
>
> (*) This follows the principle of "magic is bad, let people edit". Allowing
> inconsistencies means we can detect errors by finding such inconsistencies.
> Automatically enforcing consistency may lead to errors propagating out of
> view
> of the curation process. The QA process on wikis is centered around edits,
> so
> every change should be an edit. Using a bot to fill in missing "reverse"
> links
> follows this idea. The fact that you found an issue with the data because
> you
> saw a bot do an edit is an example of this principle working nicely.
>
>
> --
> Daniel Kinzler
> Senior Software Developer
>
> Wikimedia Deutschland
> Gesellschaft zur Förderung Freien Wissens e.V.
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Peter F. Patel-Schneider
On 09/28/2015 07:25 AM, Daniel Kinzler wrote:
> Am 28.09.2015 um 16:14 schrieb Peter F. Patel-Schneider:
>> I worry about this way of specializing properties.   How are people, and
>> particularly programs, going to be able to find out that a qualifier is
>> needed, which qualifier it is, and how it is to be used, or which broad
>> property is to be used for a specific purpose?
> 
> This problem exists either way: either you have to know which specific 
> property
> to use, or you have to know how to qualify the broader property. In both 
> cases,
> human readable documentation is the way to find out.

I agree that finding the right thing to use is not easy.

However, I think that a uniform search space is better than a non-uniform one.
  I would much prefer to look through a collection of properties than a
collection of properties and qualifiers.  If I am writing a tool to help the
process, I would much prefer to display a collection of properties than a
collection of properties plus qualifiers.

peter

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Daniel Kinzler
Am 28.09.2015 um 16:14 schrieb Peter F. Patel-Schneider:
> I worry about this way of specializing properties.   How are people, and
> particularly programs, going to be able to find out that a qualifier is
> needed, which qualifier it is, and how it is to be used, or which broad
> property is to be used for a specific purpose?

This problem exists either way: either you have to know which specific property
to use, or you have to know how to qualify the broader property. In both cases,
human readable documentation is the way to find out.


-- 
Daniel Kinzler
Senior Software Developer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread John Erling Blad
I would like to add "minister", as there are some fine distinctions on who
is and who's not in a government. Still we call them all "ministers". Very
confusing, and very obvious at the same time.

There are also the differences in organisation of American municipalities,
oh what a glorious mess!

Then you have the differences between a state and its different
main/bi-lands, not to say the not inhabited parts.

It is a lot that don't have an obvious description.

And btw, twin cities, I found a lot of errors and pretended I didn't see
them. Don't tell anyone.

On Mon, Sep 28, 2015 at 4:04 PM, Markus Krötzsch <
mar...@semantic-mediawiki.org> wrote:

> On 28.09.2015 13:31, Luca Martinelli wrote:
>
>> 2015-09-28 11:16 GMT+02:00 Markus Krötzsch > >:
>>
>>> If this is the case, then maybe it
>>> should just be kept as an intentionally broad property that captures
>>> what we
>>> now find in the Wikipedias.
>>>
>>
>> +1, the more broad the application of certain property is, the better.
>> We really don't need to be 100% specific with a property, if we can
>> exploit qualifiers.
>>
>
> I would not completely agree to this: otherwise we could just have a
> property "related to" and use qualifiers for the rest ;-) It's always about
> finding the right balance for each case. Many properties (probably most)
> have a predominant natural definition that is quite clear. Take "parent" as
> a simple example of a property that can have a very strict definition
> (biological parent) and still be practically useful and easy to understand.
> The trouble is often with properties that have a legal/political meaning
> since they are different in each legislation (which in itself changes over
> space and time). "Twin city" is such a case; "mayor" is another; also
> classes like "company" are like this. I think we do well to stick to the
> "folk terminology" in such cases, which lacks precision but caters to our
> users.
>
> This can then be refined in the mid and long term (maybe using qualifiers,
> more properties, or new editing conventions). Each domain could have a
> dedicated Wikiproject to work this out (the Wikiproject Names is a great
> example of such an effort [1]).
>
> Markus
>
> [1] https://www.wikidata.org/wiki/Wikidata:WikiProject_Names
>
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Daniel Kinzler
Am 27.09.2015 um 21:19 schrieb Thad Guidry:
> Both Sides ?  Wikidata has a true graph representation like FB ?  didn't know
> that.  Can you show me the other side your referring too ?

"Both sides" probably means that "sister city" is a reflexive property, so if
item A refers to item B as a sister city, item B should also refer to item A as
a sister city. This is not automatic, and it was a conscious design decision to
not make it automatic(*).

What do you mean by "true graph representation"? Wikidata internally uses JSON
structures to represent items, and items reference each other, forming a graph.
We have a linked data interface for traversing the graph[1]. We also have an RDF
mapping with a SPARQL endpoint[2] that allows queries against that graph.

-- daniel


[1] https://www.wikidata.org/wiki/Wikidata:Data_access#Linked_Data_interface
[2] https://www.wikidata.org/wiki/Wikidata:Data_access#SPARQL_endpoints

(*) This follows the principle of "magic is bad, let people edit". Allowing
inconsistencies means we can detect errors by finding such inconsistencies.
Automatically enforcing consistency may lead to errors propagating out of view
of the curation process. The QA process on wikis is centered around edits, so
every change should be an edit. Using a bot to fill in missing "reverse" links
follows this idea. The fact that you found an issue with the data because you
saw a bot do an edit is an example of this principle working nicely.


-- 
Daniel Kinzler
Senior Software Developer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Bene*
If you want to use any data source in your program/tool, you first need 
to look into the data and see how things are specified. You will soon 
get an idea which qualifiers/properties are used to describe certain 
circumstances. Much more important in my opinion is that we do this 
consistently for all facts in some area so that we don't end up using 
two different qualifiers to describe the same fact on different statements.


Best regards
Bene

Am 28.09.2015 um 16:14 schrieb Peter F. Patel-Schneider:

On 09/28/2015 04:31 AM, Luca Martinelli wrote:

2015-09-28 11:16 GMT+02:00 Markus Krötzsch :

If this is the case, then maybe it
should just be kept as an intentionally broad property that captures what we
now find in the Wikipedias.

+1, the more broad the application of certain property is, the better.
We really don't need to be 100% specific with a property, if we can
exploit qualifiers.

L.

I worry about this way of specializing properties.   How are people, and
particularly programs, going to be able to find out that a qualifier is
needed, which qualifier it is, and how it is to be used, or which broad
property is to be used for a specific purpose?

peter



___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata



___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Luca Martinelli
My intervention was limited to cases that *can* be generalised (such
as "points scored" or "twin cities"), I wasn't talking about a "one
property fits all". Sorry for the misunderstanding.

I meant, in other words, that whenever it's possible (as in *this*
case) to create one property that covers several type of comparable
things, it should be preferable to make just one property, instead of
several ones which are hard to distinguish or have too few differences
among themselves.

Hope this time I was more clear :)

L.

2015-09-28 16:14 GMT+02:00 Peter F. Patel-Schneider :
> On 09/28/2015 04:31 AM, Luca Martinelli wrote:
>> 2015-09-28 11:16 GMT+02:00 Markus Krötzsch :
>>> If this is the case, then maybe it
>>> should just be kept as an intentionally broad property that captures what we
>>> now find in the Wikipedias.
>>
>> +1, the more broad the application of certain property is, the better.
>> We really don't need to be 100% specific with a property, if we can
>> exploit qualifiers.
>>
>> L.
>
> I worry about this way of specializing properties.   How are people, and
> particularly programs, going to be able to find out that a qualifier is
> needed, which qualifier it is, and how it is to be used, or which broad
> property is to be used for a specific purpose?
>
> peter
>
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata



-- 
Luca "Sannita" Martinelli
http://it.wikipedia.org/wiki/Utente:Sannita

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Peter F. Patel-Schneider
On 09/28/2015 04:31 AM, Luca Martinelli wrote:
> 2015-09-28 11:16 GMT+02:00 Markus Krötzsch :
>> If this is the case, then maybe it
>> should just be kept as an intentionally broad property that captures what we
>> now find in the Wikipedias.
> 
> +1, the more broad the application of certain property is, the better.
> We really don't need to be 100% specific with a property, if we can
> exploit qualifiers.
> 
> L.

I worry about this way of specializing properties.   How are people, and
particularly programs, going to be able to find out that a qualifier is
needed, which qualifier it is, and how it is to be used, or which broad
property is to be used for a specific purpose?

peter



___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Markus Krötzsch

On 28.09.2015 13:31, Luca Martinelli wrote:

2015-09-28 11:16 GMT+02:00 Markus Krötzsch :

If this is the case, then maybe it
should just be kept as an intentionally broad property that captures what we
now find in the Wikipedias.


+1, the more broad the application of certain property is, the better.
We really don't need to be 100% specific with a property, if we can
exploit qualifiers.


I would not completely agree to this: otherwise we could just have a 
property "related to" and use qualifiers for the rest ;-) It's always 
about finding the right balance for each case. Many properties (probably 
most) have a predominant natural definition that is quite clear. Take 
"parent" as a simple example of a property that can have a very strict 
definition (biological parent) and still be practically useful and easy 
to understand. The trouble is often with properties that have a 
legal/political meaning since they are different in each legislation 
(which in itself changes over space and time). "Twin city" is such a 
case; "mayor" is another; also classes like "company" are like this. I 
think we do well to stick to the "folk terminology" in such cases, which 
lacks precision but caters to our users.


This can then be refined in the mid and long term (maybe using 
qualifiers, more properties, or new editing conventions). Each domain 
could have a dedicated Wikiproject to work this out (the Wikiproject 
Names is a great example of such an effort [1]).


Markus

[1] https://www.wikidata.org/wiki/Wikidata:WikiProject_Names


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Luca Martinelli
2015-09-28 11:16 GMT+02:00 Markus Krötzsch :
> If this is the case, then maybe it
> should just be kept as an intentionally broad property that captures what we
> now find in the Wikipedias.

+1, the more broad the application of certain property is, the better.
We really don't need to be 100% specific with a property, if we can
exploit qualifiers.

L.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-28 Thread Markus Krötzsch

Hi,

Important discussion (but please don't get angry over such things -- 
some emails sounded a bit rough to my taste if I may say so :-).


Property definitions are an important issue, and ours are too vague in 
general. However, some properties need to be quite broad to be useful: 
they need some semantic wiggle room to allow them to be used in slightly 
different situations rather than having hundreds of hardly-used (but 
very precise) properties that are not natural. If such broadness is 
intended for a property, it should of course still be documented.


As it is now "twin cities" seem to be all cities that have some form of 
bilateral partnership contract that defines some such status. One could 
use a qualifier to specify which kind of contract it is (if someone can 
find out what the main types of partner cities are!). However, to be 
honest, it is hard to see an application where this information would be 
relevant, other than for trivia (the last time I needed such information 
was in a pub quiz ;-) and for display in Wikipedia pages. If this is the 
case, then maybe it should just be kept as an intentionally broad 
property that captures what we now find in the Wikipedias. The 
ontologists among us could better spend their time on properties like 
"part of".


Cheers,

Markus


On 27.09.2015 23:45, Thad Guidry wrote:


On Sun, Sep 27, 2015 at 4:37 PM, Jan Ainali mailto:jan.ain...@wikimedia.se>> wrote:

2015-09-27 23:03 GMT+02:00 Thad Guidry mailto:thadgui...@gmail.com>>:

I have added my viewpoint to the P190 Discussion page.


Great, thanks!


Are you seriously saying that a Talk Page (where many are 2 - 10
pages scrollable) is the culmination of a definition of a
property in Wikidata ?


In theory, only the top box should be needed. In practice, since
everything in the Wikimedia movement is a work in progress, the talk
page are almost always worth reading, especially if you have doubts
on how to use it.


​Good to know.  I will use it profusely from now on.
​


You expect folks to read 10 pages of Talk to fully understand
the intent and how to use a Property in Wikidata ?


No, _I_ do not expect "folks" to read it. But I do expect people who
want to improve the use of a property to read it before they setup
their own definitions of that property.


​As I shall from now on, knowing its importance, or more to the point,
where I can help with confusion.
​
Thad
+ThadGuidry 


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata




___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Thad Guidry
On Sun, Sep 27, 2015 at 4:37 PM, Jan Ainali  wrote:

> 2015-09-27 23:03 GMT+02:00 Thad Guidry :
>
>> I have added my viewpoint to the P190 Discussion page.
>>
>
> Great, thanks!
>
>
>>
>> Are you seriously saying that a Talk Page (where many are 2 - 10 pages
>> scrollable) is the culmination of a definition of a property in Wikidata ?
>>
>
> In theory, only the top box should be needed. In practice, since
> everything in the Wikimedia movement is a work in progress, the talk page
> are almost always worth reading, especially if you have doubts on how to
> use it.
>
>

​Good to know.  I will use it profusely from now on.
​


>
>> You expect folks to read 10 pages of Talk to fully understand the intent
>> and how to use a Property in Wikidata ?
>>
>
> No, _I_ do not expect "folks" to read it. But I do expect people who want
> to improve the use of a property to read it before they setup their own
> definitions of that property.
>
>

​As I shall from now on, knowing its importance, or more to the point,
where I can help with confusion.
​
Thad
+ThadGuidry 
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Jan Ainali
2015-09-27 23:03 GMT+02:00 Thad Guidry :

> I have added my viewpoint to the P190 Discussion page.
>

Great, thanks!


>
> Are you seriously saying that a Talk Page (where many are 2 - 10 pages
> scrollable) is the culmination of a definition of a property in Wikidata ?
>

In theory, only the top box should be needed. In practice, since everything
in the Wikimedia movement is a work in progress, the talk page are almost
always worth reading, especially if you have doubts on how to use it.


>
> You expect folks to read 10 pages of Talk to fully understand the intent
> and how to use a Property in Wikidata ?
>

No, _I_ do not expect "folks" to read it. But I do expect people who want
to improve the use of a property to read it before they setup their own
definitions of that property.


>
> Does WP not value the input of ontologists and researchers ?
>

I am pretty sure they do (as long as they cite their sources), but that
might be out of the scope of this mailing list (which is supposed to cover
Wikidata topics, not Wikipedia topics).

/Jan


>
>
> Thad
> +ThadGuidry 
>
> On Sun, Sep 27, 2015 at 3:29 PM, Jan Ainali 
> wrote:
>
>> 2015-09-27 22:11 GMT+02:00 Thad Guidry :
>>
>>> ​​
>>> I have improved the Sister City property with a MUCH better description
>>> to save others further grief.
>>>
>>
>> You have improved the English description on the property. However,
>> without updating the documentation page (which is the talk page) I guess
>> most community members will not notice. Just to take myself as an example,
>> I only check the documentation page on how a property is supposed to be
>> used, not a random language version description of it.
>>
>> /Jan
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Vi to
But also Wikidata brings lots of mistakes/issues up.

Vito

2015-09-27 23:12 GMT+02:00 Gerard Meijssen :

> Hoi,
> In them old days, this would be called a typical case of gigo or garbage
> in garbage out. It is hardly the only instance where what we define in
> Wikidata is not what we find in one of our sources.
>
> It is to be expected, it is not something to get upset about.
>
> At this time I am working on a division of the Netherlands that is largely
> wrong in Wikipedia. That is at most worthy of a blog post, telling people
> about it, making a list that makes this division understandable for use on
> any Wikipedia that wants it.
>
> It is good to appreciate how erroneous Wikidata can be. It is good to
> appreciate that this happens with the best of intentions. The problem is
> not that there are errors, the problem is that we are lacking in well
> crafted workflows,build to help people fix what is wrong and report on it.
> We do not zoom in on known issues and therefore every item, every statement
> without a source is an issue. Even worse, many sourced statements are
> questionable but we have no way of classifying sources. Everything is a
> potential problem and we do not contain errors and have people work on what
> we know may be wrong.
>
> That is an issue worth talking about.. The rest is just a drop in the
> ocean.
> Thanks,
>GerardM
>
> On 27 September 2015 at 20:06, Thad Guidry  wrote:
>
>> ​I live there sometimes. :)
>>
>> Partner City (an alias of that Sister City description) does even help
>> make them closer to any sister city.
>>
>> If anyone disagrees with me, then you would have to fully describe what
>> the property Sister City really means...because there is little to go on
>> other than the idea of a "twin city". ​
>>
>> Thad
>> +ThadGuidry 
>>
>> On Sun, Sep 27, 2015 at 12:07 PM, Vi to  wrote:
>>
>>> From a technical point of view there's nothing wrong with those
>>> statemets.
>>>
>>> Original infos were added by
>>> https://meta.wikimedia.org/wiki/Special:CentralAuth/ILoveKorea which
>>> doesn't seem to be a vandal. How did you find they are wrong?
>>>
>>>
>>> Vito
>>>
>>> 2015-09-27 18:57 GMT+02:00 Amir Ladsgroup :
>>>
 Italian Wikipedia says it's correct:
 https://it.wikipedia.org/wiki/Hefei



 On Sun, Sep 27, 2015 at 8:21 PM Thad Guidry 
 wrote:

> OK, it seems that Wikipedia does have a few nice features. :)
>
> I was able to quickly search History on the entity and find that
> Dexbot had imported the erroneous statements
> https://www.wikidata.org/wiki/User:Dexbot  <-- pretty cool options
> there, Good Job whomever !
>
> and let the User (owner of the bot) know of the problem.
> https://www.wikidata.org/wiki/User_talk:Ladsgroup
>
>
> Thad
> +ThadGuidry 
>
> On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry 
> wrote:
>
>> I had to clean up this entity that had Sister City property filled in
>> with lots of erroneous statements for it that I removed.
>>
>> https://www.wikidata.org/wiki/Q185684
>>
>> How can I figure out where the import went wrong, how it happened,
>> and how to ensure it doesn't happen again ?  How does one look at 
>> Wikidata
>> bots and their efficiency or incorrectness ?
>>
>> Trying to learn more,
>>
>> Thad
>> +ThadGuidry 
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>

 ___
 Wikidata mailing list
 Wikidata@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata


>>>
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>>
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Gerard Meijssen
Hoi,
In them old days, this would be called a typical case of gigo or garbage in
garbage out. It is hardly the only instance where what we define in
Wikidata is not what we find in one of our sources.

It is to be expected, it is not something to get upset about.

At this time I am working on a division of the Netherlands that is largely
wrong in Wikipedia. That is at most worthy of a blog post, telling people
about it, making a list that makes this division understandable for use on
any Wikipedia that wants it.

It is good to appreciate how erroneous Wikidata can be. It is good to
appreciate that this happens with the best of intentions. The problem is
not that there are errors, the problem is that we are lacking in well
crafted workflows,build to help people fix what is wrong and report on it.
We do not zoom in on known issues and therefore every item, every statement
without a source is an issue. Even worse, many sourced statements are
questionable but we have no way of classifying sources. Everything is a
potential problem and we do not contain errors and have people work on what
we know may be wrong.

That is an issue worth talking about.. The rest is just a drop in the ocean.
Thanks,
   GerardM

On 27 September 2015 at 20:06, Thad Guidry  wrote:

> ​I live there sometimes. :)
>
> Partner City (an alias of that Sister City description) does even help
> make them closer to any sister city.
>
> If anyone disagrees with me, then you would have to fully describe what
> the property Sister City really means...because there is little to go on
> other than the idea of a "twin city". ​
>
> Thad
> +ThadGuidry 
>
> On Sun, Sep 27, 2015 at 12:07 PM, Vi to  wrote:
>
>> From a technical point of view there's nothing wrong with those statemets.
>>
>> Original infos were added by
>> https://meta.wikimedia.org/wiki/Special:CentralAuth/ILoveKorea which
>> doesn't seem to be a vandal. How did you find they are wrong?
>>
>>
>> Vito
>>
>> 2015-09-27 18:57 GMT+02:00 Amir Ladsgroup :
>>
>>> Italian Wikipedia says it's correct: https://it.wikipedia.org/wiki/Hefei
>>>
>>>
>>>
>>> On Sun, Sep 27, 2015 at 8:21 PM Thad Guidry 
>>> wrote:
>>>
 OK, it seems that Wikipedia does have a few nice features. :)

 I was able to quickly search History on the entity and find that Dexbot
 had imported the erroneous statements
 https://www.wikidata.org/wiki/User:Dexbot  <-- pretty cool options
 there, Good Job whomever !

 and let the User (owner of the bot) know of the problem.
 https://www.wikidata.org/wiki/User_talk:Ladsgroup


 Thad
 +ThadGuidry 

 On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry 
 wrote:

> I had to clean up this entity that had Sister City property filled in
> with lots of erroneous statements for it that I removed.
>
> https://www.wikidata.org/wiki/Q185684
>
> How can I figure out where the import went wrong, how it happened, and
> how to ensure it doesn't happen again ?  How does one look at Wikidata 
> bots
> and their efficiency or incorrectness ?
>
> Trying to learn more,
>
> Thad
> +ThadGuidry 
>

 ___
 Wikidata mailing list
 Wikidata@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata

>>>
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>>
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Vi to
Wikipedia is build upon a bet: better ideas prevail anyway, regardless of
who is expressing them ;)

Vito

2015-09-27 23:03 GMT+02:00 Thad Guidry :

> I have added my viewpoint to the P190 Discussion page.
>
> Are you seriously saying that a Talk Page (where many are 2 - 10 pages
> scrollable) is the culmination of a definition of a property in Wikidata ?
>
> You expect folks to read 10 pages of Talk to fully understand the intent
> and how to use a Property in Wikidata ?
>
> Does WP not value the input of ontologists and researchers ?
>
>
> Thad
> +ThadGuidry 
>
> On Sun, Sep 27, 2015 at 3:29 PM, Jan Ainali 
> wrote:
>
>> 2015-09-27 22:11 GMT+02:00 Thad Guidry :
>>
>>> ​​
>>> I have improved the Sister City property with a MUCH better description
>>> to save others further grief.
>>>
>>
>> You have improved the English description on the property. However,
>> without updating the documentation page (which is the talk page) I guess
>> most community members will not notice. Just to take myself as an example,
>> I only check the documentation page on how a property is supposed to be
>> used, not a random language version description of it.
>>
>> /Jan
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Thad Guidry
I have added my viewpoint to the P190 Discussion page.

Are you seriously saying that a Talk Page (where many are 2 - 10 pages
scrollable) is the culmination of a definition of a property in Wikidata ?

You expect folks to read 10 pages of Talk to fully understand the intent
and how to use a Property in Wikidata ?

Does WP not value the input of ontologists and researchers ?


Thad
+ThadGuidry 

On Sun, Sep 27, 2015 at 3:29 PM, Jan Ainali  wrote:

> 2015-09-27 22:11 GMT+02:00 Thad Guidry :
>
>> ​​
>> I have improved the Sister City property with a MUCH better description
>> to save others further grief.
>>
>
> You have improved the English description on the property. However,
> without updating the documentation page (which is the talk page) I guess
> most community members will not notice. Just to take myself as an example,
> I only check the documentation page on how a property is supposed to be
> used, not a random language version description of it.
>
> /Jan
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Jan Ainali
2015-09-27 22:11 GMT+02:00 Thad Guidry :

> ​​
> I have improved the Sister City property with a MUCH better description to
> save others further grief.
>

You have improved the English description on the property. However, without
updating the documentation page (which is the talk page) I guess most
community members will not notice. Just to take myself as an example, I
only check the documentation page on how a property is supposed to be used,
not a random language version description of it.

/Jan
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Thad Guidry
​​
I have improved the Sister City property with a MUCH better description to
save others further grief.

Freebase had the same problems.  Sometimes we were not descriptive enough
on our Types and Properties and it caused confusion and bad data.
Luckily, Experts and Staff and Volunteers would review descriptions on new
properties introduced and cooperatively suggest and vote on really good
descriptions that everyone could understand easily.  (We did not just use
synonyms as a description, but made sure it was always a nice full sentence
or two that described its use and intent well to the users.  We also tried
to use Simple English as much as possible to help with translating easily
to other languages.

Thad
​+ThadGuidry ​
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Thad Guidry
Yup, just as I feared...its almost a useless property as it stands now.  No
clear definition. Confusion is rampant among everyone.  And where I think
things could be vastly improved...is here

Wikipedia has some great pages that typically describe the Property.
https://en.wikipedia.org/wiki/Twin_towns_and_sister_cities is one of
thoseyet, it takes 4 clicks to get to it from the Sister City property
https://www.wikidata.org/wiki/Property:P190  and then you finally have a
better description or intent of what the authors of the Sister City
property were trying to describe.

Improvement idea and Suggestion to Wikidata admins:   Let Descriptions
allow (and show) the link directly to a Wikipedia page describing more
fully the idea of the property.

sister city (P190)


twin towns, sister cities, twinned municipalities and other localities  :
MORE DESCRIPTION HERE (Link to WP page)


Thad
+ThadGuidry 
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Federico Leva (Nemo)

Thad Guidry, 27/09/2015 18:51:

OK, it seems that Wikipedia does have a few nice features. :)

I was able to quickly search History


Congratulations for using action=history! Those interested in learnign 
more features of a wiki can check https://www.mediawiki.org/wiki/Principles


> then you would have to fully describe what the property Sister City 
really means...


Good idea. Please add your point of view to the discussion on the 
matter: https://www.wikidata.org/wiki/Property_talk:P190


Nemo

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Thad Guidry
Both Sides ?  Wikidata has a true graph representation like FB ?  didn't
know that.  Can you show me the other side your referring too ?


Thad
+ThadGuidry 

On Sun, Sep 27, 2015 at 2:06 PM, Sjoerd de Bruin 
wrote:

> It seems that the bot only wanted to sync statements, so the claim is on
> both items instead of one. You've only removed the statements at one side,
> so another bot could do the same again and add the statements back.
>
> Sjoerddebruin
>
>
> Op 27 sep. 2015, om 21:02 heeft Thad Guidry  het
> volgende geschreven:
>
> I didn't revert a bot.  I removed a few erroneous statements that a bot
> made incorrectly.
>
>
> Thad
> +ThadGuidry 
>
> On Sun, Sep 27, 2015 at 1:58 PM, Vi to  wrote:
>
>> Actually you didn't talk about it, but still you can ask him for
>> explanations. Generally reverting a bot which is working fine (since the
>> mistake, if any, is onwiki) is useless.
>>
>> Vito
>>
>> 2015-09-27 20:11 GMT+02:00 Thad Guidry :
>>
>>> Great.
>>>
>>> ValterVB doesn't think I am right in removing them and has reverted them.
>>>
>>> I think I now remember why I left Wikipedia... too many chiefs who think
>>> they know better, rather than talking through things as a community.
>>>
>>>
>>>
>>> Thad
>>> +ThadGuidry 
>>>
>>> On Sun, Sep 27, 2015 at 1:06 PM, Thad Guidry 
>>> wrote:
>>>
 sorry, that should have been ... does NOT even help make them closer to
 any sister city.

 Thad
 +ThadGuidry 

 On Sun, Sep 27, 2015 at 1:06 PM, Thad Guidry 
 wrote:

> ​I live there sometimes. :)
>
> Partner City (an alias of that Sister City description) does even help
> make them closer to any sister city.
>
> If anyone disagrees with me, then you would have to fully describe
> what the property Sister City really means...because there is little to go
> on other than the idea of a "twin city". ​
>
> Thad
> +ThadGuidry 
>
> On Sun, Sep 27, 2015 at 12:07 PM, Vi to 
> wrote:
>
>> From a technical point of view there's nothing wrong with those
>> statemets.
>>
>> Original infos were added by
>> https://meta.wikimedia.org/wiki/Special:CentralAuth/ILoveKorea which
>> doesn't seem to be a vandal. How did you find they are wrong?
>>
>>
>> Vito
>>
>> 2015-09-27 18:57 GMT+02:00 Amir Ladsgroup :
>>
>>> Italian Wikipedia says it's correct:
>>> https://it.wikipedia.org/wiki/Hefei
>>>
>>>
>>>
>>> On Sun, Sep 27, 2015 at 8:21 PM Thad Guidry 
>>> wrote:
>>>
 OK, it seems that Wikipedia does have a few nice features. :)

 I was able to quickly search History on the entity and find that
 Dexbot had imported the erroneous statements
 https://www.wikidata.org/wiki/User:Dexbot  <-- pretty cool options
 there, Good Job whomever !

 and let the User (owner of the bot) know of the problem.
 https://www.wikidata.org/wiki/User_talk:Ladsgroup


 Thad
 +ThadGuidry 

 On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry >>> > wrote:

> I had to clean up this entity that had Sister City property filled
> in with lots of erroneous statements for it that I removed.
>
> https://www.wikidata.org/wiki/Q185684
>
> How can I figure out where the import went wrong, how it happened,
> and how to ensure it doesn't happen again ?  How does one look at 
> Wikidata
> bots and their efficiency or incorrectness ?
>
> Trying to learn more,
>
> Thad
> +ThadGuidry 
>

 ___
 Wikidata mailing list
 Wikidata@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata

>>>
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>>
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>

>>>
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>>
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
> ___
> Wikidata mailing lis

Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Sjoerd de Bruin
It seems that the bot only wanted to sync statements, so the claim is on both 
items instead of one. You've only removed the statements at one side, so 
another bot could do the same again and add the statements back.

Sjoerddebruin

> Op 27 sep. 2015, om 21:02 heeft Thad Guidry  het 
> volgende geschreven:
> 
> I didn't revert a bot.  I removed a few erroneous statements that a bot made 
> incorrectly.
> 
> 
> Thad
> +ThadGuidry 
> 
> On Sun, Sep 27, 2015 at 1:58 PM, Vi to  > wrote:
> Actually you didn't talk about it, but still you can ask him for 
> explanations. Generally reverting a bot which is working fine (since the 
> mistake, if any, is onwiki) is useless.
> 
> Vito
> 
> 2015-09-27 20:11 GMT+02:00 Thad Guidry  >:
> Great.
> 
> ValterVB doesn't think I am right in removing them and has reverted them.
> 
> I think I now remember why I left Wikipedia... too many chiefs who think they 
> know better, rather than talking through things as a community.
> 
> 
> 
> Thad
> +ThadGuidry 
> 
> On Sun, Sep 27, 2015 at 1:06 PM, Thad Guidry  > wrote:
> sorry, that should have been ... does NOT even help make them closer to any 
> sister city.
> 
> Thad
> +ThadGuidry 
> 
> On Sun, Sep 27, 2015 at 1:06 PM, Thad Guidry  > wrote:
> ​I live there sometimes. :)
> 
> Partner City (an alias of that Sister City description) does even help make 
> them closer to any sister city.
> 
> If anyone disagrees with me, then you would have to fully describe what the 
> property Sister City really means...because there is little to go on other 
> than the idea of a "twin city". ​
> 
> Thad
> +ThadGuidry 
> 
> On Sun, Sep 27, 2015 at 12:07 PM, Vi to  > wrote:
> From a technical point of view there's nothing wrong with those statemets.
> 
> Original infos were added by 
> https://meta.wikimedia.org/wiki/Special:CentralAuth/ILoveKorea 
>  which 
> doesn't seem to be a vandal. How did you find they are wrong?
> 
> 
> Vito
> 
> 2015-09-27 18:57 GMT+02:00 Amir Ladsgroup  >:
> Italian Wikipedia says it's correct: https://it.wikipedia.org/wiki/Hefei 
> 
> 
> 
> 
> On Sun, Sep 27, 2015 at 8:21 PM Thad Guidry  > wrote:
> OK, it seems that Wikipedia does have a few nice features. :)
> 
> I was able to quickly search History on the entity and find that Dexbot had 
> imported the erroneous statements https://www.wikidata.org/wiki/User:Dexbot 
>   <-- pretty cool options there, 
> Good Job whomever !
> 
> and let the User (owner of the bot) know of the problem.  
> https://www.wikidata.org/wiki/User_talk:Ladsgroup 
> 
> 
> 
> Thad
> +ThadGuidry 
> 
> On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry  > wrote:
> I had to clean up this entity that had Sister City property filled in with 
> lots of erroneous statements for it that I removed.
> 
> https://www.wikidata.org/wiki/Q185684 
> 
> How can I figure out where the import went wrong, how it happened, and how to 
> ensure it doesn't happen again ?  How does one look at Wikidata bots and 
> their efficiency or incorrectness ?
> 
> Trying to learn more,
> 
> Thad
> +ThadGuidry 
> 
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org 
> https://lists.wikimedia.org/mailman/listinfo/wikidata 
> 
> 
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org 
> https://lists.wikimedia.org/mailman/listinfo/wikidata 
> 
> 
> 
> 
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org 
> https://lists.wikimedia.org/mailman/listinfo/wikidata 
> 
> 
> 
> 
> 
> 
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org 
> https://lists.wikimedia.org/mailman/listinfo/wikidata 
> 
> 
> 
> 
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org 
> https://lists.wikimedia.org/mailman/listinfo/wikidata 
> 

Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Thad Guidry
I didn't revert a bot.  I removed a few erroneous statements that a bot
made incorrectly.


Thad
+ThadGuidry 

On Sun, Sep 27, 2015 at 1:58 PM, Vi to  wrote:

> Actually you didn't talk about it, but still you can ask him for
> explanations. Generally reverting a bot which is working fine (since the
> mistake, if any, is onwiki) is useless.
>
> Vito
>
> 2015-09-27 20:11 GMT+02:00 Thad Guidry :
>
>> Great.
>>
>> ValterVB doesn't think I am right in removing them and has reverted them.
>>
>> I think I now remember why I left Wikipedia... too many chiefs who think
>> they know better, rather than talking through things as a community.
>>
>>
>>
>> Thad
>> +ThadGuidry 
>>
>> On Sun, Sep 27, 2015 at 1:06 PM, Thad Guidry 
>> wrote:
>>
>>> sorry, that should have been ... does NOT even help make them closer to
>>> any sister city.
>>>
>>> Thad
>>> +ThadGuidry 
>>>
>>> On Sun, Sep 27, 2015 at 1:06 PM, Thad Guidry 
>>> wrote:
>>>
 ​I live there sometimes. :)

 Partner City (an alias of that Sister City description) does even help
 make them closer to any sister city.

 If anyone disagrees with me, then you would have to fully describe what
 the property Sister City really means...because there is little to go on
 other than the idea of a "twin city". ​

 Thad
 +ThadGuidry 

 On Sun, Sep 27, 2015 at 12:07 PM, Vi to  wrote:

> From a technical point of view there's nothing wrong with those
> statemets.
>
> Original infos were added by
> https://meta.wikimedia.org/wiki/Special:CentralAuth/ILoveKorea which
> doesn't seem to be a vandal. How did you find they are wrong?
>
>
> Vito
>
> 2015-09-27 18:57 GMT+02:00 Amir Ladsgroup :
>
>> Italian Wikipedia says it's correct:
>> https://it.wikipedia.org/wiki/Hefei
>>
>>
>>
>> On Sun, Sep 27, 2015 at 8:21 PM Thad Guidry 
>> wrote:
>>
>>> OK, it seems that Wikipedia does have a few nice features. :)
>>>
>>> I was able to quickly search History on the entity and find that
>>> Dexbot had imported the erroneous statements
>>> https://www.wikidata.org/wiki/User:Dexbot  <-- pretty cool options
>>> there, Good Job whomever !
>>>
>>> and let the User (owner of the bot) know of the problem.
>>> https://www.wikidata.org/wiki/User_talk:Ladsgroup
>>>
>>>
>>> Thad
>>> +ThadGuidry 
>>>
>>> On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry 
>>> wrote:
>>>
 I had to clean up this entity that had Sister City property filled
 in with lots of erroneous statements for it that I removed.

 https://www.wikidata.org/wiki/Q185684

 How can I figure out where the import went wrong, how it happened,
 and how to ensure it doesn't happen again ?  How does one look at 
 Wikidata
 bots and their efficiency or incorrectness ?

 Trying to learn more,

 Thad
 +ThadGuidry 

>>>
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>

>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Vi to
Actually you didn't talk about it, but still you can ask him for
explanations. Generally reverting a bot which is working fine (since the
mistake, if any, is onwiki) is useless.

Vito

2015-09-27 20:11 GMT+02:00 Thad Guidry :

> Great.
>
> ValterVB doesn't think I am right in removing them and has reverted them.
>
> I think I now remember why I left Wikipedia... too many chiefs who think
> they know better, rather than talking through things as a community.
>
>
>
> Thad
> +ThadGuidry 
>
> On Sun, Sep 27, 2015 at 1:06 PM, Thad Guidry  wrote:
>
>> sorry, that should have been ... does NOT even help make them closer to
>> any sister city.
>>
>> Thad
>> +ThadGuidry 
>>
>> On Sun, Sep 27, 2015 at 1:06 PM, Thad Guidry 
>> wrote:
>>
>>> ​I live there sometimes. :)
>>>
>>> Partner City (an alias of that Sister City description) does even help
>>> make them closer to any sister city.
>>>
>>> If anyone disagrees with me, then you would have to fully describe what
>>> the property Sister City really means...because there is little to go on
>>> other than the idea of a "twin city". ​
>>>
>>> Thad
>>> +ThadGuidry 
>>>
>>> On Sun, Sep 27, 2015 at 12:07 PM, Vi to  wrote:
>>>
 From a technical point of view there's nothing wrong with those
 statemets.

 Original infos were added by
 https://meta.wikimedia.org/wiki/Special:CentralAuth/ILoveKorea which
 doesn't seem to be a vandal. How did you find they are wrong?


 Vito

 2015-09-27 18:57 GMT+02:00 Amir Ladsgroup :

> Italian Wikipedia says it's correct:
> https://it.wikipedia.org/wiki/Hefei
>
>
>
> On Sun, Sep 27, 2015 at 8:21 PM Thad Guidry 
> wrote:
>
>> OK, it seems that Wikipedia does have a few nice features. :)
>>
>> I was able to quickly search History on the entity and find that
>> Dexbot had imported the erroneous statements
>> https://www.wikidata.org/wiki/User:Dexbot  <-- pretty cool options
>> there, Good Job whomever !
>>
>> and let the User (owner of the bot) know of the problem.
>> https://www.wikidata.org/wiki/User_talk:Ladsgroup
>>
>>
>> Thad
>> +ThadGuidry 
>>
>> On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry 
>> wrote:
>>
>>> I had to clean up this entity that had Sister City property filled
>>> in with lots of erroneous statements for it that I removed.
>>>
>>> https://www.wikidata.org/wiki/Q185684
>>>
>>> How can I figure out where the import went wrong, how it happened,
>>> and how to ensure it doesn't happen again ?  How does one look at 
>>> Wikidata
>>> bots and their efficiency or incorrectness ?
>>>
>>> Trying to learn more,
>>>
>>> Thad
>>> +ThadGuidry 
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>

 ___
 Wikidata mailing list
 Wikidata@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata


>>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Thad Guidry
Great.

ValterVB doesn't think I am right in removing them and has reverted them.

I think I now remember why I left Wikipedia... too many chiefs who think
they know better, rather than talking through things as a community.



Thad
+ThadGuidry 

On Sun, Sep 27, 2015 at 1:06 PM, Thad Guidry  wrote:

> sorry, that should have been ... does NOT even help make them closer to
> any sister city.
>
> Thad
> +ThadGuidry 
>
> On Sun, Sep 27, 2015 at 1:06 PM, Thad Guidry  wrote:
>
>> ​I live there sometimes. :)
>>
>> Partner City (an alias of that Sister City description) does even help
>> make them closer to any sister city.
>>
>> If anyone disagrees with me, then you would have to fully describe what
>> the property Sister City really means...because there is little to go on
>> other than the idea of a "twin city". ​
>>
>> Thad
>> +ThadGuidry 
>>
>> On Sun, Sep 27, 2015 at 12:07 PM, Vi to  wrote:
>>
>>> From a technical point of view there's nothing wrong with those
>>> statemets.
>>>
>>> Original infos were added by
>>> https://meta.wikimedia.org/wiki/Special:CentralAuth/ILoveKorea which
>>> doesn't seem to be a vandal. How did you find they are wrong?
>>>
>>>
>>> Vito
>>>
>>> 2015-09-27 18:57 GMT+02:00 Amir Ladsgroup :
>>>
 Italian Wikipedia says it's correct:
 https://it.wikipedia.org/wiki/Hefei



 On Sun, Sep 27, 2015 at 8:21 PM Thad Guidry 
 wrote:

> OK, it seems that Wikipedia does have a few nice features. :)
>
> I was able to quickly search History on the entity and find that
> Dexbot had imported the erroneous statements
> https://www.wikidata.org/wiki/User:Dexbot  <-- pretty cool options
> there, Good Job whomever !
>
> and let the User (owner of the bot) know of the problem.
> https://www.wikidata.org/wiki/User_talk:Ladsgroup
>
>
> Thad
> +ThadGuidry 
>
> On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry 
> wrote:
>
>> I had to clean up this entity that had Sister City property filled in
>> with lots of erroneous statements for it that I removed.
>>
>> https://www.wikidata.org/wiki/Q185684
>>
>> How can I figure out where the import went wrong, how it happened,
>> and how to ensure it doesn't happen again ?  How does one look at 
>> Wikidata
>> bots and their efficiency or incorrectness ?
>>
>> Trying to learn more,
>>
>> Thad
>> +ThadGuidry 
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>

 ___
 Wikidata mailing list
 Wikidata@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata


>>>
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>>
>>>
>>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Thad Guidry
sorry, that should have been ... does NOT even help make them closer to any
sister city.

Thad
+ThadGuidry 

On Sun, Sep 27, 2015 at 1:06 PM, Thad Guidry  wrote:

> ​I live there sometimes. :)
>
> Partner City (an alias of that Sister City description) does even help
> make them closer to any sister city.
>
> If anyone disagrees with me, then you would have to fully describe what
> the property Sister City really means...because there is little to go on
> other than the idea of a "twin city". ​
>
> Thad
> +ThadGuidry 
>
> On Sun, Sep 27, 2015 at 12:07 PM, Vi to  wrote:
>
>> From a technical point of view there's nothing wrong with those statemets.
>>
>> Original infos were added by
>> https://meta.wikimedia.org/wiki/Special:CentralAuth/ILoveKorea which
>> doesn't seem to be a vandal. How did you find they are wrong?
>>
>>
>> Vito
>>
>> 2015-09-27 18:57 GMT+02:00 Amir Ladsgroup :
>>
>>> Italian Wikipedia says it's correct: https://it.wikipedia.org/wiki/Hefei
>>>
>>>
>>>
>>> On Sun, Sep 27, 2015 at 8:21 PM Thad Guidry 
>>> wrote:
>>>
 OK, it seems that Wikipedia does have a few nice features. :)

 I was able to quickly search History on the entity and find that Dexbot
 had imported the erroneous statements
 https://www.wikidata.org/wiki/User:Dexbot  <-- pretty cool options
 there, Good Job whomever !

 and let the User (owner of the bot) know of the problem.
 https://www.wikidata.org/wiki/User_talk:Ladsgroup


 Thad
 +ThadGuidry 

 On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry 
 wrote:

> I had to clean up this entity that had Sister City property filled in
> with lots of erroneous statements for it that I removed.
>
> https://www.wikidata.org/wiki/Q185684
>
> How can I figure out where the import went wrong, how it happened, and
> how to ensure it doesn't happen again ?  How does one look at Wikidata 
> bots
> and their efficiency or incorrectness ?
>
> Trying to learn more,
>
> Thad
> +ThadGuidry 
>

 ___
 Wikidata mailing list
 Wikidata@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata

>>>
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>>
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Thad Guidry
​I live there sometimes. :)

Partner City (an alias of that Sister City description) does even help make
them closer to any sister city.

If anyone disagrees with me, then you would have to fully describe what the
property Sister City really means...because there is little to go on other
than the idea of a "twin city". ​

Thad
+ThadGuidry 

On Sun, Sep 27, 2015 at 12:07 PM, Vi to  wrote:

> From a technical point of view there's nothing wrong with those statemets.
>
> Original infos were added by
> https://meta.wikimedia.org/wiki/Special:CentralAuth/ILoveKorea which
> doesn't seem to be a vandal. How did you find they are wrong?
>
>
> Vito
>
> 2015-09-27 18:57 GMT+02:00 Amir Ladsgroup :
>
>> Italian Wikipedia says it's correct: https://it.wikipedia.org/wiki/Hefei
>>
>>
>>
>> On Sun, Sep 27, 2015 at 8:21 PM Thad Guidry  wrote:
>>
>>> OK, it seems that Wikipedia does have a few nice features. :)
>>>
>>> I was able to quickly search History on the entity and find that Dexbot
>>> had imported the erroneous statements
>>> https://www.wikidata.org/wiki/User:Dexbot  <-- pretty cool options
>>> there, Good Job whomever !
>>>
>>> and let the User (owner of the bot) know of the problem.
>>> https://www.wikidata.org/wiki/User_talk:Ladsgroup
>>>
>>>
>>> Thad
>>> +ThadGuidry 
>>>
>>> On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry 
>>> wrote:
>>>
 I had to clean up this entity that had Sister City property filled in
 with lots of erroneous statements for it that I removed.

 https://www.wikidata.org/wiki/Q185684

 How can I figure out where the import went wrong, how it happened, and
 how to ensure it doesn't happen again ?  How does one look at Wikidata bots
 and their efficiency or incorrectness ?

 Trying to learn more,

 Thad
 +ThadGuidry 

>>>
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Luca Martinelli
I'll try to give it a look in the next hours.

L.
Il 27/set/2015 18:57, "Amir Ladsgroup"  ha scritto:

> Italian Wikipedia says it's correct: https://it.wikipedia.org/wiki/Hefei
>
>
>
> On Sun, Sep 27, 2015 at 8:21 PM Thad Guidry  wrote:
>
>> OK, it seems that Wikipedia does have a few nice features. :)
>>
>> I was able to quickly search History on the entity and find that Dexbot
>> had imported the erroneous statements
>> https://www.wikidata.org/wiki/User:Dexbot  <-- pretty cool options
>> there, Good Job whomever !
>>
>> and let the User (owner of the bot) know of the problem.
>> https://www.wikidata.org/wiki/User_talk:Ladsgroup
>>
>>
>> Thad
>> +ThadGuidry 
>>
>> On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry 
>> wrote:
>>
>>> I had to clean up this entity that had Sister City property filled in
>>> with lots of erroneous statements for it that I removed.
>>>
>>> https://www.wikidata.org/wiki/Q185684
>>>
>>> How can I figure out where the import went wrong, how it happened, and
>>> how to ensure it doesn't happen again ?  How does one look at Wikidata bots
>>> and their efficiency or incorrectness ?
>>>
>>> Trying to learn more,
>>>
>>> Thad
>>> +ThadGuidry 
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Vi to
>From a technical point of view there's nothing wrong with those statemets.

Original infos were added by
https://meta.wikimedia.org/wiki/Special:CentralAuth/ILoveKorea which
doesn't seem to be a vandal. How did you find they are wrong?


Vito

2015-09-27 18:57 GMT+02:00 Amir Ladsgroup :

> Italian Wikipedia says it's correct: https://it.wikipedia.org/wiki/Hefei
>
>
>
> On Sun, Sep 27, 2015 at 8:21 PM Thad Guidry  wrote:
>
>> OK, it seems that Wikipedia does have a few nice features. :)
>>
>> I was able to quickly search History on the entity and find that Dexbot
>> had imported the erroneous statements
>> https://www.wikidata.org/wiki/User:Dexbot  <-- pretty cool options
>> there, Good Job whomever !
>>
>> and let the User (owner of the bot) know of the problem.
>> https://www.wikidata.org/wiki/User_talk:Ladsgroup
>>
>>
>> Thad
>> +ThadGuidry 
>>
>> On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry 
>> wrote:
>>
>>> I had to clean up this entity that had Sister City property filled in
>>> with lots of erroneous statements for it that I removed.
>>>
>>> https://www.wikidata.org/wiki/Q185684
>>>
>>> How can I figure out where the import went wrong, how it happened, and
>>> how to ensure it doesn't happen again ?  How does one look at Wikidata bots
>>> and their efficiency or incorrectness ?
>>>
>>> Trying to learn more,
>>>
>>> Thad
>>> +ThadGuidry 
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Amir Ladsgroup
Italian Wikipedia says it's correct: https://it.wikipedia.org/wiki/Hefei



On Sun, Sep 27, 2015 at 8:21 PM Thad Guidry  wrote:

> OK, it seems that Wikipedia does have a few nice features. :)
>
> I was able to quickly search History on the entity and find that Dexbot
> had imported the erroneous statements
> https://www.wikidata.org/wiki/User:Dexbot  <-- pretty cool options there,
> Good Job whomever !
>
> and let the User (owner of the bot) know of the problem.
> https://www.wikidata.org/wiki/User_talk:Ladsgroup
>
>
> Thad
> +ThadGuidry 
>
> On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry 
> wrote:
>
>> I had to clean up this entity that had Sister City property filled in
>> with lots of erroneous statements for it that I removed.
>>
>> https://www.wikidata.org/wiki/Q185684
>>
>> How can I figure out where the import went wrong, how it happened, and
>> how to ensure it doesn't happen again ?  How does one look at Wikidata bots
>> and their efficiency or incorrectness ?
>>
>> Trying to learn more,
>>
>> Thad
>> +ThadGuidry 
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Thad Guidry
OK, it seems that Wikipedia does have a few nice features. :)

I was able to quickly search History on the entity and find that Dexbot had
imported the erroneous statements https://www.wikidata.org/wiki/User:Dexbot
 <-- pretty cool options there, Good Job whomever !

and let the User (owner of the bot) know of the problem.
https://www.wikidata.org/wiki/User_talk:Ladsgroup


Thad
+ThadGuidry 

On Sun, Sep 27, 2015 at 11:38 AM, Thad Guidry  wrote:

> I had to clean up this entity that had Sister City property filled in with
> lots of erroneous statements for it that I removed.
>
> https://www.wikidata.org/wiki/Q185684
>
> How can I figure out where the import went wrong, how it happened, and how
> to ensure it doesn't happen again ?  How does one look at Wikidata bots and
> their efficiency or incorrectness ?
>
> Trying to learn more,
>
> Thad
> +ThadGuidry 
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Italian Wikipedia imports gone haywire ?

2015-09-27 Thread Thad Guidry
I had to clean up this entity that had Sister City property filled in with
lots of erroneous statements for it that I removed.

https://www.wikidata.org/wiki/Q185684

How can I figure out where the import went wrong, how it happened, and how
to ensure it doesn't happen again ?  How does one look at Wikidata bots and
their efficiency or incorrectness ?

Trying to learn more,

Thad
+ThadGuidry 
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata