It is possible to represent linguistic patterns as items but it needs some
work to conceptualize a structure that can be used in several languages.
Then you could implement a complex query that selects the one that displays
the most information.

The biggest hurdle I see is that we are not storing enough linguistic data.
We could already do it without any further development, but I think that we
need more time to understand thoroughly the structure that is emerging.

Maybe it could be thought as a research project to explore the
possibilities.


On Tue, Aug 19, 2014 at 4:34 PM, Thomas Douillard <
[email protected]> wrote:

> One answer would be deeper answer of autodescription into Wikidata. I
> don't know how to make this flexible and generic enough to make this only
> depedant of Wikibase model concept though.
>
> One rough and not thought throw proposition would be: associate a
> (mediawiki?) template to a query or a class for each language, which
> generates a description if an item matches the query according to the
> values of the properties.
>
> hen we could add an API call that uses the template.
>
>
> 2014-08-19 16:13 GMT+02:00 Lydia Pintscher <[email protected]>:
>
> On Tue, Aug 19, 2014 at 11:19 AM, David Cuenca <[email protected]> wrote:
>> > Thanks for the stats, Gerard. Two thoughts:
>> > - With so many items without description I wonder why we don't have the
>> > automatic descriptions gadget enabled by default.
>>
>> I am a bit worried about enabling this by default for everyone as a
>> gadget. We need the descriptions in a lot of places where people
>> search for items. The next big one will be Commons. But _a lot_ more
>> will come in the future. Think for example of tagging your blog post
>> on Wordpress with Wikidata concepts. You'll need the descriptions. If
>> we enable automatic descriptions on Wikidata now we will actively
>> discourage people from entering more descriptions. That would be bad
>> as 3rd parties then don't get the benefit of them.
>> I am also hesitant to build this into Wikibase directly as it'd need
>> quite some domain-knowledge for all I can tell at this point. That's
>> something we need to avoid.
>> Anyone got ideas how to get out of this?
>>
>>
>> Cheers
>> Lydia
>>
>> --
>> Lydia Pintscher - http://about.me/lydia.pintscher
>> Product Manager for Wikidata
>>
>> Wikimedia Deutschland e.V.
>> Tempelhofer Ufer 23-24
>> 10963 Berlin
>> www.wikimedia.de
>>
>> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>>
>> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
>> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
>> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>>
>> _______________________________________________
>> Wikidata-l mailing list
>> [email protected]
>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>
>
>
> _______________________________________________
> Wikidata-l mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>


-- 
Etiamsi omnes, ego non
_______________________________________________
Wikidata-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikidata-l

Reply via email to