Tried a couple of times to rewrite this, but it grows out of bound
anyhow. Seems like it has its own life.
There is a book from 2000 by Robert Dale and Ehud Reiter; Building
natural language generation systems ISBN 978-0-521-02451-8
Wikibase items can be rebuilt as Plans from the type statement
Cool, thanks! I read this a while ago, rereading again.
On Tue, Jan 15, 2019 at 3:28 AM Sebastian Hellmann <
hellm...@informatik.uni-leipzig.de> wrote:
> Hi all,
>
> let me send you a paper from 2013, which might either help directly or at
> least to get some ideas...
>
> A lemon lexicon for
Hi all,
let me send you a paper from 2013, which might either help directly or
at least to get some ideas...
A lemon lexicon for DBpedia, Christina Unger, John McCrae, Sebastian
Walter, Sara Winter, Philipp Cimiano, 2013, Proceedings of 1st
International Workshop on NLP and DBpedia,
Felipe,
thanks for the kind words.
There are a few research projects that use Wikidata to generate parts of
Wikipedia articles - see for example https://arxiv.org/abs/1702.06235 which
is almost as good as human results and beats templates by far, but only for
the first sentence of biographies.
An additional note; what Wikipedia urgently needs is a way to create
and reuse canned text (aka "templates"), and a way to adapt that text
to data from Wikidata. That is mostly just inflection rules, but in
some cases it involves grammar rules. To create larger pieces of text
is much harder,
Using an abstract language as an basis for translations have been
tried before, and is almost as hard as translating between two common
languages.
There are two really hard problems, it is the implied references and
the cultural context. An artificial language can get rid of the
implied