Hello Fabian,
The LKB gazetteer uses its own tokenization, which is generally - whitespace
based. This is the reason why it won't work over asian texts.
Unfortunately we no longer support it.
All the best,
Philip
On 4 Apr 2012, at 10:59 AM, Fabian Cretton wrote:
> Dear all,
>
> I am using
Dear all,
I am using Gate 6.1 and the large KB Gazetteer. It works just fine with
french and english.
But when I include chinese 'aliases', no lookup appear for the chinese
words.
Where should I look for a mistake ?
The ontology does have labels with chinese as "你好"@zh.
They are loaded in O