Forgot to say why this is important. Neural nets, especially recurrent
neural nets (RNN),
can do inflection and thus make reuse of Wikidata statements possible
inside the text.
A lot of languages have quite complex rules for inflection and agreement.

An alternative to RNN is finite state transfer (FSM).

On Thu, Sep 26, 2019 at 3:03 PM John Erling Blad <[email protected]> wrote:

> A project that could be really interesting is to make a Lua interface for
> some of the new neural nets, especially based on the Tsetlin-engine. Sounds
> nifty, but it is nothing more than a slight reformulation of an old
> learning algorithm (type early 70th), where the old algorithm has problem
> converging for bad training data (ie. not separable). What is really nice
> is that a trained network is extremely efficient, as it is mostly just
> bit-operations or add-operations. Which means we can make rather fancy
> classifiers that run in the web servers, and thus without any delayed
> update of the pages.
>
> The bad thing is that the training must be done offline, because that is
> nowhere near lightweight.
>
> Ordinary classifiers seems to work well, that is equivalents to fully
> connected layers. Also some types of convolutional layers. Some regressions
> can be done, but the networks are binary in nature, and mapping to and from
> linear scaling adds complexity.
>
> But running neural nets inside a PHP-based web server… I doubt we would
> hit the 10 sec limit for a Lua module even if we added several such
> networks.
>
> Ok, to much coffee today…
>
> John
>
_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to