Hello, Lucene community
The project I'm working on requires me to include a Hebrew analyzer,
therefore I searched through the OpenSearch and Elasticsearch documents but
couldn't find anything relevant. I would want to know if the Lucene
community has any plans to integrate Hebrew Tockenizer in the future.
Additionally, I would like to know what obstacles some of our users had
while trying to construct a Hebrew analyzer and whether it is possible to
make it work with stopwords, synonyms, lemmitizers, and other features.
Additionally, I discovered a github repository where users claim to have
created a Hebrew plugin for elastic search, but it appears to be out of
date, and when I attempt to add it to my elastic, the links break. What
other option do we have to fulfill this requirement?
Warm regards