Hi,

I had a very brief look. I don’t know why there is a Python server 
implementation.
We have a fully GRPC capable backend implementation located here: 
https://github.com/apache/opennlp-sandbox/tree/main/opennlp-grpc
You can just run that via Docker (for example). This would also be the thing to 
upgrade / enhance for further NLP tasks.

On the other side, we would need to implement a Python client (based on the 
proto definition) and wrap that into something a Python person is more familiar 
with. 
I guess that the best way would be to fork opennlp-sandbox and provide your 
updates in your fork of the sandbox, so it can easily be integrated.

Gruß
Richard

> Am 16.09.2025 um 13:56 schrieb Jobin Sabu <85jobins...@gmail.com>:
> 
> Good evening OpenNLP devs,
> I hope you’re doing well. I wanted to apologise for the delay in following
> up on the gRPC-based Python integration we discussed earlier — university
> exams and related academic commitments took up more time than I expected.
> Things have settled down now, and I’m ready to continue the work.
> 
> Quick status update (minimal prototype):
> 
> I generated and tested the Python gRPC stubs from proto/opennlp.proto.
> 
> I implemented a small Python gRPC server that exposes the tokenizer service
> and a simple client example.
> 
> I verified the flow locally using the official Apache OpenNLP CLI (OpenNLP
> 2.5.5).
> 
> Code + usage examples are on GitHub:
> https://github.com/JOBIN-SABU/Apache_opennlp_grpc
> 
> 
> Next steps I intend to take (and would value your input on):
> 
> Open a short PR with this minimal prototype so it’s easy for reviewers to
> run and test.
> 
> Expand to additional services (POS tagging, sentence detection) based on
> your feedback and preferences.
> 
> Ensure the implementation and packaging meet ASF guidelines for
> contribution.
> 
> 
> If you or other maintainers have a moment, could you try the repo and share
> any initial feedback? I can open a PR with a concise README and test
> instructions so it’s straightforward to run locally. I’d greatly appreciate
> any guidance on preferred design choices or packaging requirements before I
> expand the scope.
> 
> Thank you again for the earlier encouragement — I’m excited to contribute
> and will follow your lead on next steps.
> 
> Best regards,
> Jobin Sabu
> 85jobins...@gmail.com
> https://github.com/JOBIN-SABU/Apache_opennlp_grpc
> 
> 
> On Sun, 18 May, 2025, 12:38 am Richard Zowalla, <r...@apache.org> wrote:
> 
>> Hi,
>> 
>> Feel free to contribute to OpenNLP.
>> 
>> You can ask any question on the list or we can open a discussion with code
>> (i.e. in a PR).
>> Every contribution in that area is valuable, imho.
>> 
>> Gruß
>> Richard
>> 
>>> Am 10.05.2025 um 05:38 schrieb Jobin Sabu <85jobins...@gmail.com>:
>>> 
>>> Respected Apache OpenNLP Team,
>>> 
>>> I hope you're doing well.
>>> 
>>> Although I wasn't selected for GSoC this year, I remain deeply interested
>>> in contributing to OpenNLP — particularly the gRPC-based Python
>> integration
>>> project we previously discussed. I found the idea both technically
>> exciting
>>> and valuable for the community, and I’d still love to pursue it
>>> independently.
>>> 
>>> Currently, I’m in the middle of my 4th semester university exams, which
>>> will conclude on *May 17*. I plan to begin contributing more actively
>>> around *May 20* and would be truly grateful for your *guidance and
>>> mentorship* throughout the process.
>>> 
>>> Please let me know if you’re open to supporting this contribution and if
>>> there are any specific expectations or directions I should follow to
>> align
>>> well with the project.
>>> 
>>> Looking forward to your response.
>>> 
>>> Warm regards,
>>> *Jobin Sabu*
>> 
>> 

Reply via email to