Hi everyone, 

I'm excited to share a project I've been heads-down on for the better part 
of this year: an LLM backend management and orchestration API written in Go.

After a few failed attempts and several months of work, it's reached a 
point where I'd genuinely value feedback from developers who understand 
this space.

In a nutshell, it's a system to manage multiple LLM backends (Ollama, vLLM, 
OpenAI, etc.), execute complex, conditional workflows ("Task Chains") that 
can branch based on model output, call external hooks, and handle a variety 
of data types.

Key Components: 

- Unified API: Manage models, backends, and provider configs through a 
single OpenAPI 3.1 spec. 

- Affinity Groups: Control exactly which models are available to which 
backends for routing and access control. 

- Powerful Task Engine: Define workflows with multiple steps that can 
conditionally branch, parse responses (as numbers, scores, ranges, etc.), 
and integrate with external systems via hooks. 

- OpenAI-Compatible: Includes endpoints that mimic the OpenAI API, making 
it easier to integrate with existing tools.

It's Apache 2.0 licensed and available on GitHub. 

__
I'd be incredibly grateful if you could take a look, star it if it seems 
interesting, and open an issue with any thoughts, feedback, or questions—no 
matter how small.

GitHub: https://github.com/contenox/runtime 

Docs & API Spec: 
https://github.com/contenox/runtime/blob/main/docs/api-reference.md

Thanks for your time and any feedback you might have. 

Best, 
Alexander Ertli

-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to golang-nuts+unsubscr...@googlegroups.com.
To view this discussion visit 
https://groups.google.com/d/msgid/golang-nuts/eea1b531-a4af-4bc3-acbc-195f498cd5c0n%40googlegroups.com.

Reply via email to