Jean Louis <[email protected]> writes:

> * David Masterson <[email protected]> [2026-03-25 08:12]:

>> Would this be possible given today's technology?
>
> Yes, it is possible—I am already doing it daily with local models
> using llama.cpp running entirely on CPU, with no proprietary
> dependencies. The tooling required is minimal: a text editor,
> llama.cpp, and a directory of prompt files or list of database
> entries. This runs on standard GNU/Linux systems today.

This is very interesting.  Where can I get a good primer on how to do
this?
--
David Masterson


Reply via email to