> On Dec 25, 2025, at 10:47 AM, John R. Hogerhuis <[email protected]> wrote:
> 
> It seems like a focused base local model for model 100 basic programming 
> would be the way to go

I've got about 50 bucks credit on RunPod that's been sitting there for a year. 
If I had a decent dataset it'd be effectively free to train something on, say, 
3x3090 or 1xH100. Quality hand-tagged data is always best and it's a huge pain 
to put together. Maybe along the lines of several large listings and a wad of 
good snippets would do it.

The pain of making training data is why a lot of local LLM stuff is very 
ChatGPT in behavior and language: synthetic data generated by a more expensive 
LLM.  It's turned into something of a plague.

Reply via email to