> On Dec 7, 2025, at 7:28 AM, George M. Rimakis <[email protected]> wrote:
> 
> I hope everyone is well. As new AI-Agent coding tools and models are released 
> and improved, I've started to adopt them in my professional life to some 
> extent personally as well. I've tried out Claude Code and OpenAI's GPT Codex, 
> with a handful of models on both sides.

> On Dec 7, 2025, at 9:35 AM, Scott McDonnell <[email protected]> wrote:
> Use the AI tools just like you do traditional programming. Break it into 
> functions, modules, and prototypes. This gives it one thing to do at a time. 
> You describe the inputs, the outputs, and the expected processing.

I've mentioned before I have some level of professional experience on the back 
end of LLMs. This keeps me pretty wary of the pitfalls of using them 
thoughtlessly but they're great tools within their limitations. The best way 
I've been able to use these things in a software development context is to 
treat them as junior developers: assume they know very little (they do!), are 
worse at the job than you are, and are prone to jumping the gun (they are!).

Sometimes the smaller, more focused open-weight models of the kind you can run 
at home are easier to keep on track. There's the additional advantage of never 
"running out of tokens" which can be helpful considering how much 
back-and-forth winds up being necessary in the inference context over time. 
Context sizes get expensive fast with the big providers, you're chucking the 
entire conversation in for processing each time you reply.

It's never occurred to me to do LLM-assisted development on old-school systems 
and I'm really interested in seeing how that goes for folks. I might try my 
hand at some point. I don't have the 8080/85 chops to have a device assist 
without imposing a bunch of debugging on me, particularly on a platform I doubt 
any LLMs have a ton of training on, but BASIC shouldn't be an issue at all.

Very cool project. Please share as much of the experience as you're comfortable 
sharing!

Reply via email to