* Dr. Arne Babenhauserheide <[email protected]> [2026-03-28 16:44]: > Ihor Radchenko <[email protected]> writes: > > > Jean Louis <[email protected]> writes: > > > >> One quick clarification on determinism, though: while LLMs can be > >> non-deterministic, temperature 0 gives fully repeatable outputs. I > >> just tested this with a local model (Qwen3.5-9B) — three runs each > >> on math and even a haiku, identical results every time. > > > > If I remember correctly, temperature 0 also makes the more complex > > outputs non-usable. In other words, performance is severely degraded on > > complex tasks. > > Also as far as I understand the determinism only applies at the same > version of the model: any update would break it.
Yes, right, though same thing with all of the version of Linux kernel for example. -- Jean Louis
