On 7/16/25 13:30, Jean Louis wrote:
What you call "AI" is just new technology powered with knowledge that
gives us good outcomes, it is new computing age, and not "intelligent"
by any means. It is just computer and software. So let's not give it
too much of the importance.
There is no knowledge involved, just statistical probabilities in those
"plausible sentence generators" or "stochastical parrots". Thus we see
daily the catastrophic failure of these systems in regards to factual
output. More money just make them more expensive. More electricity
just makes them more polluting. LLMs have peaked, technologically, but
the investment bubble still grows. It relates to software freedom in
that these parrots strip freedom-preserving attribution and licensing
information from the code snippets which they regurgitate.
AI (using today's definitions) is good at recombining pieces, once the
pieces are identified. So it can be useful right now in areas like
protein folding, I would expect. However, as far as producing code, it
can't. All it can do in that regard is strip licensing and attribution
from existing code and mix the pieces until something compiles. As
pointed out earlier in the thread, that reduces productivity.
Programmers using LLMs may /fee/ that they are 24% more effective, but
the data actually shows a 19% drop in productivity. It is the stripping
of licensing and attribution which may be a greater harm than the
reduced productivity, from a software freedom perspective. Indeed, it
is the licensing, specifically copyleft, which ensures the freedom to
code going forward. Once that is stripped from the files, the freedom
is gone.
Furthermore, the LLMs are being used to take away agency from coders,
turning them into, as Cory Doctorow put it, reverse centaurs which have
already been mentioned in an earlier message:
"A centaur is someone whose work is supercharged by
automation: you are a human head atop the tireless body
of a machine that lets you get more done than you could
ever do on your own."
"A reverse-centaur is someone who is harnessed to the
machine, reduced to a mere peripheral for a cruelly
tireless robotic overlord that directs you to do the
work that it can’t, at a robotic pace, until your body
and mind are smashed."
https://doctorow.medium.com/https-pluralistic-net-2024-08-02-despotism-on-demand-virtual-whips-4919c7e3d2bc
See also:
"Revenge of the Chickenized Reverse-Centaurs"
https://pluralistic.net/2022/04/17/revenge-of-the-chickenized-reverse-centaurs/
That situation is antithetical to the goals of software freedom, which
is the goal for the human to be in charge of the system and use it as a
tool to amplify his or her ability.
The people maneuvering to take away freedom and agency from the public
are working hard in the press to present "AI" as a done deal. It is
not, at least not as long as those working towards software freedom
remain able to continue to push back. These LLMs are enjoying an
extended overtime investment bubble which I posit will leave nothing
useful when it does finally burst.
But as for Akira's question at the start of the thread, is AI-generated
code changing free software? Since the LLMs strip both attribution and
licensing information, I would say yes, AI generated code is changing
free software by stripping away the freedom while simultaneously
detaching the code from the upstream projects it has been plagiarized
from. In that way it separates people from the free software projects
they could be working with.
/Lars
_______________________________________________
libreplanet-discuss mailing list
libreplanet-discuss@libreplanet.org
https://lists.libreplanet.org/mailman/listinfo/libreplanet-discuss