IMHO we are missing one big topic in the current DebConf schedule - the
explosion of large language models and other generative AI models. This is
raising questions of what is or could be Debian role in these developments?
So the question from my side is - is anyone who will be there at DebConf
and is feeling competent enough to chat about this topic, maybe also give a
quick summary of what the landscape of this functionality and its
challenges is currently, to start the discussion?
I expect that tooling for development of such AI tools is in Debian or is
easy to get into Debian and that training of models is too cost-intensive
for Debian to do (currently) but what about distribution of pre-trained
models - what are technical and legal challenges there? What about hosting
execution engines of pre-trained models - is that within Debian compute
capability? What about distribution of tools with AI integration - local,
Debina-hosted or externally hosted? What are technical and legal challenges
there?
If we don't have a better candidate, I can volunteer to host/moderate the
BoF, but we for sure need some more competent people who can actually talk
there. Any ideas?
--
Best regards,
Aigars Mahinovs
--
Best regards,
Aigars Mahinovs