> Op 31 jan 2026 om 03:47 heeft hongxu via lists.openembedded.org 
> <[email protected]> het volgende geschreven:
> 
> On 1/31/26 00:56, Richard Purdie wrote:
>> CAUTION: This email comes from a non Wind River email account!
>> Do not click links or open attachments unless you recognize the sender and 
>> know the content is safe.
>> 
>> Hi Hongxu,
>> 
>>> On Fri, 2026-01-30 at 13:20 +0800, hongxu via lists.openembedded.org wrote:
>>> Would you please approve to add this layer to
>>> https://git.yoctoproject.org/, I will maintain
>>> meta-ollama for recipe uprev and bug fix
>> Thanks for sharing this, it looks really interesting and I like the
>> idea of it a lot.

Great timing as well :) I’m currently at FOSDEM asking around how people would 
feel about a meta-ai layer that would have various bits in it: tflite, 
onnx-runtime, llama.cpp and other things that need compilation.

At the OE BoF this morning I was pointed at this thread, so I’ll hook into that 
to ask:

How would people feel about a layer that combines at least the runtimes and has 
active maintainers?

The amount of ai/ml recipes I touch daily has grown too large to stuff in in 
the qcom-distro layer, but are also to spread out in other (vendor) layers to 
include by default.

regards,

Koen

>> 
>> Also, thanks for volunteering to maintain it, that does help alleviate
>> various concerns.
>> 
>> Approval for this rests with the Yocto Project TSC so I have asked them
>> about it. We should have a decision after our next meeting.
> 
> Copy, thank you for your support!
> 
> //Hongxu
> 
>> Cheers,
>> 
>> Richard
> 
> 
> 
> 
> 
-=-=-=-=-=-=-=-=-=-=-=-
Links: You receive all messages sent to this group.
View/Reply Online (#2223): 
https://lists.openembedded.org/g/openembedded-architecture/message/2223
Mute This Topic: https://lists.openembedded.org/mt/117540395/21656
Group Owner: [email protected]
Unsubscribe: https://lists.openembedded.org/g/openembedded-architecture/unsub 
[[email protected]]
-=-=-=-=-=-=-=-=-=-=-=-

Reply via email to