[Buildroot] [PATCH v4] package/llama-cpp: new package

Joseph Kogut joseph.kogut at gmail.com
Wed Oct 29 20:36:01 UTC 2025


Hi Julien,


On Wed, Oct 29, 2025 at 1:28 PM Julien Olivain <ju.o at free.fr> wrote:
>
> Hi Joseph,
>
> On 28/10/2025 18:16, Joseph Kogut wrote:
> > Add a package for llama.cpp, a C/C++ LLM inference library, used in
> > popular projects like Ollama, RamaLama, and more.
> >
> > Signed-off-by: Joseph Kogut <joseph.kogut at gmail.com>
>
> I applied this patch to master.
>
> I did two minor changes, see:
> https://gitlab.com/buildroot.org/buildroot/-/commit/03f35bc63ba01ede5e78900cd98e5217b7e8d70b
>
> I was not sure about the real value of supporting a musl static build
> with such a package... If we see in autobuilders those static builds
> are a source of build failures which are complex to fix, we could limit
> to shared libraries later...
>

Looks good, thanks for your support and reviews!

> I also wrote a runtime test for that package. If you want to review
> or test it:
> https://patchwork.ozlabs.org/project/buildroot/patch/20251029202331.16192-1-ju.o@free.fr/
>

This is great, I'll test and follow up on that patch.

> Best regards,
>
> Julien.

Best,
Joseph


More information about the buildroot mailing list