[Buildroot] [PATCH v4] package/llama-cpp: new package

Julien Olivain ju.o at free.fr
Wed Oct 29 20:28:12 UTC 2025


Hi Joseph,

On 28/10/2025 18:16, Joseph Kogut wrote:
> Add a package for llama.cpp, a C/C++ LLM inference library, used in
> popular projects like Ollama, RamaLama, and more.
> 
> Signed-off-by: Joseph Kogut <joseph.kogut at gmail.com>

I applied this patch to master.

I did two minor changes, see:
https://gitlab.com/buildroot.org/buildroot/-/commit/03f35bc63ba01ede5e78900cd98e5217b7e8d70b

I was not sure about the real value of supporting a musl static build
with such a package... If we see in autobuilders those static builds
are a source of build failures which are complex to fix, we could limit
to shared libraries later...

I also wrote a runtime test for that package. If you want to review
or test it:
https://patchwork.ozlabs.org/project/buildroot/patch/20251029202331.16192-1-ju.o@free.fr/

Best regards,

Julien.


More information about the buildroot mailing list