Matt Rickard
Subscribe
Sign in
Optimizing $Model.cpp
Jul 25, 2023
2
llama.cpp and llama.c are both libraries to perform fast inference for Llama-based models.
Read →
Comments
This site requires JavaScript to run correctly. Please
turn on JavaScript
or unblock scripts
Optimizing $Model.cpp
llama.cpp and llama.c are both libraries to perform fast inference for Llama-based models.