Running LLMs Locally
3AI tools in the Running LLMs Locally category
llama.cpp
A
LLM inference in C/C++.
AgentRunning LLMs Locally
2 dirs
llama.cpp guide
Running LLMs locally, on any hardware, from scratch
SkillRunning LLMs Locally
1 dir
PowerInfer
a high-speed inference engine for deploying LLMs locally
SkillRunning LLMs Locally
1 dir