What backend do you use to run local models?

by a guest ·
KoboldCPP
% (7 votes)
Kobold
% (0 votes)
Kobold Horde
% (0 votes)
Oobabooga
% (5 votes)
Some paid service
% (1 votes)
I don’t run local models
% (3 votes)
Other (if you put which one in the comments I will add it)
% (0 votes)
Exllama/ExllamaV2/TabbyAPI/exui/anything from turboderp
% (3 votes)
Llama.cpp
% (4 votes)
Ollama
% (0 votes)
Aphrodite
% (1 votes)
vLLM
% (3 votes)

Comments

This content is neither created nor endorsed by StrawPoll.

Create Your Own Poll

Want to create your own poll? With StrawPoll anyone can easily create an online poll in seconds.