Skip to main content


So on the Fedora Start page there's the following link to how to setup Ollama and OpenWebUI to work with AI models locally. Which seems great.

fedoramagazine.org/running-gen…

But 1stn00b in the comments mentions Alpaca on Flathub, which is an even simpler way to get started.

flathub.org/apps/com.jeffser.A…

So, anyway, Alpaca is installing ...

#Fedora #Ollama #Alpaca #LLM

DouDou reshared this.

in reply to Steven

yes alpaca is dead simple. In any case you need top notch hardware for it to be useful.
in reply to DouDou

Yeah, I'm finding the large models on my server to be ... slow as shit. Have to stick with smaller ones for the time being. After I play a bit more I may start looking into some upgrades, if it's worth it to me.

DouDou reshared this.