So on the Fedora Start page there's the following link to how to setup Ollama and OpenWebUI to work with AI models locally. Which seems great.
fedoramagazine.org/running-gen…
But 1stn00b in the comments mentions Alpaca on Flathub, which is an even simpler way to get started.
flathub.org/apps/com.jeffser.A…
So, anyway, Alpaca is installing ...
Running Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazine
How to get started running Generative AI Models Locally with Ollama and Open WebUISumantro Mukherjee (Fedora Project)
DouDou reshared this.
DouDou
in reply to Steven • • •Steven likes this.
Steven
in reply to DouDou • •DouDou reshared this.