Tag: ollama
-
Private LLM Chats with Ollama and Open WebUI – including LLaMa 3.1
Perhaps you’re concerned about chatting with one of these publicly hosted LLMs, or you’ve continuously hit your limit on the number of questions asked. These are just a few of the reasons you may want to run an LLM locally. I’ll be using docker to run these applications because I like being able to easily…