Tag: llama
-
Prompting Local Models with Node.js and Ollama’s API – Classifying Food Items
I’ve created a script to scrape local flyer data. It’s a great start for a larger project but it gets all items in the flyer. When it comes to stores that sell groceries and non-edible merchandise, the list is full of potentially unwanted items. The goal of this project is to create a meal plan…
-
Private LLM Chats with Ollama and Open WebUI – including LLaMa 3.1
Perhaps you’re concerned about chatting with one of these publicly hosted LLMs, or you’ve continuously hit your limit on the number of questions asked. These are just a few of the reasons you may want to run an LLM locally. I’ll be using docker to run these applications because I like being able to easily…