Prompting Local Models with Node.js and Ollama’s API – Classifying Food Items

I’ve created a script to scrape local flyer data. It’s a great start for a larger project but it gets all items in the flyer. When it comes to stores that sell groceries and non-edible merchandise, the list is full of potentially unwanted items. The goal of this project is to create a meal plan out of mostly on sale items. This means inedible items will be a waste of tokens.

I tried dumping the entire list of items in chat with a few different local models and returns undesirable results. It often thinks I want the items reformatted even though I begin the prompt with asking for a recipe based on the items. So we need to trim the list to edible items for a chance to get it into a context window of a larger model. But how do we classify hundreds, possibly thousands of items without breaking the bank? A common approach is using smaller models for simple tasks like this, and that we can do locally for free.

Earlier I created a post on how to run models locally using Ollama, as well as installing models not provided by Ollama. This is great when you have access to the user interface to start a conversation. If you wanted to access these models in Ollama from another application, you’d have to use another means of delivery. Luckily Ollama can be accessed using its API as well.

The beauty about APIs is you can use any programming language that can make JSON requests. For this example, we’ll be using Node.js.

If you have made API requests before this should be very simple for you. The only difference to the common HTTP request is how we handle the response. If you complete this request like you would any other, you’ll find the body of the response looks like a mess. This is because Ollama is expecting you to handle the response as a stream. We can collect this response and perform some processing once it’s complete.

const axios = require("axios");
const fs = require("fs");

// Path to your JSON file
const inputFilePath = "./allItems.json";
const outputFilePath = "./classifiedProducts.json";

// Configuration for Ollama API
const ollamaUrl = "http://localhost:11434/api/generate";
const modelName = "llama3.1"; // Replace with the specific model name if needed

/**
 * Classify an item using Ollama API
 * @param {string} itemName Item name to classify
 * @returns
 */
const classifyItemWithOllama = async (itemName) => {
  const prompt = `Is the following item commonly used as an ingredient in recipes?
     Consider items like raw fruits, vegetables, meats, grains, dairy, herbs, 
     spices, and similar basic ingredients. Items that are prepared meals, snacks,
     or heavily processed foods should not be considered. If the item is a common 
     recipe ingredient, respond with 'Recipe Ingredient'; otherwise, respond with 
     'Not Recipe Ingredient'. Do not include any other text in your response.
     Item: '${itemName}'.`;
  
  const payload = {
    model: modelName,
    prompt,
  };

  try {
    const response = await axios.post(ollamaUrl, payload, {
      responseType: "stream",
    });

    let fullResponse = "";

    for await (const chunk of response.data) {
      const chunkData = chunk.toString();
      try {
        const parsedChunk = JSON.parse(chunkData);
        if (parsedChunk.response) {
          fullResponse += parsedChunk.response;
        }
      } catch (error) {
        // Handle partial or invalid JSON chunks gracefully
        console.error(`Error parsing chunk: ${chunkData}`);
      }
    }

    return fullResponse.trim();
  } catch (error) {
    console.error(`Error communicating with Ollama: ${error.message}`);
    return "Error";
  }
};

/**
 * Classify products in the input JSON file
 * and save the results to a new JSON file.
 */
const classifyProducts = async () => {
  try {
    // Read the input JSON file
    const rawData = fs.readFileSync(inputFilePath, "utf-8");
    const flyers = JSON.parse(rawData);

    // Classify each product
    for (const flyer of Object.values(flyers)) {
      for (const product of flyer) {
        const itemName = product.name || "Unknown Item";
        console.log(`Classifying: ${itemName}`);
        const classification = await classifyItemWithOllama(itemName);
        product.classification = classification;
        console.log(`Classified '${itemName}' as '${classification}'`);
      }
    }

    // Save the results to a new JSON file
    fs.writeFileSync(outputFilePath, JSON.stringify(flyers, null, 4));
    console.log(`Classification complete. Results saved to ${outputFilePath}`);
  } catch (error) {
    console.error(`Error processing products: ${error.message}`);
  }
};

// Run the script
classifyProducts();

You can see in the classifyItemWithOllama() function that a response is broken into chunks, those chunks are casted to a string, then parsed as a JSON object. The response from that JSON object is then appended to a running response named fullResponse.

You can check out the YouTube video of this in action. For the full NodeJS application check out the Github Repo.


Posted

in

,

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *