Skip to main content
Version: V1.1.0

Ollama

Ollama runs embedding models locally and exposes an OpenAI-compatible API for embeddings. seekdb provides OllamaEmbeddingFunction so you can call that API and use the embeddings with seekdb collections.

tip

Using Ollama service requires you to follow Ollama's pricing rules and may incur corresponding fees. Before proceeding, please visit their official website or refer to relevant documentation to confirm and accept their pricing standards. If you do not agree, please do not proceed.

Dependencies and environment

  • Ollama installed and running.

  • The embedding model available locally. Pull the model before use (for example, ollama pull nomic-embed-text).

    ollama pull nomic-embed-text

Installation

npm i seekdb @seekdb/ollama

Example: create an Ollama embedding function

import { OllamaEmbeddingFunction } from "@seekdb/ollama";

const ef = new OllamaEmbeddingFunction({
url: "http://localhost:11434/v1",
modelName: "nomic-embed-text",
// apiKeyEnv: "OLLAMA_API_KEY",
});

Configurations:

  • url: Base URL for the Ollama API (default: "http://localhost:11434/v1").
  • modelName: Name of a model you have pulled locally (default: "nomic-embed-text").
  • apiKeyEnv: Environment variable name for the API key (default: "OLLAMA_API_KEY"; optional; Ollama typically does not validate this value).
tip

Ensure Ollama is running and the target model is available (for example, via ollama pull <model_name>).