Ollama Embedder

The OllamaEmbedder can be used to embed text data into vectors locally using Ollama.

The model used for generating embeddings needs to run locally.

Usage

cookbook/embedders/ollama_embedder.py

from bitca.agent import AgentKnowledge
from bitca.vectordb.pgvector import PgVector
from bitca.embedder.ollama import OllamaEmbedder

embeddings = OllamaEmbedder().get_embedding("The quick brown fox jumps over the lazy dog.")

# Print the embeddings and their dimensions
print(f"Embeddings: {embeddings[:5]}")
print(f"Dimensions: {len(embeddings)}")

# Example usage:
knowledge_base = AgentKnowledge(
    vector_db=PgVector(
        db_url="postgresql+psycopg://ai:ai@localhost:5532/ai",
        table_name="ollama_embeddings",
        embedder=OllamaEmbedder(),
    ),
    num_documents=2,
)

Params

Parameter
Type
Default
Description

model

str

"openhermes"

The name of the model used for generating embeddings.

dimensions

int

4096

The dimensionality of the embeddings generated by the model.

host

str

-

The host address for the API endpoint.

timeout

Any

-

The timeout duration for API requests.

options

Any

-

Additional options for configuring the API request.

client_kwargs

Optional[Dict[str, Any]]

-

Additional keyword arguments for configuring the API client. Optional.

ollama_client

Optional[OllamaClient]

-

An instance of the OllamaClient to use for making API requests. Optional.

Last updated