Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.trieve.ai/llms.txt

Use this file to discover all available pages before exploring further.

Trieve Vector Inference is compatible with the OpenAI API. This means you’re able to just replace the endpoint, without changing any pre-existing code. Here’s an example with the openai python SDK:
1

Install the dependencies

pip install openai requests python-dotenv
2

Update OpenAI URL

Replace base_url with your embeddding endpoint.
openai_compatibility.py
import openai
import time
import requests
import os
from dotenv import load_dotenv

load_dotenv()

endpoint = "http://<your-ingress-endpoint-here>"

openai.base_url = endpoint

client = openai.OpenAI(
    # This is the default and can be omitted
    api_key=os.environ.get("OPENAI_API_KEY"),
    base_url=endpoint
)

embedding = client.embeddings.create(
    input="This is some example input",
    model="BAAI/bge-m3"
)