How to integrate TVI with existing OpenAI compatible endpoints
Trieve Vector Inference is compatible with the OpenAI API.This means you’re able to just replace the endpoint, without changing any pre-existing code.Here’s an example with the openai python SDK:
1
Install the dependencies
Copy
Ask AI
pip install openai requests python-dotenv
2
Update OpenAI URL
Replace base_url with your embeddding endpoint.
openai_compatibility.py
Copy
Ask AI
import openaiimport timeimport requestsimport osfrom dotenv import load_dotenvload_dotenv()endpoint = "http://<your-ingress-endpoint-here>"openai.base_url = endpointclient = openai.OpenAI( # This is the default and can be omitted api_key=os.environ.get("OPENAI_API_KEY"), base_url=endpoint)embedding = client.embeddings.create( input="This is some example input", model="BAAI/bge-m3")