Basic Environment Variables
Configure your acontext core services using these essential environment variables. All environment variables use uppercase field names corresponding to the configuration schema.LLM Configuration
API key for your LLM provider (OpenAI or Anthropic). This is the primary authentication credential for AI model access.
Custom base URL for LLM API endpoints. Leave unset to use the provider’s default endpoint.
LLM provider to use. Supported values:
openai, anthropicDefault model identifier for LLM operations. Examples:
gpt-4, gpt-3.5-turbo, claude-3-sonnetTimeout in seconds for LLM API responses. Increase for longer operations.
Embedding Configuration
Embedding provider for vector operations. Supported values:
openai, jinaEmbedding model to use for generating vectors. Examples:
text-embedding-3-small, text-embedding-ada-002Dimension size for embedding vectors. Must match your chosen embedding model’s output dimensions.
Separate API key for embedding service. If not set, uses
LLM_API_KEY.Custom base URL for embedding API endpoints. Leave unset to use the provider’s default.
Cosine distance threshold for embedding similarity searches. Lower values = more strict matching.
.env Examples
Appendix
Ollama Setup Instructions
Ollama Setup Instructions
1
Install Ollama
Go to Ollama to download and install Ollama.
2
Start Ollama
3
Enable OpenAI compatibility
Ollama automatically provides OpenAI-compatible endpoints at
http://localhost:11434/v1Local LLM setups are perfect for development, privacy-sensitive applications, or when you want to avoid API costs. Ollama provides OpenAI-compatible APIs, making integration seamless.