Acontext

Core Dependencies

Configure LLM providers, embedding models, and core services

Basic Environment Variables

Configure your acontext core services using these essential environment variables. All environment variables use uppercase field names corresponding to the configuration schema.

LLM Configuration

LLM_API_KEYstringrequired

API key for your LLM provider (OpenAI or Anthropic). This is the primary authentication credential for AI model access.

LLM_BASE_URLstring

Custom base URL for LLM API endpoints. Leave unset to use the provider's default endpoint.

LLM_SDKstring

LLM provider to use. Supported values: openai, anthropic

LLM_SIMPLE_MODELstring

Default model identifier for LLM operations. Examples: gpt-4, gpt-3.5-turbo, claude-3-sonnet

LLM_RESPONSE_TIMEOUTfloat

Timeout in seconds for LLM API responses. Increase for longer operations.

Embedding Configuration

BLOCK_EMBEDDING_PROVIDERstring

Embedding provider for vector operations. Supported values: openai, jina

BLOCK_EMBEDDING_MODELstring

Embedding model to use for generating vectors. Examples: text-embedding-3-small, text-embedding-ada-002

BLOCK_EMBEDDING_DIMinteger

Dimension size for embedding vectors. Must match your chosen embedding model's output dimensions.

BLOCK_EMBEDDING_API_KEYstring

Separate API key for embedding service. If not set, uses LLM_API_KEY.

BLOCK_EMBEDDING_BASE_URLstring

Custom base URL for embedding API endpoints. Leave unset to use the provider's default.

BLOCK_EMBEDDING_SEARCH_COSINE_DISTANCE_THRESHOLDfloat

Cosine distance threshold for embedding similarity searches. Lower values = more strict matching.

Be careful when choosing your embedding model. Changing the embedding model after data has been stored will require you to clean and rebuild your databases, as existing vector embeddings will be incompatible with the new model's output format and dimensions.

.env Examples

# Required LLM Configuration
LLM_API_KEY=sk-your-openai-api-key-here

# Optional LLM Settings
LLM_SDK=openai
LLM_SIMPLE_MODEL=gpt-4
LLM_RESPONSE_TIMEOUT=60

# Embedding Configuration
BLOCK_EMBEDDING_PROVIDER=openai
BLOCK_EMBEDDING_MODEL=text-embedding-3-small
BLOCK_EMBEDDING_DIM=1536
BLOCK_EMBEDDING_SEARCH_COSINE_DISTANCE_THRESHOLD=0.8
# Using Anthropic Claude
LLM_API_KEY=your-anthropic-api-key
LLM_SDK=anthropic
LLM_SIMPLE_MODEL=claude-3-sonnet-20240229

# Keep OpenAI for embeddings (recommended)
BLOCK_EMBEDDING_PROVIDER=openai
BLOCK_EMBEDDING_API_KEY=sk-your-openai-key-for-embeddings
# Custom LLM endpoint (e.g., Azure OpenAI)
LLM_API_KEY=your-azure-key
LLM_BASE_URL=https://your-resource.openai.azure.com/
LLM_SDK=openai

# Custom embedding endpoint
BLOCK_EMBEDDING_API_KEY=your-embedding-key
BLOCK_EMBEDDING_BASE_URL=https://api.jina.ai/v1/embeddings
BLOCK_EMBEDDING_PROVIDER=jina
# Ollama server running locally
LLM_API_KEY=dummy-key-not-required
LLM_BASE_URL=http://localhost:11434/v1
LLM_SDK=openai
LLM_SIMPLE_MODEL=qwen3:8b

# Local embedding with Ollama
BLOCK_EMBEDDING_PROVIDER=openai
BLOCK_EMBEDDING_API_KEY=dummy-key
BLOCK_EMBEDDING_BASE_URL=http://localhost:11434/v1
BLOCK_EMBEDDING_MODEL=qwen3-embedding:0.6b
BLOCK_EMBEDDING_DIM=1024

Appendix

Last updated on

On this page