🔧 Core Configuration

  • CAN_CHANGE_KEYS Controls whether users can view or modify API keys via the interface. Set to "false" to keep keys hidden and locked, or "true" to allow modification. Example:
    CAN_CHANGE_KEYS="false"
    
  • LLM Select the Large Language Model (LLM) provider to use. Supported values: "openai", "google", "anthropic", "ollama", "custom" Example:
    LLM="openai"
    
  • IMAGE_PROVIDER Select the Image provider to use. Supported values: "pexels", "dall-e-3", "gemini_flash", "pixabay" Example:
    IMAGE_PROVIDER="pexels"
    
  • DATABASE_URL (optional) Defines the external database connection URL. If not provided, the application will default to using SQLite for local storage. Supports both PostgreSQL and MySQL connection strings. Examples:
    DATABASE_URL="postgresql://user:password@host:port/dbname"
    DATABASE_URL="mysql://user:password@host:port/dbname"
    

🧠 Model Provider Specific Variables

  • WEB_GROUNDING (Optional) Uses web search and other tools to improve presentation quality. Supported LLM: "openai", "google", "anthropic" Example:
 WEB_GROUNDING="true"

🔹 OpenAI

  • OPENAI_API_KEY Required if LLM="openai" Example:
    OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxx"
    
  • OPENAI_MODEL (Optional) Defaults to gpt-4.1 if not specified Example:
    OPENAI_MODEL="gpt-4o"
    

🔹 Google

  • GOOGLE_API_KEY Required if LLM="google" Example:
    GOOGLE_API_KEY="AIzaSyXXXXXXXXXXXX"
    
  • GOOGLE_MODEL (Optional) Defaults to models/gemini-2.0-flash if not specified Example:
    GOOGLE_MODEL="models/gemini-1.5-pro"
    
⚠️ Image generation is not supported in EU regions with Google.

🔹 Anthropic

  • ANTHROPIC_API_KEY Required if LLM="anthropic" Example:
    ANTHROPIC_API_KEY="sk-ant-xxxxxxxxxxxx"
    
  • ANTHROPIC_MODEL (Optional) Defaults to claude-3-5-sonnet-20241022 if not specified Example:
    ANTHROPIC_MODEL="claude-3-opus-20240229"
    

🔹 Ollama

  • OLLAMA_URL (Optional) URL of your custom Ollama server. Useful if you’re self-hosting. Example:
    OLLAMA_URL="http://localhost:11434"
    
  • OLLAMA_MODEL Required if LLM="ollama" Example:
    OLLAMA_MODEL="llama3.2:3b"
    

🔹 Custom (OpenAI-compatible LLMs)

  • CUSTOM_LLM_URL Required if LLM="custom" Example:
    CUSTOM_LLM_URL="https://api.your-custom-llm.com/v1"
    
  • CUSTOM_LLM_API_KEY (Optional) if LLM="custom" Example:
    CUSTOM_LLM_API_KEY="your_custom_key"
    
  • CUSTOM_MODEL Required if LLM="custom" Example:
    CUSTOM_MODEL="llama3.2:3b"
    
  • TOOL_CALL (Optional) if LLM="custom", Uses Tool Call for structured outputs instead of Json Schema. Example:
    TOOL_CALL="true"
    
  • DISABLE_THINKING (Optional) if LLM="custom", Disables thinking for Custom Models. Example:
    DISABLE_THINKING="true"
    

Let me know if you’d like a compact version or .env template too.

🖼️ Image Providers

  • PEXELS_API_KEY (Optional) Used to fetch high-quality stock images from Pexels. Example:
    PEXELS_API_KEY="vzXXXXXXXXXXXXXX"
    
  • OPENAI_API_KEY (Optional) Used to generate images using DALL·E 3 via the OpenAI API. Example:
    OPENAI_API_KEY="sk-XXXXXXXXXXXXXXXX"
    
  • GOOGLE_API_KEY (Optional) Used to access Gemini Flash Image. Example:
    GOOGLE_API_KEY="AIzaSyXXXXXXXXXXXXXX"
    
  • PIXABAY_API_KEY (Optional) Used to fetch stock images from Pixabay. Example:
    PIXABAY_API_KEY="3883XXXXXXXXXXXXX"
    

🐳 Docker Example

docker run -it --name presenton -p 5000:80 \
  -e LLM="ollama" \
  -e OLLAMA_MODEL="llama3.2:3b" \
  -e OLLAMA_URL="http://localhost:11434" \
  -e CAN_CHANGE_KEYS="false" \
  -e IMAGE_PROVIDER="pexels" \
  -e PEXELS_API_KEY="your_pexels_key" \
  -v "./app_data:/app_data" \
  ghcr.io/presenton/presenton:latest