🌐 Example: Run Presenton with a Custom LLM

docker run -it --name presenton -p 5000:80 \
  -e LLM="custom" \
  -e CUSTOM_LLM_URL="http://XXXXXXXXXXX/v1" \
  -e CUSTOM_LLM_API_KEY="your_custom_api_key" \
  -e CUSTOM_MODEL="your-model-name" \
  -e CAN_CHANGE_KEYS="false" \
  -v "./user_data:/app/user_data" \
  ghcr.io/presenton/presenton:latest

πŸ”§ Environment Variables for Custom LLM

VariableDescription
LLM="custom"Use the custom value to enable OpenAI-compatible API support
CUSTOM_LLM_URLBase URL of your OpenAI-compatible API (e.g. http://XXXXXXXXXXX/v1)
CUSTOM_LLM_API_KEYAPI key used for authorization (Bearer header)
CUSTOM_MODELID of the model to use (as defined by your API provider)
PEXELS_API_KEY(Optional) Used to fetch high-quality images to enhance presentations
CAN_CHANGE_KEYSSet to false to hide API keys from the frontend