Empirical supports a set of popular LLM inference providers out-of-the-box. These can be specified as provider in the Empirical configuration file.

Supported providers

ProviderDescription
openaiAll chat models are supported. Requires OPENAI_API_KEY environment variable.
azure-openaiAll chat models from OpenAI that are hosted on Azure are supported. Requires AZURE_OPENAI_API_KEY and either of AZURE_OPENAI_RESOURCE_NAME or AZURE_OPENAI_BASE_URL environment variables.
anthropicClaude 3 models are supported. Requires ANTHROPIC_API_KEY environment variable.
mistralAll chat models are supported. Requires MISTRAL_API_KEY environment variable.
googleGemini Pro models are supported. Requires GOOGLE_API_KEY environment variable.
fireworksModels hosted on Fireworks (e.g. dbrx-instruct) are supported. Requires FIREWORKS_API_KEY environment variable.

Environment variables

API calls to model providers require API keys, which are stored as environment variables. The CLI can work with:

  • Existing environment variables (using process.env)
  • Environment variables defined in .env or .env.local files, in the current working directory
    • For .env files that are located elsewhere, you can pass the --env-file flag
npx empiricalrun --env-file <PATH_TO_ENV_FILE>