Hosted models
Supported providers
LLM providers supported out-of-the-box
Empirical supports a set of popular LLM inference providers out-of-the-box. These
can be specified as provider
in the Empirical configuration file.
Supported providers
Provider | Description |
---|---|
openai | All chat models are supported. Requires OPENAI_API_KEY environment variable. |
azure-openai | All chat models from OpenAI that are hosted on Azure are supported. Requires AZURE_OPENAI_API_KEY and either of AZURE_OPENAI_RESOURCE_NAME or AZURE_OPENAI_BASE_URL environment variables. |
anthropic | Claude 3 models are supported. Requires ANTHROPIC_API_KEY environment variable. |
mistral | All chat models are supported. Requires MISTRAL_API_KEY environment variable. |
google | Gemini Pro models are supported. Requires GOOGLE_API_KEY environment variable. |
fireworks | Models hosted on Fireworks (e.g. dbrx-instruct ) are supported. Requires FIREWORKS_API_KEY environment variable. |
Environment variables
API calls to model providers require API keys, which are stored as environment variables. The CLI can work with:
- Existing environment variables (using
process.env
) - Environment variables defined in
.env
or.env.local
files, in the current working directory- For .env files that are located elsewhere, you can pass the
--env-file
flag
- For .env files that are located elsewhere, you can pass the