Model Aliases
Model aliases allow you to create shorthand names for models you frequently use. This is particularly useful for models with long names or when you want to standardize model usage across your team.
Command Line Usage
You can define aliases when launching cecli using the --alias option:
cecli --alias "fast:gpt-5-mini" --alias "smart:o3-mini"
Multiple aliases can be defined by using the --alias option multiple times. Each alias definition should be in the format alias:model-name.
Configuration File
Of course,
you can also define aliases in your .cecli.conf.yml file:
alias:
- "fast:gpt-5-mini"
- "smart:o3-mini"
- "hacker:claude-3-sonnet-20240229"
Using Aliases
Once defined, you can use the alias instead of the full model name from the command line:
cecli --model fast # Uses gpt-5-mini
cecli --model smart # Uses o3-mini
Or with the /model command in-chat:
cecli v0.75.3
Main model: anthropic/claude-3-7-sonnet-20250219 with diff edit format, prompt cache, infinite output
Weak model: claude-3-5-sonnet-20241022
Git repo: .git with 406 files
Repo-map: using 4096 tokens, files refresh
─────────────────────────────────────────────────────────────────────────────────────────────────────
> /model fast
cecli v0.75.3
Main model: gpt-5-mini with diff edit format
─────────────────────────────────────────────────────────────────────────────────────────────────────
diff> /model smart
cecli v0.75.3
Main model: o3-mini with diff edit format
─────────────────────────────────────────────────────────────────────────────────────────────────────
>
Built-in Aliases
cecli includes some built-in aliases for convenience:
3: gpt-3.5-turbo35-turbo: gpt-3.5-turbo35turbo: gpt-3.5-turbo4: gpt-4-06134-turbo: gpt-4-1106-preview4o: gpt-4o5: gpt-5deepseek: deepseek/deepseek-chatflash: gemini/gemini-2.5-flashflash-lite: gemini/gemini-2.5-flash-litegemini: gemini/gemini-3-pro-previewgemini-2.5-pro: gemini/gemini-2.5-progemini-3-pro-preview: gemini/gemini-3-pro-previewgemini-exp: gemini/gemini-2.5-pro-exp-03-25grok3: xai/grok-3-betahaiku: claude-3-5-haiku-20241022optimus: openrouter/openrouter/optimus-alphaopus: claude-opus-4-20250514quasar: openrouter/openrouter/quasar-alphar1: deepseek/deepseek-reasonersonnet: anthropic/claude-sonnet-4-20250514
Advanced Model Settings
CECLI/Cecli supports model names with colon-separated suffixes (e.g., gpt-5:high) that map to additional configuration parameters defined in the relevant config.yml file. This allows you to create named configurations for different use cases. These configurations map precisely to the LiteLLM completion() method parameters here, though more are supported for specific models and providers. Any key under the model_settings key will override the model parameters defined in files like .cecli.model.settings.yml (more information here)
Configuration File
Add a structure like the following to your config.yml file or create a .cecli.model.overrides.yml file (or specify a different file with --model-overrides-file if there are global defaults you want):
model-overrides:
gpt-5:
high: # Use with: --model gpt-5:high
temperature: 0.8
top_p: 0.9
extra_body:
reasoning_effort: high
low: # Use with: --model gpt-5:low
temperature: 0.2
top_p: 0.5
creative: # Use with: --model gpt-5:creative
temperature: 0.9
top_p: 0.95
frequency_penalty: 0.5
claude-4-5-sonnet:
fast: # Use with: --model claude-3-5-sonnet:fast
temperature: 0.3
detailed: # Use with: --model claude-3-5-sonnet:detailed
temperature: 0.7
thinking_tokens: 4096
Usage
You can use these suffixes with any model argument:
# Main model with high reasoning effort (using file)
cecli --model gpt-5:high --model-overrides-file .cecli.model.overrides.yml
# Main model with high reasoning effort (using direct JSON/YAML)
cecli --model gpt-5:high --model-overrides '{"gpt-5": {"high": {"temperature": 0.8, "top_p": 0.9, "extra_body": {"reasoning_effort": "high"}}}}'
# Different configurations for main and weak models
cecli --model claude-3-5-sonnet:detailed --weak-model claude-3-5-sonnet:fast
# Editor model with creative settings
cecli --model gpt-5 --editor-model gpt-5:creative
How It Works
- When you specify a model with a suffix (e.g.,
gpt-5:high), cecli splits it into the base model name (gpt-5) and suffix (high). - It looks up the suffix in the overrides file for that model.
- The corresponding configuration parameters are applied to the model’s API calls.
- The parameters are deep-merged into the model’s existing settings, with overrides taking precedence.
Default Overrides
In addition to suffix-based overrides, you can define default overrides that apply directly to a model by name without requiring a colon-separated suffix. Use the special defaults key within your model-overrides configuration:
model-overrides:
defaults:
gpt-5:
temperature: 0.7
top_p: 0.9
claude-4-5-sonnet:
temperature: 1
model_settings:
cache_control: true
When you run cecli --model gpt-5, the default overrides specified under defaults are applied automatically. This is useful for setting baseline parameters for specific models without creating a named configuration.
Default overrides work alongside suffix-based overrides. If both a default override and a suffix override match the same parameter, the suffix override takes precedence:
# Applies default overrides for gpt-5
cecli --model gpt-5
# Applies suffix-based overrides for gpt-5:high, merged on top of defaults
cecli --model gpt-5:high
model-overrides:
defaults:
gpt-5:
temperature: 0.7
gpt-5:
high:
temperature: 0.9 # Overrides the default of 0.7
Priority
Model overrides work alongside aliases. For example, you can use:
cecli --model fast:high(iffastis an alias forgpt-5-mini)cecli --model sonnet:detailed(ifsonnetis an alias foranthropic/claude-sonnet-4-20250514)
The suffix is applied after alias resolution.
Priority
If the same alias is defined in multiple places, the priority is:
- Command line aliases (highest priority)
- Configuration file aliases
- Built-in aliases (lowest priority)
This allows you to override built-in aliases with your own preferences.
Model overrides with suffixes provide an additional layer of configuration that works alongside aliases, giving you fine-grained control over model parameters for different use cases.