You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be very convenient to allow defining an alias such as o3-mini-high that is an alias for o3-mini but also automatically supplies -o reasoning high. In particular, it makes it easier to use shell completions and to swap out one model for the other when testing a different response to the previously executed llm shell command from bash/shell history (only edit the -m payload and not have to also add or remove 3 additional arguments, especially if you're comparing, say, o3-mini-medium vs gemini-2.5 vs o3-mini-high or whatever where it's not the same base model).
Alternatively, if it could be defined in a models .yaml instead of in the alias.yaml as a separate model, for example
- model_id: o3-mini-highmodel_name: o3-minimodel_args:
- reasoning: high
- model_id: o3-mini-mediummodel_name: o3-minimodel_args:
- reasoning: medium
The text was updated successfully, but these errors were encountered:
It would be very convenient to allow defining an alias such as
o3-mini-high
that is an alias foro3-mini
but also automatically supplies-o reasoning high
. In particular, it makes it easier to use shell completions and to swap out one model for the other when testing a different response to the previously executedllm
shell command from bash/shell history (only edit the-m
payload and not have to also add or remove 3 additional arguments, especially if you're comparing, say, o3-mini-medium vs gemini-2.5 vs o3-mini-high or whatever where it's not the same base model).Alternatively, if it could be defined in a models
.yaml
instead of in thealias.yaml
as a separate model, for exampleThe text was updated successfully, but these errors were encountered: