Model Select
Given an array of messages and the selection of LLMs you want to route between, /model-select
will return a label for which LLM you should call.
Example usage:
Authorizations
Bearer authentication header of the form Bearer <token>
, where <token>
is your auth token.
Body
The messages array in OpenAI format.
A list of LLM providers you want to route between. You can optionally define their custom attributes, such as price and latency.
A list of tools the model may call. Currently, only functions are supported as a tool. Use this to provide a list of functions the model may generate JSON inputs for. A max of 128 functions are supported.
The maximum depth considered in the ranked LLM providers list.
Additional routing tradeoffs. Valid values are "cost" and "latency".
The preference ID created on the Not Diamond dashboard. Setting this value will change the ranking algorithm based on the preferences set.
Whether to hash the content of messages.
The session ID from a previous API call. This allows you to link recommendations such that the recommendation engine knows which requests belong in the same group.
Response
Label of the LLM you should call.
Was this page helpful?