OpenRouter Integration

The OpenRouter API users can leverage OctoAI's best in class LLM endpoints.


OpenRouter provides a unified interface for using various LLMs and allows users to find and compare models based on PromiseRejectionEvent, latency, throughput. This let’s users find the right LLM and mix of price and performance for their use case.

Using OctoAI’s LLMs and OpenRouter

To access OctoAI’s best in class LLMs via OpenRouter sign into OpenRouter and create an account to obtain an OPENROUTER_API_KEY.

Using the code snippet below you can route your calls to OpenRouter via OpenAI’s client API. Set the providers the OpenRouter will use for your request using the order field. The router will filter this list to only include fproviders that are available for the model you want to use, and then try one at a time. It will fail if none are available.

If you do not set the field, the router will use the default ordering shown on the model page.

In the following code snippet, we assume the order of preference is as follows: OctoAI, Azure, then TogetherAI.

1import OpenAI from "openai"
3const openai = new OpenAI({
4 baseURL: "",
6 defaultHeaders: {
7 "HTTP-Referer": $YOUR_SITE_URL, // Optional, for including your app on rankings.
8 "X-Title": $YOUR_SITE_NAME, // Optional. Shows in rankings on
9 },
10 // dangerouslyAllowBrowser: true,
12async function main() {
13 const completion = await{
14 model: "mistralai/mixtral-8x7b-instruct",
15 messages: [
16 { role: "user", content: "Say this is a test" }
17 ],
18 provider: {
19 order: ["OctoAI", "Azure", "Together"]
20 }
21 })
23 console.log(completion.choices[0].message)

Find all models from OctoAI you can use at