Skip to main content

OpenRouter

OpenRouter

Properties used to connect to OpenRouter.

openRouter

  • Type: {
         model?: string,
         max_tokens?: number,
         temperature?: number,
         top_p?: number,
         frequency_penalty?: number,
         presence_penalty?: number,
         system_prompt?: string,
         tools?: OpenRouterTool[],
         function_handler?: FunctionHandler
    }
  • Default: {model: "openai/gpt-4o"}

Connect to OpenRouter's chat completion API.
model is the name of the model to be used by the API (e.g., "openai/gpt-3.5-turbo").
max_tokens limits the maximum number of tokens in the generated response (1 to model's context length).
temperature controls the randomness of responses (0.0-2.0). Higher values produce more creative outputs.
top_p controls diversity through nucleus sampling (0.0-1.0).
frequency_penalty reduces repetition by penalizing frequently used tokens (-2.0 to 2.0).
presence_penalty encourages topic diversity by penalizing tokens that have appeared (-2.0 to 2.0).
system_prompt provides behavioral context and instructions to the model.
tools defines available function declarations for the model to call.
function_handler enables function calling capabilities for tool use.

Example

<deep-chat
directConnection='{
"openRouter": {
"key": "placeholder key",
"model": "openai/gpt-3.5-turbo",
"temperature": 0.7
}
}'
></deep-chat>
info

Use stream to stream the AI responses.

Vision Example

Upload images alongside your text prompts for visual understanding. You must use a model with vision capabilities.

<deep-chat
directConnection='{
"openRouter": {
"key": "placeholder key",
"model": "openai/gpt-4o"
}
}'
images="true"
camera="true"
></deep-chat>
tip

When sending images we advise you to set maxMessages to 1 to send less data and reduce costs.

Audio Example

Upload audio files alongside your text prompts for speech understanding. You must use a model with audio capabilities.

<deep-chat
directConnection='{
"openRouter": {
"key": "placeholder key",
"model": "openai/gpt-4o-audio-preview"
}
}'
audio="true"
></deep-chat>

Tool Calling

OpenRouter supports function calling functionality:

OpenRouterTool

  • Type: {
         type: "function",
         function: {
             name: string,
             description: string,
             parameters: object
    }}

Array describing tools that the model may call.
type must be "function" for function tools.
name is the name of the tool function.
description explains what the tool does and when it should be used.
parameters defines the parameters the tool accepts in JSON Schema format.

FunctionHandler

The actual function that the component will call if the model wants to use tools.
functionsDetails contains information about what tool functions should be called.
This function should either return an array of JSONs containing a response property for each tool function (in the same order as in functionsDetails) which will feed it back into the model to finalize a response, or return a JSON containing text which will immediately display it in the chat.

Example

// using JavaScript for a simplified example

chatElementRef.directConnection = {
openRouter: {
tools: [
{
type: 'function',
function: {
name: 'get_current_weather',
description: 'Get the current weather in a given location',
parameters: {
type: 'object',
properties: {
location: {
type: 'string',
description: 'The city and state, e.g. San Francisco, CA',
},
unit: {type: 'string', enum: ['celsius', 'fahrenheit']},
},
required: ['location'],
},
},
},
],
function_handler: (functionsDetails) => {
return functionsDetails.map((functionDetails) => {
return {
response: getCurrentWeather(functionDetails.arguments),
};
});
},
key: 'placeholder-key',
},
};