Skip to main content

Mistral

Mistral

Properties used to connect to Mistral AI.

mistral

  • Type: {
         model?: string,
         system_prompt?: string,
         max_tokens?: number,
         temperature?: number,
         top_p?: number,
         random_seed?: number,
         n?: number,
         safe_mode?: boolean,
         reasoning_mode?: string,
         presence_penalty?: number,
         frequency_penalty?: number,
         tools?: MistralTool[],
         tool_choice?: "auto" | "any" | "none" | {type: "function", function: {name: string}},
         function_handler?: FunctionHandler
    }
  • Default: {model: "mistral-small-latest"}

Connect to Mistral AI's chat completion API.
model is the name of the Mistral model to be used by the API.
system_prompt provides behavioral context and instructions to the model.
max_tokens limits the maximum number of tokens in the generated response.
temperature controls the randomness of responses (0.0-1.0). Higher values produce more creative outputs.
top_p controls diversity through nucleus sampling (0.0-1.0).
random_seed sets a seed for deterministic generation.
n specifies the number of response choices to generate.
safe_mode enables or disables safe mode for content filtering.
reasoning_mode controls the reasoning behavior of the model.
presence_penalty reduces repetition by penalizing tokens that have appeared (-2.0 to 2.0).
frequency_penalty reduces repetition by penalizing frequently used tokens (-2.0 to 2.0).
tools defines available function declarations for the model to call.
tool_choice controls which (if any) tool should be called.
function_handler enables function calling capabilities for tool use.

Example

<deep-chat
directConnection='{
"mistral": {
"key": "placeholder key",
"temperature": 0.7
}
}'
></deep-chat>
info

Use stream to stream the AI responses.

Vision Example

Upload images alongside your text prompts for visual understanding. You must use a model with vision capabilities.

<deep-chat
directConnection='{
"mistral": {
"key": "placeholder key",
"model": "pixtral-12b-latest"
}
}'
images="true"
camera="true"
></deep-chat>
tip

When sending images we advise you to set maxMessages to 1 to send less data and reduce costs.

Tool Calling

Mistral supports function calling functionality:

MistralTool

  • Type: {
         type: "function",
         function: {
             name: string,
             description?: string,
             parameters: object
         }
    }

Array describing tools that the model may call.
type must be "function" for function tools.
name is the name of the tool function.
description explains what the tool does and when it should be used.
parameters defines the parameters the tool accepts in JSON Schema format.

FunctionHandler

The actual function that the component will call if the model wants to use tools.
functionsDetails contains information about what tool functions should be called.
This function should either return an array of JSONs containing a response property for each tool function (in the same order as in functionsDetails) which will feed it back into the model to finalize a response, or return a JSON containing text which will immediately display it in the chat.

Example

// using JavaScript for a simplified example

chatElementRef.directConnection = {
mistral: {
tools: [
{
type: 'function',
function: {
name: 'get_current_weather',
description: 'Get the current weather in a given location',
parameters: {
type: 'object',
properties: {
location: {
type: 'string',
description: 'The city and state, e.g. San Francisco, CA',
},
unit: {type: 'string', enum: ['celsius', 'fahrenheit']},
},
required: ['location'],
},
},
},
],
function_handler: (functionsDetails) => {
return functionsDetails.map((functionDetails) => {
return {
response: getCurrentWeather(functionDetails.arguments),
};
});
},
key: 'placeholder-key',
},
};