Skip to main content

Ollama

Ollama

Properties used to connect to Ollama.

ollama

  • Type: true | {
         model?: string,
         system_prompt?: string,
         think?: boolean,
         keep_alive?: boolean,
         tools?: OllamaTool[],
         function_handler?: FunctionHandler,
         options?: {
             temperature?: number,
             top_k?: number,
             top_p?: number,
             min_p?: number
    }}
  • Default: {model: "llama3.2"}

Connect to your locally running Ollama instance. Ollama is a tool that allows you to run large language models locally on your machine.
model is the name of the Ollama model to use. See here.
system_prompt provides system instructions for the model's behavior.
think enables the model's reasoning capabilities when supported.
keep_alive controls whether to keep the model loaded in memory after the request.
tools defines functions that the model can call.
function_handler is the actual function called with the model's instructions.
options contains additional model configuration parameters.

info

Ollama does not require an API key as it runs locally.

Example

<deep-chat
directConnection='{
"ollama": {
"system_prompt": "You are a helpful assistant.",
"options": {"temperature": 0.7}
}
}'
></deep-chat>
info

Use stream to stream the AI responses.

Custom URL Example

By default, Ollama connects to http://localhost:11434/api/chat. You can specify a custom URL using the connect property:

<deep-chat directConnection='{"ollama": true}' connect='{"url": "http://localhost:11434/api/chat"}'></deep-chat>

Vision Example

Upload images alongside your text prompts for visual understanding. You must use a vision-capable model.

<deep-chat
directConnection='{
"ollama": {
"model": "llava"
}
}'
images="true"
camera="true"
></deep-chat>
tip

When sending images we advise you to set maxMessages to 1 to send less data and reduce costs.

Tool Calling

Ollama supports tool calling functionality with compatible models:

OllamaTool

  • Type: {
         type: "function",
         function: {
             name: string,
             description: string,
             parameters: object
         }
    }

Array describing tools that the model may call.
name is the name of the tool function.
description explains what the tool does and when it should be used.
parameters defines the parameters the tool accepts in JSON Schema format.

FunctionHandler

The actual function that the component will call if the model wants to use tools.
functionsDetails contains information about what tool functions should be called.
This function should either return an array of JSONs containing a response property for each tool function (in the same order as in functionsDetails) which will feed it back into the model to finalize a response, or return a JSON containing text which will immediately display it in the chat.

Example

// using JavaScript for a simplified example

chatElementRef.directConnection = {
ollama: {
tools: [
{
type: 'function',
function: {
name: 'get_current_weather',
description: 'Get the current weather in a given location',
parameters: {
type: 'object',
properties: {
location: {
type: 'string',
description: 'The city and state, e.g. San Francisco, CA',
},
unit: {type: 'string', enum: ['celsius', 'fahrenheit']},
},
required: ['location'],
},
},
},
],
function_handler: (functionsDetails) => {
return functionsDetails.map((functionDetails) => {
return {
response: getCurrentWeather(functionDetails.arguments),
};
});
},
},
};

Prerequisites

To use Ollama with Deep Chat, you need to:

  1. Install Ollama on your machine from ollama.com
  2. Download a model: Run ollama pull llama3.2 (or any other model)
  3. Start Ollama: The service should be running on http://localhost:11434
tip

You can list available models with ollama list and see running models with ollama ps.