Ollama
Ollama
Properties used to connect to Ollama.
ollama
- Type:
true| {
model?: string,
system_prompt?: string,
think?: boolean,
keep_alive?: boolean,
tools?: OllamaTool[],
function_handler?: FunctionHandler,
options?:{
temperature?: number,
top_k?: number,
top_p?: number,
min_p?: number
}} - Default: {model: "llama3.2"}
Connect to your locally running Ollama instance. Ollama is a tool that allows you to run large language models locally on your machine.
model is the name of the Ollama model to use. See here.
system_prompt provides system instructions for the model's behavior.
think enables the model's reasoning capabilities when supported.
keep_alive controls whether to keep the model loaded in memory after the request.
tools defines functions that the model can call.
function_handler is the actual function called with the model's instructions.
options contains additional model configuration parameters.
Ollama does not require an API key as it runs locally.
Example
- Sample code
- Full code
<deep-chat
directConnection='{
"ollama": {
"system_prompt": "You are a helpful assistant.",
"options": {"temperature": 0.7}
}
}'
></deep-chat>
<!-- This example is for Vanilla JS and should be tailored to your framework (see Examples) -->
<deep-chat
directConnection='{
"ollama": {
"system_prompt": "You are a helpful assistant.",
"options": {"temperature": 0.7}
}
}'
style="border-radius: 8px"
></deep-chat>
Use stream to stream the AI responses.
Custom URL Example
By default, Ollama connects to http://localhost:11434/api/chat. You can specify a custom URL using the connect property:
- Sample code
- Full code
<deep-chat directConnection='{"ollama": true}' connect='{"url": "http://localhost:11434/api/chat"}'></deep-chat>
<!-- This example is for Vanilla JS and should be tailored to your framework (see Examples) -->
<deep-chat
directConnection='{"ollama": true}'
connect='{"url": "http://localhost:11434/api/chat"}'
style="border-radius: 8px"
></deep-chat>
Vision Example
Upload images alongside your text prompts for visual understanding. You must use a vision-capable model.
- Sample code
- Full code
<deep-chat
directConnection='{
"ollama": {
"model": "llava"
}
}'
images="true"
camera="true"
></deep-chat>
<!-- This example is for Vanilla JS and should be tailored to your framework (see Examples) -->
<deep-chat
directConnection='{
"ollama": {
"model": "llava"
}
}'
images="true"
camera="true"
style="border-radius: 8px"
textInput='{"styles": {"container": {"width": "77%"}}}'
></deep-chat>
When sending images we advise you to set maxMessages to 1 to send less data and reduce costs.
Tool Calling
Ollama supports tool calling functionality with compatible models:
OllamaTool
- Type: {
type: "function",
function:{
name: string,
description: string,
parameters: object
}
}
Array describing tools that the model may call.
name is the name of the tool function.
description explains what the tool does and when it should be used.
parameters defines the parameters the tool accepts in JSON Schema format.
FunctionHandler
- Type: (
functionsDetails: FunctionsDetails) =>{response: string}[]|{text: string}
The actual function that the component will call if the model wants to use tools.
functionsDetails contains information about what tool functions should be called.
This function should either return an array of JSONs containing a response property for each tool function (in the same order as in functionsDetails)
which will feed it back into the model to finalize a response, or return a JSON containing text which will immediately display it in the chat.
Example
- Sample code
- Full code
// using JavaScript for a simplified example
chatElementRef.directConnection = {
ollama: {
tools: [
{
type: 'function',
function: {
name: 'get_current_weather',
description: 'Get the current weather in a given location',
parameters: {
type: 'object',
properties: {
location: {
type: 'string',
description: 'The city and state, e.g. San Francisco, CA',
},
unit: {type: 'string', enum: ['celsius', 'fahrenheit']},
},
required: ['location'],
},
},
},
],
function_handler: (functionsDetails) => {
return functionsDetails.map((functionDetails) => {
return {
response: getCurrentWeather(functionDetails.arguments),
};
});
},
},
};
// using JavaScript for a simplified example
chatElementRef.directConnection = {
ollama: {
tools: [
{
type: 'function',
function: {
name: 'get_current_weather',
description: 'Get the current weather in a given location',
parameters: {
type: 'object',
properties: {
location: {
type: 'string',
description: 'The city and state, e.g. San Francisco, CA',
},
unit: {type: 'string', enum: ['celsius', 'fahrenheit']},
},
required: ['location'],
},
},
},
],
function_handler: (functionsDetails) => {
return functionsDetails.map((functionDetails) => {
return {
response: getCurrentWeather(functionDetails.arguments),
};
});
},
},
};
function getCurrentWeather(location) {
location = location.toLowerCase();
if (location.includes('tokyo')) {
return JSON.stringify({location, temperature: '10', unit: 'celsius'});
} else if (location.includes('san francisco')) {
return JSON.stringify({location, temperature: '72', unit: 'fahrenheit'});
} else {
return JSON.stringify({location, temperature: '22', unit: 'celsius'});
}
}
Prerequisites
To use Ollama with Deep Chat, you need to:
- Install Ollama on your machine from ollama.com
- Download a model: Run
ollama pull llama3.2(or any other model) - Start Ollama: The service should be running on
http://localhost:11434
You can list available models with ollama list and see running models with ollama ps.