This agent is used to interact with a language model running locally or remotely by utilizing the Ollama API. Before using this agent locally you need to have Ollama installed and running.
- Install Ollama
- Install a Ollama model, we suggest using the
phi3model as it is set as the default model in the code - Start the Ollama API server
To configure the agent, run /agent config ollama to open up the setting file in your default editor, and then update the file based on the following example.
{ // To use Ollama API service: // 1. Install Ollama: `winget install Ollama.Ollama` // 2. Start Ollama API server: `ollama serve` // 3. Install Ollama model: `ollama pull phi3` // Declare predefined model configurations "Presets": [ { "Name": "PowerShell Expert", "Description": "A ollama agent with expertise in PowerShell scripting and command line utilities.", "ModelName": "phi3", "SystemPrompt": "You are a helpful and friendly assistant with expertise in PowerShell scripting and command line." } ], // Declare Ollama endpoint "Endpoint": "http://localhost:11434", // Enable Ollama streaming "Stream": false, // Specify the default preset to use "DefaultPreset": "PowerShell Expert" }