Skip to content

Add support for Ollama as a local inference backend #1

@Rage997

Description

@Rage997

It would be great to add Ollama as an alternative backend to OpenAI. This would let users run GitNexus with local LLMs, improving privacy, avoiding API costs, and enabling offline use.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions