Skip to content

Ollama Provider Integration #15

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
omeraplak opened this issue Apr 20, 2025 · 2 comments · May be fixed by #93
Open

Ollama Provider Integration #15

omeraplak opened this issue Apr 20, 2025 · 2 comments · May be fixed by #93
Assignees
Labels

Comments

@omeraplak
Copy link
Member

1. Overview:

Integrate Ollama as a supported LLM provider within the voltagent framework. This enables developers to run open-source large language models (like Llama, Mistral, etc.) locally via an Ollama instance and leverage them directly within their voltagent applications, offering privacy, offline capabilities, and cost savings.

2. Goals:

  • Implement a new OllamaProvider class adhering to the LLMProvider interface.
  • Support core text generation functionalities (generateText, streamText) using models served by a local or remote Ollama instance.
  • Communicate with the Ollama REST API (typically running on localhost:11434).
  • Map voltagent generation options (model name, temperature, etc.) to the corresponding Ollama API parameters.
  • Handle connection errors and Ollama-specific API responses/errors.
  • Allow users to easily configure the Ollama endpoint URL.
  • (Optional) Support generateObject and streamObject if Ollama's API offers a reliable JSON mode.
  • (Optional) Explore mapping voltagent tools if Ollama develops a standardized tool/function calling mechanism.

3. Proposed Architecture & Components:

  • OllamaProvider: A new class within agent/providers implementing LLMProvider. This class will:
    • Make direct fetch calls to the configured Ollama API endpoint (e.g., /api/generate, /api/chat).
    • Handle API client initialization (primarily setting the base URL).
    • Implement generateText, streamText, translating voltagent's BaseMessage format and options into Ollama's API request structure (handling different request formats like /api/generate vs /api/chat if necessary).
    • Parse Ollama API responses (including streaming JSON lines) back into the voltagent format.
  • Provider Registration: Update logic to recognize and instantiate OllamaProvider.
  • Configuration: Allow users to select 'ollama' as the provider, specify the model name (which must be pulled/available in their Ollama instance), and configure the Ollama API base URL.

4. Affected Core Modules:

  • agent/providers: New OllamaProvider class.
  • agent/types: Minor adjustments might be needed for Ollama parameters.
  • Agent/Agent Options: Configuration for provider selection, model names, and Ollama endpoint.
  • Relies on built-in fetch.

5. Acceptance Criteria (Initial MVP):

  • Users can configure an Agent to use the OllamaProvider with a specified model name (e.g., 'llama3') and the Ollama API endpoint.
  • The agent assumes Ollama is running and the specified model is available locally.
  • agent.generateText() successfully calls the Ollama API and returns a text response.
  • agent.streamText() successfully streams text chunks from the Ollama API.
  • Basic parameters like temperature are passed correctly.
  • Documentation includes setup instructions for the Ollama provider and prerequisites (installing Ollama, pulling models).

6. Potential Challenges & Considerations:

  • Ensuring the provider works correctly with different Ollama API versions.
  • Handling various Ollama errors (model not found, connection refused, etc.).
  • Lack of standardized tool/function calling in Ollama itself makes tool integration difficult.
  • Performance depends heavily on the user's local hardware running Ollama.
  • Users are responsible for managing the Ollama instance and downloading models.
  • Differences in output format or behavior between models served via Ollama.
@omeraplak omeraplak added good first issue Good for newcomers provider labels Apr 20, 2025
@Dhruv1969Karnwal
Copy link

Hi @omeraplak, I'd like to work on this issue. If no one is currently assigned, could you please assign it to me?

@omeraplak
Copy link
Member Author

Hey @Dhruv1969Karnwal ! That would be awesome. Feel free to pick it up 🙌
Let me know if you need anything to get started ⚡

Also, feel free to join us on Discord. We hang out there a lot:
👉 https://s.voltagent.dev/discord

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
2 participants