-
-
Notifications
You must be signed in to change notification settings - Fork 172
feat(ollama): implement Ollama provider integration #93
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat(ollama): implement Ollama provider integration #93
Conversation
Hi! The CI failed, but it seems unrelated to my changes — the same test command passes for me locally. Could someone take a look when you get a chance? Let me know if there's anything I should adjust. Thanks! |
Hey @Dhruv1969Karnwal , |
🦋 Changeset detectedLatest commit: 542259d The changes in this PR will be included in the next version bump. This PR includes changesets to release 2 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
e3429cd
to
389e574
Compare
Hi @omeraplak, I've updated the Ollama provider documentation—let me know if you'd like to review it now or if there's anything specific you'd like me to add or change. |
|
||
## Available Providers | ||
|
||
VoltAgent offers built-in providers for various popular services and SDKs, including Vercel AI, Google AI (Gemini/Vertex), Groq, and any OpenAI-compatible API endpoint (via xsAI). | ||
|
||
**For detailed information on each available provider, including specific installation, configuration, and usage instructions, please see the [Providers Overview](../providers/overview.md).** |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
## Available Providers | |
VoltAgent offers built-in providers for various popular services and SDKs, including Vercel AI, Google AI (Gemini/Vertex), Groq, and any OpenAI-compatible API endpoint (via xsAI). | |
**For detailed information on each available provider, including specific installation, configuration, and usage instructions, please see the [Providers Overview](../providers/overview.md).** |
## Error Handling | ||
|
||
The provider includes several specialized error classes for better error handling: | ||
|
||
```typescript | ||
// Example error handling | ||
try { | ||
const response = await agent.generateText("Hello"); | ||
} catch (error) { | ||
if (error instanceof OllamaModelError) { | ||
console.error("Model-related error:", error.message); | ||
} else if (error instanceof OllamaConnectionError) { | ||
console.error("Connection error:", error.message); | ||
} else if (error instanceof OllamaValidationError) { | ||
console.error("Validation error:", error.message); | ||
} | ||
} | ||
``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
## Error Handling | |
The provider includes several specialized error classes for better error handling: | |
```typescript | |
// Example error handling | |
try { | |
const response = await agent.generateText("Hello"); | |
} catch (error) { | |
if (error instanceof OllamaModelError) { | |
console.error("Model-related error:", error.message); | |
} else if (error instanceof OllamaConnectionError) { | |
console.error("Connection error:", error.message); | |
} else if (error instanceof OllamaValidationError) { | |
console.error("Validation error:", error.message); | |
} | |
} | |
``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To keep things consistent, can we structure the Ollama provider docs the same way as the Vercel AI docs? Right now, the Ollama docs have more sections, and it would be great to align them. Thank you
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, will update you
Any updates on this? |
Thanks for your patience! Until this PR is ready, you can use it with our Vercel AI provider: |
Hi @omeraplak, sorry for the delay, and thank you for your patience! I've completed the update — the Ollama provider docs are now structured to match the Vercel AI docs. Let me know if there's anything you'd like me to adjust. Thanks! |
🛡️ Consolidates PRs #45, VoltAgent#88, VoltAgent#90, VoltAgent#91, VoltAgent#93 into unified error handling system ## ✨ Features - **Intelligent Error Classification**: Auto-categorizes errors by type, severity, retryability - **Advanced Retry Strategies**: Exponential backoff, jitter, category-aware delays - **Circuit Breaker Pattern**: Prevents cascading failures with configurable thresholds - **Error Recovery System**: Automatic parameter adjustment and fallback strategies - **Error Escalation**: Configurable escalation based on frequency and severity - **Comprehensive Metrics**: Detailed monitoring and analytics - **Easy Integration**: Drop-in replacement with backward compatibility ## 🏗️ Architecture - ErrorClassifier: Intelligent error categorization with provider-specific rules - RetryManager: Multiple retry strategies (fixed, exponential, linear, custom) - CircuitBreaker: Resilience pattern with state management - RecoverySystem: Pluggable recovery strategies for different error types - EscalationSystem: Configurable escalation with custom handlers - ErrorHandlingManager: Unified coordinator for all components ## 🔧 Components - - Comprehensive type definitions - - Error classification system - - Advanced retry strategies - - Circuit breaker implementation - - Error recovery mechanisms - - Error escalation management - - Main coordinator - - Integration utilities - - Public API exports ## 📊 Error Categories - TRANSIENT, RATE_LIMIT, AUTH, VALIDATION, RESOURCE - TOOL, MODEL, NETWORK, PERMANENT, UNKNOWN ## 🎯 Key Benefits - Zero duplication in error handling logic - Consistent error classification patterns - Removal of redundant retry mechanisms - Single cohesive resilience system - Clear contracts for error recovery - Comprehensive monitoring and metrics ## 🔄 Backward Compatibility - Maintains existing VoltAgentError interface - Integrates seamlessly with current provider system - Preserves existing error handling patterns ## 📚 Usage Resolves: ZAM-786
PR Checklist
Please check if your PR fulfills the following requirements:
Description
This PR implements the Ollama provider integration as discussed in #15, enabling users to leverage local LLM models via Ollama within VoltAgent applications.
Features
OllamaProvider
class adhering toLLMProvider
interfaceChecklist
Testing
What is the current behavior?
No Direct Implementation for Ollama
What is the new behavior?
Allows users to use OllamaProvider as LLM while creating an Agent
fixes (issue)
Related Issue
Fixes #15