Leverage OpenAPI Spec for Client/Model Generation? #44
Replies: 4 comments 4 replies
-
First of all, I love this opener with the efforts to add all the links to further information, just checked them. And yes, I am totally in to get something like this rolling. I guess I need to say it here but personally I think it's obvious though: I don't need OllamaSharp to be hand-rolled. I am not voting for the current code base. The intention has always been to serve the most convenient way (maybe highly opinionated) to use the Ollama API from an .NET app - mostly because I think it's a good way to host the API separately from the application host. Most Ollama libraries required Ollama to be running locally. That being said, I am not sure if the OpenAI spec can cover all the required endpoints to build powerfull Ollama API apps, like pulling models and getting the currencly running models, etc. So my first question would be: Would this make an OpenAI spec library + Ollama specifics? Or are there no specifics left? Also, if you can manage to get the Ollama team to host the specs, I think we could link it from their repository to ours with a git submodule directly, for example. |
Beta Was this translation helpful? Give feedback.
-
I'm a little bit confused, is this about OpenAPI Spec or about OpenAI Spec? |
Beta Was this translation helpful? Give feedback.
-
Wow man, this was my typo at first but it seems I got down the same rabbit hole while writing. It's OpenAPI of course 🙈 |
Beta Was this translation helpful? Give feedback.
-
I went ahead and opened a PR on the Ollama repo with the OpenAPI 3.1 spec. |
Beta Was this translation helpful? Give feedback.
-
Today, the OllamaSharp library is largely hand-rolled. This was done because the Ollama team does not furnish any OpenAPI or Swagger specifications. There is an issue open in the Ollama repository to try to remedy this. The Ollama contributors have stated a few times that they're not fans of the existing tooling out there to automatically generate OpenAPI specs from.
I had started working on my own OpenAPI spec. It's not fully complete or accurate, but it's close. There's also the spec furnished by @HavenDV here.
HavenDV has asked that we combine efforts a bit to create a standard ollama wrapper. I'm inclined to agree.
Ultimately, I'd like to use something like OllamaSharp or HavenDV's Ollama nuget package to underpin broader packages, like a Semantic Kernel connector. There are some out there today, but they either utilize older versions of OllamaSharp or they're also using hand-rolled implementations.
I'm using some cycles to try to get the OpenAPI spec fully in-line with the current Ollama code. Further, I'm going to push hard to get the Ollama team to include the spec in the repository for ongoing releases. I'm working to script out and automate as much of this as possible to at least throw some CI warnings if the server API and OpenAPI spec start to drift.
@awaescher What are your thoughts?
Beta Was this translation helpful? Give feedback.
All reactions