@@ -12,19 +12,36 @@ The format is based on [Keep a Changelog][] and this project adheres to
12
12
[ heading__unreleased ] : #unreleased
13
13
14
14
15
- - [ ] ` doc/proompter.txt ` is correct and up-to-date?
16
15
- [ ] Proxy traffic between Vim ` channel ` and Ollama API
16
+ - [ ] HTTP Response parser handles non-` 200 ` status codes
17
17
- [ ] releases listening port
18
18
> Note; ` kill -SIGINT <PID> ` works for interactive sessions, but doesn't
19
19
> when backgrounded within Vader test runner script?!
20
20
- [ ] tested with system-level SystemD
21
21
> Note; the ` Wants ` , ` Requires ` , and other bindings to ` ollama.service ` may
22
22
> need additional testing
23
23
- [ ] Allow [ python-hot-reload] [ ] of ` scripts/proompter-channel-proxy.py ` ?
24
- - [ ] Passing mock tests?!
25
- - [ ] ` autoload/proompter.vim ` needs mock tests for; ` proompter#SendPromptToGenerate ` , ` proompter#SendPrompt ` , ` proompter#SendHighlightedText ` , ` proompter#cancel ` , ` proompter#load ` , ` proompter#unload `
24
+ - [ ] Refactor code and configurations to allow connections to other LLM APIs?
25
+ - [ vllm ] [ ]
26
26
27
27
[ python-hot-reload ] : https://stackoverflow.com/questions/29960634/reload-the-currently-running-python-script
28
+ [ vllm ] : https://docs.vllm.ai/en/latest/getting_started/quickstart.html
29
+
30
+ ______
31
+
32
+
33
+ ## [ 0.0.6] - 2024-10-13
34
+
35
+
36
+ - [X] Passing mock tests for ` autoload/proompter.vim `
37
+ - [X] ` proompter#SendPromptToGenerate `
38
+ - [X] ` proompter#SendPrompt `
39
+ - [X] ` proompter#SendHighlightedText `
40
+ - [X] ` proompter#Cancel `
41
+ - [X] ` proompter#Load `
42
+ - [X] ` proompter#Unload `
43
+ - [X] ` doc/proompter.txt ` is correct and up-to-date?
44
+ - [X] Add ` pre-commit ` hook to help mitigate known committing bugs
28
45
29
46
30
47
______
0 commit comments