-
-
Notifications
You must be signed in to change notification settings - Fork 440
Issues: simonw/llm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Enable mac clipboard via pngpaste (Support unseekable attachments)
#971
opened May 4, 2025 by
tjltjl
Can't get AsyncModel from extra-openai-models.yaml?
enhancement
New feature or request
#963
opened Apr 28, 2025 by
industrial-sloth
IDEA: spotted(?) first enshittification of openai models: artificial follow up questions
#959
opened Apr 26, 2025 by
rdslw
Markdown serialization/deserialization of
llm
conversations
#958
opened Apr 26, 2025 by
davidgasquez
Ability to specify fragment placement in templates
enhancement
New feature or request
fragments
#948
opened Apr 23, 2025 by
simonw
Ability to calculate token-per-second speeds, including recording time to first token
enhancement
New feature or request
#943
opened Apr 20, 2025 by
simonw
Reconsider llm.Conversation in favor of allowing prompts to be in reply to responses
design
tools
#938
opened Apr 20, 2025 by
simonw
Ability to "reply" to a tool-response with a prompt carrying those tool results
tools
#937
opened Apr 20, 2025 by
simonw
Mechanism for attaching tool execution requests to a Response
tools
#936
opened Apr 20, 2025 by
simonw
Feature: Provide JSON formatted output for
llm models list --options
#928
opened Apr 17, 2025 by
rpdelaney
Add documentation clarifying the difference between fragments and templates
documentation
Improvements or additions to documentation
fragments
templates
#918
opened Apr 14, 2025 by
simonw
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.