You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Feature Request: Fragment support in llm chat command
Currently, the llm prompt command supports fragments with the -f/--fragment option, but the llm chat command does not.
Use Case
When engaging in an ongoing chat with a model, it would be useful to include fragments (documentation, code, context) in the conversation. This would allow for maintaining context throughout the conversation similar to how a Claude Workspace allows users to pin documents to a conversation.
Proposed Solution
Add -f/--fragment and --sf/--system-fragment options to the llm chat command, similar to how they work in the llm prompt command:
Consistent experience between llm prompt and llm chat
Ability to maintain reference documentation throughout a chat session
Better support for complex, context-rich conversations
Users can easily add context from files, URLs or saved fragments into ongoing conversations
Implementation Recommendation
Update the chat function signature in cli.py to include fragment parameters:
@cli.command()@click.option("-s", "--system", help="System prompt to use")@click.option("-m", "--model", help="Model to use")@click.option("fragments","-f","--fragment",multiple=True,help="Fragment (alias, URL, hash or file path) to add to the prompt",)@click.option("system_fragments","--sf","--system-fragment",multiple=True,help="Fragment to add to system prompt",)# existing parameters...defchat(
system,
model_id,
fragments,
system_fragments,
# existing parameters...
):
# ...
Add fragment handling logic similar to the prompt command:
# Load fragments contentfragment_content= []
forfragmentinfragments:
fragment_content.append(load_fragment(fragment))
system_fragment_content= []
forfragmentinsystem_fragments:
system_fragment_content.append(load_fragment(fragment))
# Add fragment content to the promptiffragment_content:
ifprompt_text:
prompt_text=prompt_text+"\n\n"+"\n\n".join(fragment_content)
else:
prompt_text="\n\n".join(fragment_content)
# Add system fragment content to system promptifsystem_fragment_content:
ifsystem_prompt:
system_prompt=system_prompt+"\n\n"+"\n\n".join(system_fragment_content)
else:
system_prompt="\n\n".join(system_fragment_content)
Ensure fragments persist between chat messages:
Store fragment references in the chat session
Re-apply fragment content for each new message in the conversation
Current Workaround
Currently, users need to use llm prompt with the fragments and manually continue the conversation, which breaks the natural chat flow.
The text was updated successfully, but these errors were encountered:
Feature Request: Fragment support in
llm chat
commandCurrently, the
llm prompt
command supports fragments with the-f/--fragment
option, but thellm chat
command does not.Use Case
When engaging in an ongoing chat with a model, it would be useful to include fragments (documentation, code, context) in the conversation. This would allow for maintaining context throughout the conversation similar to how a Claude Workspace allows users to pin documents to a conversation.
Proposed Solution
Add
-f/--fragment
and--sf/--system-fragment
options to thellm chat
command, similar to how they work in thellm prompt
command:Benefits
llm prompt
andllm chat
Implementation Recommendation
chat
function signature incli.py
to include fragment parameters:prompt
command:Current Workaround
Currently, users need to use
llm prompt
with the fragments and manually continue the conversation, which breaks the natural chat flow.The text was updated successfully, but these errors were encountered: