Skip to content

Add fragment support to llm chat command #951

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
dguido opened this issue Apr 23, 2025 · 0 comments
Open

Add fragment support to llm chat command #951

dguido opened this issue Apr 23, 2025 · 0 comments

Comments

@dguido
Copy link
Contributor

dguido commented Apr 23, 2025

Feature Request: Fragment support in llm chat command

Currently, the llm prompt command supports fragments with the -f/--fragment option, but the llm chat command does not.

Use Case

When engaging in an ongoing chat with a model, it would be useful to include fragments (documentation, code, context) in the conversation. This would allow for maintaining context throughout the conversation similar to how a Claude Workspace allows users to pin documents to a conversation.

Proposed Solution

Add -f/--fragment and --sf/--system-fragment options to the llm chat command, similar to how they work in the llm prompt command:

llm chat -f fragment1 -f fragment2 --sf system-fragment

Benefits

  • Consistent experience between llm prompt and llm chat
  • Ability to maintain reference documentation throughout a chat session
  • Better support for complex, context-rich conversations
  • Users can easily add context from files, URLs or saved fragments into ongoing conversations

Implementation Recommendation

  1. Update the chat function signature in cli.py to include fragment parameters:
@cli.command()
@click.option("-s", "--system", help="System prompt to use")
@click.option("-m", "--model", help="Model to use")
@click.option(
    "fragments",
    "-f",
    "--fragment",
    multiple=True,
    help="Fragment (alias, URL, hash or file path) to add to the prompt",
)
@click.option(
    "system_fragments",
    "--sf",
    "--system-fragment",
    multiple=True,
    help="Fragment to add to system prompt",
)
# existing parameters...
def chat(
    system,
    model_id,
    fragments,
    system_fragments,
    # existing parameters...
):
    # ...
  1. Add fragment handling logic similar to the prompt command:
# Load fragments content
fragment_content = []
for fragment in fragments:
    fragment_content.append(load_fragment(fragment))
    
system_fragment_content = []
for fragment in system_fragments:
    system_fragment_content.append(load_fragment(fragment))

# Add fragment content to the prompt
if fragment_content:
    if prompt_text:
        prompt_text = prompt_text + "\n\n" + "\n\n".join(fragment_content)
    else:
        prompt_text = "\n\n".join(fragment_content)

# Add system fragment content to system prompt
if system_fragment_content:
    if system_prompt:
        system_prompt = system_prompt + "\n\n" + "\n\n".join(system_fragment_content)
    else:
        system_prompt = "\n\n".join(system_fragment_content)
  1. Ensure fragments persist between chat messages:
    • Store fragment references in the chat session
    • Re-apply fragment content for each new message in the conversation

Current Workaround

Currently, users need to use llm prompt with the fragments and manually continue the conversation, which breaks the natural chat flow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant