-
Notifications
You must be signed in to change notification settings - Fork 11.6k
read chat prompts from a template file #1196
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
I think the nice thing about this example is it basically creates a chat bot out of the standard LLaMA model. If you have the official models, there's nothing else to download or figure out to have a back and forth chat. I probably prefer a single file example people can edit that also works with a base model. That said, I'm not against breaking out the prompt and passing that in separately, but it's going to be a bit of work keeping the same chat in 3 different formats - and then maybe adding more formats as new models come along. If we did want to support prompts for different chat models, we might want to just have the chat script also convert regular chat style conversations to alpaca/vicuna-1.0 style prompting (with something like Edit: Keep in mind there's also a |
this has now been rebased to master. N.B. occasionally with this patch, encountering the reverse prompt does not halt text generation. i believe this is triggered by the trailing whitespace that i've added to the reverse prompt. is this expected? if so, i can rollback the whitespace addition. it is a slightly better user experience to include the whitespace at the end, but not if it causes the aforementioned issue. i tested this PR with the following invocations:
|
Yes, this is expected. There should be no whitespace at the end. |
@ggerganov thank you, fixed the whitespace in the reverse prompt. i think this is ready to merge now. |
sed -e "s/\[\[USER_NAME\]\]/$USER_NAME/g" \ | ||
-e "s/\[\[AI_NAME\]\]/$AI_NAME/g" \ | ||
-e "s/\[\[DATE_TIME\]\]/$DATE_TIME/g" \ | ||
-e "s/\[\[DATE_YEAR\]\]/$DATE_YEAR/g" \ | ||
$PROMPT_TEMPLATE > $PROMPT_FILE |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
export USER_NAME AI_NAME DATE_TIME DATE_YEAR
configure() {
local file="$1"
sed 's/\\\\/\${__envsubst_b}/g;s/\\\$/\${__envsubst_d}/g' "${file}" | __envsubst_b=\\ __envsubst_d=\$ envsubst
}
configure "$PROMPT_TEMPLATE" > "$PROMPT_FILE"
this is a simple improvement to the chat-13B example which reads prompts from a text file instead of inlining them in the script. i've used this with vicuna 13B with excellent results.