Skip to content

add save_load_state example #1150

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Apr 24, 2023
Merged

Conversation

xaedes
Copy link
Collaborator

@xaedes xaedes commented Apr 24, 2023

This is the save_load_state example script from #730 (comment)

It will load model params read from command line.
Then evaluate and print the initial prompt and save the state.
n_predict tokens are generate and printed.
Model is freed and load in new context.
Load the state and generate and print n_predict.
The n_predict generated tokens should be the same.

@@ -0,0 +1,133 @@
#include <vector>
#include <iostream>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
#include <iostream>
#include <cstdio>

Is better

auto n_prompt_tokens = llama_tokenize(ctx, params.prompt.c_str(), tokens.data(), tokens.size(), true);

if (n_prompt_tokens < 1) {
cout << "Failed to tokenize prompt" << endl;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
cout << "Failed to tokenize prompt" << endl;
fprintf(stderr, "%s : failed to tokenize prompt\n", __func__);

etc.

@xaedes
Copy link
Collaborator Author

xaedes commented Apr 24, 2023

I changed cout to printf / fprintf in this last commit.

@ggerganov
Copy link
Member

ggerganov commented Apr 24, 2023

Sorry, one more thing - all filenames use dashes - instead of underscore _

examples/save_load_state/save_load_state.cpp -> examples/save-load-state/save-load-state.cpp

@xaedes
Copy link
Collaborator Author

xaedes commented Apr 24, 2023

Sure, I changed it :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants