Skip to content

webui: add server info to chat message #14065

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

am17an
Copy link

@am17an am17an commented Jun 8, 2025

Fixes #13570. Tested this change out locally, this a model card similar to the timing card, also added a similar checkbox to toggle this behavior.

model_card

Also checked if it has getting added to the IndexedDB

storage

I purposefully did not store the whole serverProps object because it was quite large

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry this file needs to be removed

Copy link
Collaborator

@ngxson ngxson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As said in the last comment, I think the web UI code reached a state where we should put more thinking about edge cases and the UX, rather than "just make it works"

Comment on lines +208 to +210
<br />- Path: {msg.serverProps.model_path}
<br />- Build: {msg.serverProps.build_info}
<br />- Context: {msg.serverProps.n_ctx}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use <ul> <li> instead

Copy link
Author

@am17an am17an Jun 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did it this way to maintain homogeneity with the other dropdown. Let me if we should still use <ul> <li> here

<div className="dropdown-content bg-base-100 z-10 w-64 p-2 shadow mt-4">
                    <b>Prompt</b>
                    <br />- Tokens: {timings.prompt_n}
                    <br />- Time: {timings.prompt_ms} ms
                    <br />- Speed: {timings.prompt_per_second.toFixed(1)} t/s
                    <br />
                    <b>Generation</b>
                    <br />- Tokens: {timings.predicted_n}
                    <br />- Time: {timings.predicted_ms} ms
                    <br />- Speed: {timings.predicted_per_second.toFixed(1)} t/s
                    <br />
                  </div>
                  

className="cursor-pointer font-semibold text-sm opacity-60"
>
Model:{' '}
{msg.serverProps.model_path?.split(/(\\|\/)/).pop()}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this logic should be abstract into a function called modelNameFromPath()

Comment on lines +191 to +197
serverProps: serverProps
? {
model_path: serverProps.model_path,
build_info: serverProps.build_info,
n_ctx: serverProps.n_ctx,
}
: null,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is quite fragile for 2 reasons:

  • If user changes model but forget to reload the frontend, this serverProps object will be out of sync
  • This design is not future-proof. If we decide to add a new key into serverProps in the future, it will be quite messy. The serverProps object is not supposed to be saved an reuse later

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ngxson thanks for your review!
getServerProps is only called once on mount, so at the moment there is no way to get these in sync in case the backend changes but the frontend has not reloaded. One imprecise solution is to periodically poll the backend /health API and reload once the server comes back online. Let me know if that works
In case of future proofing the design, do you wish to de-couple these things? I think it might still be good to use the serverProps object to read these values but store them as separate types in the message.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Feature Request: Save Model Name in Conversation Chats (WebUI)
2 participants