-
Notifications
You must be signed in to change notification settings - Fork 109
VRAM leak when extension is installed #13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
oh |
you mean also the normal lora? |
Yes even just using only regular loras. I thought at first maybe the medvram/xformers settings didn't know how to flush locon/lycoris, but even just using regular loras the VRAM would eventually fill and won't empty. However I have not tried not using any loras at all. Also it's somewhat random when this happens. Sometimes I could generate 100 images before it breaks, sometimes it happens on the first couple of images. oh and the gpu is 8GB 3060TI posex |
Tried this extension again and I am no longer having this issue with the latest version of webui, the latest version of this extension, pytorch 2, and xformers 0.18, and updated all my extensions. I don't really know which of those fixed it, but it is fixed for me. |
When this extension is installed, after a few times generating images with either lora or locon the vram will be filled and there's no way to fix without restarting. Uninstalling the extension fixes the problem.
COMMANDLINE_ARGS= --listen --medvram --xformers --api
The text was updated successfully, but these errors were encountered: