Skip to content

[Bug]: xformers always generate different results, seed is the same! #3967

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
1 task done
mykeehu opened this issue Oct 30, 2022 · 6 comments
Closed
1 task done

[Bug]: xformers always generate different results, seed is the same! #3967

mykeehu opened this issue Oct 30, 2022 · 6 comments
Labels
bug-report Report of a bug, yet to be confirmed

Comments

@mykeehu
Copy link
Contributor

mykeehu commented Oct 30, 2022

Is there an existing issue for this?

  • I have searched the existing issues and checked the recent builds/commits

What happened?

With the same parameters I get different results for each generation if xformers is enabled. This is a big problem!
If xformers is disabled, the error does not occur.

Steps to reproduce the problem

  1. I used these parameters for generation with xformers enabled:
wood table top
Steps: 25, Sampler: Euler a, CFG scale: 7, Seed: 1057871485, Size: 512x512, Model hash: 7460a6fa
  1. Generate multiple images, one by one
  2. Compare the images.
  3. Do the same without xformers

What should have happened?

With the same seed and parameters, get the same result with xformers enabled as without, and never change

Commit where the problem happens

d699720

What platforms do you use to access UI ?

Windows

What browsers do you use to access the UI ?

Google Chrome

Command Line Arguments

With xformers:
--xformers --ui-config-file 1my/ui-config-my.json --ui-settings-file 1my/config-my.json --autolaunch --gradio-img2img-tool color-sketch


without xformers:
--ui-config-file 1my/ui-config-my.json --ui-settings-file 1my/config-my.json --autolaunch --gradio-img2img-tool color-sketch

Additional information, context and logs

No response

@mykeehu mykeehu added the bug-report Report of a bug, yet to be confirmed label Oct 30, 2022
@mykeehu
Copy link
Contributor Author

mykeehu commented Oct 30, 2022

Here are some samples, parameters are the same:
00123-1057871485-wood table top
00122-1057871485-wood table top
00121-1057871485-wood table top

I also tested with version 17a2076, but the error persists. The --reinstall-xformers did not help

@ClashSAN
Copy link
Collaborator

I recall this always being a thing. Search the other related issues

@ClashSAN
Copy link
Collaborator

#3967

@0xdevalias
Copy link

0xdevalias commented Nov 1, 2022

A friend mentioned that using xformers could make things non-deterministic, and that there were a lot of references to it on the repo issues here. Wanting to understand a bit more about it, and link a bunch of potentially related issues together, I tried to find as many issues as I could that seemed to be related to xformers and the potential for it to be causing non-deterministic / unstable / inconsistent results:

The following may potentially be related (ordered by issue number):

Issues:

Discussions:

I also came across this thread in the xformers repo, which while I can't guarantee is related, am wondering if it might be:

And a question I raised on a PR in the diffusers repo:

@shirooo39
Copy link

ah... no wonder why I'm always getting exremely slightly different result from TheLastBen's implementation and my own simple implementation, even though I'm also using xformers

@MrKrzYch00
Copy link

MrKrzYch00 commented Jun 3, 2024

When using xformers I changed "Random number generator source." from GPU to NV in settings and results became the same as when xformers were not used.
After changing it to CPU the result got the same (or very close to it) when I tested the AI with CPU only before, for exactly that purpose - comparing results.

So it seems that changing random generator source is all that is required to bring a specific behavior to result generation rather than switching between xformers or not or running the model on CPU specifically.

In other words, GPU RNG + disable-xformers behaved the same as NV RNG + xformers in my particular case - would need confirmation. Now the question would be, why using xformers cause GPU RNG to behave differently while not using them make it like NV RNG setting? Because like that, GPU setting seems somewhat like an undefined behavior (if it acts differently when using a slightly different platform setup while the hardware stays the same).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-report Report of a bug, yet to be confirmed
Projects
None yet
Development

No branches or pull requests

5 participants