Skip to content

Add a workaround for broken sliced attention on MPS with torch 2.4.1 #7066

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Oct 10, 2024

Conversation

RyanJDick
Copy link
Contributor

Summary

This PR adds a workaround for broken sliced attention on MPS with torch 2.4.1. The workaround keeps things working on MPS at the cost of increased peak memory utilization. Users who are unhappy with this should manually downgrade to torch==2.2.2.

Related Issues / Discussions

Bug report: #7049

QA Instructions

  • Test text-to-image on MPS

Checklist

  • The PR has a short but descriptive title, suitable for a changelog
  • Tests added / updated (if applicable)
  • Documentation added / updated (if applicable)

@github-actions github-actions bot added python PRs that change python files backend PRs that change backend files labels Oct 8, 2024
@psychedelicious psychedelicious force-pushed the ryan/fix-mps-attn-torch241 branch from 4b63bfc to 2678b4c Compare October 10, 2024 23:13
@hipsterusername hipsterusername merged commit ac08c31 into main Oct 10, 2024
14 checks passed
@hipsterusername hipsterusername deleted the ryan/fix-mps-attn-torch241 branch October 10, 2024 23:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backend PRs that change backend files python PRs that change python files
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants