Skip to content

remove scaled_mm fallback #1746

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

yuchengliu1
Copy link
Contributor

scaled_mm has supported in pytorch pytorch/pytorch#140972
This fallback will cause duplicate registration.
remove this fallback after pytorch/pytorch#140972 merged

@Copilot Copilot AI review requested due to automatic review settings June 16, 2025 07:40
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR removes the "_scaled_mm" fallback registration from the XPU backend, addressing duplicate registration issues as detailed in the PR description.

  • Remove the "_scaled_mm" fallback entry to prevent duplicate registrations
  • Align the XPU fallback registration with the updates in PyTorch core

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant