Skip to content

LoRA Wrapper for Existing Models #2943

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Glar35 opened this issue Mar 22, 2025 · 1 comment
Open

LoRA Wrapper for Existing Models #2943

Glar35 opened this issue Mar 22, 2025 · 1 comment
Labels
feature The feature request

Comments

@Glar35
Copy link

Glar35 commented Mar 22, 2025

Does a LoRA (Low-Rank Adaptation) wrapper for existing models already exist? If not, is it technically possible to construct one?

@laggui
Copy link
Member

laggui commented Mar 24, 2025

There is no official implementation at this time 🙂 LoRA is a simple re-parametrization of the linear layer weights, so this should be possible to implement.

We'd like to have LoRA with burn for users to leverage easily, so it's probably gonna appear at some point!

@laggui laggui added the feature The feature request label Mar 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature The feature request
Projects
None yet
Development

No branches or pull requests

2 participants