Papers
Topics
Authors
Recent
Search
2000 character limit reached

SuperLoRA: Parameter-Efficient Unified Adaptation of Multi-Layer Attention Modules

Published 18 Mar 2024 in cs.CV, cs.AI, and cs.LG | (2403.11887v1)

Abstract: Low-rank adaptation (LoRA) and its variants are widely employed in fine-tuning large models, including LLMs for natural language processing and diffusion models for computer vision. This paper proposes a generalized framework called SuperLoRA that unifies and extends different LoRA variants, which can be realized under different hyper-parameter settings. Introducing grouping, folding, shuffling, projecting, and tensor factoring, SuperLoRA offers high flexibility compared with other LoRA variants and demonstrates superior performance for transfer learning tasks especially in the extremely few-parameter regimes.

Citations (5)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.