Dice Question Streamline Icon: https://streamlinehq.com

Compatibility of LoRA-GA with other LoRA variants

Determine whether LoRA-GA achieves equally strong performance when combined with other LoRA architectures or improvements, such as ReLoRA, which periodically merges learned adapters into frozen weights, thereby assessing its compatibility and efficacy in modified LoRA frameworks.

Information Square Streamline Icon: https://streamlinehq.com

Background

LoRA-GA only alters initialization and can, in principle, be combined with other LoRA variants, but the paper does not implement or evaluate such combinations.

The authors specifically mention ReLoRA as an example and note that they cannot ascertain whether LoRA-GA would perform equally well alongside other LoRA architectures or improvements.

References

Additionally, we did not implement our method on other LoRA variants that are orthogonal to our improvements (e.g., ReLoRA). Therefore, we cannot ascertain whether LoRA-GA would perform equally well with other LoRA architectures/improvements.

LoRA-GA: Low-Rank Adaptation with Gradient Approximation (2407.05000 - Wang et al., 6 Jul 2024) in Section: Limitations