Compatibility of LoRA-GA with other LoRA variants
Determine whether LoRA-GA achieves equally strong performance when combined with other LoRA architectures or improvements, such as ReLoRA, which periodically merges learned adapters into frozen weights, thereby assessing its compatibility and efficacy in modified LoRA frameworks.
References
Additionally, we did not implement our method on other LoRA variants that are orthogonal to our improvements (e.g., ReLoRA). Therefore, we cannot ascertain whether LoRA-GA would perform equally well with other LoRA architectures/improvements.
— LoRA-GA: Low-Rank Adaptation with Gradient Approximation
(2407.05000 - Wang et al., 6 Jul 2024) in Section: Limitations