2000 character limit reached
One Reflection Suffice
Published 30 Sep 2020 in cs.LG and stat.ML | (2009.14554v1)
Abstract: Orthogonal weight matrices are used in many areas of deep learning. Much previous work attempt to alleviate the additional computational resources it requires to constrain weight matrices to be orthogonal. One popular approach utilizes many Householder reflections. The only practical drawback is that many reflections cause low GPU utilization. We mitigate this final drawback by proving that one reflection is sufficient, if the reflection is computed by an auxiliary neural network.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.