Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Unified Scheme of ResNet and Softmax (2309.13482v1)

Published 23 Sep 2023 in cs.LG and stat.ML

Abstract: LLMs have brought significant changes to human society. Softmax regression and residual neural networks (ResNet) are two important techniques in deep learning: they not only serve as significant theoretical components supporting the functionality of LLMs but also are related to many other machine learning and theoretical computer science fields, including but not limited to image classification, object detection, semantic segmentation, and tensors. Previous research works studied these two concepts separately. In this paper, we provide a theoretical analysis of the regression problem: $| \langle \exp(Ax) + A x , {\bf 1}_n \rangle{-1} ( \exp(Ax) + Ax ) - b |_22$, where $A$ is a matrix in $\mathbb{R}{n \times d}$, $b$ is a vector in $\mathbb{R}n$, and ${\bf 1}_n$ is the $n$-dimensional vector whose entries are all $1$. This regression problem is a unified scheme that combines softmax regression and ResNet, which has never been done before. We derive the gradient, Hessian, and Lipschitz properties of the loss function. The Hessian is shown to be positive semidefinite, and its structure is characterized as the sum of a low-rank matrix and a diagonal matrix. This enables an efficient approximate Newton method. As a result, this unified scheme helps to connect two previously thought unrelated fields and provides novel insight into loss landscape and optimization for emerging over-parameterized neural networks, which is meaningful for future research in deep learning models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Zhao Song (253 papers)
  2. Weixin Wang (14 papers)
  3. Junze Yin (26 papers)
Citations (5)