| | |
| | |
Stat |
Members: 3665 Articles: 2'599'751 Articles rated: 2609
21 January 2025 |
|
| | | |
|
Article overview
| |
|
Neural Collapse in Deep Linear Network: From Balanced to Imbalanced Data | Hien Dang
; Tan Nguyen
; Tho Tran
; Hung Tran
; Nhat Ho
; | Date: |
1 Jan 2023 | Abstract: | Modern deep neural networks have achieved superhuman performance in tasks
from image classification to game play. Surprisingly, these various complex
systems with massive amounts of parameters exhibit the same remarkable
structural properties in their last-layer features and classifiers across
canonical datasets. This phenomenon is known as "Neural Collapse," and it was
discovered empirically by Papyan et al. cite{Papyan20}. Recent papers have
theoretically shown the global solutions to the training network problem under
a simplified "unconstrained feature model" exhibiting this phenomenon. We take
a step further and prove the Neural Collapse occurrence for deep linear network
for the popular mean squared error (MSE) and cross entropy (CE) loss.
Furthermore, we extend our research to imbalanced data for MSE loss and present
the first geometric analysis for Neural Collapse under this setting. | Source: | arXiv, 2301.00437 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
|
| |
|
|
|