seminar_
art_
_2_0_2_
(1).jpg
Event Details
Speaker Name
Dayal Singh Kalra
Speaker Institution
(CMTC)
Start Date & Time
2024-03-15 12:00 pm
End Date & Time
2024-03-15 12:00 pm
Semester
QuICS Event Type
Event Details

In gradient descent dynamics of neural networks, the top eigenvalue of the Hessian of the loss (sharpness) displays a variety of robust phenomena throughout training. This includes early time regimes where the sharpness may decrease during early periods of training (sharpness reduction), and later time behavior such as progressive sharpening and edge of stability. We demonstrate that a simple $2$-layer linear network (UV model) trained on a single training example exhibits all of the essential sharpness phenomenology observed in real-world scenarios. By analyzing the structure of dynamical fixed points in function space and the vector field of function updates, we uncover the underlying mechanisms behind these sharpness trends. Our analysis reveals (i) the mechanism behind early sharpness reduction and progressive sharpening, (ii) the required conditions for edge of stability, and (iii) a period-doubling route to chaos on the edge of stability manifold as learning rate is increased. Finally, we demonstrate that various predictions from this simplified model generalize to real-world scenarios and discuss its limitations.  https://arxiv.org/abs/2311.02076

Pizza and drinks will be served after the seminar in ATL 2117.

 

 

Location
ATL 2324
Misc
Groups
TEMP migration NID
12003456