
Notes on Deep Learning for NLP
My notes on Deep Learning for NLP....
read it

Lecture Notes: Selected topics on robust statistical learning theory
These notes gather recent results on robust statistical learning theory....
read it

The Rise and Fall of the Note: Changing Paper Lengths in ACM CSCW, 20002018
In this note, I quantitatively examine various trends in the lengths of ...
read it

Lecture Notes on Support Preconditioning
These are lectures notes on support preconditioning, originally written ...
read it

Lecture notes on descriptional complexity and randomness
A didactical survey of the foundations of Algorithmic Information Theory...
read it

Notes on computationaltostatistical gaps: predictions using statistical physics
In these notes we describe heuristics to predict computationaltostatis...
read it

Machine Assisted Authentication of Paper Currency: an Experiment on Indian Banknotes
Automatic authentication of paper money has been targeted. Indian bank n...
read it
Notes on Deep Learning Theory
These are the notes for the lectures that I was giving during Fall 2020 at the Moscow Institute of Physics and Technology (MIPT) and at the Yandex School of Data Analysis (YSDA). The notes cover some aspects of initialization, loss landscape, generalization, and a neural tangent kernel theory. While many other topics (e.g. expressivity, a meanfield theory, a double descent phenomenon) are missing in the current version, we plan to add them in future revisions.
READ FULL TEXT
Comments
There are no comments yet.