AdaGrad
Title : Adaptive subgradient methods for online learning and stochastic optimization
Author : J. Duchi, E. Hazan, and Y. Singer
Description : Most used optimiation algorithm in machine learning.
Publication : Journal of Machine Learning Research, 12:2121–2159, July 2011.
Used in : Wide & Deep Learning for Recommender Systems
FTRL : Follow the regularized leader
Title : Follow-the-regularized-leader and mirror descent: Equivalence theorems and l1 regularization
Author : H. B. McMahan
Description :
Publication : In Proc. AISTATS, 2011.
Used in : Wide & Deep Learning for Recommender Systems