Zygmunt Zając at fastml.com has a great post titled “13 NIPS Papers that caught our eye.” Zajac provides a short readable summary of his favorite NIPS 2013 papers. (NIPS 2013 which just ended last week.) The papers are:
- Understanding Dropout by Baldi and Sadowski
- Training and Analysing Deep Recurrent Neural Networks by Hermans and Schrauwen
- RNADE: The real-valued neural autoregressive density-estimator by Uria, Murray, and Larochelle
- Predicting Parameters in Deep Learning by Denil, Shakibi, Dinh, Ranzato, and Freitas
- Pass-efficient unsupervised feature selection by Maung and Schweitzer
- Multi-Prediction Deep Boltzmann Machines by Goodfellow, Mirza, Courville, and Bengio
- Memoized Online Variational Inference for Dirichlet Process Mixture Models by Hughes, and Sudderth
- Learning word embeddings efficiently with noise-contrastive estimation by Andriy , and Kavukcuoglu
- Learning Stochastic Feedforward Neural Networks by Tang and Salakhutdinov
- Distributed Representations of Words and Phrases and their Compositionality by Mikolov, Sutskever, Chen, Corrado, and Dean
- Correlated random features for fast semi-supervised learning by McWilliams, Balduzzi, and Buhmann
- Convex Two-Layer Modeling by Aslan, Cheng, Zhang, and Schuurmans, and
- Approximate inference in latent Gaussian-Markov models from continuous time observations by Cseke, Opper, and Sanguinetti
I suggest reading Zajac’s summaries before diving in.
Comments are now closed.