My main research interest concerns the compression of machine learning models. Besides, the direct gain of memory space, constrained models offer serval advantages such as faster predictions, implicit regularization which can help in preventing overfitting and a better interpretability.
Regarding, big data, model compression also helps contain the storage explosion which comes with bigger and bigger datasets.
In addition, small models lead to embedded ML without the need for a central processing nodes, with all the technical and privacy-preserving issues that come with it.
After a first paper on random forests pre-pruning, I switched to deep learning. Deepnets awesome performances on image/voice recognition begs for them to be more deployed!
Other research interests include generative models, interconnection of neural networks, (style) transfer, locality sensitive hashing and information retrieval, probabilistic programming, etc.