About Me - I'm a final year B.E. CSE student at Thapar University, Patiala. I'm at IIT Madras as a Research Intern. I'm working on problems related to recommender systems along with Avijit Saha(Graduate student at IIT Madras). My LinkedIn profile - Rishabh Misra
Mentor - Dr. Balaraman Ravindan
Areas of Interest - Machine learning, Recommender Systems, Artificial Intelligence
Problems working on -
1 - Variational Bayesian Factorization Machine - Factorization Machine (FM) combines advantages of SVM and factorization model. FM models all interactions between variables using latent factors. Thus it can estimate interactions even in problems with huge sparsity where SVM fails. FM can mimic various factorization models just by specifying the input data. This makes FM easily applicable even for users without expert knowledge in factorization models. The simplest approach to solve FM is by SGD which is scalable due to its online nature but faces problem of over-fitting. Alternative methods for solving FM include Bayesian Factorization Machine, which achieves state-of-the-art performance. However, since this is a batch method, scalability to larger problems such as the Yahoo! Music dataset is an issue. Additionally for Markov chain Monte Carlo inference, it is harder to assess convergence. Our aim is to introduce Variational Bayesian approach to FM along with its online version that scales very well on large datasets.
2 - Scalable Probabilistic Matrix Factorization using Markov chain Monte Carlo - A standard way to solve matrix completion problem is by factorizing it. A simple approach to solve the problem is by using SGD. Though SGD for matrix factorization scales well, it faces the problem of over-fitting. Other approach includes Bayesian PMF which provides state-of-the art performances on Netfix data. But the O(K^3) complexity, where K is the low rank factor, prohibits this methods to be applied to very large datasets like KDD music dataset. Our motivation is to provide an algorithm with complexity O(K) with good results.