Bamdev Mishra

Applied Machine Learning Researcher

Publications

[PhD thesis]   [Preprints]   [Journals]   [Conferences]   [Workshop papers]   [Talks]   [Posters]   [Patents]   [Miscellaneous]

A list of my publications is also available on Google Scholar.

Please contact me if you don’t find something.

Preprints

M. Meghwanshi, P. Jawanpuria, A. Kunchukuttan, H. Kasai, and BM, “McTorch, a manifold optimization library for deep learning”, arXiv preprint arXiv:1810.01811 2018 [arXiv:1810.01811][GitHub]. Also accepted at the MLOSS workshop at NeurIPS 2018.

A. R. Yelundur, S. H. Sengamedu, and BM, “E-commerce anomaly detection: a Bayesian semi-Supervised tensor decomposition approach using natural gradients”, arXiv preprint arXiv:1804.03836 2018 [arXiv:1804.03836].

BM, H. Kasai, P. Jawanpuria, and A. Saroop, “A Riemannian gossip approach to subspace learning on Grassmann manifold.”, Technical report, 2017 [arXiv:1705.00467][Matlab codes]. It is an extension of the technical report [arXiv:1605.06968]. A shorter version was earlier accepted to 9th NeurIPS workshop on optimization for machine learning (OPT2016) held at Barcelona.

H. Sato, H. Kasai, and BM, “Riemannian stochastic variance reduced gradient”, arXiv preprint arXiv:1702.05594, 2017 [ arXiv:1702.05594]. A shorter version is “Riemannian stochastic variance reduced gradient on Grassmann manifold”, Technical report, 2016 [ arXiv:1605.07367] [Matlab codes]. A still shorter version has been accepted to 9th NeurIPS workshop on optimization for machine learning (OPT2016) to be held at Barcelona.

Journal articles

P. Jawanpuria, B. Arjun, A. Kunchukuttan, and BM, “Learning multilingual word embeddings in latent metric space: a geometric approach”, accepted to TACL [arXiv:1808.08773][GitHub].

Y. Shi, BM, and W. Chen, “Topological interference management with user admission control via Riemannian optimization”, accepted to IEEE Transactions on Wireless Communications, 2017 [arXiv:1607.07252][Publisher’s pdf].

BM and R. Sepulchre, “Riemannian preconditioning”, SIAM Journal on optimization, 26(1), pp. 635 – 660, 2016 [arXiv:1405.6055] [Publisher’s pdf].

Y. Sun, J. Gao, X. Hong, BM, and B. Yin, “Heterogeneous tensor decomposition for clustering via manifold optimization”, Transactions on Pattern Analysis and Machine Intelligence, 38(3) pp. 476 – 489, 2016 [arXiv:1504.01777] [Publisher’s pdf].

N. Boumal, BM, P.-A. Absil, and R. Sepulchre, “Manopt, a Matlab toolbox for optimization on manifolds”, Journal of Machine Learning Research 15(Apr), pp. 1455 – 1459, 2014 [Publisher’s pdf] [arXiv:1308.5200] [Webpage].

BM, G. Meyer, S. Bonnabel, and R. Sepulchre, “Fixed-rank matrix factorizations and Riemannian low-rank optimization”, Computational Statistics 29(3 – 4), pp. 591 – 621, 2014 [Publisher’s pdf] [arXiv:1209.0430] [Matlab codes supplied with the qGeomMC package].

BM, G. Meyer, F. Bach, and R. Sepulchre, “Low-rank optimization with trace norm penalty”, SIAM Journal on Optimization 23 (4), pp. 2124 – 2149, 2013 [Publisher’s pdf] [arXiv:1112.2318] [Matlab code webpage].

Conference proceedings


H. Kasai and BM, “Inexact trust-region algorithms on Riemannian manifolds”, accepted to NeurIPS 2018 [GitHub][Personal copy][Publisher’s copy].

M. Nimishakavi, P. Jawanpuria, and BM, “A dual framework for low-rank tensor completion”, accepted to NeurIPS 2018 [arXiv:1712.01193][Matlab codes page][Publisher’s copy]. A shorter version got accepted to the NeurIPS workshop on Synergies in Geometric Data Analysis, 2017.

M. Nimishakavi, BM, Manish Gupta, and Partha Talukdar, “Inductive framework for multi-aspect streaming tensor completion with side information”, accepted to CIKM 2018 [arXiv:1802.06371][Personal copy][GitHub].

S. Mahadevan, BM, and S. Ghosh, “A unified framework for domain adaptation using metric learning on manifolds”, accepted to ECML-PKDD 2018. [arXiv:1804.10834][GitHub].

H. Kasai and BM, “Riemannian joint dimensionality reduction and dictionary learning on symmetric positive definite manifold”, accepted to EUSIPCO 2018.

H. Kasai, H. Sato, and BM, “Riemannian stochastic recursive gradient algorithm”, accepted to ICML, 2018 [Publisher’s pdf][HK’s presentation video].

P. Jawanpuria and BM, “A unified framework for structured low-rank matrix learning”, accepted to ICML, 2018 [Publisher’s pdf][arXiv:1704.07352][Matlab codes page]. A shorter version got accepted to 10th NeurIPS Workshop on Optimization for Machine Learning, 2017.

H. Kasai, H. Sato, and BM, “Riemannian stochastic quasi-Newton algorithm with variance reduction and its convergence analysis”, accepted to AISTATS, 2018 [arXiv:1703.04890][Publisher’s pdf].

Y. Shi, BM, and Wei Chen, “A sparse and low-rank optimization framework for network topology control in dense fog-RAN,” accepted to the IEEE 85th Vehicular Technology Conference, 2017. The earlier title was “A sparse and low-rank optimization framework for index coding via Riemannian optimization”, Technical report, 2016 [arXiv:1604.04325] [Matlab codes]. A shorter version of the work presenting a unified approach, titled “Sparse and low-rank decomposition for big data systems via smoothed Riemannian optimization,” also appeared in the 9th NeurIPS workshop on optimization for machine learning (OPT2016) held in Barcelona.

BM and R. Sepulchre, “Scaled stochastic gradient descent for low-rank matrix completion”, Accepted to the 55th IEEE Conference on Decision and Control, 2016 [Publisher’s copy] [ arXiv:1603.04989] [Matlab codes]. A different version is accepted to the internal Amazon machine learning conference (AMLC) 2016. AMLC is a platform for internal Amazon researchers to present their work.

H. Kasai and BM, “Low-rank tensor completion: a Riemannian manifold preconditioning approach”, accepted to ICML 2016 [Publisher’s copy] [Supplementary material] [arXiv:1605.08257]. A shorter version is at 8th NeurIPS workshop on optimization for machine learning (OPT2015). Earlier title “Riemannian preconditioning for tensor completion”, arXiv:1506.02159, 2015 [arXiv:1506.02159] [Matlab code webpage].

R. Liégeois, BM, M. Zorzi, and R. Sepulchre, “Sparse plus low-rank autoregressive identification in neuroimaging time series”, Accepted for publication in the proceedings of the 54th IEEE Conference on Decision and Control, 2015 [Publisher’s pdf] [arXiv:1503.08639] [Matlab code webpage].

BM and R. Sepulchre, “R3MC: A Riemannian three-factor algorithm for low-rank matrix completion”, Proceedings of the 53rd IEEE Conference on Decision and Control, pp. 1137 – 1142, 2014 [arXiv:1306.2672] [R3MC Matlab code webpage] [Publisher’s pdf].

BM and B. Vandereycken, “A Riemannian approach to low-rank algebraic Riccati equations”, Proceedings of the 21st International Symposium on Mathematical Theory of Networks and Systems, Extended abstract, pp. 965 – 968, 2014 [Publisher’s pdf][arXiv:1312.4883] [Matlab code webpage].

BM, G. Meyer, and R. Sepulchre, “Low-rank optimization for distance matrix completion”, Proceedings of the 50th IEEE Conference on Decision and Control, pp. 4455 – 4460, 2011 [Publisher’s pdf] [arXiv:1304.6663] [Matlab code webpage].

Workshop papers and technical reports


M. Bhutani, P. Jawanpuria, H. Kasai, and BM, “Low-rank geometric mean metric learning”, accepted to the Geometry in Machine Learning (GiMLi) workshop in ICML, 2018 [arXiv:1806.05454].

M. Bhutani and BM, “A two-dimensional decomposition approach for matrix completion through gossip”, arXiv preprint arXiv:1711.07684 2017 [arXiv:1711.07684]. A shorter version got accepted to the NeurIPS workshop on Emergent Communication, 2017.

V. Badrinarayanan, BM, and R. Cipolla, “Symmetry-invariant optimization in deep networks”, Technical report, arXiv:arXiv:1511.01754, 2015 [arXiv:1511.01754] [Matlab code webpage]. A shorter version appears in “Understanding symmetries in deep networks”, 8th NeurIPS workshop on optimization for machine learning (OPT2015) [arXiv:1511.01029].

BM, K. Adithya Apuroop, and R. Sepulchre, “A Riemannian geometry for low-rank matrix completion”, Technical report, 2012 [arXiv:1211.1550] [qGeomMC Matlab code webpage].

PhD thesis


BM, “A Riemannian approach to large-scale constrained least-squares with symmetries”, PhD thesis, University of Liège, 2014 [Orbi:173257] [Thesis presentation]. The PhD jury included Lieven De Lathauwer, Quentin Louveaux, Louis Wehenkel, Philippe Toint, Francis Bach (co-supervisor), and Rodolphe Sepulchre (supervisor).

Seminars, workshops, and symposiums

Gave a talk on manifold optimization: applications and algorithms at IIIT Hyderabad, 2018.

Gave a talk on the role of machine learning in Office India at Microsoft IDC, 2018.

Gave a seminar on dimensionality reduction techniques using matrix factorization at IIT Bhubaneswar on 16 Jan 2018.

Gave a seminar in the numerical analysis group at the University of Geneva on 07 October 2016.

Gave a lecture talk at IFCAM summer school on large-scale optimization at IISc on 07 July 2016. The talk was focussed on manifold optimization: applications and algorithms.

On 30 September 2015, gave a seminar talk on Manopt: a Matlab toolbox for optimization on manifolds in the department of systems and control engineering at IIT Bombay. Gave a second seminar talk on the same topic at the School of Electrical Sciences, IIT Bhubaneswar on 12 October 2015.

In Workshop on low-rank optimization and applications, Hausdorff Center for Mathematics, Bonn, June 2015, presented Riemannian preconditioning.

On 04 June 2015, gave a seminar talk on a Riemannian approach to large-scale constrained least-squares with symmetries at UCLouvain, Belgium as part of the big data seminar series.

In TCMM Workshop 2014, together with Nicolas Boumal, presented the Manopt toolbox.

In Dolomites Workshop 2013, presented the work on “Fixed-rank optimization on Riemannian quotient manifolds”.

In Benelux meeting on systems and control 2013, presented the work on “Tuning metrics for low-rank matrix completion” [Book of Abstracts: Page 157].

In ISMP 2012, presented the work on “Fixed-rank matrix factorizations and the design of invariant optimization algorithms” [Book of Abstracts: Page 145].

In Benelux meeting on systems and control 2012, presented the work on “Low-rank optimization on the set of low-rank non-symmetric matrices” [Book of Abstracts: Page 103].

In Benelux meeting on systems and control 2011, presented the work on “Manifold based optimization techniques for low-rank distance matrix completion” [Book of Abstracts: Page 75].

Posters

Presented three posters on Structured low-rank learning, a dual framework for tensor completion, and a two-dimensional gossip approach to matrix completion at NeurIPS workshops, 2017.

Presented three posters on RSVRG, gossip for matrix completion, and low-rank and sparse decomposition at the optimization for machine learning workshop (OPT2016) at NeurIPS, Barcelona, 2016.

Along with Hiroyuki Kasai, presented poster on low-rank tensor completion: a Riemannian preconditioning approach at ICML 2016.

Scaled SGD for matrix completion at internal Amazon machine learning conference, Seattle, 2016.

Two posters at the NeurIPS workshop on optimization for machine learning, 2015. One on the tensor completion and the other on symmetry-invariant optimization in deep networks.

Riemannian preconditioning for tensor completion in Low-rank Optimization and Applications workshop, Hausdorff Center for Mathematics, Bonn, 2015.

Riemannian preconditioning in IAP DySCO study day, 2014.

Trace norm minimization in IAP DySCO study day, 2012.

Patents


A patent filed on computing competitive reference prices of third-party unique products.

A patent filed on a two-dimensional distributed computing approach for matrix completion through gossip.

Miscellaneous


I maintain a list of fixed-rank algorithms and geometries here. The list is out of date as of 2018.