Lam M. Nguyen
Lam M. Nguyen
IBM Research AI, Thomas J. Watson Research Center
Verified email at ibm.com - Homepage
Title
Cited by
Cited by
Year
SARAH: A novel method for machine learning problems using stochastic recursive gradient
LM Nguyen, J Liu, K Scheinberg, M Takáč
The 34th International Conference on Machine Learning (ICML 2017), 2017
2002017
SGD and Hogwild! Convergence Without the Bounded Gradients Assumption
LM Nguyen, PH Nguyen, M van Dijk, P Richtárik, K Scheinberg, M Takác
The 35th International Conference on Machine Learning (ICML 2018), 2018
702018
Stochastic recursive gradient algorithm for nonconvex optimization
LM Nguyen, J Liu, K Scheinberg, M Takáč
arXiv preprint arXiv:1705.07261, 2017
532017
ProxSARAH: An efficient algorithmic framework for stochastic composite nonconvex optimization
NH Pham, LM Nguyen, DT Phan, Q Tran-Dinh
Journal of Machine Learning Research 21 (110), 1-48, 2020
37*2020
Finite-Sum Smooth Optimization with SARAH
LM Nguyen, M van Dijk, DT Phan, PH Nguyen, TW Weng, ...
arXiv preprint arXiv:1901.07648, 2019
26*2019
CEO Compensation: Does Financial Crisis Matter?
P Vemala, L Nguyen, D Nguyen, A Kommasani
International Business Research 7 (4), 125-131, 2014
262014
Inexact SARAH algorithm for stochastic optimization
LM Nguyen, K Scheinberg, M Takáč
Optimization Methods and Software, 2020
162020
New convergence aspects of stochastic gradient algorithms
LM Nguyen, PH Nguyen, P Richtárik, K Scheinberg, M Takáč, M van Dijk
Journal of Machine Learning Research 20 (176), 1-49, 2019
132019
When does stochastic gradient algorithm work well?
LM Nguyen, NH Nguyen, DT Phan, JR Kalagnanam, K Scheinberg
arXiv preprint arXiv:1801.06159, 2018
112018
PROVEN: Verifying Robustness of Neural Networks with a Probabilistic Approach
TW Weng, PY Chen, LM Nguyen, MS Squillante, A Boopathy, I Oseledets, ...
The 36th International Conference on Machine Learning (ICML 2019), 2019
102019
Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization
Q Tran-Dinh, NH Pham, DT Phan, LM Nguyen
arXiv preprint arXiv:1905.05920, 2019
102019
Tight Dimension Independent Lower Bound on the Expected Convergence Rate for Diminishing Step Sizes in SGD
PH Nguyen, LM Nguyen, M van Dijk
The 33th Annual Conference on Neural Information Processing Systems (NeurIPS …, 2019
8*2019
A service system with randomly behaving on-demand agents
LM Nguyen, AL Stolyar
SIGMETRICS 2016, ACM SIGMETRICS Performance Evaluation Review 44 (1), 365-366, 2016
82016
A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization
Q Tran-Dinh, NH Pham, DT Phan, LM Nguyen
arXiv preprint arXiv:1907.03793, 2019
72019
A queueing system with on-demand servers: local stability of fluid limits
L Nguyen, A Stolyar
Queueing Systems: Theory and Applications 89 (3-4), 243–268, 2017
42017
Hybrid Variance-Reduced SGD Algorithms For Nonconvex-Concave Minimax Problems
Q Tran-Dinh, D Liu, LM Nguyen
arXiv preprint arXiv:2006.15266, 2020
32020
A Unified Convergence Analysis for Shuffling-Type Gradient Methods
LM Nguyen, Q Tran-Dinh, DT Phan, PH Nguyen, M van Dijk
arXiv preprint arXiv:2002.08246, 2020
32020
Convergence Rates of Accelerated Markov Gradient Descent with Applications in Reinforcement Learning
TT Doan, LM Nguyen, NH Pham, J Romberg
arXiv preprint arXiv:2002.02873, 2020
32020
Characterization of Convex Objective Functions and Optimal Expected Convergence Rates for SGD
M van Dijk, LM Nguyen, PH Nguyen, DT Phan
The 36th International Conference on Machine Learning (ICML 2019), 2019
32019
Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization
Q Tran-Dinh, NH Pham, LM Nguyen
The 37th International Conference on Machine Learning (ICML 2020), 2020
12020
The system can't perform the operation now. Try again later.
Articles 1–20