Follow
Jianhao Ma
Title
Cited by
Cited by
Year
Global convergence of sub-gradient method for robust matrix recovery: Small initialization, noisy measurements, and over-parameterization
J Ma, S Fattahi
Journal of Machine Learning Research, 2022
382022
Sign-RIP: A robust restricted isometry property for low-rank matrix recovery
J Ma, S Fattahi
2021 NeurIPS Workshop on Optimization for Machine Learning, 2021
22*2021
Blessing of Depth in Linear Regression: Deeper Models Have Flatter Landscape Around the True Solution
J Ma, S Fattahi
Advances in Neural Information Processing Systems, 2022
14*2022
Behind the Scenes of Gradient Descent: A Trajectory Analysis via Basis Function Decomposition
J Ma, L Guo, S Fattahi
ICLR 2023, 2022
82022
Towards Understanding Generalization via Decomposing Excess Risk Dynamics
J Teng, J Ma, Y Yuan
ICLR 2022, 2021
72021
Can Learning Be Explained By Local Optimality In Low-rank Matrix Recovery?
J Ma, S Fattahi
arXiv preprint arXiv:2302.10963, 2023
5*2023
Convergence of gradient descent with small initialization for unregularized matrix completion
J Ma, S Fattahi
The Thirty Seventh Annual Conference on Learning Theory, 3683-3742, 2024
22024
Robust Sparse Mean Estimation via Incremental Learning
J Ma, RR Chen, Y He, S Fattahi, W Hu
arXiv preprint arXiv:2305.15276, 2023
2023
The system can't perform the operation now. Try again later.
Articles 1–8