Research InterestsMy general research interest is in nonlinear optimization algorithms, machine learning, and statistical signal processing. My research topics include:
Large-Scale Optimization AlgorithmsIn the era of "big data", we are witnessing a fast development in data acquisition techniques. New data features such as the massive volume/dimension, heterogeneous structure, and decentralized storage challenge traditional optimization methods, most of which rely on centralized information and computation. We are interested in developing parallel and decentralized algorithms capable of solving large-scale optimization problems leveraging multiple computing units, equipped with the following desirable features:
An overview of our algorithmic framework can be found in the following book chapter. Parallel and distributed successive convex approximation methods for big-data optimization Flexible distributed successive convex approximation Distributed optimization based on gradient-tracking revisited: enhancing convergence rate via surrogation Distributed big-data optimization via block-wise gradient tracking Distributed nonconvex constrained optimization over time-varying digraphs Distributed algorithm design under heterogeneityTackling data heterogeneity: a new unified framework for decentralized SGD with sample-induced topology Hybrid local SGD for federated learning with heterogeneous communications Asynchronous distributed optimizationAchieving linear convergence in distributed asynchronous multi-agent optimization Computational Statistics and Data AnalyticsStatistics and optimization demonstrate a close interplay in data analytics. Sophisticated statistical models that produce high quality solutions often lead to complex highly nonconvex optimization problems. However, traditional optimization tools applied to these problems in theory only yield local solutions without any statistical guarantee. Moreover, employing a black-box algorithm can be inefficient due to the ignorance of the problem structure and computational resources at hand. We are interested in developing problem-driven low complexity algorithms for statistical learning with provable guarantees. Decentralized learning from multiple sourcesDistributed sparse regression via penalization Distributed (ATC) gradient descent for high dimension sparse regression Decentralized dictionary learning over time-varying digraphs Majorization-minimization algorithmsMajorization-minimization algorithms in signal processing, communications, and machine learning Structured robust covariance estimationLow-complexity algorithms for low rank clutter parameters estimation in radar systems Robust estimation of structured covariance matrix for heavy-tailed elliptical distributions Regularized robust estimation of mean and covariance matrix under heavy-tailed distributions Regularized Tyler's scatter estimator: existence, uniqueness, and algorithms Sparse principal component analysisOrthogonal Sparse PCA and Covariance Estimation via Procrustes Reformulation |