Randomized algorithms for Tikhonov regularization in linear least squares

arxiv(2022)

引用 0|浏览8
暂无评分
摘要
We describe two algorithms to efficiently solve regularized linear least squares systems based on sketching. The algorithms compute preconditioners for $\min \|Ax-b\|^2_2 + \lambda \|x\|^2_2$, where $A\in\mathbb{R}^{m\times n}$ and $\lambda>0$ is a regularization parameter, such that LSQR converges in $\mathcal{O}(\log(1/\epsilon))$ iterations for $\epsilon$ accuracy. We focus on the context where the optimal regularization parameter is unknown, and the system must be solved for a number of parameters $\lambda$. Our algorithms are applicable in both the underdetermined $m\ll n$ and the overdetermined $m\gg n$ setting. Firstly, we propose a Cholesky-based sketch-to-precondition algorithm that uses a `partly exact' sketch, and only requires one sketch for a set of $N$ regularization parameters $\lambda$. The complexity of solving for $N$ parameters is $\mathcal{O}(mn\log(\max(m,n)) +N(\min(m,n)^3 + mn\log(1/\epsilon)))$. Secondly, we introduce an algorithm that uses a sketch of size $\mathcal{O}(\text{sd}_{\lambda}(A))$ for the case where the statistical dimension $\text{sd}_{\lambda}(A)\ll\min(m,n)$. The scheme we propose does not require the computation of the Gram matrix, resulting in a more stable scheme than existing algorithms in this context. We can solve for $N$ values of $\lambda_i$ in $\mathcal{O}(mn\log(\max(m,n)) + \min(m,n)\,\text{sd}_{\min\lambda_i}(A)^2 + Nmn\log(1/\epsilon))$ operations.
更多
查看译文
关键词
tikhonov regularization,least
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要