Blind Deconvolution Using Convex Programming

IEEE Transactions on Information Theory(2014)

引用 455|浏览90
暂无评分
摘要
We consider the problem of recovering two unknown vectors, ${\mbi w}$ and ${\mbi x}$, of length $L$ from their circular convolution. We make the structural assumption that the two vectors are members of known subspaces, one with dimension $N$ and the other with dimension $K$. Although the observed convolution is nonlinear in both ${\mbi w}$ and ${\mbi x}$, it is linear in the rank-1 matrix formed by their outer product ${\mbi w}{\mbi x}^{\ast}$. This observation allows us to recast the deconvolution problem as low-rank matrix recovery problem from linear measurements, whose natural convex relaxation is a nuclear norm minimization program. We prove the effectiveness of this relaxation by showing that, for “generic” signals, the program can deconvolve ${\mbi w}$ and ${\mbi x}$ exactly when the maximum of $N$ and $K$ is almost on the order of $L$. That is, we show that if ${\mbi x}$ is drawn from a random subspace of dimension $N$, and ${\mbi w}$ is a vector in a subspace of dimension $K$ whose basis vectors are spread out in the frequency domain, then nuclear norm minimization recovers ${\mbi w}{\mbi x}^{\ast}$ without error. We discuss this result in the context of blind channel estimation in communications. If we have a message of length $N$, which we code using a random $L\times N$ coding matrix, and the encoded message travels through an unknown linear time-invariant channel of maximum length $K$, then the receiver can recover both the channel response and the message when $L\gtrsim N+K$, to within constant and log factors.
更多
查看译文
关键词
convolution,minimization,blind source separation,circular convolution,channel coding,blind deconvolution,vectors,deconvolution,convex programming,low rank matrix,government,compressed sensing,frequency domain analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要