Relative Error RKHS Embeddings for Gaussian Kernels.

arXiv: Learning(2018)

引用 23|浏览5
暂无评分
摘要
We show how to obliviously embed into the reproducing kernel Hilbert space associated with Gaussian kernels, so that distance in this space (the kernel distance) only has $(1+varepsilon)$-relative error. This only holds in comparing any point sets at a kernel distance at least $alpha$; this parameter only shows up as a poly-logarithmic factor of the dimension of an intermediate embedding, but not in the final embedding. The main insight is to effectively modify the well-traveled random Fourier features to be slightly biased and have higher variance, but so they can be defined as a convolution over the function space. This result provides the first guaranteed algorithmic results for LSH of kernel distance on point sets and low-dimensional shapes and distributions, and for relative error bounds on the kernel two-sample test.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要