Parameterized k-Clustering: The distance matters!

arXiv: Data Structures and Algorithms(2019)

引用 23|浏览12
暂无评分
摘要
We consider the $k$-Clustering problem, which is for a given multiset of $n$ vectors $Xsubset mathbb{Z}^d$ and a nonnegative number $D$, to decide whether $X$ can be partitioned into $k$ clusters $C_1, dots, C_k$ such that the cost [sum_{i=1}^k min_{c_iin mathbb{R}^d}sum_{x C_i} |x-c_i|_p^p leq D,] where $|cdot|_p$ is the Minkowski ($L_p$) norm of order $p$. For $p=1$, $k$-Clustering is the well-known $k$-Median. For $p=2$, the case of the Euclidean distance, $k$-Clustering is $k$-Means. We show that the parameterized complexity of $k$-Clustering strongly depends on the distance order $p$. In particular, we prove that for every $pin (0,1]$, $k$-Clustering is solvable in time $2^{O(D log{D})} (nd)^{O(1)}$, and hence is fixed-parameter tractable when parameterized by $D$. On the other hand, we prove that for distances of orders $p=0$ and $p=infty$, no such algorithm exists, unless FPT=W[1].
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要