A mazing 2+ε approximation for unsplittable flow on a path

Proceedings of the Twenty-Fifth Annual ACM-SIAM Symposium on Discrete Algorithms(2014)

引用 47|浏览58
暂无评分
摘要
We study the unsplittable flow on a path problem (UFP), which arises naturally in many applications such as bandwidth allocation, job scheduling, and caching. Here we are given a path with nonnegative edge capacities and a set of tasks, which are characterized by a subpath, a demand, and a profit. The goal is to find the most profitable subset of tasks whose total demand does not violate the edge capacities. Not surprisingly, this problem has received a lot of attention in the research community. If the demand of each task is at most a small enough fraction δ of the capacity along its subpath (δ-small tasks), then it has been known for a long time [Chekuri et al., ICALP 2003] how to compute a solution of value arbitrarily close to the optimum via LP rounding. However, much remains unknown for the complementary case, that is, when the demand of each task is at least some fraction δ > 0 of the smallest capacity of its subpath (δ-large tasks). For this setting a constant factor approximation, improving on an earlier logarithmic approximation, was found only recently [Bonsma et al., FOCS 2011]. In this paper we present a PTAS for δ-large tasks, for any constant δ > 0. Key to this result is a complex geometrically inspired dynamic program. Each task is represented as a segment underneath the capacity curve, and we identify a proper maze-like structure so that each corridor of the maze is crossed by only O(1) tasks in the optimal solution. The maze has a tree topology, which guides our dynamic program. Our result implies a 2 + ε approximation for UFP, for any constant ε > 0, improving on the previously best 7 + ε approximation by Bonsma et al. We remark that our improved approximation algorithm matches the best known approximation ratio for the considerably easier special case of uniform edge capacities.
更多
查看译文
关键词
algorithms,design,graph algorithms,path and circuit problems,theory
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要