Disaggregated Multi-Tower: Topology-aware Modeling Technique for Efficient Large-Scale Recommendation
CoRR(2024)
摘要
We study a mismatch between the deep learning recommendation models' flat
architecture, common distributed training paradigm and hierarchical data center
topology. To address the associated inefficiencies, we propose Disaggregated
Multi-Tower (DMT), a modeling technique that consists of (1)
Semantic-preserving Tower Transform (SPTT), a novel training paradigm that
decomposes the monolithic global embedding lookup process into disjoint towers
to exploit data center locality; (2) Tower Module (TM), a synergistic dense
component attached to each tower to reduce model complexity and communication
volume through hierarchical feature interaction; and (3) Tower Partitioner
(TP), a feature partitioner to systematically create towers with meaningful
feature interactions and load balanced assignments to preserve model quality
and training throughput via learned embeddings. We show that DMT can achieve up
to 1.9x speedup compared to the state-of-the-art baselines without losing
accuracy across multiple generations of hardware at large data center scales.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要