Mixtape: Breaking the Softmax Bottleneck Efficiently

neural information processing systems(2019)

引用 23|浏览315
暂无评分
摘要
The softmax bottleneck has been shown to limit the expressiveness of neural lan-guage models. Mixture of Softmaxes (MoS) is an effective approach to address such a theoretical limitation, but are expensive compared to softmax in terms of both memory and time. We propose Mixtape, an output layer that breaks the softmax bottleneck more efficiently with three novel techniques—logit space vector gating, sigmoid tree decomposition, and gate sharing. On four benchmarks including language modeling and machine translation, the Mixtape layer substantially improves the efficiency over the MoS layer by 3.5x to 10.5x while obtaining similar performance. A network equipped with Mixtape is only 20% to 34% slower than asoftmax-based network with 10-30K vocabulary sizes, and outperforms softmax in perplexity and translation quality.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要