Evolving modular neural sequence architectures with genetic programming.

GECCO (Companion)(2018)

引用 23|浏览78
暂无评分
摘要
Automated architecture search has demonstrated significant success for image data, where reinforcement learning and evolution approaches now outperform the best human designed networks ([12], [8]). These successes have not transferred over to models dealing with sequential data, such as in language modeling and translation tasks. While there have been several attempts to evolve improved recurrent cells for sequence data [7], none have achieved significant gains over the standard LSTM. Recent work has introduced high performing recurrent neural network alternatives, such as Transformer [11] and Wavenet [4], but these models are the result of manual human tuning.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要