基本信息
浏览量:1198
职业迁徙
个人简介
I have invented much of the current revolution in large language models. Some of my inventions include:
Transformer (2017)
(personally designed the multi-head attention, the residual architecture, and coded up the first better-than-SOTA working implementation)
Sparsely-gated Mixture of Experts (2016)
Mesh-Tensorflow (2018) - first practical system for training giant Transformers on supercomputers.
T5 (2019)
Major contributor to Google’s LaMDA dialog system, a project led by Daniel De Freitas, my now co-founder at Character AI.
Transformer (2017)
(personally designed the multi-head attention, the residual architecture, and coded up the first better-than-SOTA working implementation)
Sparsely-gated Mixture of Experts (2016)
Mesh-Tensorflow (2018) - first practical system for training giant Transformers on supercomputers.
T5 (2019)
Major contributor to Google’s LaMDA dialog system, a project led by Daniel De Freitas, my now co-founder at Character AI.
研究兴趣
论文共 87 篇作者统计合作学者相似作者
按年份排序按引用量排序主题筛选期刊级别筛选合作者筛选合作机构筛选
时间
引用量
主题
期刊级别
合作者
合作机构
JOURNAL OF MACHINE LEARNING RESEARCH (2023): 240:1-240:113
Annual Conference on Neural Information Processing Systems (2022): 6010-6022
引用59浏览0EI引用
59
0
arxiv(2022)
AUTOMATIC SHARDING, Dmitry Lepikhin, HyoukJoong Lee,Yuanzhong Xu,Dehao Chen,Orhan Firat,Yanping Huang, Maxim Krikun,Noam Shazeer,Zhifeng Chen
semanticscholar(2021)
引用0浏览0引用
0
0
加载更多
作者统计
合作学者
合作机构
D-Core
- 合作者
- 学生
- 导师
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn