基本信息
浏览量:93
职业迁徙
个人简介
Implemented model parallel transformer training framework using JAX and Google Cloud TPU pods. Achieves 61% of peak theoretical performance with data movement and model structure optimizations. Trained 6B parameter GPT model to completion while achieving similar downstream performance to GPT-3 models of similar size in 4 weeks with a TPU v3-256.
研究兴趣
论文共 103 篇作者统计合作学者相似作者
按年份排序按引用量排序主题筛选期刊级别筛选合作者筛选合作机构筛选
时间
引用量
主题
期刊级别
合作者
合作机构
CoRR (2024)
引用0浏览0EI引用
0
0
Visualized Cancer Medicine (2024): 2
2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)pp.28619-28630, (2024)
Yuxuan Gu,Yi Jin,Ben Wang,Zhixiang Wei,Xiaoxiao Ma,Pengyang Ling,Haoxuan Wang,Huaian Chen, Enhong Chen
IEEE Transactions on Circuits and Systems for Video Technologypp.1-1, (2024)
THERANOSTICSno. 1 (2024): 220-248
INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGYno. 1-2 (2024): 491-510
IEEE transactions on instrumentation and measurement (2023): 1-12
ACS nanono. 8 (2023): 7498-7510
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICAno. 38 (2023)
Journal of controlled release : official journal of the Controlled Release Society (2023): 283-296
加载更多
作者统计
#Papers: 103
#Citation: 5297
H-Index: 25
G-Index: 66
Sociability: 6
Diversity: 3
Activity: 71
合作学者
合作机构
D-Core
- 合作者
- 学生
- 导师
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn