Variation of the Clinical Spectrum and Genotype-Phenotype Associations in Coenzyme Q10 Deficiency Associated Glomerulopathy.
Kidney international(2022)SCI 1区
Giannina Gaslini Childrens Hosp | Med Univ Gdansk | Hacettepe Univ | Bambino Gesu Pediat Hosp | Necker Enfants Malad Hosp | Heidelberg Univ | Univ Gdansk | Fudan Univ | Tech Univ Munich | Natl Med & Res Ctr Childrens Hlth | Natl Univ Singapore | NAMS Ukraine | Padua Univ Hosp | Univ Childrens Hosp | Pirogov Russian Natl Res Med Univ | Radboud Univ Nijmegen | Belarusian State Med Univ | Zhejiang Univ | Anhui Prov Childrens Hosp | La Timone Univ Hosp Marseille | Polish Mothers Mem Hosp | Montpellier Univ Hosp | Childrens Hosp Westmead | Univ Med Ctr | Erciyes Univ | Shahid Beheshti Univ Med Sci | Univ Hosp Heidelberg | Oslo Univ Hosp | Cukurova Univ | Pediat Med Dachau | Iran Univ Med Sci | Paris Univ | Univ Hosp Vall dHebron | Univ Hosp Ludwig Maximilian Univ | Univ Hosp Bonn | Henan Childrens Hosp | Shandong Prov Hosp | Wuhan Childrens Hosp
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance

被引用95 | 浏览
被引用230 | 浏览
被引用58 | 浏览
被引用52 | 浏览
被引用21 | 浏览
被引用87 | 浏览
被引用75 | 浏览
被引用73 | 浏览
被引用44 | 浏览
被引用6 | 浏览