Principal Component Analysis As an Efficient Method for Capturing Multivariate Brain Signatures of Complex Disorders-Enigma Study in People with Bipolar Disorders and Obesity.
Journal Of Sleep Research(2024)SCI 3区
Dalhousie Univ | Czech Acad Sci | Natl Inst Mental Hlth | Karolinska Inst | Philipps Univ Marburg | Univ Vita Salute San Raffaele | Deakin Univ | Oslo Univ Hosp | Univ Munster | FIDMAG Germanes Hospitalaries Res Fdn | Univ Galway | Univ Minnesota | Univ Antioquia | Univ Oslo | Univ Calif San Diego | Univ Barcelona | Neurosci Res Australia | Stanford Univ | Univ Groningen | Univ Cape Town | Gothenburg Univ | Laureate Inst Brain Res | Univ Vermont | Univ New South Wales | UCLA | Inst Invest Biomed August Pi i Sunyer IDIBAPS | Inst Alta Tecnol Med | Harvard Med Sch | Inst Mental Hlth | Monash Univ | Univ Southern Calif | Erasmus Univ | Univ British Columbia
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance

被引用7688 | 浏览
被引用129 | 浏览
被引用13 | 浏览
被引用540 | 浏览
被引用42 | 浏览
被引用361 | 浏览
被引用118 | 浏览
被引用130 | 浏览
被引用25 | 浏览
被引用28 | 浏览
被引用37 | 浏览
被引用4 | 浏览
被引用26 | 浏览
被引用3 | 浏览
被引用6 | 浏览
被引用10 | 浏览
被引用3 | 浏览