Resonances in E+e− Annihilation Near 2.2 GeV
Physical Review D(2020)SCI 2区
Univ Savoie | Univ Barcelona | INFN | Univ Bergen | Lawrence Berkeley Natl Lab | Ruhr Univ Bochum | Univ British Columbia | Inst Particle Phys | RAS | Univ Calif Irvine | Univ Calif Riverside | Univ Calif Santa Cruz | CALTECH | Univ Cincinnati | Univ Colorado | Ecole Polytech | Harvey Mudd Coll | Humboldt Univ | Indian Inst Technol Guwahati | Univ Iowa | Iowa State Univ | Johns Hopkins Univ | CNRS | Lawrence Livermore Natl Lab | Univ Liverpool | SUNY Albany | Queen Mary Univ London | Univ London Royal Holloway & Bedford New Coll | Univ Louisville | Johannes Gutenberg Univ Mainz | Univ Manchester | Univ Maryland | MIT | McGill Univ | Univ Mississippi | Univ Montreal | Natl Inst Nucl & High Energy Phys | Univ Notre Dame | Ohio State Univ | Sorbonne Univ | Princeton Univ | Univ Rostock | Rutherford Appleton Lab | Univ Paris Saclay | SLAC Natl Accelerator Lab | Univ South Carolina | Southern Methodist Univ | St Francis Xavier Univ | Stanford Univ | Tel Aviv Univ | Univ Tennessee | Univ Texas Austin | Univ Texas Dallas | Univ Valencia | Univ Victoria | Univ Warwick | Univ Wisconsin
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance

被引用70 | 浏览
被引用103 | 浏览
被引用99 | 浏览
被引用109 | 浏览