Measurement of the inclusive t(t)over-bar production cross section in proton-proton collisions at s=5.02 TeV
Journal of High Energy Physics(2022)SCI 2区SCI 1区
Yerevan Phys Inst | Inst Hochenergiephys | Inst Nucl Problems | Univ Antwerp | Vrije Univ Brussel | Univ Libre Bruxelles | Univ Ghent | Catholic Univ Louvain | Ctr Brasileiro Pesquisas Fis | Univ Estado Rio de Janeiro | Univ Estadual Paulista | Bulgarian Acad Sci | Univ Sofia | Beihang Univ | Tsinghua Univ | Inst High Energy Phys | Peking Univ | Sun Yat Sen Univ | Fudan Univ | Zhejiang Univ | Univ Los Andes | Univ Antioquia | Univ Split | Inst Rudjer Boskovic | Univ Cyprus | Charles Univ Prague | Escuela Politec Nacl | Univ San Francisco Quito | Egyptian Network High Energy Phys | Fayoum Univ | NICPB | Univ Helsinki | Helsinki Inst Phys | Lappeenranta Univ Technol | Univ Paris Saclay | Ecole Polytech | Univ Strasbourg | Inst Phys 2 Infinis Lyon IP2I | Georgian Tech Univ | Rhein Westfal TH Aachen | DESY | Univ Hamburg | Karlsruher Inst Technol | NCSR Demokritos | Natl & Kapodistrian Univ Athens | Natl Tech Univ Athens | Univ Ioannina | Eotvos Lorand Univ | Wigner Res Ctr Phys | Inst Nucl Res ATOMKI | Univ Debrecen | MATE Inst Technol | Indian Inst Sci IISc | HBNI | Panjab Univ | Univ Delhi | Indian Inst Technol Madras | Bhabha Atom Res Ctr | Tata Inst Fundamental Res A | Tata Inst Fundamental Res B | Indian Inst Sci Educ & Res IISER | Isfahan Univ Technol | Inst Res Fundamental Sci IPM | Univ Coll Dublin | Ist Nazl Fis Nucl | Univ Napoli Federico II | Univ Perugia | Univ Torino | Kyungpook Natl Univ | Chonnam Natl Univ | Hanyang Univ | Korea Univ | Kyung Hee Univ | Sejong Univ | Seoul Natl Univ | Univ Seoul | Yonsei Univ | Sungkyunkwan Univ | Amer Univ Middle East AUM | Riga Tech Univ | Vilnius Univ | Univ Malaya | Univ Sonora UNISON | IPN | Univ Iberoamer | Benemerita Univ Autonoma Puebla | Univ Montenegro | Univ Auckland | Univ Canterbury | Quaid I Azam Univ | AGH Univ Sci & Technol | Natl Ctr Nucl Res | Univ Warsaw | Lab Instrumentacao & Fis Expt Particulas | Joint Inst Nucl Res | Petersburg Nucl Phys Inst | Inst Nucl Res | NRC Kurchatov Inst | Moscow Inst Phys & Technol | Natl Res Nucl Univ | PN Lebedev Phys Inst | Lomonosov Moscow State Univ | Novosibirsk State Univ NSU | Natl Res Ctr | Ctr Invest Energet Medioambientales & Tecnol | Natl Res Tomsk Polytech Univ | Tomsk State Univ | Univ Belgrade | Univ Autonoma Madrid | Univ Oviedo | Univ Cantabria | Univ Colombo | Univ Ruhuna | CERN | Paul Scherrer Inst | Swiss Fed Inst Technol | Univ Zurich | Natl Cent Univ | Natl Taiwan Univ NTU | Chulalongkorn Univ | Cukurova Univ | Middle East Tech Univ | Bogazici Univ | Istanbul Tech Univ | Istanbul Univ | Natl Acad Sci Ukraine | Natl Sci Ctr | Univ Bristol | Rutherford Appleton Lab | Imperial Coll | Brunel Univ | Baylor Univ | Catholic Univ Amer | Univ Alabama | Boston Univ | Brown Univ | Univ Calif Davis | Univ Calif Los Angeles | Univ Calif Riverside | Univ Calif San Diego | Univ Calif Santa Barbara | CALTECH | Carnegie Mellon Univ | Univ Colorado | Cornell Univ | Fermilab Natl Accelerator Lab | Univ Florida | Florida State Univ | Florida Inst Technol | Univ Illinois Chicago UIC | Univ Iowa | Johns Hopkins Univ | Univ Kansas | Kansas State Univ | Lawrence Livermore Natl Lab | Univ Maryland | MIT | Univ Minnesota | Univ Nebraska | SUNY Buffalo | Northeastern Univ | Northwestern Univ | Univ Notre Dame | Ohio State Univ | Princeton Univ | Univ Puerto Rico | Purdue Univ | Purdue Univ Northwest | Rice Univ | Univ Rochester | Rutgers State Univ | Univ Tennessee | Texas A&M Univ | Texas Tech Univ | Vanderbilt Univ | Univ Virginia | Wayne State Univ | Univ Wisconsin | TU Wien
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance

被引用10500 | 浏览
被引用5053 | 浏览
被引用311 | 浏览
被引用14930 | 浏览
被引用65 | 浏览
被引用481 | 浏览
被引用418 | 浏览