Studying of Space Weather Electromagnetic Parameters of Ionosphere in «obstanovka 1-Step» Experiment on the Russian Segment of the ISS
Space engineering and technology(2021)
Space Research Institute of the Russian Academy of Sciences (IKI RAS) | Wigner Research Centre for Physics | Institute of Information and Communication Technologies of the Bulgarian Academy of Sciences (IICT BAS) | Lviv Center of Institute for Space Research of the National Academy of Sciences of Ukraine National Space Agency of Ukraine (LC ISR NASU NSAU) | 5Space Research Institute of the Royal Swedish Academy of Sciences | Eötvös Loránd University | Space Research and Technology Institute of the Bulgarian Academy of Sciences (SRTI BAS) | University of Sussex | S.P. Korolev Rocket and Space Corporation Energia (RSC Energia) | Eötvös University | Space Research Centre of Polish Academy of Sciences (SRC PAS) | Wigner Research Center | Wigner Researсh Centre for Physics | BL-Electronics
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
