Abstract P1-12-04: Factors Influencing on Discontinuation of Adjuvant Anastrozole in Postmenopausal Japanese Breast Cancer Patients: Results from a Prospective Multicenter Cohort Study of Patient-Reported Outcomes
Cancer Research(2015)
1Kansai Rosai Hospital | 2Hyogo Cancer Center | 3Shinko Hospital | 4Kohnan Hospital | 5Itami City Hospital | 6Rokko Island Hospital | 7Hyogo Prefectural Nishinomiya Hospital | 8Kobe City Hospital Organization Kobe City Medical Center West Hospital | 9Kobe Kyodo Hospital | 10Nishi-Kobe Medical Center | 11Miyauchi Clinic | 12Chayamachi Breast Clinic | 13Hyogo Prefectural Tsukaguchi Hospital | 14Hashimoto Clinic | 15Takarazuka Municipal Hospital | 16Kinki Central Hospital | 17Kobe Century Memorial Hospital | 18Kobe Urban Breast Clinic | 19Hyogo Prefectural Kakogawa Medical Center | 20Kuma Hospital | 21Meiwa Hospital | 22Nishikawa Clinic | 23Kobe University School of Medicine | 24Kokufu Breast Clinic | 25Sakita Clinic | 26Kitatsuji Clinic | 27Kobe Adventist Hospital | 28Hyogo College of Medicine
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance