Telemonitoring of Active Inflammatory Bowel Disease Using the App TECCU: Short-Term Results of a Multicenter Trial of GETECCU
Journal of medical Internet research(2024)
Gastroenterology Department La Fe University and Polytechnic Hospital Valencia Spain | Miguel Servet University Hospital Zaragoza Spain | University Clinical Hospital Santiago Spain | Nuestra Señora de la Candelaria University Hospital Tenerife Spain | Dr Balmis General University Hospital ISABIAL Alicante Spain | Lozano Blesa Clinic University Hospital Zaragoza Spain | Burgos University Hospital Burgos Spain | CIBERehd Instituto de Salud Carlos III Madrid Spain | La Paz University Hospital Faculty of Medicine Universidad Autónoma de Madrid Madrid Spain | Infanta Sofía University Hospital Madrid Spain | Ramón y Cajal University Hospital Madrid Spain | San Cecilio Clinic University Hospital Parque Tecnológico de la Salud Granada Spain | Morales Meseguer General University Hospital Murcia Spain | Clinic University Hospital Valencia Spain | Reina Sofía University Hospital Córdoba Spain | Hospital Alvaro Cunqueiro Vigo Spain | Puerta de Hierro University Hospital Madrid Spain
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
