Impact of Smoking on Recurrence Rates among Wide-Neck Intracranial Aneurysms Treated with Woven EndoBridge: a Multicenter Retrospective Study.
Journal of neurosurgery(2025)
3Department of Neurosurgery | 5Department of Neuroradiology | 6Neurosurgery Department | 2Department of Radiology | 7Radiology Department | 8Department of Neuroradiology | 9Department of Neuroradiology | 10Department of Neuroradiology | 12Department of Neuroradiology | 13Department of Neurosurgery | 14Department of Neuroradiology | 15Department of Neuroradiology | 17Department of Radiology | 18Department of Neurosurgery | 19Department of Neuroradiology | 21Department of Radiology | 22Department of Neuroradiology | 23Department of Neurosurgery | 24Department of Neurosurgery | 25Department of Neurosurgery | 26Department of Neuroradiology | 27Department of Neuroradiology | 28Department of Neuroradiology | 29Department of Neuroradiology | 30Department of Neuroradiology | 31Department of Neurosurgery | 32Department of Neuroradiology | 33Department of Neurosurgery | 34Department of Neuroradiology | 36Department of Radiology
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
