Altered EBV Specific Immune Control in Multiple Sclerosis
JOURNAL OF NEUROIMMUNOLOGY(2024)
Abstract
Since the 1980s it is known that immune responses to the Epstein-Barr virus (EBV) are elevated in multiple sclerosis (MS) patients. Recent seroepidemiologial data have shown that this alteration after primary EBV infection identifies individuals with a more than 30-fold increased risk to develop MS. The mechanisms by which EBV infection might erode tolerance for the central nervous system (CNS) in these individuals, years prior to clinical MS onset, remain unclear. In this review I will discuss altered frequencies of EBV life cycle stages and their tissue distribution, EBV with CNS autoantigen cross-reactive immune responses and loss of immune control for autoreactive B and T cells as possible mechanisms. This discussion is intended to stimulate future studies into these mechanisms with the aim to identify candidates for interventions that might correct EBV specific immune control and/or resulting cross-reactivities with CNS autoantigens in MS patients and thereby ameliorate disease activity.
MoreTranslated text
Key words
Multiple sclerosis,EBNA1,T cells,NK cells,Infectious mononucleosis
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
2003
被引用134 | 浏览
2000
被引用439 | 浏览
2009
被引用272 | 浏览
1995
被引用1938 | 浏览
2003
被引用63 | 浏览
1997
被引用256 | 浏览
2006
被引用213 | 浏览
1992
被引用378 | 浏览
2012
被引用108 | 浏览
1971
被引用453 | 浏览
2009
被引用169 | 浏览
2013
被引用127 | 浏览
2013
被引用184 | 浏览
2014
被引用61 | 浏览
2016
被引用707 | 浏览
2019
被引用88 | 浏览
2020
被引用39 | 浏览
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined