Chrome Extension
WeChat Mini Program
Use on ChatGLM

Dendritic Cells Exposed in Vitro to Tgf-Beta 1 Ameliorate Experimental Autoimmune Myasthenia Gravis

Clinical and experimental immunology(2002)SCI 3区

Karolinska Inst

Cited 58|Views1
Abstract
Experimental autoimmune myasthenia gravis (EAMG) is an animal model for human myasthenia gravis (MG), characterized by an autoaggressive T-cell-dependent antibody-mediated immune response directed against the acetylcholine receptor (AChR) of the neuromuscular junction. Dendritic cells (DC) are unique antigen-presenting cells which control T- and B-cell functions and induce immunity or tolerance. Here, we demonstrate that DC exposed to TGF-beta1 in vitro mediate protection against EAMG. Freshly prepared DC from spleen of healthy rats were exposed to TGF-beta1 in vitro for 48 h, and administered subcutaneously to Lewis rats (2 x 10(6) DC/rat) on day 5 post immunization with AChR in Freund's complete adjuvant. Control EAMG rats were injected in parallel with untreated DC (naive DC) or PBS. Lewis rats receiving TGF-beta1-exposed DC developed very mild symptoms of EAMG without loss of body weight compared with control EAMG rats receiving naive DC or PBS. This effect of TGF-beta1-exposed DC was associated with augmented spontaneous and AChR-induced proliferation, IFN-gamma and NO production, and decreased levels of anti-AChR antibody-secreting cells. Autologous DC exposed in vitro to TGF-beta1 could represent a new opportunity for DC-based immunotherapy of antibody-mediated autoimmune diseases.
More
Translated text
Key words
experimental autoimmune myasthenia gravis,dendritic cells,TGF-beta 1
PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Related Papers

Vaccines Against Myasthenia Gravis

Expert Opinion on Biological Therapy 2005

被引用4

Suppression of EAMG in Lewis Rats by IL‐10‐Exposed Dendritic Cells

Annals of the New York Academy of Sciences 2003

被引用5

Current and Emerging Treatments for the Management of Myasthenia Gravis

Therapeutics and Clinical Risk Management 2011

被引用31

The Role of Innate Immunity in Induction of Tolerance

Biochemistry (Moscow) Supplement Series B Biomedical Chemistry 2015

被引用0

Tolerising Cellular Therapies: What is Their Promise for Autoimmune Disease?

Chijioke H. Mosanya,John D. Isaacs
Annals of the Rheumatic Diseases 2018

被引用52

Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest