Chrome Extension
WeChat Mini Program
Use on ChatGLM

Influence of Host Genetic Variation on Rubella-Specific T Cell Cytokine Responses Following Rubella Vaccination.

Vaccine(2009)SCI 3区

Mayo Clin

Cited 29|Views5
Abstract
The variability of immune response modulated by immune response gene polymorphisms is a significant factor in the protective effect of vaccines. We studied the association between cellular (cytokine) immunity and HLA genes among 738 schoolchildren (396 males and 342 females) between the ages of 11 and 19 years, who received two doses of rubella vaccine (Merck). Cytokine secretion levels in response to rubella virus stimulation were determined in PBMC cultures by ELISA. Cell supernatants were assayed for Th1 (IFN-γ, IL-2, and IL-12p40), Th2 (IL-4, IL-5, and IL-10), and innate/proinflammatory (TNF-α, GM-CSF, and IL-6) cytokines. We found a strong association between multiple alleles of the HLA-DQA1 (global p-value 0.022) and HLA-DQB1 (global p-value 0.007) loci and variations in rubella-specific IL-2 cytokine secretion. Additionally, the relationships between alleles of the HLA-A (global p-value 0.058), HLA-B (global p-value 0.035), and HLA-C (global p-value 0.023) loci and TNF-α secretion suggest the importance of HLA class I molecules in innate/inflammatory immune response. Better characterization of these genetic profiles could help to predict immune responses at the individual and population level, provide data on mechanisms of immune response development, and further inform vaccine development and vaccination policies.
More
Translated text
Key words
Rubella vaccine,HLA alleles,Cellular responses,Cytokines,ELISA,ELISPOT
PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Related Papers
2008

被引用1643 | 浏览

Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest