Diabetes is an Independent Risk Factor for Cancer after Heart And/or Lung Transplantation
Journal of clinical medicine(2022)SCI 3区SCI 2区
Abstract
Background: De novo cancers are feared complications after heart or lung transplantation. Recent data suggest that diabetes mellitus (DM) might also be a risk factor for cancer. We hypothesized that transplanted diabetic patients are at greater risk of developing cancer compared to non-diabetic ones. Methods: We reviewed 353 patients post-heart and/or -lung transplantation from our center between October 1999 and June 2021. Patients with follow-up <180 days (n = 87) were excluded from the analysis. The remaining 266 patients were divided into patients who had preoperative DM (n = 88) or developed it during follow-up (n = 40) and patients without DM (n = 138). Results: The diabetic cohort showed higher rates of malignancies in all patients (30.33 vs. 15.97%, p = 0.005) and in the matched population (31.9 vs. 16.1%, p < 0.001). There were also significantly more solid tumors (17.9 vs. 9.4%, p = 0.042; matched: 16.6 vs. 9.1%, p = 0.09) The presence of diabetes was associated with a 13% increased risk of cancer when compared to non-diabetic patients. New-onset post-transplant diabetes doubled the likelihood of cancer development. Conclusions: Pre-transplant diabetes mellitus increases the risk of cancer after heart and/or lung transplantation. However, new-onset diabetes after transplantation is associated with a much greater cancer risk. This information is relevant for screening during follow-up.
MoreTranslated text
Key words
heart transplantation,lung transplantation,diabetes mellitus,cancer
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper