Inverse Relationship Between Platelet Akt Activity and Hippocampal Atrophy: A Pilot Case-Control Study in Patients with Diabetes Mellitus.
World Journal of Clinical Cases(2024)
Natl Ctr Geriatr & Gerontol | Gifu Univ
Abstract
BACKGROUND:Akt plays diverse roles in humans. It is involved in the pathogenesis of type 2 diabetes mellitus (T2DM), which is caused by insulin resistance. Akt also plays a vital role in human platelet activation. Furthermore, the hippocampus is closely associated with memory and learning, and a decrease in hippocampal volume is reportedly associated with an insulin-resistant phenotype in T2DM patients without dementia.AIM:To investigate the relationship between Akt phosphorylation in unstimulated platelets and the hippocampal volume in T2DM patients.METHODS:Platelet-rich plasma (PRP) was prepared from the venous blood of patients with T2DM or age-matched controls. The pellet lysate of the centrifuged PRP was subjected to western blotting to analyse the phosphorylation of Akt, p38 mitogen-activated protein (MAP) kinase and glyceraldehyde 3-phosphate dehydrogenase (GAPDH). Phosphorylation levels were quantified by densitometric analysis. Hippocampal volume was analysed using a voxel-based specific regional analysis system for Alzheimer's disease on magnetic resonance imaging, which proposes the Z-score as a parameter that reflects hippocampal volume.RESULTS:The levels of phosphorylated Akt corrected with phosphorylated p38 MAP kinase were inversely correlated with the Z-scores in the T2DM subjects, whereas the levels of phosphorylated Akt corrected with GAPDH were not. However, this relationship was not observed in the control patients.CONCLUSION:These results suggest that an inverse relationship may exist between platelet Akt activation and hippocampal atrophy in T2DM patients. Our findings provide insight into the molecular mechanisms underlying T2DM hippocampal atrophy.
MoreTranslated text
Key words
Akt,Platelet,Hippocampal atrophy,Magnetic resonance imaging,Diabetes mellitus
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
2009
被引用29 | 浏览
2005
被引用73 | 浏览
2009
被引用122 | 浏览
2011
被引用24 | 浏览
2009
被引用133 | 浏览
2003
被引用140 | 浏览
2009
被引用16 | 浏览
1970
被引用306631 | 浏览
2011
被引用27 | 浏览
1995
被引用247 | 浏览
2015
被引用24 | 浏览
2016
被引用14 | 浏览
2018
被引用53 | 浏览
2019
被引用216 | 浏览
2019
被引用227 | 浏览
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper