代谢酶及转运体基因多态性对厄洛替尼与吉非替尼治疗效果及毒副反应影响的研究进展
Central South Pharmacy(2020)
Abstract
厄洛替尼与吉非替尼作为第一代小分子酪氨酸激酶抑制剂(TKI)类靶向药物,目前在临床上广泛地应用于晚期非小细胞肺癌(NSCLC)的治疗.与传统的标准铂二联化疗方案相比,厄洛替尼与吉非替尼能显著提高NSCLC患者的无进展生存期(PFS),且毒副反应较低.但在临床应用中发现,厄洛替尼与吉非替尼在患者内个体差异较大,在标准给药剂量下,有一部分患者出现治疗无效或效果较差的现象,另一部分患者则出现严重的毒副反应.大量的研究表明,代谢酶及转运体基因多态性是引起TKI类小分子靶向药物药动学差异的重要原因.因此,开展基于代谢酶及转运体基因多态性的个体化给药对厄洛替尼与吉非替尼具有重要的意义.本文综述了代谢酶(CYP3A4、CYP3A5、CYP1A1、CYP1A2、CYP2D6)及转运体(ABCB1、ABCG2)基因多态性对NSCLC患者体内厄洛替尼与吉非替尼的治疗效果及毒副反应的影响,为NSCLC患者个体化治疗奠定基础.
More求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
2006
被引用64 | 浏览
2013
被引用23 | 浏览
2001
被引用2739 | 浏览
2007
被引用191 | 浏览
2009
被引用248 | 浏览
2006
被引用118 | 浏览
2011
被引用67 | 浏览
2012
被引用31 | 浏览
2012
被引用68 | 浏览
2015
被引用61 | 浏览
2005
被引用4 | 浏览
2018
被引用4 | 浏览
2017
被引用2 | 浏览
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper