Chrome Extension
WeChat Mini Program
Use on ChatGLM

Expert Consensus on the Management of Adverse Events of Lorlatinib in the Treatment of ALK+ Advanced Non-small Cell Lung Cancer

Clinical drug investigation(2024)

Hospital del Mar | La Paz University Hospital | A Coruña University Hospital | Santiago de Compostela University Clinical Hospital | Virgen del Rocío University Hospital | Bellvitge University Hospital | Fundación Jiménez Díaz University Hospital | University Hospital Lozano Blesa | La Fe University and Polytechnic Hospital | Catalan Institute of Oncology | Insular-Maternity and Pediatric University Hospital Complex of Gran Canaria | Pfizer Oncology | Vall d’Hebron University Hospital and Vall d’Hebron Institute of Oncology

Cited 0|Views7
Abstract
The use of anaplastic lymphoma kinase (ALK) tyrosine kinase inhibitors (TKIs), such as lorlatinib, for the treatment of patients with ALK gene rearrangement (or ALK-positive) non-small cell lung cancer (NSCLC) has been shown to improve the overall survival and quality of life of these patients. However, lorlatinib is not exempt from potential adverse events. Adequate monitoring and management of these adverse events are critical for increasing patient adherence to lorlatinib, thereby maximizing the benefits of treatment and minimizing the risks associated with treatment discontinuation. Considering that the adverse events of lorlatinib can affect different organs and systems, the participation of a multidisciplinary team, including cardiologists, neurologists, internal medicine specialists, and oncology pharmacists, is needed. This article presents specific and pragmatic strategies for identifying and treating the most relevant adverse events associated with lorlatinib in patients with advanced ALK-positive NSCLC based on the clinical experience of a multidisciplinary panel of experts.
More
Translated text
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest