Synergistic Effect of the CDC7 Inhibitor, Monzosertib (AS-0141) with Current Therapies in AML Models
Cancer Research(2024)
1Carna Biosciences | 2CarnaBio USA
Abstract
Abstract Introduction: Cell division cycle 7 (CDC7) is a highly conserved serine-threonine kinase that plays an important role in the initiation of DNA replication and cell cycle progression. Aberrant expression of CDC7 have been reported to cause uncontrolled proliferation of many cancer types, suggesting that CDC7 inhibitors may provide a great potential for the development of novel therapy for cancers. Monzosertib (AS-0141) is a potent, selective, orally bioavailable small molecule inhibitor of CDC7, and is currently being evaluated in Phase I study for the treatment of solid tumors. In a preclinical cancer cell panel study, monzosertib showed strong antiproliferative activities against a variety of cancer types, and acute myeloid lymphoma (AML) cell lines were found to be the most sensitive to monzosertib. As a single agent, treatment with monzosertib accumulates DNA damage in cancer cells and induces cell death. In this study, we aimed to investigate the antitumor effects of monzosertib alone and in combination with other anticancer drugs in AML models. Method: Antiproliferative activity of monzosertib was examined against a panel of 35 human cancer cell lines of various cancer types. DNA metyltransferase (DNMT) inhibitors (azacitidine or decitabine) and BCL2 inhibitor (venetoclax) were evaluated for their synergistic/antagonistic effects in combination with monzosertib against human AML cell lines (THP-1, HL-60, MV4-11, MOLM-14, TF-1, U-937 and NOMO-1). The combination index (CI) was calculated using the Chou-Talalay method. Flow cytometry assay was used to analyze apoptosis. To evaluate the in vivo efficacy, tumor-bearing mice were treated with monzosertib alone or in combination with venetoclax. Results: The combination of monzosertib with azacitidine, decitabine, or venetoclax resulted in a significant synergistic antiproliferative effects against AML cell lines in vitro. The flow cytometry assay indicated that azacitidine combination induced apoptosis and increased cell death in THP-1 cells. In vivo, oral administration of monzosertib demonstrated robust in vivo antitumor efficacy in a MV4-11 tumor bearing xenograft mouse model, both as a single agent and in combination with venetoclax. Conclusions: Monzosertib, a selective CDC7 inhibitor, demonstrated strong antiproliferative activity against human AML cell lines, both as a single agent and in combination with standard therapies. Monzosertib exerts synergistic antitumor effect with venetoclax in a human AML xenograft mouse model. These results suggest that monzosertib has a potential to enhance the antitumor efficacy of standard of care agents for AML patients. Citation Format: Hiroko Endo, Hatsuo Furuichi, Akinori Arimura, Yu Nishioka, Masaaki Sawa. Synergistic effect of the CDC7 inhibitor, monzosertib (AS-0141) with current therapies in AML models [abstract]. In: Proceedings of the American Association for Cancer Research Annual Meeting 2024; Part 1 (Regular Abstracts); 2024 Apr 5-10; San Diego, CA. Philadelphia (PA): AACR; Cancer Res 2024;84(6_Suppl):Abstract nr 5714.
MoreTranslated text
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper