Tu1685 Loss of Cohesin is a Common Early Event in Colorectal Cancer: Potential Mediator of Chromosomal Instability
Gastroenterology(2014)SCI 1区
Abstract
G A A b st ra ct s in presence or not of FOLFOX (50 μM of 5-FU and 1.25, 2.5 or 5 μM of oxaliplatin; 50 μM of 5-FU/2.5 μM of oxaliplatin being considered as the circulating blood concentration of patients treated by FOLFOX) (luminescence viability assay). ABCG2 mRNA expression in 2D vs. 3D HT29 cultures (HT29 being more resistant than HCT116; see results) was also studied (Q-PCR). FOLFOX significantly decreased HT29 and HCT116 viability without any dose-dependency. HCT116 cells were more affected than HT29 cells (e.g. 20±2% vs 43±2% for FOLFOX 50/2.5; p<0.05). Using 3D cultures, FOLFOX decreased dose-dependently HCT116 viability but at a lesser extent than in 2D (72±1% for FOLFOX 50/2.5; p<0.01). By contrast, viability of HT29 colonospheres was not affected, these cells cultured in 3D therefore appearing as completly resistant to FOLFOX. CFE was significantly decreased both in HCT116 and HT29 cells. In HT29 cells, basal expression of ABCG2 mRNA was significantly higher in 2D vs. 3D cultures (p=0.001). However, as FOLFOX increased twotimes ABCG2 mRNA expression in 2D HT29 cultures, it increases by 6-fold ABCG2 mRNA expression in HT29 colonospheres (p<0.01 compared to untreated colonospheres). In conclusion, HCT116 and at a much greater level HT29 cells cultured in 3D are significantly more resistant to FOLFOX than in 2D cultures, suggesting that 3D culture is more appropriate to study in vitro resistance to anticancer drugs in CC. This chemoresistance seems to be at least partly explained by increased expression of ABCG2 drug transporter. Acknowlegments: F.P was supported by an academic grant from the French Research Ministery (Annee Recherche). This work was also supported by grants from Alsace contre le Cancer and from Association Coeur-Cancer from the Departement de la Manche, France.
MoreTranslated text
PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest