Long Non-Coding RNA MAGI2-AS3 Inactivates STAT3 Pathway to Inhibit Prostate Cancer Cell Proliferation Via Acting As a Microrna-424-5p Sponge.
Journal of Cancer(2022)
Abstract
Aberrant expression of long non-coding RNAs (lncRNAs) that results in sustained activation of cell growth promoting pathways is an important mechanism in driving prostate cancer progression. In the present study, we explored differentially expressed lncRNAs in two microarray datasets of prostate benign and malignant tissues. We found that MAGI2-AS3 was one of the most downregulated lncRNAs in prostate tumors, which was further confirmed in our collected clinical samples. The function assays showed that MAGI2-AS3 overexpression decreased cell viability and led to obvious cell apoptosis in PC-3 and DU145 prostate cancer cells. Elevation of MAGI2-AS3 decreased the activity of STAT3 in PC-3 and DU145. In addition, microRNA-424-5p (miR-424-5p), a positive regulator of STAT3 pathway, was predicted as a target of MAGI2-AS3, furthermore, the interaction between MAGI2-AS3 and miR-424-5p was confirmed via reverse-transcript polymerase chain reaction (RT-qPCR), dual luciferase reporter assay and RNA immunoprecipitation (RIP). MAGI2-AS3 upregulated miR-424-5p and downregulated COP1 in PC-3 and DU145. More importantly, IL6-induced activation of STAT3 pathway could attenuate the biological effect of MAGI2-AS3 in PC-3 and DU145. In clinical samples, MAGI2-AS3 levels were negatively correlated with miR-424-5p expression, while positively correlated with COP1 mRNA expression. Altogether, the current study revealed MAGI2-AS3 as a novel negative regulator of prostate cancer development.
MoreTranslated text
Key words
MAGI2-AS3,prostate cancer,miR-424-5p,STAT3
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
2016
被引用296 | 浏览
2017
被引用244 | 浏览
2018
被引用141 | 浏览
2019
被引用34 | 浏览
2016
被引用74 | 浏览
2016
被引用126 | 浏览
2019
被引用149 | 浏览
2019
被引用45 | 浏览
2019
被引用82 | 浏览
2019
被引用106 | 浏览
2020
被引用79 | 浏览
2020
被引用19 | 浏览
2020
被引用4700 | 浏览
2020
被引用17 | 浏览
2020
被引用12 | 浏览
2020
被引用1838 | 浏览
2020
被引用119 | 浏览
2020
被引用22 | 浏览
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper