Antiretroviral Drug-Resistance Mutations on the Gag Gene: Mutation Dynamics During Analytic Treatment Interruption among Individuals Experiencing Virologic Failure.
Pathogens(2022)
Univ Fed Sao Paulo | Univ Calif San Francisco
Abstract
We describe drug-resistance mutation dynamics of the gag gene among individuals under antiretroviral virologic failure who underwent analytical treatment interruption (ATI). These mutations occur in and around the cleavage sites that form the particles that become the mature HIV-1 virus. The study involved a 12-week interruption in antiretroviral therapy (ART) and sequencing of the gag gene in 38 individuals experiencing virologic failure and harboring triple-class resistant HIV strains. Regions of the gag gene surrounding the NC-p2 and p1-p6 cleavage sites were sequenced at baseline before ATI and after 12 weeks from plasma HIV RNA using population-based Sanger sequencing. Fourteen of the sixteen patients sequenced presented at least one mutation in the gag gene at baseline, with an average of 4.93 mutations per patient. All the mutations had reverted to the wild type by the end of the study. Mutations in the gag gene complement mutations in the pol gene to restore HIV fitness. Those mutations around cleavage sites and within substrates contribute to protease inhibitor resistance and difficulty in re-establishing effective virologic suppression. ART interruption in the presence of antiretroviral resistant HIV strains was used here as a practical measure for more adapted HIV profiles in the absence of ART selective pressure.
MoreTranslated text
Key words
antiretroviral virologic failure,antiretroviral resistance,analytical treatment interruption,fitness cost,gag gene
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
Asia Pacific Journal of Molecular Biology and Biotechnology 2023
被引用0
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest