Introduction and Establishment of SARS-CoV-2 Gamma Variant in New York City in Early 2021
JOURNAL OF INFECTIOUS DISEASES(2022)
Univ Calif San Diego | New York City Dept Hlth & Mental Hyg | Scripps Res | Univ Oxford | Univ Calif Los Angeles
Abstract
Even though international travel restrictions between the United States and Brazil, the country where Gamma variant was first detected, were in place since May 2020, they did not prevent the lineage from establishing in New York City in early 2021. Background Monitoring the emergence and spread of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) variants is an important public health objective. We investigated how the Gamma variant was established in New York City (NYC) in early 2021 in the presence of travel restrictions that aimed to prevent viral spread from Brazil, the country where the variant was first identified. Methods We performed phylogeographic analysis on 15 967 Gamma sequences sampled between 10 March and 1 May 2021, to identify geographic sources of Gamma lineages introduced into NYC. We identified locally circulating Gamma transmission clusters and inferred the timing of their establishment in NYC. Results We identified 16 phylogenetically distinct Gamma clusters established in NYC (cluster sizes ranged 2-108 genomes); most of them were introduced from Florida and Illinois and only 1 directly from Brazil. By the time the first Gamma case was reported by genomic surveillance in NYC on 10 March, the majority (57%) of circulating Gamma lineages had already been established in the city for at least 2 weeks. Conclusions Although travel from Brazil to the United States was restricted from May 2020 through the end of the study period, this restriction did not prevent Gamma from becoming established in NYC as most introductions occurred from domestic locations.
MoreTranslated text
Key words
SARS-CoV-2,Gamma variant,New York City,travel restrictions,public health
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
PLoS Pathogens 2023
被引用3
Surveillance Strategies for the Detection of New Pathogen Variants Across Epidemiological Contexts
PLoS Comput Biol 2024
被引用0
Effectiveness of Canadian Travel Restrictions in Reducing Burden of SARS-CoV-2 Variants of Concern
2023
被引用0
NATURE COMMUNICATIONS 2024
被引用1
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper