Uncertainty in Parameterized Convection Remains a Key Obstacle for Estimating Surface Fluxes of Carbon Dioxide
ATMOSPHERIC CHEMISTRY AND PHYSICS(2023)
Colorado State Univ | Univ Colorado
Abstract
The analysis of observed atmospheric trace-gas mole fractions to infer surface sources and sinks of chemical species relies heavily on simulated atmospheric transport. The chemical transport models (CTMs) used in flux-inversion models are commonly configured to reproduce the atmospheric transport of a general circulation model (GCM) as closely as possible. CTMs generally have the dual advantages of computational efficiency and improved tracer conservation compared to their parent GCMs, but they usually simplify the representations of important processes. This is especially the case for high-frequency vertical motions associated with diffusion and convection. Using common-flux experiments, we quantify the importance of parameterized vertical processes for explaining systematic differences in tracer transport between two commonly used CTMs. We find that differences in modeled column-average CO2 are strongly correlated with the differences in the models' convection. The parameterization of diffusion is more important near the surface due to its role in representing planetary-boundary-layer (PBL) mixing. Accordingly, simulated near-surface in situ measurements are more strongly impacted by this process than are simulated total-column averages. Both diffusive and convective vertical mixing tend to ventilate the lower atmosphere, so near-surface measurements may only constrain the net vertical mixing and not the balance between these two processes. Remote-sensing-based retrievals of total-column CO2, with their increased sensitivity to convection, may provide important new constraints on parameterized vertical motions.
MoreTranslated text
Key words
Chemical Transport Model,Emission Modeling
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
Influences of Uncertainties in the STT Flux on Modeled Tropospheric Methane
JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES 2023
被引用0
SCIENCE OF THE TOTAL ENVIRONMENT 2024
被引用0
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper