The Discovery Space of ELT-ANDES. Stars and Stellar Populations
Experimental Astronomy(2024)SCI 4区
University of Michigan | Leibniz Institute for Astrophysics Potsdam (AIP) | Instituto de Astrofísica de Canarias | Instituto de Astrofísica e Ciências do Espaço | Instituto de Astrofísica de Andalucía (IAA-CSIC) | Lund University | Université Côte d’Azur | Goethe University Frankfurt | Uppsala University | INAF | Kavli Institute for Cosmology | Università degli Studi di Firenze | Universidade Federal do Rio Grande do Norte | Alma Mater Studiorum | European Southern Observatory | Institut für Astrophysik und Geophysik | Sorbonne Universités | University of Victoria | Royal Military College of Canada
Abstract
The ArmazoNes high Dispersion Echelle Spectrograph (ANDES) is the optical and near-infrared high-resolution echelle spectrograph envisioned for the Extremely Large Telescope (ELT). We present a selection of science cases, supported by new calculations and simulations, where ANDES could enable major advances in the fields of stars and stellar populations. We focus on three key areas, including the physics of stellar atmospheres, structure, and evolution; stars of the Milky Way, Local Group, and beyond; and the star-planet connection. The key features of ANDES are its wide wavelength coverage at high spectral resolution and its access to the large collecting area of the ELT. These features position ANDES to address the most compelling questions and potentially transformative advances in stellar astrophysics of the decades ahead, including questions which cannot be anticipated today.
MoreTranslated text
Key words
Star clusters (1567),Stellar atmospheres (1584),Stellar evolution (1599),Stellar physics (1621),Stellar populations (1622),High resolution spectroscopy (2096),Galactic archaeology (2178)
PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
去 AI 文献库 对话