Probing Spin-Dependent Dark Matter Interactions with Li
European Physical Journal C(2022)SCI 2区
Max-Planck-Institut für Physik | INFN | Comenius University | Institut für Hochenergiephysik der Österreichischen Akademie der Wissenschaften | Technische Universität München | Eberhard-Karls-Universität Tübingen | University of Oxford
Abstract
CRESST is one of the most prominent direct detection experiments for dark matter particles with sub-GeV/c ^2 mass. One of the advantages of the CRESST experiment is the possibility to include a large variety of nuclides in the target material used to probe dark matter interactions. In this work, we discuss in particular the interactions of dark matter particles with protons and neutrons of ^6 Li. This is now possible thanks to new calculations on nuclear matrix elements of this specific isotope of Li. To show the potential of using this particular nuclide for probing dark matter interactions, we used the data collected previously by a CRESST prototype based on LiAlO _2 and operated in an above ground test-facility at Max-Planck-Institut für Physik in Munich, Germany. In particular, the inclusion of ^6 Li in the limit calculation drastically improves the result obtained for spin-dependent interactions with neutrons in the whole mass range. The improvement is significant, greater than two order of magnitude for dark matter masses below 1 GeV/c ^2 , compared to the limit previously published with the same data.
MoreTranslated text
PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
Testing Spin-Dependent Dark Matter Interactions with Lithium Aluminate Targets in CRESST-III
Physical review D/Physical review D 2022
被引用14
Improving ANAIS-112 Sensitivity to DAMA/LIBRA Signal with Machine Learning Techniques
Journal of Cosmology and Astroparticle Physics 2022
被引用3
Search for New Physics in Low-Energy Electron Recoils from the First LZ Exposure
PHYSICAL REVIEW D 2023
被引用7
Rare event searches with cryogenic detectors
PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES 2024
被引用2
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
去 AI 文献库 对话