Discovery of a Young, Highly Scattered Pulsar PSR J1032-5804 with the Australian Square Kilometre Array Pathfinder
The Astrophysical Journal(2024)SCI 2区SCI 3区
Curtin Univ | Univ Wisconsin | ATNF | Univ Toronto | Univ Sydney | ARC Ctr Excellence Gravitat Wave Discovery OzGrav
Abstract
We report the discovery of a young, highly scattered pulsar in a search for highly circularly polarized radio sources as part of the Australian Square Kilometre Array Pathfinder Variables and Slow Transients survey. In follow-up observations with the Parkes radio telescope, Murriyang, we identified PSR J1032−5804 and measured a period of 78.7 ms, a dispersion measure of 819 ± 4 pc cm ^−3 , a rotation measure of −2000 ± 1 rad m ^−2 , and a characteristic age of 34.6 kyr. We found a pulse scattering timescale at 3 GHz of ∼22 ms, implying a timescale at 1 GHz of ∼3845 ms, which is the third most scattered pulsar known and explains its nondetection in previous pulsar surveys. We discuss the identification of a possible pulsar wind nebula and supernova remnant in the pulsar’s local environment by analyzing the pulsar spectral energy distribution and the surrounding extended emission from multiwavelength images. Our result highlights the possibility of identifying extremely scattered pulsars from radio continuum images. Ongoing and future large-scale radio continuum surveys will offer us an unprecedented opportunity to find more extreme pulsars (e.g., highly scattered, highly intermittent, and highly accelerated), which will enhance our understanding of the characteristics of pulsars and the interstellar medium.
MoreTranslated text
Key words
Neutron stars,Galactic radio sources,Radio pulsars,Interstellar scattering
PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
2008
被引用216 | 浏览
2010
被引用401 | 浏览
2001
被引用568 | 浏览
2009
被引用40 | 浏览
Low-radio-frequency Eclipses of the Redback Pulsar J2215+5135 Observed in the Image Plane with LOFAR
2016
被引用26 | 浏览
2017
被引用102 | 浏览
2018
被引用88 | 浏览
2017
被引用424 | 浏览
2017
被引用103 | 浏览
2018
被引用13 | 浏览
2020
被引用421 | 浏览
2022
被引用10 | 浏览
2023
被引用15 | 浏览
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
去 AI 文献库 对话