Structural, Vibrational and Luminescence Properties of Solid Solution Based on the (1-X/2) Ce2(WO4)3 + (x/2) Sm2(WO4)3 System
Journal of Molecular Structure(2022)SCI 3区
Aix Marseille Univ | CeSigma SA | CEA Cadarache
Abstract
A series of 14 polycrystalline cerium samarium tungstates belonging to the system [(1-x/2)center dot Ce-2(WO4)(3) + x/2 Sm-2(WO4)(3)] was synthesized by a coprecipitation method followed by thermal treatment at 1000 degrees C. The polycrystalline samples were characterized by X-ray diffraction, scanning electron microscopy and Raman spectroscopy. Their crystal structure was subjected to Rietveld refinement calculations and quasi-linear variation of cell parameters was observed, suggesting the formation of a solid solution Ce(2-x)Smx(WO4)(3) (0 <= x <= 2). Raman spectroscopy was used to confirm the formation of the disordered solid solution. The analyses of Bragg peak and Raman emission profiles allowed evidencing the formation of structural defects. Scanning electron microscopy images show relatively well crystallized grains. Photoluminescence experiments were performed under polychromatic X-ray excitation delivered by copper source. The characteristic luminescence of Ce3+ cations was not observed whatever the composition x. The resulting emissions were ascribed to Sm3+ cations associated with additional emission from structural defects. Chromaticity diagrams show that color coordinates vary with Sm composition and structural defects in the orange-red range. (C) 2022 Elsevier B.V. All rights reserved.
MoreTranslated text
Key words
Samarium cerium tungstates,Chemical substitution,Structure defects, Photoluminescence
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper