Chrome Extension
WeChat Mini Program
Use on ChatGLM

Molecular Dynamics Simulation of Polymer Nanocomposites with Supramolecular Network Constructed Via Functionalized Polymer End-Grafted Nanoparticles.

POLYMERS(2023)

Beijing Technol & Business Univ | Beijing Univ Chem Technol

Cited 1|Views5
No score
Abstract
Since the proposal of self-healing materials, numerous researchers have focused on exploring their potential applications in flexible sensors, bionic robots, satellites, etc. However, there have been few studies on the relationship between the morphology of the dynamic crosslink network and the comprehensive properties of self-healing polymer nanocomposites (PNCs). In this study, we designed a series of modified nanoparticles with different sphericity (η) to establish a supramolecular network, which provide the self-healing ability to PNCs. We analyzed the relationship between the morphology of the supramolecular network and the mechanical performance and self-healing behavior. We observed that as η increased, the distribution of the supramolecular network became more uniform in most cases. Examination of the segment dynamics of polymer chains showed that the completeness of the supramolecular network significantly hindered the mobility of polymer matrix chains. The mechanical performance and self-healing behavior of the PNCs showed that the supramolecular network mainly contributed to the mechanical performance, while the self-healing efficiency was dominated by the variation of η. We observed that appropriate grafting density is the proper way to effectively enhance the mechanical and self-healing performance of PNCs. This study provides a unique guideline for designing and fabricating self-healing PNCs with modified Nanoparticles (NPs).
More
Translated text
Key words
self-healing materials,supramolecular crosslink network,molecular dynamic simulation
求助PDF
上传PDF
PPT

Code

Data

Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest