Chrome Extension
WeChat Mini Program
Use on ChatGLM

Fine-tuning the Architecture of Microgels by Varying the Initiator Addition Time

SOFT MATTER(2025)

Univ Ferrara | Lund Univ | Univ Barcelona | Inst Laue Langevin

Cited 0|Views1
Abstract
Poly-N-isopropylacrylamide (PNIPAM) microgels are versatile colloidal-scale polymer networks that exhibit unique responsiveness to external stimuli, such as temperature. While the synthesis of PNIPAM microgels is well-documented, there is limited exploration of how their structural properties can be modified by subtle changes in the polymerization process. In this work, we carefully investigate how varying the time of addition of a common initiator, such as potassium persulfate, during the polymerization process allows a precise control over microgel architecture. Our findings, based on a combination of numerical simulations, scattering, and rheology experiments, reveal that delayed initiator addition results in a more heterogeneous network, characterized by a less extended corona. In contrast, more homogeneous microgels are obtained by adding the initiator all at the start of the synthesis. In this way, the internal mass distribution of the particles can be tuned, highlighting the importance of synthesis timing for optimizing microgel conformation and functionality in tailored applications.
More
Translated text
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper

要点】:研究通过改变引发剂添加时间对PNIPAM微凝胶结构进行微调,实现对其内部质量分布和功能性的优化。

方法】:采用数值模拟、散射和流变学实验相结合的方法研究引发剂添加时间对微凝胶结构的影响。

实验】:通过改变引发剂(如过硫酸钾)在聚合过程中的添加时间,研究了微凝胶的内部结构变化,使用的数据集名称未在文中明确提及,但结果揭示了延迟引发剂添加导致网络异质性增加,冠层扩展性降低。