Chrome Extension
WeChat Mini Program
Use on ChatGLM

Reframing Family-School Partnerships to Disrupt Disenfranchisement of Black Families and Promote Reciprocity in Collaboration

Journal of School Psychology(2024)

Univ Wisconsin | Univ Wisconsin Madison | Univ Calif Los Angeles

Cited 0|Views4
Abstract
Research has long demonstrated the benefits of family-school partnerships. However, these benefits often fail to generalize to all families, especially Black families. A present and historical pattern of discrimination and exclusion has contributed to the lack of benefits yielded from Black family-school partnerships. A major contributing factor is the narrow way in which schools define family engagement. Such narrow definitions often marginalize families from non-dominant backgrounds, particularly Black families, and reinforce harmful narratives that Black parents and families are uninvolved in their children's education. The combination of continued discrimination and exclusion as well as harmful narratives has impacted Black family-school partnering. However, schools can work to repair harm and rebuild partnerships with Black families. In this article, we advance a framework for such work. After grounding the need for this framework in a historical context, we emphasize three essential components to forming equitable Black family-school partnerships: (a) grounding relationship building in social justice, (b) integrating reciprocity in family-school relationships, and (c) usage of multiple and non-dominant methods and modalities to build relationships.
More
Translated text
Key words
Black families,Family-school partnerships
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Related Papers
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest