Explanation Guided Knowledge Distillation for Pre-trained Language Model Compression
ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING(2024)
Key words
Explanation,knowledge distillation,model compression
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined