Chrome Extension
WeChat Mini Program
Use on ChatGLM

Explanation Guided Knowledge Distillation for Pre-trained Language Model Compression

ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING(2024)

Cited 0|Views15
No score
Key words
Explanation,knowledge distillation,model compression
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined