Chrome Extension
WeChat Mini Program
Use on ChatGLM

Deep Convolutional Neural Networks Based on Knowledge Distillation for Offline Handwritten Chinese Character Recognition

Journal of advanced computational intelligence and intelligent informatics(2024)

Cited 0|Views3
No score
Abstract
Deep convolutional neural networks (DNNs) have achieved outstanding performance in this field. Meanwhile, handwritten Chinese character recognition (HCCR) is a challenging area of research in the field of computer vision. DNNs require a large number of parameters and high memory consumption. To address these issues, this paper proposes an approach based on an attention mechanism and knowledge distillation. The attention mechanism improves the feature extraction and the knowledge distillation reduces the number of parameters. The experimental results show that ResNet18 achieves a recognition accuracy of 97.63% on the HCCR dataset with 11.25 million parameters. Compared with other methods, this study improves the performance for HCCR.
More
Translated text
Key words
deep convolutional neural networks,hand-written Chinese character recognition,attention mechanism,knowledge distillation
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined