Top-tuning: A study on transfer learning for an efficient alternative to fine tuning for image classification with fast kernel methods

IMAGE AND VISION COMPUTING(2024)

引用 0|浏览2
暂无评分
摘要
The impressive performance of deep learning architectures is associated with a massive increase in model complexity. Millions of parameters need to be tuned, with training and inference time scaling accordingly, together with energy consumption. But is massive fine-tuning always necessary? In this paper, focusing on image classification, we consider a simple transfer learning approach exploiting pre-trained convolutional features as input for a fast-to-train kernel method. We refer to this approach as top-tuning since only the kernel classifier is trained on the target dataset. In our study, we perform more than 3000 training processes focusing on 32 small to medium-sized target datasets, a typical situation where transfer learning is necessary. We show that the toptuning approach provides comparable accuracy with respect to fine-tuning, with a training time between one and two orders of magnitude smaller. These results suggest that top-tuning is an effective alternative to finetuning in small/medium datasets, being especially useful when training time efficiency and computational resources saving are crucial.
更多
查看译文
关键词
Fast kernel methods,Training on a budget,Fast training,Transfer learning,Image classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要