Regularization Through Simultaneous Learning: A Case Study for Hop Classification

Pedro Henrique Nascimento Castro, Gabriel Cássia Fortuna, Rafael Alves Bonfim de Queiroz,Gladston Juliano Prates Moreira

CoRR(2023)

引用 0|浏览0
暂无评分
摘要
Overfitting remains a prevalent challenge in deep neural networks, leading to suboptimal real-world performance. Employing regularization techniques is a common strategy to counter this challenge, improving model generalization. This paper proposes Simultaneous Learning, a novel regularization approach drawing on Transfer Learning and Multi-task Learning principles, applied specifically to the classification of hop varieties - an integral component of beer production. Our approach harnesses the power of auxiliary datasets in synergy with the target dataset to amplify the acquisition of highly relevant features. Through a strategic modification of the model's final layer, we enable the simultaneous classification of both datasets without the necessity to treat them as disparate tasks. To realize this, we formulate a loss function that includes an inter-group penalty. We conducted experimental evaluations using the InceptionV3 and ResNet50 models, designating the UFOP-HVD hop leaf dataset as the target and ImageNet and PlantNet as auxiliary datasets. Our proposed method exhibited a substantial performance advantage over models without regularization and those adopting dropout regularization, with accuracy improvements ranging from 5 to 22 percentage points. Additionally, we introduce a technique for interpretability devised to assess the quality of features by analyzing correlations among class features in the network's convolutional layers.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要