Chrome Extension
WeChat Mini Program
Use on ChatGLM

Towards efficient feature sharing in MIMO architectures

IEEE Conference on Computer Vision and Pattern Recognition(2022)

Cited 5|Views59
No score
Abstract
Multi-input multi-output architectures propose to train multiple subnetworks within one base network and then average the subnetwork predictions to benefit from ensembling for free. Despite some relative success, these architectures are wasteful in their use of parameters. Indeed, we highlight in this paper that the learned subnetwork fail to share even generic features which limits their applicability on smaller mobile and AR/VR devices. We posit this behavior stems from an ill-posed part of the multi-input multi-output framework. To solve this issue, we propose a novel unmixing step in MIMO architectures that allows subnetworks to properly share features. Preliminary experiments on CIFAR 100 show our adjustments allow feature sharing and improve model performance for small architectures.
More
Translated text
Key words
efficient feature sharing
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined