Grouped Multi-Task Learning with Hidden Tasks Enhancement

Jiachun Jin, Jiankun Wang,Lu Sun, Jie Zheng,Mineichi Kudo

ECAI 2023(2023)

引用 0|浏览13
暂无评分
摘要
In multi-task learning (MTL), multiple prediction tasks are learned jointly, such that generalization performance is improved by transferring information across the tasks. However, not all tasks are related, and training unrelated tasks together can worsen the prediction performance because of the phenomenon of negative transfer. To overcome this problem, we propose a novel MTL method that can robustly group correlated tasks into clusters and allow useful information to be transferred only within clusters. The proposed method is based on the assumption that the task clusters lie in the low-rank subspaces of the parameter space, and the number of them and their dimensions are both unknown. By applying subspace clustering to task parameters, parameter learning and task grouping can be done in a unified framework. To relieve the error induced by the basic linear learner and robustify the model, the effect of hidden tasks is exploited. Moreover, the framework is extended to a multi-layer architecture so as to progressively extract hierarchical subspace structures of tasks, which helps to further improve generalization. The optimization algorithm is proposed, and its effectiveness is validated by experimental results on both synthetic and real-world datasets.
更多
查看译文
关键词
hidden,learning,multi-task
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要