Robust Temporal Smoothness in Multi-Task Learning.

AAAI(2023)

引用 7|浏览16
暂无评分
摘要
Multi-task learning models based on temporal smoothness assumption, in which each time point of a sequence of time points concerns a task of prediction, assume the adjacent tasks are similar to each other. However, the effect of outliers is not taken into account. In this paper, we show that even only one outlier task will destroy the performance of the entire model. To solve this problem, we propose two Robust Temporal Smoothness (RoTS) frameworks. Compared with the existing models based on temporal relation, our methods not only chase the temporal smoothness information, but identify outlier tasks, however, without increasing the computational complexity. Detailed theoretical analyses are presented to evaluate the performance of our methods. Experimental results on synthetic and real-life datasets demonstrate the effectiveness of our frameworks. We also discuss several potential specific applications and extensions of our RoTS frameworks.
更多
查看译文
关键词
robust temporal smoothness,learning,multi-task
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要