Abstract

Proceedings Abstracts of the Twenty-Fifth International Joint Conference on Artificial Intelligence

Nonparametric Risk and Stability Analysis for Multi-Task Learning Problems / 2146
Xuezhi Wang, Junier B. Oliva, Jeff Schneider, Barnabás Póczos

Multi-task learning attempts to simultaneously leverage data from multiple domains in order to estimate related functions on each domain. For example, a special case of multi-task learning, transfer learning, is often employed when one has a good estimate of a function on a source domain, but is unable to estimate a related function well on a target domain using only target data. Multi-task/transfer learning problems are usually solved by imposing some kind of "smooth" relationship among/between tasks. In this paper, we study how different smoothness assumptions on task relations affect the upper bounds of algorithms proposed for these problems under different settings. For general multi-task learning, we study a family of algorithms which utilize a reweighting matrix on task weights to capture the smooth relationship among tasks, which has many instantiations in existing literature. Furthermore, for multi-task learning in a transfer learning framework, we study the recently proposed algorithms for the "model shift", where the conditional distribution $P(Y|X)$ is allowed to change across tasks but the change is assumed to be smooth. In addition, we illustrate our results with experiments on both simulated and real data.

PDF