Local Rademacher Complexity-Based Learning Guarantees For Multi-Task Learning
Keywords
Excess Risk Bounds; Local Rademacher Complexity; Multi-task Learning
Abstract
We show a Talagrand-type concentration inequality for Multi-Task Learning (MTL), with which we establish sharp excess risk bounds for MTL in terms of the Local Rademacher Complexity (LRC). We also give a new bound on the LRC for any norm regularized hypothesis classes, which applies not only to MTL, but also to the standard Single-Task Learning (STL) setting. By combining both results, one can easily derive fast-rate bounds on the excess risk for many prominent MTL methods, including-as we demonstrate-Schatten norm, group norm, and graph regularized MTL. The derived bounds reflect a relationship akin to a conservation law of asymptotic convergence rates. When compared to the rates obtained via a traditional, global Rademacher analysis, this very relationship allows for trading off slower rates with respect to the number of tasks for faster rates with respect to the number of available samples per task.
Publication Date
8-1-2018
Publication Title
Journal of Machine Learning Research
Volume
19
Document Type
Article
Personal Identifier
scopus
Copyright Status
Unknown
Socpus ID
85053376688 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/85053376688
STARS Citation
Yousefi, Niloofar; Lei, Yunwen; Kloft, Marius; Mollaghasemi, Mansooreh; and Anagnostopoulos, Georgios C., "Local Rademacher Complexity-Based Learning Guarantees For Multi-Task Learning" (2018). Scopus Export 2015-2019. 8416.
https://stars.library.ucf.edu/scopus2015/8416