Title

Pareto-Path Multitask Multiple Kernel Learning

Authors

Authors

C. Li; M. Georgiopoulos;G. C. Anagnostopoulos

Comments

Authors: contact us about adding a copy of your work at STARS@ucf.edu

Abbreviated Journal Title

IEEE Trans. Neural Netw. Learn. Syst.

Keywords

Machine learning; optimization methods; pattern recognition; supervised; learning; support vector machines (SVM); Computer Science, Artificial Intelligence; Computer Science, Hardware &; Architecture; Computer Science, Theory & Methods; Engineering, ; Electrical & Electronic

Abstract

A traditional and intuitively appealing Multitask Multiple Kernel Learning (MT-MKL) method is to optimize the sum (thus, the average) of objective functions with (partially) shared kernel function, which allows information sharing among the tasks. We point out that the obtained solution corresponds to a single point on the Pareto Front (PF) of a multiobjective optimization problem, which considers the concurrent optimization of all task objectives involved in the Multitask Learning (MTL) problem. Motivated by this last observation and arguing that the former approach is heuristic, we propose a novel support vector machine MT-MKL framework that considers an implicitly defined set of conic combinations of task objectives. We show that solving our framework produces solutions along a path on the aforementioned PF and that it subsumes the optimization of the average of objective functions as a special case. Using the algorithms we derived, we demonstrate through a series of experimental results that the framework is capable of achieving a better classification performance, when compared with other similar MTL approaches.

Journal Title

Ieee Transactions on Neural Networks and Learning Systems

Volume

26

Issue/Number

1

Publication Date

1-1-2015

Document Type

Article

Language

English

First Page

51

Last Page

61

WOS Identifier

WOS:000348854800005

ISSN

2162-237X

Share

COinS