Title

A Unifying Framework for Typical Multitask Multiple Kernel Learning Problems

Authors

Authors

IEEE Trans. Neural Netw. Learn. Syst.

Comments

Authors: contact us about adding a copy of your work at STARS@ucf.edu

Abbreviated Journal Title

Compos. Pt. B-Eng.

Keywords

Machine learning; optimization methods; pattern recognition; supervised; learning; support vector machines (SVMs); SUPPORT; SPARSITY; Computer Science, Artificial Intelligence; Computer Science, Hardware &; Architecture; Computer Science, Theory & Methods; Engineering, ; Electrical & Electronic

Abstract

Over the past few years, multiple kernel learning (MKL) has received significant attention among data-driven feature selection techniques in the context of kernel-based learning. MKL formulations have been devised and solved for a broad spectrum of machine learning problems, including multitask learning (MTL). Solving different MKL formulations usually involves designing algorithms that are tailored to the problem at hand, which is, typically, a nontrivial accomplishment. In this paper we present a general multitask multiple kernel learning (MT-MKL) framework that subsumes well-known MT-MKL formulations, as well as several important MKL approaches on single-task problems. We then derive a simple algorithm that can solve the unifying framework. To demonstrate the flexibility of the proposed framework, we formulate a new learning problem, namely partially-shared common space MT-MKL, and demonstrate its merits through experimentation.

Subjects

C. Li; M. Georgiopoulos;G. C. Anagnostopoulos

Volume

25

Issue/Number

7

Publication Date

1-1-2014

Document Type

Article

Language

English

First Page

1287

Last Page

1297

WOS Identifier

WOS:000337906300004

ISSN

2162-237X

Share

COinS