Title

A Unifying Framework For Typical Multitask Multiple Kernel Learning Problems

Keywords

Machine learning; optimization methods; pattern recognition; supervised learning; support vector machines (SVMs)

Abstract

Over the past few years, multiple kernel learning (MKL) has received significant attention among data-driven feature selection techniques in the context of kernel-based learning. MKL formulations have been devised and solved for a broad spectrum of machine learning problems, including multitask learning (MTL). Solving different MKL formulations usually involves designing algorithms that are tailored to the problem at hand, which is, typically, a nontrivial accomplishment. In this paper we present a general multitask multiple kernel learning (MT-MKL) framework that subsumes well-known MT-MKL formulations, as well as several important MKL approaches on single-task problems. We then derive a simple algorithm that can solve the unifying framework. To demonstrate the flexibility of the proposed framework, we formulate a new learning problem, namely partially-shared common space MT-MKL, and demonstrate its merits through experimentation. © 2013 IEEE.

Publication Date

1-1-2014

Publication Title

IEEE Transactions on Neural Networks and Learning Systems

Volume

25

Issue

7

Number of Pages

1287-1297

Document Type

Article

Personal Identifier

scopus

DOI Link

https://doi.org/10.1109/TNNLS.2013.2291772

Socpus ID

84902279861 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/84902279861

This document is currently not available here.

Share

COinS