Pareto-Path Multitask Multiple Kernel Learning
Abbreviated Journal Title
IEEE Trans. Neural Netw. Learn. Syst.
Machine learning; optimization methods; pattern recognition; supervised; learning; support vector machines (SVM); Computer Science, Artificial Intelligence; Computer Science, Hardware &; Architecture; Computer Science, Theory & Methods; Engineering, ; Electrical & Electronic
A traditional and intuitively appealing Multitask Multiple Kernel Learning (MT-MKL) method is to optimize the sum (thus, the average) of objective functions with (partially) shared kernel function, which allows information sharing among the tasks. We point out that the obtained solution corresponds to a single point on the Pareto Front (PF) of a multiobjective optimization problem, which considers the concurrent optimization of all task objectives involved in the Multitask Learning (MTL) problem. Motivated by this last observation and arguing that the former approach is heuristic, we propose a novel support vector machine MT-MKL framework that considers an implicitly defined set of conic combinations of task objectives. We show that solving our framework produces solutions along a path on the aforementioned PF and that it subsumes the optimization of the average of objective functions as a special case. Using the algorithms we derived, we demonstrate through a series of experimental results that the framework is capable of achieving a better classification performance, when compared with other similar MTL approaches.
Ieee Transactions on Neural Networks and Learning Systems
"Pareto-Path Multitask Multiple Kernel Learning" (2015). Faculty Bibliography 2010s. 6657.