Multiple Kernel Learning Algorithms and An Efficient Bayesian Formulation

Supervisor
Mehmet Gönen - Helsinki Institute for Information Technology HIIT
Date and time
Tuesday, September 25, 2012 at 4:45 PM - 4:45 p.m. rinfresco; 5:00 p.m. inizio seminario -- Sala Verde
Place
Ca' Vignal - Piramide, Floor 0, Hall Verde
Programme Director
Umberto Castellani
External reference
Publication date
September 13, 2012
Department
Computer Science  

Summary

In recent years, several methods have been proposed to combine multiple kernels instead of using a single one. These different kernels may correspond to using different notions of similarity or may be using information coming from multiple sources (different representations or different feature subsets). In trying to organize and highlight the similarities and differences between them, I will give a taxonomy of and review several multiple kernel learning algorithms.

Most of the previous research on such methods is focused on the computational efficiency issue. However, it is still not feasible to combine many kernels using existing Bayesian approaches due to their high time complexity. We propose a fully conjugate Bayesian formulation and derive a deterministic variational approximation, which allows us to combine hundreds or thousands of kernels very efficiently. Experiments with large numbers of kernels on benchmark data sets show that our inference method is quite fast, requiring less than a minute. On one bioinformatics and three image recognition data sets, our method outperforms previously reported results with better generalization performance.





© 2002 - 2021  Verona University
Via dell'Artigliere 8, 37129 Verona  |  P. I.V.A. 01541040232  |  C. FISCALE 93009870234