Search

Talk

Using $\cal H$-matrices to approximate large scale kernel machines

  • Jochen Garcke (Australian National University, Canberra)
G3 10 (Lecture hall)

Abstract

After a short introduction into machine learning and data mining we will describe how the solution of a number of machine learning methods is represented by means of kernel functions. Typically, these functions have global support and the computation of the exact solution involves a densely populated matrix requiring ${\cal O}(N2)$ units of storage for $N$ data points. We will present first results how ${\cal H}$-matrices can be applied to find data-sparse approximations of this matrix, making large scale machine learning tractable.