Spatial pyramid match kernels for brain image classification

Jonathan Young*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

1 Citation (Scopus)

Abstract

The most widely used classification techniques for whole brain image classification rely on kernel machines such as support vector machines and Gaussian processes, due to their computational efficiency, accurate prediction and suitability to tackle the combination of small sample sizes and high dimensionality that make neuroimaging data a challenging problem. Such methods generally make use of linear kernels, which assume an exact correspondence between the voxels in two brain images. This paper introduces spatial pyramid matching kernels from the computer vision literature to this problem, which allow us to relax this assumption to compensate for registration errors. The kernel formulation is compared against linear kernels for the model problems of gender prediction for classification and age prediction for regression, using a nested cross validation procedure to robustly select the optimal kernel parameters and assess the results. The spatial pyramid matching kernel outperforms the linear one in both tasks.

Original languageEnglish
Title of host publicationPRNI 2016 - 6th International Workshop on Pattern Recognition in Neuroimaging
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Print)9781467365307
DOIs
Publication statusPublished - 24 Aug 2016
Event6th International Workshop on Pattern Recognition in Neuroimaging, PRNI 2016 - Trento, Italy
Duration: 22 Jun 201624 Jun 2016

Conference

Conference6th International Workshop on Pattern Recognition in Neuroimaging, PRNI 2016
Country/TerritoryItaly
CityTrento
Period22/06/201624/06/2016

Keywords

  • classification
  • Feature histograms
  • kernels
  • MRI
  • regression
  • SVM

Fingerprint

Dive into the research topics of 'Spatial pyramid match kernels for brain image classification'. Together they form a unique fingerprint.

Cite this