Scalable Hierarchical Mixture of Gaussian Processes for Pattern Classification

T. N. A. Nguyen, A. Bouzerdoum, S. L. Phung

Research output: Contribution to conferencePaperpeer-review

3 Citations (Scopus)

Abstract

This paper introduces a novel Gaussian process (GP) classification method that combines advantages of global and local GP approximators through a two-layer hierarchical model. The upper layer consists of a global sparse GP to coarsely model the entire dataset. The lower layer is a mixture of GP experts which uses local information to learn a fine-grained model. A variational inference algorithm is developed for simultaneous learning of the global GP, the experts and the gating network. Stochastic optimization can be employed for large-scale problems. Experiments on benchmark binary classification datasets demonstrate the advantages of the method in terms of scalability and classification accuracy.
Original languageEnglish
Pages2466-2470
Number of pages5
Publication statusPublished - 13 Sept 2018
EventIEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) - Calgary, Canada
Duration: 15 Apr 201820 Apr 2018

Conference

ConferenceIEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Country/TerritoryCanada
CityCalgary
Period15/04/1820/04/18

Keywords

  • Gaussian processes
  • Pattern classification
  • Variational inference

Fingerprint

Dive into the research topics of 'Scalable Hierarchical Mixture of Gaussian Processes for Pattern Classification'. Together they form a unique fingerprint.

Cite this