Skip to main navigation Skip to search Skip to main content

Mixtures of experts estimate a posteriori probabilities

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

The mixtures of experts (ME) model offers a modular structure suitable for a divide-and-conquer approach to pattern recognition. It has a probabilistic interpretation in terms of a mixture model, which forms the basis for the error function associated with MEs. In this paper, it is shown that for classification problems the minimization of this ME error function leads to ME outputs estimating the a posteriori probabilities of class membership of the input vector.

Original languageEnglish
Title of host publicationArtificial Neural Networks - ICANN 1997 - 7th International Conference, Proceeedings
PublisherSpringer Verlag
Pages499-504
Number of pages6
ISBN (Print)3540636315, 9783540636311
DOIs
Publication statusPublished - 1997
Event7th International Conference on Artificial Neural Networks, ICANN 1997 - Lausanne, Switzerland
Duration: 8 Oct 199710 Oct 1997

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume1327
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference7th International Conference on Artificial Neural Networks, ICANN 1997
Country/TerritorySwitzerland
CityLausanne
Period08/10/199710/10/1997

Fingerprint

Dive into the research topics of 'Mixtures of experts estimate a posteriori probabilities'. Together they form a unique fingerprint.

Cite this