Memorias de investigación
Communications at congresses:
Learning conditional linear Gaussian classifiers with probabilistic class labels
Year:2013

Research Areas
  • Artificial intelligence

Information
Abstract
We study the problem of learning Bayesian classifiers (BC) when the true class label of the training instances is not known, and is substituted by a probability distribution over the class labels for each instance. This scenario can arise, e.g., when a group of experts is asked to individually provide a class label for each instance. We particularize the generalized expectation maximization (GEM) algorithm in (Come et al., 2009, Pattern Recognition 42: 334-348) to learn BCs with different structural complexities: naive Bayes, averaged one-dependence estimators or general conditional linear Gaussian classifiers. An evaluation conducted on eight datasets shows that BCs learned with GEM perform better than those using either the classical Expectation Maximization algorithm or potentially wrong class labels. BCs achieve similar results to the multivariate Gaussian classifier without having to estimate the full covariance matrices.
International
No
Congress
XV Conferencia de la Asociación Española para la Inteligencia Artificial
960
Place
Madrid
Reviewers
Si
ISBN/ISSN
978-3-642-40642-3
10.1007/978-3-642-40643-0_15
Start Date
17/09/2013
End Date
20/09/2013
From page
139
To page
148
Advances in Artificial Intelligence, Proceedings of the 15th MultiConference of the Spanish Association for Artificial Intelligence, volume 8109 of Lecture Notes in Computer Science
Participants

Research Group, Departaments and Institutes related
  • Creador: Grupo de Investigación: COMPUTATIONAL INTELLIGENCE GROUP
  • Departamento: Inteligencia Artificial