Cost functions to estimate a posteriori probabilities in multiclass problems
|Cost functions to estimate a posteriori probabilities in multiclass problems
|Year of Publication
|Cid-Sueiro, J., J. I. Arribas, S. Urban-Munoz, and A. R. Figueiras-Vidal
|IEEE Transactions on Neural Networks
|Cost functions, Estimation, Functions, Learning algorithms, Multiclass problems, Neural networks, Pattern recognition, Probability, Problem solving, Random processes, Stochastic gradient learning rule
The problem of designing cost functions to estimate a posteriori probabilities in multiclass problems is addressed in this paper. We establish necessary and sufficient conditions that these costs must satisfy in one-class one-output networks whose outputs are consistent with probability laws. We focus our attention on a particular subset of the corresponding cost functions; those which verify two usually interesting properties: symmetry and separability (well-known cost functions, such as the quadratic cost or the cross entropy are particular cases in this subset). Finally, we present a universal stochastic gradient learning rule for single-layer networks, in the sense of minimizing a general version of these cost functions for a wide family of nonlinear activation functions.