Authors: Diaa Al Mohamad and Michel Broniatowski
Abstract: Minimum divergence estimators are derived through the dual form of the divergence in parametric models. These estimators generalize the classical maximum likelihood ones. Models with unobserved data, as mixture models, can be estimated with EM algorithms, which are proved to converge to stationary points of the likelihood function under gen-eral assumptions. This paper presents an ex-tension of the EM algorithm based on min-imization of the dual approximation of the divergence between the empirical measure and the model using a proximal-type algo-rithm. The algorithm converges to the sta-tionary points of the empirical criterion un-der general conditions pertaining to the diver-gence and the model. Robustness properties of this algorithm are also presented. We pro-vide another proof of convergence of the EM algorithm in a two-component gaussian mix-ture. Simulations on Gaussian and Weibull mixtures are performed to compare the results with the MLE.
Keywords: EM algorithm Divergences Proximal point algorithm Op^timization