Files in this item



application/pdfGUO-THESIS-2017.pdf (4MB)
(no description provided)PDF


Title:Bayesian Expectation-Maximization-Maximization: a latent-mixture-modeling-based Bayesian algorithm for the three-parameter logistic model
Author(s):Guo, Shaoyang
Advisor(s):Zhang, Jinming
Contributor(s):Chang, Hua-Hua; Anderson, Carolyn J.
Department / Program:Educational Psychology
Discipline:Educational Psychology
Degree Granting Institution:University of Illinois at Urbana-Champaign
Subject(s):Bayesian Expectation-Maximization-Maximization (BEMM)
Three-parameter logistic model (3PLM)
Abstract:The current study proposes a Bayesian Expectation-Maximization-Maximization (Bayesian EMM, or BEMM), which is an alternative feasible Bayesian algorithm for the three-parameter logistic model (3PLM). The Bayesian EMM takes full advantage of both the EMM and the Bayesian approach. The BEMM not only successfully solves the issue of inaccurate estimates for few items in the EMM algorithm, but also alleviates the negative effect caused by different priors in the traditional Bayesian EM. The simulation studies and real data examples indicate that: (1) The Bayesian EMM can produce more accurate and stable item estimates. (2) Standard errors (SE) yielded by the Bayesian EMM tend to be smaller than the traditional Bayesian EM. (3) The Bayesian EMM is insensitive to priors, which means that the negative influence of different priors will be minimized.
Issue Date:2017-04-05
Rights Information:Copyright 2017 Shaoyang Guo
Date Available in IDEALS:2017-08-10
Date Deposited:2017-05

This item appears in the following Collection(s)

Item Statistics