Skip to main content

Sun-Joo Cho

Associate Professor of Psychology and Human Development

Research topics include generalized latent variable models, generalized linear mixed effect models, and parameter estimation, with a focus on item response modeling.

Data complexity Dr. Cho has dealt with consists of (1) multiple manifest person categories such as a control group versus an experimental group in an experimental design, (2) multiple latent person categories (or mixtures or latent classes) such as a mastery group versus a non-mastery group in a cognitive test, (3) multiple manifest item groups that may lead to multidimensionality such as number operation, measurement, and representation item groups in a math test, (4) multiple manifest person groups such as schools where students are nested in a multilevel (or hierarchical) data structure, (5) multiple time points such as pretest and posttest in intervention studies, (6) intensive (many time points) time series (e.g., from eye-tracking, fMRI, and emotional responses), (7) response processes, and (8) spatial information.

Dr. Cho has collaborated with researchers from a variety of disciplines including reading education, math education, special education, psycholinguistics, clinical psychology, cognitive psychology, neuropsychology, and audiology. She serves on the editorial boards of Behavior Research Methods, International Journal of Testing, Journal of Educational Measurement, and Psychologial Methods. She is also a Chancellor Faculty Fellow (2019-2021). Dr. Cho has current research projects funded by the National Science Foundation (NSF), the Institute of Education Sciences (IES), and the National Institute of Mental Health (NIMH). 

Representative Publications

* denotes co-authors at Vanderbilt University.

Methodological Papers in Peer-Reviewed Journals  

 

Substantive Papers in Peer-Reviewed Journals 

  • Spencer, M.*, Cho, S.-J., & Cutting, L. E.* (2019). Item response theory analyses of the Delis-Kaplan executive function system card sorting subtest. Child Neuropsychology, 25, 198-216. [Multidimensional graded response models, IRT DIF, and explanatory item response models were applied.]
  • Levin, D. T.*, Seiffert, A.*, Cho, S.-J., & Carter, K.* (2018). Are failures to look, to represent, or to learn associated with change blindness during screen-capture video learning? Cognitive Research: Principles and Implications, 3, 49[Mixed-effects logistic regression models were applied.]
  • Nick, E. A.*, Cole, D. A.*, Cho, S.-J., Smith, D. K.*, Cater, T. G.*, & Zelkowitz, R.* (2018). The online social support scale: Measure development and validation. Psychological Assessment, 30, 1127-1143. [For IRT analyses, multidimensional graded response models and IRT DIF were applied.]
  • Cole, D. A.*, Goodman, S., Garber, J.*, Cullum, K. A., Cho, S.-J., Right, J. D.*, Felton, J. W., Jacquez, F. M., Korelitz, K. E.*, & Simon, H. F. M.* (2018). Validating parent and child forms of the parent perception inventory. Psychological Assessment, 30, 1065-1081[For IRT analyses, multidimensional graded response models and IRT DIF were applied.]
  • Hornsby, B. W.*, Gustafson, S.*, Lancaster, H., Cho, S.-J., Camarata, S.*, & Bess, F. H.* (2017). Subjective fatigue in children with hearing loss using self- and parent- proxy reports. American Journal of Audiology, 26, 393-407. [Nonparametric ANOVA for a between-within design was applied.]
  • Goodwin, A. P.*, & Cho, S.-J. (2016). Unraveling vocabulary learning: Reader and item-level predictors of vocabulary learning within comprehension instruction for fifth and sixth-graders. Scientific Studies of Reading, 20, 490-514. [Generalized linear mixed modeling for doubly multilevel binary longitudinal data (Cho & Goodwin, 2017) was applied.]
  • Goodwin, A. P.*, Cho, S.-J., & Nichols, S.* (2016). Ways to 'WIN' at word learning. The Reading Teacher. [Generalized linear mixed modeling for doubly multilevel binary longitudinal data (Cho & Goodwin, 2017) was applied.]
  • Lee, W.-y.*, Cho, S.-J., McGugin, R. W.*, Van Gulick, A. B.*, & Gauthier, I.* (2015). Differential item functioning analysis of the Vanderbilt Expertise Test for Cars (VETcar). Journal of Vision, 15http://jov.arvojournals.org/article.aspx?articleid=2449199. [IRT DIF detection methods and multigroup item response models were applied.] 
  • Cho, S.-J., Wilmer, J., Herzmann, G., McGugin, R.*, Fiset, D., Van Gulick, A. B.*, Ryan, K.*, & Gauthier, I.* (2015). Item response theory analyses of the Cambridge face memory test (CFMT). Psychological Assessment, 27, 552-566. [Exploratory bi-factor item response models, explanatory item response models, and IRT DIF detection methods were applied.]
  • Bottge, B. A., Ma, X., Gassaway, L., Toland, M. D., Butler, M., & Cho, S.-J. (2014). Effects of blended instructional models on math performance. Exceptional Children, 80, 423-437. [Three-level hierarchical linear models for repeated measures were applied.]
  • Goodwin, A. P.*, Gilbert, J. K.*, Cho, S.-J., & Kearns, D. M. (2014). Probing lexical representations: Simultaneous modeling of word and reader contributions to multidimensional lexical representations. Journal of Educational Psychology106, 448-468. [Explanatory multidimensional multilevel random item response models (Cho, Gilbert, & Goodwin, 2013) were applied.]
  • Miller, A. C., Davis, N.*, Gilbert, J. K.*, Cho, S.-J., Toste, J. R., Street, J.*, & Cutting, L. E.* (2014). Novel approaches to examine passage, student, and question effects on reading comprehension. Learning Disabilities Research & Practice, 29, 25-35. [Linear and nonlinear models with nested and crossed random effects were applied.]
  • Bottge, B. A., & Cho, S.-J. (2013). Effects of enhanced anchored instruction on skills aligned to common core math standards. Learning Disabilities: A Multidisciplinary Journal, 19, 73-83. [Multilevel longitudinal item response models were applied.]
  • Goodwin, A. P.*, Gilbert, J. K.*, & Cho, S.-J. (2013). Morphological contributions to adolescent word reading: An item response approach. Reading Research Quarterly, 48, 39-60. [Random item response models and explanatory item response models were applied.]
  • Cole, D. A.*, Cho, S.-J., Martin, N. C.*, Youngstrom, E. A., Curry, J. F.,  Findling, R. L., Compas, B. E.*, Goodyer, I. M., Rohde, P., Weissman, M., Essex, M. J., Hyde, J. S., Forehand, R., Slattery, M. J., Felton, J. W.*, & Maxwell, M. A.* (2012). Are increased weight and appetite useful indicators of depression in children and adolescents?. Journal of Abnormal Psychology,121, 838-851. [Exploratory, explanatory, and multiple-group multidimensional graded response models were applied.]
  • Cho, S.-J., Bottge, B. A., Cohen, A. S., & Kim, S.-H. (2011). Detecting cognitive change in the math skills of low-achieving adolescents. Journal of Special Education, 45, 67-76.  [Mixture longitudinal item response model was applied.]

 

Book Chapters

  • De Boeck, P., & Cho, S.-J. (forthcoming). IRTree modeling of cognitive processes based on outcome and intermediate data. Maryland Assessment Research Center (MARC).
  • Cho, S.-J., Brown-Schmidt, S.*, Naveiras, M.*, & De Boeck, P. (forthcoming). A dynamic generalized mixed effect model for intensive binary temporal-spatio data from an eye tracking technique. Maryland Assessment Research Center (MARC).
  • De Boeck, P., Cho,S.-J., Wilson, M. (2016). Explanatory item response models: An approach to cognitive assessment. In A. Rupp, & Leighton, J. (Eds.), Handbook of cognition and assessment (pp. 249-266). Harvard, MA: Wiley Blackwell.
  • Cohen, A. S., & Cho, S.-J. (2016). Information criteria. In W. J. van der Linden (Ed.), Handbook of item response theory, models, statistical tools, and applications (Vol. 2, pp. 363-378). Boca Raton, FL: Chapman & Hall/CRC Press.

 


Honors

  • Vanderbilt University Chancellor Faculty Fellow (2019-2021)
  • Vanderbilt University Provost Research Studios  (PRS) Award (2018)
  • Vanderbilt University Trans-Institutional Program (TIPs) Award (co-PI)  (2016-2018)

Study Title: Understanding digital dominance in teaching and learning: An interdisciplinary approach

  • Vanderbilt University Research Scholar Grant Award (2016)
  • National Council on Measurement in Education (NCME) Bradley Hanson Award for Contributions to Educational Measurement (2016)

Study Title: Multilevel reliability measures in a multilevel item response theory framework

  • National Council on Measurement in Education (NCME) Award for an Outstanding Example of an Application of Educational Measurement Technology to a Specific Problem (2014)

Study Title: An application to simultaneous investigation of word and person contributions to word reading and lexical representations using random item response models

  • National Academy of Education/Spencer Postdoctoral Fellowship (9/2013 - 6/2015)

Study Title: Evaluating educational programs with a new item response theory perspective 

  • National Council on Measurement in Education (NCME) Award for an Outstanding Example of an Application of Educational Measurement Technology to a Specific Problem (2011)

Study Title: Latent transition analysis with a mixture IRT measurement model

  •  State of the Art Lecturer, Psychometric Society (2010)

Study Title: Random item response models