SunJoo Cho
Professor of Psychology and Human Development
Vanderbilt Data Science Institute Affiliate Faculty; LIVE Learning Innovation Incubator Affiliate Faculty
Research topics include generalized latent variable models, generalized linear and nonlinear mixedeffects models, generalized additive mixed models, mixedeffects machine learning, parameter estimation, model assessment and selection, and model diagnostics, with a focus on item response, multilevel, and longitudinal/timeseries modeling.
Data complexity Dr. Cho has dealt with consists of (1) multiple manifest person categories such as a control group versus a treatment group in an experimental design, (2) multiple latent person categories (or mixtures or latent classes) such as phenogroups, (3) multiple item groups that may lead to multidimensionality such as number operation, measurement, and representation item groups in a math test, (4) multiple groups such as hospitals where patients are nested in a multilevel (or hierarchical) data structure, (5) repeated measures such as pretest and posttest in intervention studies, (6) intensive (many time points) binary, ordinal, nominal, and count time series (e.g., from ambulatory physiological recording, wearable devices, eyetracking, emotional responses, experience sampling methods, ecological momentary assessment, dynamic treatment regimes, and Nof1 or single case trials), (7) response processes (e.g., multinomial processing), (8) spatial dependence, (9) multiple sequences or multivariate time series from multisourced big process data, (10) nonlinear interactions, (11) multiway categorical data, and (12) functional response time effects (e.g., in signal detection theory and item response theory).
Dr. Cho has collaborated with researchers from a wide variety of disciplines including reading education, math education, special education, psycholinguistics, clinical psychology, cognitive psychology, neuropsychology, medicine, and computer science (machine learning, deep learning, and AI applications). She is the EditorinChief of the British Journal of Mathematical and Statistical Psychology, an associate editor of the Journal of Educational Measurement and Psychometrika, and a consulting editor of the Behavior Research Methods, Psychological Methods, and International Journal of Testing. She was also named a National Academy of Education/Spencer Postdoctoral Fellow (2013), a Vanderbilt Chancellor Faculty Fellow (20192021), and an Association for Psychological Science (APS) Fellow (Quantitative Field, 2020  ). Dr. Cho has had research projects funded by the National Science Foundation (NSF), the National Institutes of Health (NIH) (e.g., NIMH), and the U.S. Department of Education Institute of Education Sciences (IES).
Representative Publications
* denotes coauthors at Vanderbilt University or Vanderbilt University Medical Center.
Methodological Papers in PeerReviewed Journals
 Cho, S.J., Goodwin, A. P.*, Naveiras, M., & De Boeck, P. (in press). Modeling nonlinear effects of personbyitem covariates in explanatory item response models: Exploratory plots and modeling using smooth functions. Journal of Educational Measurement. [Funding was supported by the U.S. Department of Education Institute of Education Sciences (IES); The data and the R code used in the illustration are available on the Open Science Framework.]
 Cho, S.J., BrownSchmidt, S.*, Clough, S.*, & Duff, M.* (in press). Comparing functional trend and learning among groups in intensive binary longitudinal eyetracking data using byvariable smooth functions of GAMM. Psychometrika. [This paper is part of a special issue on "Model Identification and Estimation for Longitudinal Data in Practice" in Psychometrika; Funding was supported by the NIDCD grant R01 NIH DC017926; The data and the R code used in the illustration are available on the Open Science Framework.]
 Cho, S.J., Goodwin, A. P.*, Naveiras, M., & Salas, J.* (2024). Differential and functional response time item analysis: An application to understanding paper versus digital reading processes. Journal of Educational Measurement, 61, 219251. [Funding was supported by the U.S. Department of Education Institute of Education Sciences (IES); The data and the R code used in the illustration are available on the Open Science Framework.]
 Cho, S.J.#, Wu, H.*#, & Naveiras, M. (2024). The effective sample size in Bayesian information criterion for levelspecific fixed and random effects selection in a twolevel nested model. British Journal of Mathematical and Statistical Psychology, 77, 289315. #The first and second authors contributed equally to this work. Prepint
 Cho, S.J., Preacher, K. J.*, Yaremych, H.*, Naveiras, M.*, Fuchs, D.*, & Fuchs, L. S.* (2024). Modeling variability in treatment effects for cluster randomized controlled trials using byvariable smooth functions in a generalized additive mixed model. Behavior Research Methods, 56, 2094–2113. [R code for parameter estimation and data visualization can be found in the supplementary materials.]
 Cho, S.J. (2024). Modelling change processes in multivariate interrupted time series data using a multivariate dynamic additive model: An application to heart rate and blood pressure selfmonitoring in heart failure with drug changes. Journal of the Royal Statistical Society Series C: Applied Statistics, 73, 123142. R code and an example data set to fit the model can be found here.
 Naveiras, M.*, & Cho, S.J. (2023). Using auxiliary item information in the item parameter estimation of a graded response model for a small to medium sample size: Empirical versus hierarchical Bayes estimation. Applied Psychological Measurement, 47, 478495. [Supplementary materials and R functions for Bayesian estimation can be found here.]
 Cho, S.J., BrownSchmidt, S.*, De Boeck, P., Naveiras, M.*, Yoon, S. O., & Benjamin, A. (2023). Incorporating functional response time effects into a signal detection theory model. Psychometrika, 88, 10561086. [Funding was supported by the National Science Foundation (SES 1851690); Data and R code can be found here.]
 Cho, S.J., De Boeck, P., Naveiras, M.*, & Ervin, H.* (2022). Levelspecific residuals and diagnostic measures, plots, and tests for random effects selection in multilevel and mixed models. Behavior Research Methods, 54, 21782220. [Funding was supported by the National Science Foundation (SES 1851690); R code for levelspecific residual calculations and diagnostic measures, plots, and tests can be found here.]
 Cho, S.J., Preacher, K. J.*, Yaremych, H.*, Naveiras, M.*, Fuchs, D.*, & Fuchs, L. S.* (2022). Modeling multilevel nonlinear treatmentbycovariate interactions in cluster randomized controlled trials using a generalized additive mixed model. British Journal of Mathematical and Statistical Psychology, 75, 493521. [R code for parameter estimation and data visualization can be found in the supplementary materials.]
 Cho, S.J., BrownSchmidt, S.*, De Boeck, P., & Naveiras, M.* (2022). Spacetime modeling of intensive binary time series eyetracking data using a generalized additive logistic regression model. Psychological Methods, 27, 307346. [Funding was supported by the National Science Foundation (SES 1851690); R code for parameter estimation and data visualization can be found in the paper.]
 A tutorial on fitting a generalized additive logistic model to intensive binary timeseries eyetracking data using R can be found here. Illustrative data can be found here.
 Cho, S.J., Naveiras, M.*, & Barton, E. E.* (2022). Modeling multivariate count time series data with a vector Poisson lognormal additive model: Applications to testing intervention effects in singlecase designs. Multivariate Behavioral Research, 57, 422440. [Code for Bayesian implementation of a cubic regression spline in a vector Poisson lognormal additive model can be found in the paper; Supplementary materials are here.]
 De Boeck, P., & Cho, S.J. (2021). Not all DIF is shaped similarly. Psychometrika, 86, 712716. [PatientReported Outcomes Measurement Information System (PROMIS) Special Section.]

Cho, S.J., Watson, D. G.*, Jacobs, C., & Naveiras, M.* (2021). A Markov mixedeffect multinomial logistic regression model for nominal repeated measures with an application to syntactic selfpriming effects. Multivariate Behavioral Research, 56, 476495. [Code for Bayesian analysis can be found in the paper.]

Cho, S.J., BrownSchmidt, S.*, De Boeck, P., & Shen, J.* (2020). Modeling intensive polytomous time series eyetracking data: A dynamic treebased item response model. Psychometrika, 85, 154184. [Funding was supported in part by the National Science Foundation (SES 1851690).]
 A tutorial on fitting a dynamic treebased item response model using R (Laplace approximation) can be found here.
 Stan code for Bayesian analysis can be found here.
 A poster presented at the (virtual) International Meeting of the Psychometric Society 2020 can be found here.
 Researchers in substantive areas may be interested in reading the following book chapter to apply a dynamic treebased item response model:
BrownSchmidt, S.*, Naveiras, M.*, De Boeck, P., & Cho, S.J. (2020). Statistical modeling of intensive categorical time series eyetracking data using dynamic generalized linear mixedeffect models with crossed random effects. A special issue of "Gazing toward the future: Advances in eye movement theory and applications", Psychology of learning and motivation series (Volume 73).
 Some extensions of dynamic treebased item response models are described in the following book chapter:
De Boeck, P., & Cho, S.J. (2020). IRTree modeling of cognitive processes based on outcome and intermediate data. In H. Jao & R. W. Lissitz (Eds.), Innovative psychometric modeling and methods (pp. 91104). Charlotte, NC: Information Age Publishing.

Cho, S.J., Shen, J., & Naveiras, M.* (2019).
Multilevel reliability measures of latent scores within an item response theory framework. Multivariate Behavioral Research, 54, 856881.
 Kim, S.H., Cohen, A. S., Cho, S.J., & Eom, H. J. (2019). Use of information criteria in the study of group differences in trace lines. Applied Psychological Measurement, 43, 95112.
 Rights, J. D.*, Sterba, S. K.*, Cho, S.J., & Preacher, K. J.* (2018). Addressing model uncertainty in item response theory person scores through model averaging. Behaviormetrika, 45, 495503.
 Cho, S.J., BrownSchmidt, S.*, & Lee, W.y.* (2018). Autoregressive generalized linear mixed effect models with crossed random effects: An application to intensive binary time series eyetracking data. Psychometrika, 83, 751771. [Supplemental materials can be found on Open Science Framework website.]
 Cho, S.J., & De Boeck., P. (2018). [Brief Reports] A note on N in Bayesian information criterion (BIC) for item response models. Applied Psychological Measurement, 42, 169172. [The derivation for N in the paper is applicable to generalized linear mixed models with crossed random effects.]
 Lee, W.y.*, Cho, S.J., & Sterba, S. K.* (2018). Ignoring a multilevel structure in mixture item response models: Impact on parameter recovery and model selection. Applied Psychological Measurement, 42, 136154.
 Suh. Y., Cho, S.J., & Bottge, B. (2018). A multilevel longitudinal nested logit model for measuring changes in correct response and error types. Applied Psychological Measurement, 42, 7388.
 Lee, W.y.*, & Cho, S.J. (2017). Detecting differential item discrimination (DID) and the consequences of ignoring DID in multilevel item response models. Journal of Educational Measurement, 54, 364393.
 Cho, S.J., & Goodwin, A. P.* (2017). Modeling learning in doubly multilevel binary longitudinal data using generalized linear mixed models: An application to measuring and explaining word learning. Psychometrika, 82, 846870.
 Cho, S.J., De Boeck, P., & Lee, W.y.* (2017). Evaluating testing, profile likelihood confidence interval estimation, and model comparisons for item covariate effects in linear logistic test models. Applied Psychological Measurement, 41, 353371.
 Cho, S.J., & Suh, Y. (2017). [Software Notes] Obtaining fixed effects for betweenwithin designs in explanatory longitudinal item response models using Mplus. Applied Psychological Measurement, 41, 155157.
 Lee, W.y.*, & Cho, S.J. (2017). Consequences of ignoring measurement invariance in longitudinal item response models. Applied Measurement in Education, 30, 129146.
 Cho, S.J., Suh, Y., & Lee, W.y.* (2016). After DIF items are detected: IRT calibration and scoring in the presence of DIF. Applied Psychological Measurement, 40, 573591. [Confirmatory multigroup multidimensional or bifactor item response modeling was presented for DIF.]
 Cho, S.J., & Preacher, K. J.* (2016). Measurement error correction formula for clusterlevel group differences in cluster randomized and observational studies. Educational and Psychological Measurement, 76, 771786.
 Cho, S.J., Suh, Y., & Lee, W.y.* (2016). An NCME instructional module on latent DIF analysis using mixture item response models. Educational Measurement: Issues and Practice, 35, 4861.
 Cho, S.J., Preacher, K. J.*, & Bottge, B. A. (2015). Detecting intervention effects in a cluster randomized design using multilevel structural equation modeling for binary responses. Applied Psychological Measurement, 39, 627642. [The first author received the following financial support for the research, authorship, and publication of this article: National Academy of Education/Spencer Postdoctoral Fellowship.]
 Cho, S.J., & Bottge, B. A. (2015). Multilevel multidimensional item response model with a multilevel latent covariate. British Journal of Mathematical and Statistical Psychology, 68, 410433. [The first author received the following financial support for the research, authorship, and publication of this article: National Academy of Education/Spencer Postdoctoral Fellowship.]
 Paek, I., & Cho, S.J. (2015). A note on parameter estimate comparability across latent classes in mixture IRT modeling. Applied Psychological Measurement, 39, 135143.
 Suh, Y., & Cho, S.J. (2014). Chisquare difference tests for detecting differential functioning in a multidimensional IRT model: A Monte Carlo study. Applied Psychological Measurement, 38, 359375.
 Cho, S.J., De Boeck, P., Embretson, S., & RabeHesketh, S. (2014). Additive multilevel item structure models with random residuals: Item modeling for explanation and item generation. Psychometrika, 79, 84104. [Alternating imputation posterior algorithm with adaptive quadrature was developed for multilevel crossed random effects such as a random item effect across items and a random item group effect across item groups in 2parameter item response models.]
 Cho, S.J., Cohen, A. S., & Kim, S.H. (2014). A mixture group bifactor model for binary responses. Structural Equation Modeling: A Multidisciplinary Journal, 21, 375395.
 Cho, S.J., Gilbert, J. K.*, & Goodwin, A. P.* (2013). Explanatory multidimensional multilevel random item response model: An application to simultaneous investigation of word and person contributions to multidimensional lexical quality. Psychometrika, 78, 830855.
 Cho, S.J., Athay, M.*, & Preacher, K. J.* (2013). Measuring change for a multidimensional test using a generalized explanatory longitudinal item response model. British Journal of Mathematical and Statistical Psychology, 66, 353381. [Supplementary results, lmer script, and data are posted on the website: http://quantpsy.org/pubs.htm.]
 Cho, S.J., Cohen, A. S., & Kim, S.H. (2013). Markov chain Monte Carlo estimation of a mixture item response theory model. Journal of Statistical Computation and Simulation, 83, 278306.
 Cho, S.J., Cohen, A. S., & Bottge, B. A. (2013). Detecting intervention effects using a multilevel latent transition analysis with a mixture IRT model. Psychometrika, 78, 576600.
 Suh, Y., Cho, S.J., & Wollack, J. A. (2012). A comparison of item calibration procedures in the presence of test speededness. Journal of Educational Measurement, 49, 285311.
 Cho, S.J., Partchev, I., & De Boeck, P. (2012). Parameter estimation of multiple item profiles models. British Journal of Mathematical and Statistical Psychology, 65, 438466. [Alternating imputation posterior algorithm with adaptive quadrature was developed for 1parameter multidimensional random item response models.]
 Cho, S.J., & Suh, Y. (2012). [Software Notes] Bayesian analysis of item response models using WinBUGS 1.4.3. Applied Psychological Measurement, 36, 147148.
 De Boeck, P., Cho, S.J., & Wilson, M. (2011). Explanatory secondary dimension modelling of latent DIF. Applied Psychological Measurement, 35, 583603.
 Cho, S.J., & RabeHesketh, S. (2011). Alternating imputation posterior estimation of models with crossed random effects. Computational Statistics and Data Analysis, 55, 1225.
 Cho, S.J., Cohen, A. S., Kim, S.H., & Bottge, B. A. (2010). Latent transition analysis with a mixture IRT measurement model. Applied Psychological Measurement, 34, 583604.
 Cho, S.J., & Cohen, A. S. (2010). A multilevel mixture IRT model with an application to DIF. Journal of Educational and Behavioral Statistics, 35, 336370.
 Cho, S.J., Li, F., & Bandalos, D. L. (2009). Accuracy of the parallel analysis procedure using polychoric correlations. Educational and Psychological Measurement. 69, 748759.
 Li, F., Cohen, A. S., Kim, S.H., & Cho, S.J. (2009). Model selection methods for mixture dichotomous IRT models. Applied Psychological Measurement, 33, 353373.
Substantive Papers in PeerReviewed Journals
 Clough, S.*, BrownSchmidt, S.*, Cho, S.J., & Duff, M.* (accepted). Reduced online speech gesture integration during multimodal language processing in adults with moderatesevere traumatic brain injury: Evidence from eyetracking. Cortex. [Dynamic GLMM and IRTree models were applied.]
 Bean, C. A. L.*, Mueller, S. B.*, Abitante, G.*, Ciesla, J. A., Cho, S.J., & Cole, D. A.* (in press). Improved scoring of the Center for Epidemiologic Studies Depression Scale – Revised: An item response theory analysis. Journal of Psychopathology and Behavioral Assessment. [Graded response models were applied.]
 Hornsby, B. W.*, Camarata, S.*, Cho, S.J., Davis, H.*, McGarrigle, R., & Bess, F. H.* (2023). Development and validation of a brief version of the Vanderbilt Fatigue Scale for Adults: The VFSA10. Ear and Hearing, 44, 12511261. [Graded response models and IRT DIF analyses were applied.]
 Hornsby, B. W.*, Camarata, S*, Cho, S.J., Davis, H.*, McGarrigle, R., & Bess, F. H.* (2022). Development and evaluation of pediatric versions of the Vanderbilt Fatigue Scale (VFSPeds) for children with hearing loss. Journal of Speech Language and Hearing Research, 65, 23432363. [Graded response models and IRT DIF analyses were applied.]
 Sunday, M. A.*, Tomarken, A.*, Cho, S.J., & Gauthier, I.* (2022). Novel and familiar object recognition rely on the same ability. Journal of Experimental Psychology: General, 151, 676694.
 Hornsby, B. W.*, Camarata, S.*, Cho, S.J., Davis, H.*, McGarrigle, R., & Bess, F. H.* (2021). Development and validation of the Vanderbilt fatigue scales for adults (VFSA). Psychological Assessment, 33, 777–788 [Graded response models and IRT DIF analyses were applied.]
 BrownSchmidt, S.*, Cho, S.J., Nozari, N., Klooster, N., & Duff, M.* (2021). The limited role of hippocampal declarative memory in transient semantic activation during online language processing. Neuropsychologica, 152, 107730. [Funding was supported in part by the National Science Foundation (SES 1851690); Dynamic generalized linear mixedeffects models were applied.]
 Goodwin, A. P.*, Cho, S.J., Reynolds, D, Silverman, R., & Nunn, S. (2021). Explorations of classroom talk and links to reading achievement in upper elementary classrooms. Journal of Educational Psychology, 113, 2748. [Multilevel factor models for complex multilevel designs and multilevel multivariate linear models were applied to data involving 745 teachers and 18,844 students from the Measures of Effective Teaching (MET) study.]
 Goodwin, A. P.*, Cho, S.J., Reynolds, D., Brady, K.*, & Salas, J. A.* (2020). Digital versus paper reading processes and links to comprehension for middle school students. The American Educational Research Journal, 57, 18371867. [Explanatory item response models were applied.]
 Jacobs, C. L., Cho, S.J., & Watson, D. G.* (2019). Selfpriming in production: Evidence for a hybrid model of syntactic priming. Cognitive Science, 43, e12749 [Markov mixedeffect multinomial logistic regression model for nominal repeated measures was applied.]
 Spencer, M.*, Cho, S.J., & Cutting, L. E.* (2019). Item response theory analyses of the DelisKaplan executive function system card sorting subtest. Child Neuropsychology, 25, 198216. [Multidimensional graded response models, IRT DIF, and explanatory item response models were applied.]
 Levin, D. T.*, Seiffert, A.*, Cho, S.J., & Carter, K.* (2018). Are failures to look, to represent, or to learn associated with change blindness during screencapture video learning?. Cognitive Research: Principles and Implications, 3, 49. [Mixedeffects logistic regression models were applied.]
 Nick, E. A.*, Cole, D. A.*, Cho, S.J., Smith, D. K.*, Cater, T. G.*, & Zelkowitz, R.* (2018). The online social support scale: Measure development and validation. Psychological Assessment, 30, 11271143. [For IRT analyses, multidimensional graded response models and IRT DIF were applied.]
 Cole, D. A.*, Goodman, S., Garber, J.*, Cullum, K. A., Cho, S.J., Right, J. D.*, Felton, J. W., Jacquez, F. M., Korelitz, K. E.*, & Simon, H. F. M.* (2018). Validating parent and child forms of the parent perception inventory. Psychological Assessment, 30, 10651081. [For IRT analyses, multidimensional graded response models and IRT DIF were applied.]
 Hornsby, B. W.*, Gustafson, S.*, Lancaster, H., Cho, S.J., Camarata, S.*, & Bess, F. H.* (2017). Subjective fatigue in children with hearing loss using self and parent proxy reports. American Journal of Audiology, 26, 393407. [Nonparametric ANOVA for a betweenwithin design was applied.]
 Goodwin, A. P.*, & Cho, S.J. (2016). Unraveling vocabulary learning: Reader and itemlevel predictors of vocabulary learning within comprehension instruction for fifth and sixthgraders. Scientific Studies of Reading, 20, 490514. [Generalized linear mixed modeling for doubly multilevel binary longitudinal data (Cho & Goodwin, 2017) was applied.]
 Goodwin, A. P.*, Cho, S.J., & Nichols, S.* (2016). Ways to 'WIN' at word learning. The Reading Teacher. [Generalized linear mixed modeling for doubly multilevel binary longitudinal data (Cho & Goodwin, 2017) was applied.]
 Lee, W.y.*, Cho, S.J., McGugin, R. W.*, Van Gulick, A. B.*, & Gauthier, I.* (2015). Differential item functioning analysis of the Vanderbilt Expertise Test for Cars (VETcar). Journal of Vision, 15. [IRT DIF detection methods and multigroup item response models were applied.]
 Cho, S.J., Wilmer, J., Herzmann, G., McGugin, R.*, Fiset, D., Van Gulick, A. B.*, Ryan, K.*, & Gauthier, I.* (2015). Item response theory analyses of the Cambridge face memory test (CFMT). Psychological Assessment, 27, 552566. [Exploratory bifactor item response models, explanatory item response models, and IRT DIF detection methods were applied.]
 Bottge, B. A., Ma, X., Gassaway, L., Toland, M. D., Butler, M., & Cho, S.J. (2014). Effects of blended instructional models on math performance. Exceptional Children, 80, 423437. [Threelevel hierarchical linear models for repeated measures were applied.]
 Goodwin, A. P.*, Gilbert, J. K.*, Cho, S.J., & Kearns, D. M. (2014). Probing lexical representations: Simultaneous modeling of word and reader contributions to multidimensional lexical representations. Journal of Educational Psychology, 106, 448468. [Explanatory multidimensional multilevel random item response models (Cho, Gilbert, & Goodwin, 2013) were applied.]
 Miller, A. C., Davis, N.*, Gilbert, J. K.*, Cho, S.J., Toste, J. R., Street, J.*, & Cutting, L. E.* (2014). Novel approaches to examine passage, student, and question effects on reading comprehension. Learning Disabilities Research & Practice, 29, 2535. [Linear and nonlinear models with nested and crossed random effects were applied.]
 Bottge, B. A., & Cho, S.J. (2013). Effects of enhanced anchored instruction on skills aligned to common core math standards. Learning Disabilities: A Multidisciplinary Journal, 19, 7383. [Multilevel longitudinal item response models were applied.]
 Goodwin, A. P.*, Gilbert, J. K.*, & Cho, S.J. (2013). Morphological contributions to adolescent word reading: An item response approach. Reading Research Quarterly, 48, 3960. [Random item response models and explanatory item response models were applied.]
 Cole, D. A.*, Cho, S.J., Martin, N. C.*, Youngstrom, E. A., Curry, J. F., Findling, R. L., Compas, B. E.*, Goodyer, I. M., Rohde, P., Weissman, M., Essex, M. J., Hyde, J. S., Forehand, R., Slattery, M. J., Felton, J. W.*, & Maxwell, M. A.* (2012). Are increased weight and appetite useful indicators of depression in children and adolescents?. Journal of Abnormal Psychology,121, 838851. [Exploratory, explanatory, and multiplegroup multidimensional graded response models were applied.]
 Cho, S.J., Bottge, B. A., Cohen, A. S., & Kim, S.H. (2011). Detecting cognitive change in the math skills of lowachieving adolescents. Journal of Special Education, 45, 6776. [Mixture longitudinal item response model was applied.]
Book Chapters
 BrownSchmidt, S.*, Naveiras, M.*, De Boeck, P., & Cho, S.J. (2020). Statistical modeling of intensive categorical time series eyetracking data using dynamic generalized linear mixed effect models with crossed random effects. A special issue of "Gazing toward the future: Advances in eye movement theory and applications", Psychology of learning and motivation series (Volume 73). [Funding was supported in part by the National Science Foundation (SES 1851690)]
 De Boeck, P., & Cho, S.J. (2020). IRTree modeling of cognitive processes based on outcome and intermediate data. In H. Jao & R. W. Lissitz (Eds.), Innovative psychometric modeling and methods (pp. 91104). Charlotte, NC: Information Age Publishing.
 Cho, S.J., BrownSchmidt, S.*, Naveiras, M.*, & De Boeck, P. (2020). A dynamic generalized mixed effect model for intensive binary temporalspatio data from an eye tracking technique. In H. Jao & R. W. Lissitz (Eds.), Innovative psychometric modeling and methods (pp. 4568). Charlotte, NC: Information Age Publishing.
 De Boeck, P., Cho, S.J., & Wilson, M. (2016). Explanatory item response models: An approach to cognitive assessment. In A. Rupp & J. Leighton (Eds.), Handbook of cognition and assessment (pp. 249266). Harvard, MA: Wiley Blackwell.
 Cohen, A. S., & Cho, S.J. (2016). Information criteria. In W. J. van der Linden (Ed.), Handbook of item response theory, models, statistical tools, and applications (Vol. 2, pp. 363378). Boca Raton, FL: Chapman & Hall/CRC Press.
Honors
 Association for Psychological Science (APS) Fellow (2020)
 Vanderbilt University Chancellor Faculty Fellow (20192021)
 Vanderbilt University Provost Research Studios (PRS) Award (2018)
 Vanderbilt University TransInstitutional Program (TIPs) Award (coPI) (20162018)
Study Title: Understanding digital dominance in teaching and learning: An interdisciplinary approach
 Vanderbilt University Research Scholar Grant Award (2016)
Study Title: Multilevel reliability measures in a multilevel item response theory framework
Study Title: An application to simultaneous investigation of word and person contributions to word reading and lexical representations using random item response models
 National Academy of Education/Spencer Postdoctoral Fellow (9/2013  6/2015)
Study Title: Evaluating educational programs with a new item response theory perspective
 National Council on Measurement in Education (NCME) Award for an Outstanding Example of an Application of Educational Measurement Technology to a Specific Problem (2011)
Study Title: Latent transition analysis with a mixture IRT measurement model
 StateoftheArt Lecturer, Psychometric Society (2010)
Study Title: Random item response models