gpt: Generalized Processing Tree Models

gptR Documentation

Generalized Processing Tree Models

Description

Fits GPT models for multivariate data (one discrete and one or more continuous responses per trial). Assumes that distribution of continuous variable(s) is a mixture distribution with the MPT core structure defining the mixture probabilities.

Details

The GPT structure is implemented by an S4 class gpt, which contains the MPT structure (the S4 class mpt), a vector mapping the MPT branches to the underlying continuous distributions (mapvec), a list of univariate or multivariate basis distributions (each an S4 class contin with information about parameter spaces etc.), the parameter labels for theta and eta, and a vector with constant values for the parameters.

It is advisable to first check that a GPT model file is valid using read_gpt. Next, one can either first generate some simulated data using gpt_gen or fit data using gpt_fit. The fitting algorithm first uses an EM algorithm before maximizing the full likelihood by gradient descent. Note that restrictions on the parameter space are automatically taken into account (e.g., variances must be positive).

Author(s)

Daniel W. Heck, dheck@uni-marburg.de

References

Heck, D. W., Erdfelder, E., & Kieslich, P. J. (2018). Generalized processing tree models: Jointly modeling discrete and continuous variables. Psychometrika, 83, 893–918. https://doi.org/10.1007/s11336-018-9622-0


danheck/gpt documentation built on March 29, 2025, 1:17 p.m.