binaryRL-package: binaryRL: Reinforcement Learning Tools for Two-Alternative...

binaryRL-packageR Documentation

binaryRL: Reinforcement Learning Tools for Two-Alternative Forced Choice Tasks

Description

Tools for building Rescorla-Wagner Models for Two-Alternative Forced Choice tasks, commonly employed in psychological research. Most concepts and ideas within this R package are referenced from Sutton and Barto (2018) <ISBN:9780262039246>. The package allows for the intuitive definition of RL models using simple if-else statements and three basic models built into this R package are referenced from Niv et al. (2012)\Sexpr[results=rd]{tools:::Rd_expr_doi("10.1523/JNEUROSCI.5498-10.2012")}. Our approach to constructing and evaluating these computational models is informed by the guidelines proposed in Wilson & Collins (2019) \Sexpr[results=rd]{tools:::Rd_expr_doi("10.7554/eLife.49547")}. Example datasets included with the package are sourced from the work of Mason et al. (2024) \Sexpr[results=rd]{tools:::Rd_expr_doi("10.3758/s13423-023-02415-x")}.

Example Data

  • Mason_2024_G1: Group 1 of Mason et al. (2024)

  • Mason_2024_G2: Group 2 of Mason et al. (2024)

Steps

  • run_m: Step 1: Building reinforcement learning model

  • rcv_d: Step 2: Generating fake data for parameter and model recovery

  • fit_p: Step 3: Optimizing parameters to fit real data

  • rpl_e: Step 4: Replaying the experiment with optimal parameters

Models

  • TD: TD Model

  • RSTD: RSTD Model

  • Utility: Utility Model

Functions

  • func_gamma: Utility Function

  • func_eta: Learning Rate

  • func_epsilon: Epsilon Related

  • func_pi: Upper-Confidence-Bound

  • func_tau: Soft-Max

  • func_logl: Loss Function

Processes

  • optimize_para: optimizing free parameters

  • simulate_list: simulating fake datasets

  • recovery_data: parameter and model recovery

Summary

  • summary.binaryRL: summary(binaryRL.res)

Author(s)

Maintainer: YuKi hmz1969a@gmail.com (ORCID)

See Also

Useful links:


binaryRL documentation built on Aug. 21, 2025, 6:01 p.m.