Description Usage Arguments Value References See Also
View source: R/effort_summary.R
Summarizes the number of studies screened, which were identified to be included/excluded from the project, as well as those with conflicting agreement on their inclusion/exclusion. If a dual (paired) design was implemented to screen references, then it also provides inter-reviewer agreement estimate following Cohen (1960) that describes the agreement (or repeatability) of screening/coding decisions. The magnitudes of inter-reviewer agreement estimates are then interpreted following Landis & Koch (1977).
1 2 3 4 5 6 7 | effort_summary(
aDataFrame,
column_reviewers = "REVIEWERS",
column_effort = "INCLUDE",
dual = FALSE,
quiet = FALSE
)
|
aDataFrame |
A data.frame containing the titles and abstracts that were
screened by a team. The default assumes that the data.frame is the
merged effort across the team using |
column_reviewers |
Changes the default label of the "REVIEWERS" column that contains the screening efforts of each team member. |
column_effort |
Changes the default label of the "INCLUDE" column that contains the screening decisions (coded references) of each team member. |
dual |
When |
quiet |
When |
A data frame with summary information on the screening tasks of a reviewing team.
Cohen, J. 1960. A coefficient of agreement for nominal scales. Educational and Psychological Measurement 20: 37-46.
Landis, J.R., and Koch, G.G. 1977. The measurement of observer agreement for categorical data. Biometrics 33: 159-174.
effort_initialize
, effort_distribute
,
effort_merge
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.