Part of the analytic development process is ensuring that the algorithms you develop will be reproducible and portable for an end-user. Often an end-user’s machine is not configured in the same way as the developer’s – leading to unexpected side effects or errors. To ensure that your methods are reproducible & portable, your analytic data products will be reviewed on multiple ‘new’ machines representing those used by potential end-users. In lieu of a final exam, each student will be tasked to review two analytic projects developed by students in the course. Using the information presented in the GitHub repository README file and any documentation files provided with the analytic, the reviewer will review the analytic in these areas:

Reviewers will capture their findings by (1) generating an issue on the analytic project’s Github page for each finding and (2) present all findings in a final report (.Rmd file – html_document). Both the report and the Github issues should include a brief discussion of the problem with one or more reproducible examples.

After reviewing the analytic, the reviewer will then assign it one of the following grades. Note: I will perform a second review on all projects to ensure that grades are assigned fairly. If my final grade differs greatly from the grade provided by the reviewer – it will be an indication of poor engagement by the reviewer.

# Define funtion to randomly assign N
# students as reviewers

reviews <- function(n.choices = 13) {

  base = seq(1,as.integer(n.choices))
  rev1 = rep(0, length(base))
  rev2 = rep(0, length(base))

  for(i in 1:(length(base)-1)) {

    choices1 <- base[base != base[i] & !base %in% rev1[1:i]]

    rev1[i] <- sample(choices1, size = 1)

  }

  rev1[length(base)] <- base[!base %in% rev1]

  for(i in 1:(length(base)-1)) {

    choices2 <- base[base != base[i] & base != rev1[i] & !base %in% rev2[1:i]]

    rev2[i] <- sample(choices2, size = 1)

  }

  rev2[length(base)] <- base[!base %in% rev2]

  df <- data.frame(base = base,
                   rev1 = rev1,
                   rev2 = rev2)
  return(df)
}
students <- c('Uhorchak',
              'Munson',
              'Ramirez',
              'Smith',
              'Trigo',
              'Kalhoff',
              'Butt',
              'Elliott',
              'Johnson',
              'Sevier',
              'Small',
              'Gallagher',
              'Stuntz',
              'Meyer')

git_url <- c('https://github.com/nuhorchak/RClean',
             'https://github.com/evan-l-munson/SAoTD',
             'https://github.com/RachelRamirez/NPS',
             'https://github.com/JSmith146/CoRpEx',
             'https://github.com/citation891/MCAC',
             'https://github.com/AFIT-R/instaExtract',
             'https://github.com/SpencerButt/IDPS-LAAD',
             'https://github.com/jtelliott/attrition-forecast',
             'https://github.com/slackliner33/Yahoo_DFS_Optimizer',
             'https://github.com/williamcsevier/TextML',
             'https://github.com/msmall318/Boots',
             'https://github.com/gallagherj2008/SWAT',
             '',
             'https://github.com/AFIT-R/MODA')

The table below lists each developer, the URL of their respective project, and the assigned reviewers.

df <- reviews(length(students))
review_table <- data.frame(Developer = students,
                           Git_URL = git_url,
                           Reviewer1 = students[df[,2]],
                           Reviewer2 = students[df[,3]])

knitr::kable(review_table,
             caption = 'List of developers, projects, and assigned reviewers for OPER782 (Winter 2018)')

The assigned reviewers are to:

  1. Download the project using devtools::install_github('devname/projectname')

  2. Review the documentation provided

  3. Test out the project in accordance with the documentation

  4. Write a one-page review of the project using the rubric provided above.

  5. Fork the project repository

  6. Add your review to the inst/ directory of the project

  7. Submit a pull request to the project developer to have them include the review as part of their project.



AFIT-R/OPER782 documentation built on May 7, 2019, 7:58 a.m.