setwd(dir = "vignettes/slides") rmarkdown::render(input = "Presentation.Rmd", output_file = "Presentation.html") # This is to make the html output appear in the folder where you want it. knitr::include_graphics(here("images/hcdc2021_09.png"))
knitr::opts_chunk$set(echo = FALSE)
library(here) # set up working directory here::here() # set working directory for images
knitr::include_graphics(here("images/prelim_subplots.png"))
knitr::include_graphics(here("images/Inset_Study_Site.png"))
knitr::include_graphics(here("images/ALL_Trends.png"))
knitr::include_graphics(here("images/Sea__Ice_vs_Clouds.png"))
knitr::include_graphics(here("images/kendall_test_results.png"))
Gradient Boosting is a machine learning technique used for regression and classification. Similar to random forest, it uses an ensemble of decision trees to learn how independent variables predict the dependent variable. Unlike random forest, trees are built upon hrough boosting, so trees are interactively improved instead of averaged out by many individual trees. This package extracts information from several raster datasets and builds a gradient boosting model to determine the most important variables that influence low cloud cover concentration over the Chukchi Sea.
For running the gradient boosting model (XGBoost) the data were partitioned into training and testing, trained a model, and then used accuracy assessment to find the optimum number of trees used to tune the final XGBoost model.
knitr::include_graphics(here("images/XGB_LCC_vs_predicted_testset.png"))
knitr::include_graphics(here("images/XGB_VariableImportance.png"))
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.