transferLearning: [Under development] Transfer learning

Description Usage Arguments Examples

View source: R/DeepNN.R

Description

[Under development] Transfer learning

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
transferLearning(
  plpResult,
  plpData,
  population,
  fixLayers = T,
  includeTop = F,
  addLayers = c(100, 10),
  layerDropout = c(T, T),
  layerActivation = c("relu", "softmax"),
  outcomeWeight = 1,
  batchSize = 10000,
  epochs = 20
)

Arguments

plpResult

The plp result when training a kersa deep learning model on big data

plpData

The new data to fine tune the model on

population

The population for the new data

fixLayers

boolean specificying whether to fix weights in model being transferred

includeTop

If TRUE the final layer of the model being transferred is removed

addLayers

vector specifying nodes in each layer to add e.g. c(100,10) will add another layer with 100 nodels and then a final layer with 10

layerDropout

Add dropout to each new layer (binary vector length of addLayers)

layerActivation

Activation function for each new layer (string vector length of addLayers)

outcomeWeight

The weight to assign the class 1 when training the model

batchSize

Size of each batch for updating layers

epochs

Number of epoches to run

Examples

1
2
3
4
5
6
7
## Not run: 
modelSet <- setDeepNN()
plpResult <- runPlp(plpData, population, modelSettings = modelSet, ...)

transferLearning(...)

## End(Not run)

hxia/plp-git-demo documentation built on March 19, 2021, 1:54 a.m.