View source: R/downscalePredict.keras.R
downscalePredict.keras | R Documentation |
Downscale data to local scales by deep learning models previously obtained by downscaleTrain.keras
.
downscalePredict.keras(
newdata,
model,
C4R.template,
clear.session = FALSE,
loss = NULL
)
newdata |
The grid data. It should be an object as returned by |
model |
An object containing the statistical model as returned from |
C4R.template |
A climate4R grid that serves as template for the returned prediction object. |
clear.session |
A logical value. Indicates whether we want to destroy the current tensorflow graph and clear the model from memory.
In particular, refers to whether we want to use the function |
loss |
Default to NULL. Otherwise a string indicating the loss function used to train the model. This is only relevant where we have used the 2 custom loss functions of this library: "gaussianLoss" or "bernouilliGammaLoss" |
This function relies on keras, which is a high-level neural networks API capable of running on top of tensorflow, CNTK or theano. There are official keras tutorials regarding how to build deep learning models. We suggest the user, especially the beginners, to consult these tutorials before using downscalePredict.keras.
A regular/irregular grid object.
J. Bano-Medina
downscaleTrain.keras for training a downscaling deep model with keras downscalePredict.keras for predicting with a keras model prepareNewData.keras for predictor preparation with new (test) data downscaleR.keras Wiki
Other downscaling.functions:
downscaleTrain.keras()
,
relevanceMaps()
# Loading data
require(climate4R.datasets)
require(transformeR)
require(magrittr)
require(keras)
data("VALUE_Iberia_tas")
y <- VALUE_Iberia_tas
data("NCEP_Iberia_hus850", "NCEP_Iberia_psl", "NCEP_Iberia_ta850")
x <- makeMultiGrid(NCEP_Iberia_hus850, NCEP_Iberia_psl, NCEP_Iberia_ta850)
# We divide in train and test data and standardize the predictors
# using transformeR functions subsetGrid and scaleGrid, respectively.
xT <- subsetGrid(x,years = 1983:1995)
xt <- subsetGrid(x,years = 1996:2002) %>% scaleGrid(base = xT, type = "standardize")
xT <- scaleGrid(xT,type = "standardize")
yT <- subsetGrid(y,years = 1983:1995)
yt <- subsetGrid(y,years = 1996:2002)
# Preparing the predictors
xy.T <- prepareData.keras(x = xT, y = yT,
first.connection = "conv",
last.connection = "dense",
channels = "last")
# Defining the keras model....
# We define 3 hidden layers that consists on
# 2 convolutional steps followed by a dense connection.
input_shape <- dim(xy.T$x.global)[-1]
output_shape <- dim(xy.T$y$Data)[2]
inputs <- layer_input(shape = input_shape)
hidden <- inputs %>%
layer_conv_2d(filters = 25, kernel_size = c(3,3), activation = 'relu') %>%
layer_conv_2d(filters = 10, kernel_size = c(3,3), activation = 'relu') %>%
layer_flatten() %>%
layer_dense(units = 10, activation = "relu")
outputs <- layer_dense(hidden,units = output_shape)
model <- keras_model(inputs = inputs, outputs = outputs)
# We can print model in console to observe its configuration
model
# Training the deep learning model
model <- downscaleTrain.keras(xy.T,
model = model,
compile.args = list("loss" = "mse",
"optimizer" = optimizer_adam(lr = 0.01)),
fit.args = list("epochs" = 30, "batch_size" = 100))
# Predicting on the test set...
xy.t <- prepareNewData.keras(newdata = xt,data.structure = xy.T)
pred <- downscalePredict.keras(newdata = xy.t,
model = model,
clear.session = TRUE,
C4R.template = yT)
# We can now apply the visualizeR functions to the prediction
# as it preserves the climate4R template.
require(visualizeR)
temporalPlot(yt,pred)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.