Migrating from Catalyst

knitr::opts_chunk$set(echo = TRUE,eval = FALSE,echo = T)

_

Intro

PyTorch framework for Deep Learning research and development. It focuses on reproducibility, rapid experimentation, and codebase reuse so you can create something new rather than write another regular train loop. Break the cycle - use the Catalyst!

Catalyst with fastai

Specify loaders from catalyst dict:

library(fastai)
library(magrittr)

loaders = loaders()

data = Data_Loaders(loaders['train'], loaders['valid'])$cuda()

nn = nn()
model = nn$Sequential() +
  nn$Flatten() +
  nn$Linear(28L * 28L, 10L)

Output:

Sequential(
  (0): Flatten()
  (1): Linear(in_features=784, out_features=10, bias=True)
)

Fit

metrics = list(accuracy,top_k_accuracy)
learn = Learner(data, model, loss_func = nn$functional$cross_entropy, opt_func = Adam,
                metrics = metrics)

learn %>% fit_one_cycle(1, 0.02)
epoch     train_loss  valid_loss  accuracy  top_k_accuracy  time    
0         0.269411    0.336529    0.910200  0.993700        00:08   


Try the fastai package in your browser

Any scripts or data that you put into this service are public.

fastai documentation built on March 31, 2023, 11:41 p.m.