A callback for fastai
from fastai.vision import *
import torch
from tensorflow.keras.datasets import mnist
from torch.utils.data import DataLoader
TorchEmber track the input/ ouput by enrich the module forward function
But to track the weight/grad, we have to use a fastai callback class
from torchember.fastai import EmberCallback
Sample download and data preprocessing¶
path = untar_data(URLs.MNIST)
il = ImageList.from_folder(path, convert_mode='L')
defaults.cmap='binary'
sd = il.split_by_folder(train='training', valid='testing')
ll = sd.label_from_folder()
x,y = ll.train[0]
x.show()
print(y,x.shape)
tfms = ([*rand_pad(padding=3, size=28, mode='zeros')], [])
ll = ll.transform(tfms)
bs = 128
data = ll.databunch(bs=bs).normalize()
Sample Model¶
def conv(ni,nf): return nn.Conv2d(ni, nf, kernel_size=3, stride=2, padding=1)
model = nn.Sequential(
conv(1, 8), # 14
nn.BatchNorm2d(8),
nn.ReLU(),
conv(8, 16), # 7
nn.BatchNorm2d(16),
nn.ReLU(),
conv(16, 32), # 4
nn.BatchNorm2d(32),
nn.ReLU(),
conv(32, 16), # 2
nn.BatchNorm2d(16),
nn.ReLU(),
conv(16, 10), # 1
nn.BatchNorm2d(10),
Flatten() # remove (1,1) grid
)
This part is the usual torchember practice, for recoding input/output tensors
te=torchEmber(model) # ======== torchember arming the model forward function ========
Use torchember with fastai learner¶
learn = Learner(data, model, loss_func = nn.CrossEntropyLoss(),
metrics=accuracy, )
learn.fit_one_cycle(1,callbacks=[EmberCallback(learn,te)]) # ======== torchember callback here ========
Notice we have no affiliation with fast.ai, thought we're inspired by the course immensely