Using TPUs with PyTorch

Currently, it's not possible to use Cloud TPU with PyTorch since it's designed specifically for Tensorflow.

But, according to this product news posted three days ago in the Google Cloud blog, "engineers on Google’s TPU team are actively collaborating with core PyTorch developers to connect PyTorch to Cloud TPUs".


Check out our repository pytorch/xla where you can start training PyTorch models on TPUs.

Also, you can even use free TPUs on Colab with PyTorch with these Colab notebooks.


As of today, PyTorch Lightning allows to run PyTorch code on TPUs trivially (you will need the XLA library installed). From their demo notebook on colab:

from pytorch_lightning import Trainer

model = CoolSystem()

# most basic trainer, uses good defaults
trainer = Trainer(num_tpu_cores=8)
trainer.fit(model)