Do I have to install packages needed each time when I start Google Colab?

You'll need to install it each time. From the FAQ:

Where is my code executed? What happens to my execution state if I close the browser window? Code is executed in a virtual machine dedicated to your account. Virtual machines are recycled when idle for a while, and have a maximum lifetime enforced by the system.


If you connect to a local runtime then you can install once and use forever.

Here is how: https://research.google.com/colaboratory/local-runtimes.html


EDIT: I incorrectly assumed you wanted to install R packages, but I'll leave this here in case it's useful to someone. I'm not familiar enough with Python to know whether a solution like this would be feasible.

The accepted answer is indeed correct, you will need to install your packages to the virtual machine every time you run it. However, you can use the lib and lib.loc arguments of install.packages and library to your advantage.

I've managed to circumvent this issue somewhat by creating a library of packages in my google drive.

I then connect to drive at the beginning of the notebook and load the packages from there. Here's how I did it.

  1. Load R into your Python NB
%reload_ext rpy2.ipython
  1. Connect your notebook to drive (only available in Python NBs).
from google.colab import drive
drive.mount('/content/mydrive')
  1. Install your packages into a folder in your drive.
%%R
lib_loc <- "/content/mydrive/r-lib"
install.packages("data.table", lib = lib_loc)
  1. Flush and unmount the drive, just to make sure it works!
drive.flush_and_unmount()
  1. Next time you run the notebook, you don't need to install the packages, you just need to do #1, and #2 and then load your packages from your new library.
%%R
lib_loc <- "/content/mydrive/r-lib"
library(data.table, lib.loc = lib_loc)

In case you're wondering, the %%R is a call to the R engine inside a Python notebook.

Hope this helps.