colab: https://colab.research.google.com (i think people are using non-google alternatives now but i know this url)
configuring gpu runtime and running !nvidia-smi shows the T4 gpu has 15GB of ram, so i’d need the quantized approach to use it in gpu ram as 13b float16 would be 26GB otherwise
