How to clear Colab Tensorflow TPU memory
I use tf.tpu.experimental.initialize_tpu_system(hw_accelerator_handle)
when I perform hyperparameter tuning on TPU and want to release memory between two sessions of training. It resets your TPU while maintaining the connection to the TPU. In my usecase I start training from scratch each time, probably it still works for your use case.
hw_accelerator_handle
is the object returned by tf.distribute.cluster_resolver.TPUClusterResolver()
I personally wouldn't try to clear TPU memory. If there is an OOM on a Google Colab TPU, either use a smaller batch size, smaller model, or use a Kaggle TPU which has twice the memory as a Colab TPU.