Release GPU Memory(VRAM) after tf.keras.backend.clear_session() #39535
Labels
comp:gpu
GPU related issues
stat:awaiting response
Status - Awaiting response from author
TF 2.0
Issues relating to TensorFlow 2.0
type:performance
Performance Issue
System information
Describe the current behavior
There's no command which frees the previously used VRAM. Even deleting the model and the data had no effect on the VRAM.
Describe the expected behavior
Any of these commands should release the VRAM.
Standalone code to reproduce the issue
I've used nvidia-smi to check the memory-usage.
The text was updated successfully, but these errors were encountered: