Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to clear tpu memory in pytorch xla #8520

Open
chaowenguo opened this issue Dec 24, 2024 · 1 comment
Open

how to clear tpu memory in pytorch xla #8520

chaowenguo opened this issue Dec 24, 2024 · 1 comment

Comments

@chaowenguo
Copy link

❓ Questions and Help

i create an diffusers pipeline and then to xla device and assign the pipeline to variable pipe. next i need to reassign pipe to another pipeline (different pipeline from the first one). so i do pipe.to('cpu') and then pipe = diffusers.pipeline. i see i can use torch.cuda.empty_cache() to clean gpu memory. i want to know how to clean tpu memory?

@radna0
Copy link

radna0 commented Dec 25, 2024

@chaowenguo. Speaking from experience. I believe within torch_xla, there isn't a direct equivalent to torch.cuda.empty_cache(). However, by using pipe.to('cpu'), you are effectively offloading the pipeline from the XLA device's memory to the CPU. This process triggers the execution of the XLA graph, which in turn clears the memory utilized by the pipeline. If you are still experiencing memory issues.

you may also consider explicitly deleting the pipeline object (using del pipe) and calling xm.mark_step() to finalize the memory cleanup. But moving to cpu should be enough

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants