Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BC breaking change in torch] weights_only default flip for torch.load #1443

Open
mikaylagawarecki opened this issue Nov 15, 2024 · 0 comments

Comments

@mikaylagawarecki
Copy link

We've flipped the default for the weights_only argument in torch.load to True in pytorch/pytorch, see here for details + documentation and this is coming in torch 2.6.

This is expected to be quite a BC-breaking change, especially if any torch.load calls are not loading state_dicts of plain tensors.

We should make sure that all the torch.load calls in captum are still working. I see 11 of them that don't have weights_only set (mostly in tutorials).

I'm happy to open a PR to explicitly set weights_only on these, but not sure who to ping for review

@mikaylagawarecki mikaylagawarecki changed the title weights_only default flip for torch.load [BC breaking change in torch] weights_only default flip for torch.load Nov 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant