You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The order in which key sequences via keys() and flatten_keys().keys() are generated differ (in some cases).
I need to extract the tensors from the tensordict with tensordict.values(True, True) and pass them into torch.autograd.grad as a tuple.
I kept running into tensordict-reconstruction issues in which keys and the tensors did not match anymore to previous incarnations of the "same" tensordict and subsequent operations were wrong.
Here, state.x and state.R, and gradients.state.x and gradients.state.R are generated differently.
A simply alpha-numerical ordering would be appreciably.
To Reproduce
This is hard as I still haven't figured out why this occurs in my case.
For all my base test cases, it behaves normally, but once I use a large model to test things, I keep running into this problem.
Nevertheless, I think it would be a useful addition.
Can we squeeze a sorted option to tensordict.items()?
Most tensordicts would have a small number of keys (<10), so maybe it would be useful as the default?
The text was updated successfully, but these errors were encountered:
ludwigwinkler
changed the title
[BUG] Random 'serialization' when calling keys() vs flatten_keys().keys()
[FEATURe] Random 'serialization' when calling keys() vs flatten_keys().keys()
Aug 12, 2024
ludwigwinkler
changed the title
[FEATURe] Random 'serialization' when calling keys() vs flatten_keys().keys()
[FEATURE] Random 'serialization' when calling keys() vs flatten_keys().keys()
Aug 12, 2024
ludwigwinkler
changed the title
[FEATURE] Random 'serialization' when calling keys() vs flatten_keys().keys()
[BUG] Random 'serialization' when calling keys() vs flatten_keys().keys()
Aug 12, 2024
Describe the bug
The order in which key sequences via
keys()
andflatten_keys().keys()
are generated differ (in some cases).I need to extract the tensors from the tensordict with
tensordict.values(True, True)
and pass them intotorch.autograd.grad
as a tuple.I kept running into tensordict-reconstruction issues in which keys and the tensors did not match anymore to previous incarnations of the "same" tensordict and subsequent operations were wrong.
Here,
state.x
andstate.R
, andgradients.state.x
andgradients.state.R
are generated differently.A simply alpha-numerical ordering would be appreciably.
To Reproduce
This is hard as I still haven't figured out why this occurs in my case.
For all my base test cases, it behaves normally, but once I use a large model to test things, I keep running into this problem.
Nevertheless, I think it would be a useful addition.
System info
0.4.0 1.26.4 3.11.9 (main, Apr 19 2024, 11:43:47) [Clang 14.0.6 ] darwin 2.3.1
Reason and Possible fixes
Can we squeeze a
sorted
option totensordict.items()
?Most tensordicts would have a small number of keys (<10), so maybe it would be useful as the default?
The text was updated successfully, but these errors were encountered: