You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried to a small model with a sigmoid activation but it's actually tested here that this does not work. Is there a specific reason for that? IMO since sigmoid is a scalar operation it should work analogously to ReLU and Tanh which can be used with LRP.
Simply adding sigmoid here yields the expected result. So why not just do so?
I would be willing to create the PR and add a test for this if there is no reason not to.
The text was updated successfully, but these errors were encountered:
❓ Questions and Help
I tried to a small model with a sigmoid activation but it's actually tested here that this does not work. Is there a specific reason for that? IMO since sigmoid is a scalar operation it should work analogously to ReLU and Tanh which can be used with LRP.
Simply adding sigmoid here yields the expected result. So why not just do so?
I would be willing to create the PR and add a test for this if there is no reason not to.
The text was updated successfully, but these errors were encountered: