Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why is sigmoid activation for LRP not allowed? #1361

Open
CloseChoice opened this issue Oct 1, 2024 · 0 comments
Open

Why is sigmoid activation for LRP not allowed? #1361

CloseChoice opened this issue Oct 1, 2024 · 0 comments

Comments

@CloseChoice
Copy link

❓ Questions and Help

I tried to a small model with a sigmoid activation but it's actually tested here that this does not work. Is there a specific reason for that? IMO since sigmoid is a scalar operation it should work analogously to ReLU and Tanh which can be used with LRP.

Simply adding sigmoid here yields the expected result. So why not just do so?

I would be willing to create the PR and add a test for this if there is no reason not to.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant