Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate adding rectifier transformations to improve stability in sampling #467

Open
2 tasks
SamuelBrand1 opened this issue Oct 3, 2024 · 1 comment
Open
2 tasks

Comments

@SamuelBrand1
Copy link
Collaborator

SamuelBrand1 commented Oct 3, 2024

Pertinent to above and @seabbs comment on using upjitter. Using the softplus transform instead of an upjitter (or clamp to do relu) stabilises the sampling. Also, using LogExpFunctions.xexpy to implement the effect of the AR process on observations in the stochastic model.
After these changes, an experiment to not use initial values to start NUTS still converges onto the correct posterior distributions, so at least in this context this is a big improvement in the stability of the model. I know @damonbayer has wrangled with this in Bayesian inference of ODEs so I wonder what your thoughts on using special functions to stabilise sampling are?

Originally posted by @SamuelBrand1 in #464 (comment)

This finding really makes me wonder if we should have some kind of (preferable smooth) rectifier transformation as default in a large chunk of the modelling. Maybe via TransformLatentModel etc?

@seabbs
Copy link
Collaborator

seabbs commented Oct 3, 2024

step one is being able to do all the transforms we have in the new vignette using Transform right?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants