Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
-
Updated
Sep 23, 2020 - Python
Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
This repository contains the code and models for our paper "Investigating and Mitigating Failure Modes in Physics-informed Neural Networks(PINNs)"
The AFOF was developed to help Matlab users to obtain the optimal adaptive filters and their parameters for a specific application. To run this function, Signal Processing and DSP System Toolboxes are necessary. See the AFOF_user_guide PDF for instructions.
Adaptive stochastic gradient method based on the universal gradient method. The universal method adjusts Lipsitz constant of the gradient on each step so that the loss function is majorated by the quadratic function.
A novel optimizer that leverages the trend observed in the gradients (https://arxiv.org/pdf/2109.03820.pdf)
We introduce the new concept of (α,L,δ)-relative smoothness (see https://arxiv.org/pdf/2107.05765.pdf) which covers both the concept of relative smoothness and relative Lipschitz continuity. For the corresponding class of problems, we propose some adaptive and universal methods which have optimal estimates of the convergence rate.
Add a description, image, and links to the adaptive-optimizer topic page so that developers can more easily learn about it.
To associate your repository with the adaptive-optimizer topic, visit your repo's landing page and select "manage topics."