Skip to content

ASDL: Automatic Second-order Differentiation (for Fisher, Gradient covariance, Hessian, Jacobian, and Kernel) Library

License

Notifications You must be signed in to change notification settings

floatingbigcat/asdfghjkl

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ASD(FGHJK)L (alpha release)

The library is called "ASDL", which stands for Automatic Second-order Differentiation (for Fisher, Gradient covariance, Hessian, Jacobian, and Kernel) Library. ASDL is a PyTorch extension for computing 1st/2nd-order metrics and performing 2nd-order optimization of deep neural networks.

You can import asdfghjkl by sliding your finger on a QWERTY keyboard 😇

import asdfghjkl

ADL vs ASDL

Basic metrics supported by a standard automatic differentiation libarary (ADL)

metric definition
neural network
loss
(averaged) gradient

Advanced 1st/2nd-order metrics (FGHJK) supported by ASDL

metric definition
Fisher information matrix
Fisher information matrix (MC estimation)
empirical Fisher
Gradient covariance
Hessian
Jacobian (per example)
Jacobian
Kernel

Matrix approximations

Supported operations

  • matrix-vector product
    • power method
    • conjugate gradient method
  • preconditioning gradient

About

ASDL: Automatic Second-order Differentiation (for Fisher, Gradient covariance, Hessian, Jacobian, and Kernel) Library

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%