Scientific Computing in Lean

šŸ–Ø

šŸš§ Differentiation

We will cover these topics

  1. symbolic differentiations

  2. automatic differentiations, forward and reverse mode AD

  3. function transformations and how AD works in SciLean

    • exersices: define new function transformations:

      • vectorize (X ā†’ Y) ā†’ (X^[n] ā†’ Y^[n])

      • vectorized version of fwdFDeriv fwdFDerivVec (X ā†’ Y) ā†’ (XƗX^[n] ā†’ YƗY^[n])

  4. working with user defined functions and structurs

    • polymorphics functions

    • higher order functions

    • recursive functions

  5. differentiating tensor expressions

    • explain the problem

    • current solution

    • sparse update and structure sharing problem

  6. differentiating imperative and monadic code

  7. variational calclus

  1. Symbolic Differentiation
    1. Notation
      1. Exercises
    2. Examples
      1. Newton's solver
        1. Exercises
      2. Kinematics
        1. Exercises
    3. Gradient
      1. Exercises
    4. Derivative Rules
    5. Differentiating Division, Log, Sqrt, ...
      1. Exercises
    6. Abstract Vector Spaces
  2. Automatic Differentiation
    1. Forward Mode
      1. autodiff vs fun_trans
      2. Relation to Dual Numbers
      3. Exercises
    2. Reverse Mode
      1. Exercises
    3. Derivatives of Neural Network Layers
  3. Derivative Rules
    1. Basic Rules
    2. Rules with Conditions
    3. Higher Order Functions
    4. Recursive Functions
    5. Custom Structures
    6. Polymorphics Functions
  4. Differentiating Array Expressions
  5. Imperative and Monadic Code
    1. Control flow
    2. Differentiable Monad
    3. Forward Derivative Monad
    4. Reverse Derivative Monad
  6. Variational Calculus