Beautiful differentiation (Conal Elliott)
Automatic differentiation but:
- generalized to arbitrary dimensionality
- generalized to arbitrary order
- employing lazy evaluation – allowing to avoid obfuscation of code
Representation is as pairs of value and it's derivative.
Some other literature might call such pairs "jets".
In implementation there might be some similarities with HOAS (higher order abstract syntax).
Concrete output at the end
The infinite sequence of multidimensional derivatives
gives a tower of tensors with increasing rank.
- 0th derivative: scalar
- 1st derivative: vector (gradient)
- 2nd derivative: matrix (Hessian matrix)
- 3rd derivative: cube of numbers
- 4th derivative: hypercube of numbers
- …
- 0th derivative: vector
- 1st derivative: matrix (Jacobian matrix, vectorgradient)
- 2nd derivative: cube of numbers
- 3rd derivative: hypercube of numbers
- …
With "beautiful differentiation" all that falls out naturally.
Just ask for a certain derivative or max order of derivative and it gets spat out.
Use cases
In constructive solid geometry when volumes are represented via F-Rep then
finding points on the surfaces for triangulation involves
finding zeros of an implicit function (algebraic variety).
This needs "root finding algorithms" which need methods to calculate their derivatives.
There is need for nontrivial differentiation in machine learning.
Related
External links
Central page linking to all relevant material:
Actually usable implementation (Haskell library):
vector-space provides classes and generic operations for vector spaces and affine spaces.
It also defines a type of infinite towers of generalized derivatives.
A generalized derivative is a linear transformation rather than one of the common concrete representations (scalars, vectors, matrices, ...).
- Underlying data storage method:
MemoTrie: Trie-based memo functions
Wikipedia: