Difference between revisions of "Beautiful differentiation (Conal Elliott)"
m (added + == Concrete output at the end == and == Use cases ==) |
m (added == Notes ==) |
||
(One intermediate revision by the same user not shown) | |||
Line 29: | Line 29: | ||
---- | ---- | ||
With "beautiful differentiation" all that falls out naturally. <br> | With "beautiful differentiation" all that falls out naturally. <br> | ||
− | Just ask for a certain max order of derivative. | + | '''Just ask for a certain derivative or max order of derivative and it gets spat out.''' |
== Use cases == | == Use cases == | ||
Line 39: | Line 39: | ||
There is need for nontrivial differentiation in machine learning. | There is need for nontrivial differentiation in machine learning. | ||
+ | |||
+ | == Notes == | ||
+ | |||
+ | Automatic differentiation in general is … | ||
+ | * neither symbolic math like in a computer algebra system | ||
+ | * nor numeric using a small step-size to calculate the derivative | ||
== Related == | == Related == |
Latest revision as of 11:01, 11 July 2023
Automatic differentiation but:
- generalized to arbitrary dimensionality
- generalized to arbitrary order
- employing lazy evaluation – allowing to avoid obfuscation of code
Representation is as pairs of value and it's derivative.
Some other literature might call such pairs "jets".
In implementation there might be some similarities with HOAS (higher order abstract syntax).
Concrete output at the end
The infinite sequence of multidimensional derivatives
gives a tower of tensors with increasing rank.
- 0th derivative: scalar
- 1st derivative: vector (gradient)
- 2nd derivative: matrix (Hessian matrix)
- 3rd derivative: cube of numbers
- 4th derivative: hypercube of numbers
- …
- 0th derivative: vector
- 1st derivative: matrix (Jacobian matrix, vectorgradient)
- 2nd derivative: cube of numbers
- 3rd derivative: hypercube of numbers
- …
With "beautiful differentiation" all that falls out naturally.
Just ask for a certain derivative or max order of derivative and it gets spat out.
Use cases
In constructive solid geometry when volumes are represented via F-Rep then
finding points on the surfaces for triangulation involves
finding zeros of an implicit function (algebraic variety).
This needs "root finding algorithms" which need methods to calculate their derivatives.
There is need for nontrivial differentiation in machine learning.
Notes
Automatic differentiation in general is …
- neither symbolic math like in a computer algebra system
- nor numeric using a small step-size to calculate the derivative
Related
External links
Central page linking to all relevant material:
Actually usable implementation (Haskell library):
vector-space provides classes and generic operations for vector spaces and affine spaces.
It also defines a type of infinite towers of generalized derivatives.
A generalized derivative is a linear transformation rather than one of the common concrete representations (scalars, vectors, matrices, ...).
- Underlying data storage method:
MemoTrie: Trie-based memo functions
Wikipedia: