Dual-numbers, auto-differentiation and gradient decent

Dr. Aaron Nielsen will be speaking on a fast (aka auto) way to take a differentiation. In Machine Learning and AI, the gradient descent is a standard technique used during the training phase of a neural network. However gradient descent requires computing the derivative of a potentially complicated function at every step during the training process. Computing the form of the derivative of the function can be tricky and error prone, but a technique called auto-differentiation allows for the derivative to be computed automatically without know the functional form of the derivative. Auto-differentiation uses a class of number known as dual-numbers to compute these derivatives automatically. This talk will discuss dual-numbers and their special properties and then walk thru how these properties allow for auto-differentiation. A simple gradient descent example will be presented using auto-differentiation and some python packages which perform auto-differentiation will be presented.