Suite of tools written in Python to solve automatic numerical differentiation problems in one or more variables. Finite differences are used in an adaptive manner, coupled with a Romberg extrapolation methodology to provide a maximally accurate result. The user can configure many options like; changing the order of the method or the extrapolation, even allowing the user to specify whether central, forward or backward differences are used.
The methods provided are:
Derivative: Computate derivatives of order 1 through 4 on any scalar function.
Gradient: Computes the gradient vector of a scalar function of one or more variables.
Jacobian: Computes the Jacobian matrix of a vector valued function of one or more variables.
Hessian: Computes the Hessian matrix of all 2nd partial derivatives of a scalar function of one or more variables.
Hessdiag: Computes only the diagonal elements of the Hessian matrix
All of these methods also produce error estimates on the result.
A pdf file is also provided to explain the theory behind these tools. Download the toolbox here