Project Status: WIP – Initial development is in progress, but there has not yet been a stable, usable release suitable for the public. R-CMD-check codecov

pnd

An R package for computing fast and accurate numerical derivatives.

Parallel numerical derivatives in R

In the past, I was using numDeriv to compute numerical gradients. However, the results were not stable for some function, and I could not investigate the source of this instability. Different step sizes yielded different results. Small step sizes were sometimes better, sometimes worse.

The pnd package was designed to offer a comprehensive tool-kit containing popular algorithms for finite differences, numerical gradients, Jacobians, and Hessians.

Optimal step sizes and parallel evaluation of numerical derivatives translate directly to faster numerical optimisation and statistical inference.

Features

Getting started

This package has numDeriv-compatible syntax. Simply replace the first letter of numDeriv commands with a capital one to get the improved commands: Grad, Jacobian, and Hessian.

Here is how to compute the gradient of f(x) = sum(sin(x)) at the point x = (1, 2, 3, 4).

f <- function(x) sum(sin(x))
x <- 1:4
names(x) <- c("Jan", "Feb", "Mar", "Apr")

numDeriv::grad(f, x)
#> 0.5403023 -0.4161468 -0.9899925 -0.6536436

pnd::Grad(f, x)
#> Estimated gradient:
#>      Jan      Feb      Mar      Apr  
#>   0.5403  -0.4161  -0.9900  -0.6536  
#> (default step size: 6.1e-06, 1.2e-05, 1.8e-05, 2.4e-05).

The output contains diagnostic information about the chosen step size. Our function preserved the names of the input argument, unlike grad.

The default step size in many implementations is proportional to the argument value, and this is reflected in the default output. Should the user desire a fixed step size, this can be easily achieved with an extra argument named h:

pnd::Grad(f, x, h = c(1e-5, 1e-5, 1e-5, 2e-5))
#> Estimated gradient:
#>      Jan      Feb      Mar      Apr  
#>   0.5403  -0.4161  -0.9900  -0.6536  
#> (user-supplied step size: 1.0e-05, 1.0e-05, 1.0e-05, 2.0e-05).

Finally, it is easy to request an algorithmically chosen optimal step size – here is how to do it with the Stepleman–Winarsky (1979) rule, named "SW", that works well in practice:

pnd::Grad(f, x, h = "SW")
#> Estimated gradient:
#>      Jan      Feb      Mar      Apr  
#>   0.5403  -0.4161  -0.9900  -0.6536  
#> (SW step size: 5.0e-06, 1.0e-05, 7.5e-06, 1.0e-05).

Extensive diagnostics requested at any time: the step-search tracing information is saved in the attr(pnd::Grad(...), "step.search") attribute that has an $iterations element. The numerical gradients and Jacobian are simple numeric vectors and matrices with attributes that facilitate printing – feel free to handle them as any other numeric object.

Learning resources

Literature

This package is supported by 3 vignettes:

The following articles provide the theory behind the methods implemented in this package:

Installation

The stable version is on CRAN. To install it, run the following line:

install.packages("pnd")

The development version is available on GitHub. To install it, run the following two commands:

install.packages("devtools")
devtools::install_github("Fifis/pnd")

To load this package, include this line in the code:

library(pnd)

This package is almost dependency-free; the parallel library belongs to the base group and is included in most R distributions.

Licence

This software is released under the free/open-source EUPL 1.2 licence.