Chemistry - Boys function for Gaussian integrals in ab-initio calculations

Solution 1:

I am not aware of any existing Fortran code for direct numerical quadrature of this problem, but it is worth pointing out that Mathematica can perform this integral symbolically:

Integrate[t^(2 n) Exp[-x t^2], {t, 0, 1}]

(* 1/2 x^(-(1/2) - n) (Gamma[1/2 + n] - Gamma[1/2 + n, x]) *)

where the $\Gamma$ function can be computed by exponentiating easy-to-find $\ln \Gamma$ functions, often called lngamma(). (One of them is the Euler function $\Gamma(z)$ and the other incomplete $\Gamma(a,z)$ function).

As a check:

With[
  {x = 1, n = 2},
  1/2 x^(-(1/2) - n) (Gamma[1/2 + n] - Gamma[1/2 + n, x])
  ] // N

(* 0.100269  *)

and numerical quadrature yields:

With[
 {x = 1, n = 2},
 NIntegrate[t^(2 n) Exp[-x t^2], {t, 0, 1}]
 ]

(* 0.100269  *)

One has to be a little bit careful, as the $\Gamma$ function is closely related to the factorial, and it can explode as the arguments get too large. I would experiment and see whether you can get by with this analytic solution, which essentially takes you from numerical quadrature into evaluation of special functions.

The $\Gamma$ function approach is faster by over two orders of magnitude than the quadrature approach. This is critical when evaluating molecular integrals. I have seen implementations where the Boys function is actually interpolated rather than evaluated, and there is discussion about the error in interpolation being almost insignificant if suitable interpolation basis functions are used.

Solution 2:

The Boys function is $F_n(x)$ a special case of the Kummer confluent hypergeometric function $M(a,b,x) = {_1}F_1(a,b,x)$, which can be found in many special function libraries, such as scipy.special. According to equation (9) of this paper, the relationship is

$F_n(x) = \frac{{_1}F_1(n+\frac{1}{2},n+\frac{3}{2},-x)}{2n +1}$

If you use scipy you can get ${_1}F_1$ here.

Otherwise, you can grab a FORTRAN implementation here. I believe the routine you want is CHGM. There are probably other implementations.


Solution 3:

I know this is an old question, but I would like to give a small comparison regarding efficiency when evaluating the Boys function $F_n(x)$. Below are some implementations (in Julia) with simple benchmarks.

In the end, I also give some useful approximations to the Boys function, both for small and large values of $x$.


For benchmarking, I'll use BenchmarkTools to measure the average time it takes for a particular implementation to calculate $F_1(x)$ $\forall x \in \hat{x}$, where $\tilde{x}$ is the 10000-vector below:

using BenchmarkTools

n = 1
x̃ = range(1e-15, 15, length=10000)

Implementations are in order, from fastest to slowest. (Of course, benchmarks should be compared with each other; different computers will produce different results.)

Evaluations

Using the incomplete gamma function

As mentioned by Eric above, $F_n(x)$ can be written in terms of the incomplete gamma function as

$$\frac{1}{2 x^{\frac{1}{2} + n}} \Gamma \left( \frac{1}{2} + n \right) \gamma \left( \frac{1}{2} + n, x \right)$$

using SpecialFunctions

# Incomplete gamma function implementation.
boys2(n, x) = gamma(0.5 + n) * gamma_inc(0.5 + n, x, 0)[1] / (2x^(0.5 + n))
@benchmark boys2.(n, x̃)
BenchmarkTools.Trial: 
  memory estimate:  406.52 KiB
  allocs estimate:  1005
  --------------
  minimum time:     1.317 ms (0.00% GC)
  median time:      1.345 ms (0.00% GC)
  mean time:        1.380 ms (0.33% GC)
  maximum time:     2.880 ms (0.00% GC)
  --------------
  samples:          3614
  evals/sample:     1

Using quadratures

The simplest option is a quadrature of

$$\int_0^1 t^{2n} e^{-x t^2} dt$$

using QuadGK

# Adaptive Gauss-Kronrod quadrature implementation.
boys1(n, x) = quadgk(t -> t^(2n) * exp(-x * t^2), 0.0, 1.0)[1]
@benchmark boys1.(n, x̃)
BenchmarkTools.Trial: 
  memory estimate:  8.32 MiB
  allocs estimate:  326405
  --------------
  minimum time:     23.747 ms (0.00% GC)
  median time:      24.629 ms (0.00% GC)
  mean time:        25.839 ms (3.78% GC)
  maximum time:     45.680 ms (0.00% GC)
  --------------
  samples:          194
  evals/sample:     1

Using the confluent hypergeometric function

Yet another relation, mentioned by Joshua above, uses the Kummer confluent hypergeometric function in

$$F_n(x) = \frac{{_1}F_1 \left( \frac{1}{2} + n, \frac{3}{2} + n, -x \right)}{2n + 1}$$

using using HypergeometricFunctions

# Confluent hypergeometric function implementation.
boys3(n, x) = pFq([0.5 + n], [1.5 + n], -x) / (2n + 1)
@benchmark boys3.(n, x̃)
BenchmarkTools.Trial: 
  memory estimate:  18.03 MiB
  allocs estimate:  162806
  --------------
  minimum time:     58.511 ms (0.00% GC)
  median time:      59.524 ms (0.00% GC)
  mean time:        60.007 ms (1.19% GC)
  maximum time:     65.223 ms (0.00% GC)
  --------------
  samples:          84
  evals/sample:     1

Approximations

Small values of $x$

We can expand the integrand $t^{2n} e^{-x t^2}$ as a series around $x = 0$ and obtain successive polynomial approximations to the Boys function. Below is a systematic way of doing that (using SymPy in Julia):

using SymPy

I = t^(2n) * exp(-x * t^2)

order = 2  # The order of the final polynomial.
I_taylor = series(I, x, 0, order)
B = SymPy.integrate(I_taylor, (t, 0, 1)) |> simplify

$$B^2_n(x) = \frac{3 + 2n - (2n + 1) x}{(2n + 1) (2n + 3)} + O(x^2)$$

Such approximations can be used for small values of $x$ (they are safer for $x \approx 0$ than some implementations above). An approximation of order 10 is compared below with a quadrature of the Boys function.

Boys function approximation of order 10

Large values $x$

Since the Boys function is decreasing, we could, for large $x$, approximate

$$\int_0^1 t^{2n} e^{-x t^2} \approx \int_0^\infty t^{2n} e^{-x t^2} = \frac{1}{2x^{\frac{1}{2} + n}} \Gamma \left( \frac{1}{2} + n \right) = \frac{(2n - 1)!!}{2^{n + 1}} \sqrt{\frac{\pi}{x^{2n + 1}}}$$

(I learned this trick here.) Compare this to the Boys function below.

Boys function approximation at infinity


Further reading

The above are obviously toy examples. Here are some nice reads:

  • Journal of Mathematical Chemistry 54, 2022-2047 (2016)
  • Journal of Computational Chemistry 36, 1390-1398 (2015)
  • Journal of Mathematical Chemistry 40, 179-183 (2006)