Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable Gauss-Kronrod rules #15

Closed
jwscook opened this issue Sep 8, 2022 · 6 comments
Closed

Enable Gauss-Kronrod rules #15

jwscook opened this issue Sep 8, 2022 · 6 comments

Comments

@jwscook
Copy link

jwscook commented Sep 8, 2022

I'd be interested to see if this package could compose with the QuadGK.jl package for Gauss-Kronrod points. This type of points has Gauss points mixed in with Kronrod points, together with weights. The Kronrod points work to augment the accuracy of the overall scheme as well as to provide an estimate of the error.

julia> FastGaussQuadrature.gausslegendre(2)
([-0.5773502691896258, 0.5773502691896258], [1.0, 1.0])

julia> QuadGK.kronrod(2)
([-0.9258200997725514, -0.5773502691896257, 0.0], [0.19797979797979798, 0.4909090909090911, 0.6222222222222223], [1.0000000000000002])

How easy would it be to accommodate this functionality?

@robertdj
Copy link
Owner

robertdj commented Sep 8, 2022

It's been a while since I worked on this package, so I don't have an answer off the top of my head.
Does it work if you provide the nodes and weights from QuadGK.kronrod to sparsegrid?

@jwscook
Copy link
Author

jwscook commented Sep 8, 2022

I've given that a go, but it's both slower and less accurate than HCubature.jl

using QuadGK, SparseGrids, HCubature

function gaussrule(order)
  allnodes, _, partialweights = QuadGK.kronrod(order)
  nodes = vcat(allnodes[2:2:end], -reverse(allnodes[2:2:end-isodd(order)]))
  weights = vcat(partialweights, reverse(partialweights[1:end-isodd(order)]))
  return nodes, weights
end

function _kronrodrule(order, a, b=1)
  allnodes, partialweights, _ = QuadGK.kronrod(order)
  M = length(allnodes)
  mask1 = a:2:(order + 1 + isodd(order) * b)
  mask2 = a:2:(order - isodd(order))
  nodes = vcat(allnodes[mask1], -reverse(allnodes[mask2]))
  weights = vcat(partialweights[mask1], reverse(partialweights[mask2]))
  return nodes, weights
end

kronrodrulekronrod(order) = _kronrodrule(order, 1, 0)
kronrodrulegauss(order) = _kronrodrule(order, 2)

function cubaturegk(f::F, xmin, xmax, order=5, rtol=sqrt(eps()), atol=eps()
    ) where F
  dims = length(xmin)

  gn, gw = SparseGrids.sparsegrid(dims, order, gaussrule)
  knk, kwk = SparseGrids.sparsegrid(dims, order, kronrodrulekronrod)
  _, kwg = SparseGrids.sparsegrid(dims, order, kronrodrulegauss)
  #kng == gn

  xlen = xmax .- xmin

  f1 = f(gn[1] .* xlen .+ xmin)

  gresult = f1 .* 0
  kresult = deepcopy(gresult)
  for i in eachindex(gn)
    fx = i == 1 ? f1 : f(gn[i] .* xlen .+ xmin)
    gresult += gw[i] * fx
    kresult += kwg[i] * fx
  end
  for i in eachindex(kwk)
    fx = f(knk[i] .* xlen .+ xmin)
    kresult += kwk[i] * fx
  end

  changeofvariables = prod(xmax .- xmin) * 2^dims
  gresult .*= changeofvariables / 2
  kresult .*= changeofvariables / 2
  @show gresult, kresult, kresult - gresult

  return (kresult, abs.(kresult .- gresult))
end

f(x) = [exp(-sum(abs2, x))/sqrt(pi)^length(x), 1]

@show "3D"
I, E = cubaturegk(f, (-6,-6,-6), (6,6,6), 50)
@show I, E
@show HCubature.hcubature(f, (-6,-6,-6), (6,6,6))

@show "2D"
I, E = cubaturegk(f, (-6, -6), (6, 6), 100)
@show I, E
@show HCubature.hcubature(f, (-6, -6), (6, 6))

@show "1D"
I, E = cubaturegk(f, (-6,), (6,), 50)
@show I, E
@show HCubature.hcubature(f, (-6,), (6,))

Maybe it's not worth doing?

@robertdj
Copy link
Owner

Sorry -- I've been caught up with other things.

What part is slower here: Computing the nodes for a sparse grid or evaluating the function? Because the former should be done only once and it might be okay if that is slow.

Which quadrature rule require fewer function evaluations is probably difficult to say a priori.

@jwscook
Copy link
Author

jwscook commented Sep 17, 2022

I don't think there's anything wrong with this composition of GaussGK and SparseGrids, but Genz-Malik cubature in HCubature is better suited to this problem.

@robertdj
Copy link
Owner

I agree.

In general, I would expect the adaptive HCubature to be more precise. The benefit of SparseGrids is that for a given accuracy it may use fewer function evaluations. But that only helps if it is feasible for you to compute the weights and nodes in advance.

@jwscook
Copy link
Author

jwscook commented Sep 18, 2022

GaussGK caches the nodes and weights so they're pretty cheap to come by. Anyway, it was a nice experiment but I don't think it should be taken any further. Thanks for your input.

@jwscook jwscook closed this as completed Sep 18, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants