-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement more generic distributions on Manifolds #57
Comments
We can also add utilities to aid in implementing custom distributions. As an example, consider However, Of course, we could also consider a traits-based approach. |
A great topic for discussion although I'll probably have more questions than answers here. How do you sample from a general |
In general you can exactly sample through rejection sampling. However, if the submanifold looks like a shell like a sphere, this will essentially never produce a valid sample. This works for bounded volumes like a ball though if the density is sufficiently high. But if the distribution in the ambient space has a log density, and you have an MCMC technique for sampling points on the manifold, then you can get a correlated sample. We could have a utility function that designates whether rejection sampling is a suitable default. |
My intuition says that more generally if a constraint that defines a submanifold can be written with an equal sign, then rejection sampling is not applicable. |
Right, but almost all manifolds we have are of measure zero in the ambient space. It wouldn't be a particularly useful distribution type here. |
Yes, that's true for our current manifolds, but not necessarily every manifold we or others will implement. 🙂 And that's only relevant for drawing exact samples. Because we can write down a log density within proportionality, we can still sample from them with MCMC and therefore use them as priors in Bayesian inference using just the defaults. That's powerful enough to include them. And the most widely used distributions on our existing manifolds are restrictions. e.g. von Mises-Fisher and Bingham on Each of these generic distributions will have trade-offs. We can exactly sample from As long as the limitations are clearly documented, I don't see a downside to including them. |
I have a short question on this topic: What's the easiest way to generate a random point on a manifold? Should we define a “default distribution” for certain manifolds? Today I noticed that while |
I guess it depends what you mean by "random". For a compact embedded manifold, the easiest way will be to sample with a multivariate normal in the ambient space and project to the manifold. For an abstract manifold, it's trickier. If you already have a point, you can sample from an multivariate normal in the tangent space. But neither of these is probably what one would naively expect when one uses All that to say, if the point is to just give the user a point on the manifold, I think a function like |
Yes, I think I am mainly asking for the ones you mentioned, just ran into that after 5 minutes of starting to port For the real-valued case Edit: Just for completeness, When starting an algorithm I was often using some random initialisation (not to be confused with our allocation, really a valid point on
I am not saying my old approach should be transferred directly, but it would be neat (if possible for a manifold) to have some default random point generator. |
First proposal for a default, since I just stumbled upon that: rand(M::Sphere) = rand(uniform_distribution(M,zeros(representation_size(M)))) and I am not claiming that this is easy or straightforward, for sure. It would just be nice (for optimization algorithms for example) to just have some point on M to start with (either uniform or from some gaussian). |
My more recent thinking on this is that for now at least, each manifold should have some default uniform measure/distribution, chosen to correspond to the most common notion of uniformity for that manifold, and normalized to be a probability measure whenever possible. This is important for providing
We can dispatch on all three of these only if a manifold distribution somehow is typed on its uniform distribution, but then we're departing from how Distributions usually works and moving more towards Measures.jl. In Distributions, every support implicitly assumes a base measure (see JuliaStats/Distributions.jl#224 (comment)), so for consistency, we could just continue that behavior here until something fancier like Measures.jl is complete or there's high demand for different base measures. For every |
And one more note, in order to have edit: And @kellertuer, then we can get the function you want with the default |
After further thought, I think it makes more sense to explicitly have |
Coming back to distributions (or my |
I think this is no longer relevant to Manifolds.jl since it's going to be solved in ManifoldMeasures.jl? |
It maybe should be moved to ManifoldMeasures.jl, which is carrying out the plan discussed here, but given how experimental that package is and that it itself could be discontinued, I'd hold off until it is released. |
This issue narrows the discussion in #3 to implementing distributions on Manifolds. It also continues some discussion from https://discourse.julialang.org/t/rfc-taking-directional-orientational-statistics-seriously/31951 and on the Slack channel.
There are a few generic distributions on manifolds that we'll want, with examples in directional statistics noted where available and not obvious:
Dirac
: Dirac measure, has 0 density everywhere but at a fixed point.Mixture
: alias ofDistributions.MixtureModel
typed on our supports and variatesUniform
: similar toDistributions.Uniform
but defined on manifolds.Product
: product distribution analogous toDistributions.Product
onProductManifold
Power
: product distribution on sub-manifolds of the power-manifoldProduct
but on aPowerManifold
. Useful for cases like Matrix von Mises-Fisher on the Steifel manifold which can be composed from a product of von Mises-Fisher distributions with different means and concentrations on the spherical submanifoldsProjected
: distribution on ambient space projected to submanifoldRestricted
,Conditioned
, orIntersection
: distribution on manifold resulting from intersection of a distribution on ambient space to the manifoldRetracted
: distribution on tangent space at point retracted to manifoldRiemannian
??: Similar toRetracted
but with adjustment for curvature (name chosen because "Riemannian normal" is the only well-studied example of this I know).IsotropicDiffusion
orBrownian
: Distribution resulting from brownian motion for a fixed time aka solution to the heat equation on the manifoldThere are plenty of Lie group specific ones that we can address when we've finished implementing Lie groups, i.e distributions resulting from quotient, actions, etc.
I've intentionally abbreviated the names above. While descriptive names like we currently have (e.g.
ProductPointDistribution
) provide clarity, they're a major eyesore if one wants to use the distribution e.g. in PPLs. We could have aDistributions
submodule that contains these types so we can dropDistribution
from the name, and we could use types to dropPoint
, etc., but I've not put much thought in this.It's worth noting that implementing these generic distributions will handle nearly every case covered in the above Discourse post on directional statistics and also make it a lot easier to construct new distributions that haven't to my knowledge been tested (e.g. a Riemannian Cauchy distribution) for research purposes.
The text was updated successfully, but these errors were encountered: