Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement a forward_max_backward_argmax() recipe #576

Draft
wants to merge 3 commits into
base: master
Choose a base branch
from
Draft
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Start to support Gaussian.reduce(ops.max, ...)
  • Loading branch information
fritzo committed Oct 19, 2021

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
commit dffb4a7661dd92b4649031b06f2f9ff88cad50b9
29 changes: 29 additions & 0 deletions funsor/gaussian.py
Original file line number Diff line number Diff line change
@@ -965,6 +965,35 @@ def eager_reduce(self, op, reduced_vars):

return Gaussian(white_vec, prec_sqrt, inputs)

elif op is ops.max:
# Marginalize out real variables, but keep mixtures lazy.
assert all(v in self.inputs for v in reduced_vars)
real_vars = frozenset(
k for k, d in self.inputs.items() if d.dtype == "real"
)
reduced_reals = reduced_vars & real_vars
reduced_ints = reduced_vars - real_vars
if reduced_ints:
raise NotImplementedError("TODO argmax over Gaussian mixtures")

inputs = OrderedDict(
(k, d) for k, d in self.inputs.items() if k not in reduced_reals
)
int_inputs = OrderedDict(
(k, v) for k, v in inputs.items() if v.dtype != "real"
)
if reduced_reals == real_vars:
if self.rank <= self.prec_sqrt.shape[-2]:
return 0.0
# Otherwise compress.
white_vec, prec_sqrt, shift = _compress_rank(
self.white_vec, self.prec_sqrt
)
return Tensor(shift, int_inputs)

# FIXME
raise NotImplementedError("TODO partial max")

return None # defer to default implementation

def _sample(self, sampled_vars, sample_inputs, rng_key):