Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix bug due to __props__ in OpFromGraph subclasses #981

Merged
merged 2 commits into from
Aug 24, 2024

Conversation

ricardoV94
Copy link
Member

@ricardoV94 ricardoV94 commented Aug 24, 2024

When specified, Ops with identical __props__ are considered identical, in that they can be swapped and given the original inputs to obtain the same output.

Description

This was causing distinct OFG to be reused incorrectly for different nodes during function inner graph cloning. When provided Ops with the same __props__ are considered identical and can be swapped / recycled. It would be fine for OFG to have __props__ but would need to include the fgraph as well. However, then, the fgraph should be frozen because __props__ are also used for hashing and can't be mutable. For now I removed the __props__ from the two use cases.

The relevant function is

def clone_node_and_cache(
node: Apply,
clone_d: dict[Union[Apply, Variable, "Op"], Union[Apply, Variable, "Op"]],
clone_inner_graphs=False,
**kwargs,
) -> Apply | None:
"""Clone an `Apply` node and cache the results in `clone_d`.
This function handles `Op` clones that are generated by inner-graph
cloning.
Returns
-------
``None`` if all of `node`'s outputs are already in `clone_d`; otherwise,
return the clone of `node`.
"""
if all(out in clone_d for out in node.outputs):
# If all of `node`'s outputs already have replacements or clones in
# `clone_d`, then there's likely no need to clone it
return None
# Use a cached `Op` clone when available
new_op: Op | None = cast(Optional["Op"], clone_d.get(node.op))
cloned_inputs: list[Variable] = [cast(Variable, clone_d[i]) for i in node.inputs]
new_node = node.clone_with_new_inputs(
cloned_inputs,
# Only clone inner-graph `Op`s when there isn't a cached clone (and
# when `clone_inner_graphs` is enabled)
clone_inner_graph=clone_inner_graphs if new_op is None else False,
**kwargs,
)
if new_op:
# If we didn't clone the inner-graph `Op` above, because
# there was a cached version, set the cloned `Apply` to use
# the cached clone `Op`
new_node.op = new_op
clone_d[node] = new_node
if new_node.op is not node.op:
clone_d.setdefault(node.op, new_node.op)
for old_o, new_o in zip(node.outputs, new_node.outputs):
clone_d.setdefault(old_o, new_o)
return new_node

When specified, Ops with identical __props__ are considered identical, in that they can be swapped and given the original inputs to obtain the same output.
@ricardoV94 ricardoV94 marked this pull request as ready for review August 24, 2024 11:49
Copy link

codecov bot commented Aug 24, 2024

Codecov Report

Attention: Patch coverage is 75.00000% with 1 line in your changes missing coverage. Please review.

Project coverage is 81.72%. Comparing base (1509cee) to head (fbd1e4d).
Report is 110 commits behind head on main.

Files with missing lines Patch % Lines
pytensor/tensor/einsum.py 50.00% 1 Missing ⚠️
Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #981      +/-   ##
==========================================
+ Coverage   81.70%   81.72%   +0.02%     
==========================================
  Files         182      182              
  Lines       47596    47598       +2     
  Branches    11589    11589              
==========================================
+ Hits        38889    38901      +12     
+ Misses       6518     6508      -10     
  Partials     2189     2189              
Files with missing lines Coverage Δ
pytensor/tensor/basic.py 91.42% <100.00%> (+<0.01%) ⬆️
pytensor/tensor/einsum.py 96.41% <50.00%> (-0.50%) ⬇️

... and 4 files with indirect coverage changes

---- 🚨 Try these New Features:

@ricardoV94 ricardoV94 merged commit 18d73db into pymc-devs:main Aug 24, 2024
59 of 60 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants