Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change LinalgExt::EncodingAttr from enum to structured. #14336

Merged
merged 2 commits into from
Jul 11, 2023

Conversation

bjacob
Copy link
Contributor

@bjacob bjacob commented Jul 7, 2023

This has long been discussed, as the enum was just an initial design shortcut, but the number of enum cases was already getting uncomfortable due to the multiple dimensions there.

This is a step in a chain towards fixing #11632. The reason is that in order to properly specialize for narrow matmul cases in MaterializeEncoding, selecting adequately narrow matmul kernels that avoid widening the entire matmul problem at hand, we will need to know there the original (pre-padding) matrix shape. Since MaterializeEncoding is a type-conversion, not just an ordinary rewrite pattern, this information will need to be encoded in types --- we won't just be able to walk from a value to its defining op to find the pre-padding value, there just aren't values there. So I will want to add the pre-padding shape (or type) to EncodingAttr. This is a step towards that: by making EncodingAttr a data structure, it's easy then to add another field. By contrast, if it's still an enum, the combinatorics get out of hand.

@bjacob bjacob marked this pull request as ready for review July 7, 2023 19:05
@bjacob bjacob force-pushed the bjacob-encoding-structured branch from 1d2d79a to 23bd44c Compare July 7, 2023 19:35
@ScottTodd
Copy link
Member

We had an issue with CI triggering. Can you sync this past 68350cb to run presubmits?

@bjacob bjacob force-pushed the bjacob-encoding-structured branch 3 times, most recently from b822caa to ba1ec4a Compare July 10, 2023 15:02
@bjacob
Copy link
Contributor Author

bjacob commented Jul 10, 2023

@MaheshRavishankar @hanhanW do you want to take a look?

Copy link
Contributor

@hanhanW hanhanW left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

overall looks good to me, just few style nits

@bjacob bjacob force-pushed the bjacob-encoding-structured branch from ba1ec4a to 5da9f45 Compare July 10, 2023 18:04
@MaheshRavishankar
Copy link
Contributor

@MaheshRavishankar @hanhanW do you want to take a look?

Ill take a look. Could you wait till EoD for me to review. This is on top of my stack.

@bjacob
Copy link
Contributor Author

bjacob commented Jul 10, 2023

@MaheshRavishankar @hanhanW do you want to take a look?

Ill take a look. Could you wait till EoD for me to review. This is on top of my stack.

@MaheshRavishankar - sure, no problem.

@bjacob bjacob force-pushed the bjacob-encoding-structured branch from 5da9f45 to 034c1c9 Compare July 10, 2023 18:39
Copy link
Contributor

@MaheshRavishankar MaheshRavishankar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Posting some intermediate comments.... The diff updated while I was commenting. I marked as changes requested, but I am fine with landing this as is for the most part as well.

def MATMUL_BF16BF16F32 : I32EnumAttrCase<"MATMUL_BF16BF16F32", 4>;
def MATMUL_BF16BF16BF16 : I32EnumAttrCase<"MATMUL_BF16BF16BF16", 5>;

def EncodingOpKind
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it better to say something like EncodingUserKind? or just UserInfo cause it is encoding information about the user of the tensor (in this case the linalg.matmul)

Copy link
Contributor Author

@bjacob bjacob Jul 10, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have updated this PR with a tentative renaming along these lines, as a separate commit c658894 so it's easy to change into anything else, particularly as it was generated by a command line:

git grep --name-only -e '\(\bop_kind\b\|[oO]pKind\b\)' | grep -v dispatch_profiler | xargs sed -i 's/\bop_kind\b/user/g;s/opKind/user/g;s/OpKind/User/g'

So this is doing User not UserKind but we can do that as well. See my comment at https://discord.com/channels/689900678990135345/690274711523164166/1128034873718878290

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note: avoiding the Info terminology because that is already how we refer to the result of MaterializeEncoding (MaterializeEncodingInfo is a struct holding materialized tile sizes).

@bjacob bjacob force-pushed the bjacob-encoding-structured branch from 034c1c9 to c658894 Compare July 10, 2023 20:40
@bjacob bjacob requested a review from MaheshRavishankar July 10, 2023 20:58
@MaheshRavishankar
Copy link
Contributor

Cool! Thanks Benoit for doing this!

@bjacob bjacob merged commit 0061355 into main Jul 11, 2023
@bjacob bjacob deleted the bjacob-encoding-structured branch July 11, 2023 00:32
nhasabni pushed a commit to plaidml/iree that referenced this pull request Aug 24, 2023
This has long been discussed, as the enum was just an initial design
shortcut, but the number of enum cases was already getting uncomfortable
due to the multiple dimensions there.

This is a step in a chain towards fixing
iree-org#11632. The reason is that in
order to properly specialize for narrow matmul cases in
`MaterializeEncoding`, selecting adequately narrow matmul kernels that
avoid widening the entire matmul problem at hand, we will need to know
there the original (pre-padding) matrix shape. Since
`MaterializeEncoding` is a type-conversion, not just an ordinary rewrite
pattern, this information will need to be encoded in types --- we won't
just be able to walk from a value to its defining op to find the
pre-padding value, there just aren't values there. So I will want to add
the pre-padding shape (or type) to EncodingAttr. This is a step towards
that: by making EncodingAttr a data structure, it's easy then to add
another field. By contrast, if it's still an enum, the combinatorics get
out of hand.
bjacob added a commit that referenced this pull request Oct 16, 2023
This has been discussed for a while: ever since #14336 made encodings a
data structure, it was an odd remnant that we were still encoding the
element types tuple in the user enum. This was cumbersome, and
resurfaced in every design discussion as it looked like something that
wasn't scaling with new data types. Concretely, this is good to fix
ahead of adding `i16xi16` and `i16xi4` data-tiling support (#15158).
bjacob added a commit to bjacob/iree that referenced this pull request Oct 16, 2023
…-org#15182)

This has been discussed for a while: ever since iree-org#14336 made encodings a
data structure, it was an odd remnant that we were still encoding the
element types tuple in the user enum. This was cumbersome, and
resurfaced in every design discussion as it looked like something that
wasn't scaling with new data types. Concretely, this is good to fix
ahead of adding `i16xi16` and `i16xi4` data-tiling support (iree-org#15158).
ramiro050 pushed a commit to ramiro050/iree that referenced this pull request Dec 19, 2023
…-org#15182)

This has been discussed for a while: ever since iree-org#14336 made encodings a
data structure, it was an odd remnant that we were still encoding the
element types tuple in the user enum. This was cumbersome, and
resurfaced in every design discussion as it looked like something that
wasn't scaling with new data types. Concretely, this is good to fix
ahead of adding `i16xi16` and `i16xi4` data-tiling support (iree-org#15158).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants