Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding cumsum_ #10487

Merged
merged 9 commits into from
Feb 16, 2023
Merged

Adding cumsum_ #10487

merged 9 commits into from
Feb 16, 2023

Conversation

soma2000-lang
Copy link
Contributor

@soma2000-lang soma2000-lang commented Feb 11, 2023

@ivy-leaves ivy-leaves added the PyTorch Frontend Developing the PyTorch Frontend, checklist triggered by commenting add_frontend_checklist label Feb 11, 2023
Copy link
Contributor

@guillesanbri guillesanbri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, thanks for the contribution! I've left a couple of comments requesting changes, let me know if you need anything!

@@ -525,6 +525,11 @@ def flatten(self, start_dim, end_dim):
def cumsum(self, dim, dtype):
return torch_frontend.cumsum(self._ivy_array, dim, dtype=dtype)

@with_unsupported_dtypes({"1.11.0 and below": ("float16",)}, "torch")
def cumsum_(self, dim, dtype):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

def test_torch_instance_cumsum_(
dtype_value,
dim,
dtypes,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

seems like this is already generated in dtype_value

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@guillesanbri So we need to remove dtype_value from the argument right?

Copy link
Contributor

@guillesanbri guillesanbri Feb 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, dtypes has to be removed from the args. You are generating dtype_value and dim in the decorator (which is fine), but the function is expecting dtype, dim and dtypes

method_input_dtypes=dtypes,
method_all_as_kwargs_np={
"dim": dim,
"dtype": dtypes[0],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dtypes is not being generated, instead, use input_dtype

@soma2000-lang
Copy link
Contributor Author

@guillesanbri sorry for the trouble,made the changes but still failing!

Copy link
Contributor

@guillesanbri guillesanbri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just left a couple of changes more, should be good to go after this! thanks!

init_tree="torch.tensor",
method_name="cumsum_",
dtype_value=helpers.dtype_and_values(
available_dtypes=helpers.get_dtypes("valid"),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

seems like the ground truth does not support bool, you can change "valid" to "numeric" here

@@ -525,6 +525,11 @@ def flatten(self, start_dim, end_dim):
def cumsum(self, dim, dtype):
return torch_frontend.cumsum(self._ivy_array, dim, dtype=dtype)

@with_unsupported_dtypes({"1.11.0 and below": ("float16",)}, "torch")
def cumsum_(self, dim, dtype=None):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dtype has to be a keyword argument, in order for the test to detect this automatically, you should add *, between dim and dtype (i.e. def cumsum_(self, dim, *, dtype=None):)

@soma2000-lang
Copy link
Contributor Author

@guillesanbri Done! Thanks

@guillesanbri
Copy link
Contributor

lgtm! I'll merge it now, thanks for the contribution! 😄

@guillesanbri guillesanbri merged commit 513f15d into ivy-llc:master Feb 16, 2023
vedpatwardhan pushed a commit to vedpatwardhan/ivy that referenced this pull request Feb 23, 2023
vedpatwardhan pushed a commit to vedpatwardhan/ivy that referenced this pull request Feb 26, 2023
vedpatwardhan pushed a commit to vedpatwardhan/ivy that referenced this pull request Feb 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
PyTorch Frontend Developing the PyTorch Frontend, checklist triggered by commenting add_frontend_checklist
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants