Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding multiple Statistical functions to paddle backend #10869

Merged
merged 11 commits into from
Feb 25, 2023

Conversation

KevinUnadkat
Copy link
Contributor

Added prod from Statistical to paddle backend

@KevinUnadkat
Copy link
Contributor Author

KevinUnadkat commented Feb 23, 2023

issues -
prod #10868
var #10875
sum #10874
std #10873

@KevinUnadkat
Copy link
Contributor Author

hy
@MahmoudAshraf97
pls review this file it is probably ready to merge

@KevinUnadkat KevinUnadkat changed the title prod Adding multiple Statistical functions to paddle backend Feb 23, 2023
@@ -72,7 +72,7 @@ def std(
keepdims: bool = False,
out: Optional[paddle.Tensor] = None,
) -> paddle.Tensor:
raise IvyNotImplementedException()
return paddle.std(x, dim=axis, unbiased=False, keepdim=keepdims)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please review the input arguments and make sure that the output adheres to them

@@ -84,7 +84,7 @@ def sum(
keepdims: bool = False,
out: Optional[paddle.Tensor] = None,
) -> paddle.Tensor:
raise IvyNotImplementedException()
return torch.sum(input=x, dim=axis, dtype=dtype, keepdim=keepdims)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use paddle implementation, and check that the args and kwargs are correct

@@ -96,7 +96,7 @@ def var(
keepdims: bool = False,
out: Optional[paddle.Tensor] = None,
) -> paddle.Tensor:
raise IvyNotImplementedException()
return paddle.var(x, dim=axis, unbiased=False, keepdim=keepdims)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same comments as above

@KevinUnadkat
Copy link
Contributor Author

@MahmoudAshraf97
made the changes , everything seems good now

@@ -72,7 +72,7 @@ def std(
keepdims: bool = False,
out: Optional[paddle.Tensor] = None,
) -> paddle.Tensor:
raise IvyNotImplementedException()
return paddle.std(x, axis=axis, unbiased=True, keepdim=keepdims)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you need to implement the correction parameter, the unbiased parameter in paddle sets correction = 1, but it needs to be a float, check the torch backend implementation as it's the closest one to paddle

@@ -96,7 +96,7 @@ def var(
keepdims: bool = False,
out: Optional[paddle.Tensor] = None,
) -> paddle.Tensor:
raise IvyNotImplementedException()
return paddle.var(x, axis, unbiased=true, keepdim=keepdims)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same comment as std

@@ -112,7 +112,7 @@ def cumprod(
dtype: Optional[paddle.dtype] = None,
out: Optional[paddle.Tensor] = None,
) -> paddle.Tensor:
raise IvyNotImplementedException()
return paddle.cumprod(x, dim=None, dtype=None, name=None)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you need to implement the exclusive and reverse arguments here

@@ -124,12 +124,12 @@ def cumsum(
dtype: Optional[paddle.dtype] = None,
out: Optional[paddle.Tensor] = None,
) -> paddle.Tensor:
raise IvyNotImplementedException()
return paddle.cumsum(x, axis=axis, dtype=dtype)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same comment as cumprod



def einsum(
equation: str,
*operands: paddle.Tensor,
out: Optional[paddle.Tensor] = None,
) -> paddle.Tensor:
raise IvyNotImplementedException()
return paddle.einsum(equation, *operands)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please verify the results with another backend to make sure the implementation is correct

@MahmoudAshraf97
Copy link
Contributor

I recommend you to read both Ivy documentation and paddle documentation when implementing a function in order to know the difference in implementation because some functions work differently, also check other backends implementations for reference

for a in axis:
size *= x.shape[a]
if size - correction <= 0:
ret = paddle.var(x, axis==axis, unbiased=False, keepdim=keepdims)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

syntax error

ret = paddle.var(x, axis==axis, unbiased=False, keepdim=keepdims)
ret = ivy.full(ret.shape, float("nan"), dtype=ret.dtype)
return ret
ret = torch.mul(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please correct this

@@ -112,7 +151,28 @@ def cumprod(
dtype: Optional[paddle.dtype] = None,
out: Optional[paddle.Tensor] = None,
) -> paddle.Tensor:
raise IvyNotImplementedException()
dtype = ivy.as_native_dtype(dtype)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please rewrite this function from scratch and ensure that it's free from syntax errors and arguments are passed correctly, or remove it from the PR

@MahmoudAshraf97 MahmoudAshraf97 merged commit dabafba into ivy-llc:PaddlePaddle Feb 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants