-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding multiple Statistical functions to paddle backend #10869
Adding multiple Statistical functions to paddle backend #10869
Conversation
hy |
@@ -72,7 +72,7 @@ def std( | |||
keepdims: bool = False, | |||
out: Optional[paddle.Tensor] = None, | |||
) -> paddle.Tensor: | |||
raise IvyNotImplementedException() | |||
return paddle.std(x, dim=axis, unbiased=False, keepdim=keepdims) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please review the input arguments and make sure that the output adheres to them
@@ -84,7 +84,7 @@ def sum( | |||
keepdims: bool = False, | |||
out: Optional[paddle.Tensor] = None, | |||
) -> paddle.Tensor: | |||
raise IvyNotImplementedException() | |||
return torch.sum(input=x, dim=axis, dtype=dtype, keepdim=keepdims) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use paddle implementation, and check that the args and kwargs are correct
@@ -96,7 +96,7 @@ def var( | |||
keepdims: bool = False, | |||
out: Optional[paddle.Tensor] = None, | |||
) -> paddle.Tensor: | |||
raise IvyNotImplementedException() | |||
return paddle.var(x, dim=axis, unbiased=False, keepdim=keepdims) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same comments as above
@MahmoudAshraf97 |
@@ -72,7 +72,7 @@ def std( | |||
keepdims: bool = False, | |||
out: Optional[paddle.Tensor] = None, | |||
) -> paddle.Tensor: | |||
raise IvyNotImplementedException() | |||
return paddle.std(x, axis=axis, unbiased=True, keepdim=keepdims) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you need to implement the correction parameter, the unbiased parameter in paddle sets correction = 1, but it needs to be a float, check the torch backend implementation as it's the closest one to paddle
@@ -96,7 +96,7 @@ def var( | |||
keepdims: bool = False, | |||
out: Optional[paddle.Tensor] = None, | |||
) -> paddle.Tensor: | |||
raise IvyNotImplementedException() | |||
return paddle.var(x, axis, unbiased=true, keepdim=keepdims) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same comment as std
@@ -112,7 +112,7 @@ def cumprod( | |||
dtype: Optional[paddle.dtype] = None, | |||
out: Optional[paddle.Tensor] = None, | |||
) -> paddle.Tensor: | |||
raise IvyNotImplementedException() | |||
return paddle.cumprod(x, dim=None, dtype=None, name=None) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you need to implement the exclusive
and reverse
arguments here
@@ -124,12 +124,12 @@ def cumsum( | |||
dtype: Optional[paddle.dtype] = None, | |||
out: Optional[paddle.Tensor] = None, | |||
) -> paddle.Tensor: | |||
raise IvyNotImplementedException() | |||
return paddle.cumsum(x, axis=axis, dtype=dtype) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same comment as cumprod
|
||
|
||
def einsum( | ||
equation: str, | ||
*operands: paddle.Tensor, | ||
out: Optional[paddle.Tensor] = None, | ||
) -> paddle.Tensor: | ||
raise IvyNotImplementedException() | ||
return paddle.einsum(equation, *operands) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please verify the results with another backend to make sure the implementation is correct
I recommend you to read both Ivy documentation and paddle documentation when implementing a function in order to know the difference in implementation because some functions work differently, also check other backends implementations for reference |
for a in axis: | ||
size *= x.shape[a] | ||
if size - correction <= 0: | ||
ret = paddle.var(x, axis==axis, unbiased=False, keepdim=keepdims) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
syntax error
ret = paddle.var(x, axis==axis, unbiased=False, keepdim=keepdims) | ||
ret = ivy.full(ret.shape, float("nan"), dtype=ret.dtype) | ||
return ret | ||
ret = torch.mul( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please correct this
@@ -112,7 +151,28 @@ def cumprod( | |||
dtype: Optional[paddle.dtype] = None, | |||
out: Optional[paddle.Tensor] = None, | |||
) -> paddle.Tensor: | |||
raise IvyNotImplementedException() | |||
dtype = ivy.as_native_dtype(dtype) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please rewrite this function from scratch and ensure that it's free from syntax errors and arguments are passed correctly, or remove it from the PR
Added prod from Statistical to paddle backend