Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-39476][SQL] Disable Unwrap cast optimize when casting from Long to Float/ Double or from Integer to Float #36873

Closed
wants to merge 3 commits into from

Conversation

WangGuangxin
Copy link
Contributor

What changes were proposed in this pull request?

Cast from Integer to Float or from Long to Double/Float may loss precision if the length of Integer/Long beyonds the significant digits of a Double(which is 15 or 16 digits) or Float(which is 7 or 8 digits).

For example, select *, cast(a as int) from (select cast(33554435 as foat) a ) gives 33554436 instead of 33554435.

When it comes the optimization rule UnwrapCastInBinaryComparison, it may result in incorrect (confused) result .
We can reproduce it with following script.

spark.range(10).map(i => 64707595868612313L).createOrReplaceTempView("tbl")
val df = sql("select * from tbl where cast(value as double) = cast('64707595868612313' as double)")
df.explain(true)
df.show()

With we disable this optimization rule , it returns 10 records.
But if we enable this optimization rule, it returns empty, since the sql is optimized to

select * from tbl where value = 64707595868612312L

Why are the changes needed?

Fix the behavior that may confuse users (or maybe a bug?)

Does this PR introduce any user-facing change?

No

How was this patch tested?

Add a new UT

@github-actions github-actions bot added the SQL label Jun 15, 2022
@WangGuangxin WangGuangxin force-pushed the SPARK-24994-followup branch from 8ff6461 to cce6c87 Compare June 15, 2022 03:59
@ulysses-you
Copy link
Contributor

you should file a new ticket since SPARK-24994 is already released ..

@WangGuangxin WangGuangxin changed the title [SPARK-24994][SQL][FOLLOW-UP] Disable Unwrap cast optimize when casting from Long to Float/ Double or from Integer to Float [SPARK-39476][SQL] Disable Unwrap cast optimize when casting from Long to Float/ Double or from Integer to Float Jun 15, 2022
@WangGuangxin
Copy link
Contributor Author

you should file a new ticket since SPARK-24994 is already released ..

Thanks for remind, updated. @ulysses-you @sunchao @dongjoon-hyun @cloud-fan Please help review this


private def canUnwrapCast(from: DataType, to: DataType): Boolean = (from, to) match {
case (BooleanType, _) => true
case (IntegerType, FloatType) => false
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we add some code comments to explain the reason?

Copy link
Contributor

@cloud-fan cloud-fan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good catch!

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

Copy link
Member

@sunchao sunchao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch! thanks @WangGuangxin !

@cloud-fan cloud-fan closed this in 9612db3 Jun 16, 2022
cloud-fan pushed a commit that referenced this pull request Jun 16, 2022
…g to Float/ Double or from Integer to Float

### What changes were proposed in this pull request?
Cast from Integer to Float or from Long to Double/Float may loss precision if the length of Integer/Long beyonds the **significant digits** of a Double(which is 15 or 16 digits) or Float(which is 7 or 8 digits).

For example, ```select *, cast(a as int) from (select cast(33554435 as foat) a )``` gives `33554436` instead of `33554435`.

When it comes the optimization rule `UnwrapCastInBinaryComparison`, it may result in incorrect (confused) result .
We can reproduce it with following script.
```
spark.range(10).map(i => 64707595868612313L).createOrReplaceTempView("tbl")
val df = sql("select * from tbl where cast(value as double) = cast('64707595868612313' as double)")
df.explain(true)
df.show()
```

With we disable this optimization rule , it returns 10 records.
But if we enable this optimization rule, it returns empty, since the sql is optimized to
```
select * from tbl where value = 64707595868612312L
```

### Why are the changes needed?
Fix the behavior that may confuse users (or maybe a bug?)

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
Add a new UT

Closes #36873 from WangGuangxin/SPARK-24994-followup.

Authored-by: wangguangxin.cn <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
(cherry picked from commit 9612db3)
Signed-off-by: Wenchen Fan <[email protected]>
cloud-fan pushed a commit that referenced this pull request Jun 16, 2022
…g to Float/ Double or from Integer to Float

Cast from Integer to Float or from Long to Double/Float may loss precision if the length of Integer/Long beyonds the **significant digits** of a Double(which is 15 or 16 digits) or Float(which is 7 or 8 digits).

For example, ```select *, cast(a as int) from (select cast(33554435 as foat) a )``` gives `33554436` instead of `33554435`.

When it comes the optimization rule `UnwrapCastInBinaryComparison`, it may result in incorrect (confused) result .
We can reproduce it with following script.
```
spark.range(10).map(i => 64707595868612313L).createOrReplaceTempView("tbl")
val df = sql("select * from tbl where cast(value as double) = cast('64707595868612313' as double)")
df.explain(true)
df.show()
```

With we disable this optimization rule , it returns 10 records.
But if we enable this optimization rule, it returns empty, since the sql is optimized to
```
select * from tbl where value = 64707595868612312L
```

Fix the behavior that may confuse users (or maybe a bug?)

No

Add a new UT

Closes #36873 from WangGuangxin/SPARK-24994-followup.

Authored-by: wangguangxin.cn <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
(cherry picked from commit 9612db3)
Signed-off-by: Wenchen Fan <[email protected]>
cloud-fan pushed a commit that referenced this pull request Jun 16, 2022
…g to Float/ Double or from Integer to Float

Cast from Integer to Float or from Long to Double/Float may loss precision if the length of Integer/Long beyonds the **significant digits** of a Double(which is 15 or 16 digits) or Float(which is 7 or 8 digits).

For example, ```select *, cast(a as int) from (select cast(33554435 as foat) a )``` gives `33554436` instead of `33554435`.

When it comes the optimization rule `UnwrapCastInBinaryComparison`, it may result in incorrect (confused) result .
We can reproduce it with following script.
```
spark.range(10).map(i => 64707595868612313L).createOrReplaceTempView("tbl")
val df = sql("select * from tbl where cast(value as double) = cast('64707595868612313' as double)")
df.explain(true)
df.show()
```

With we disable this optimization rule , it returns 10 records.
But if we enable this optimization rule, it returns empty, since the sql is optimized to
```
select * from tbl where value = 64707595868612312L
```

Fix the behavior that may confuse users (or maybe a bug?)

No

Add a new UT

Closes #36873 from WangGuangxin/SPARK-24994-followup.

Authored-by: wangguangxin.cn <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
(cherry picked from commit 9612db3)
Signed-off-by: Wenchen Fan <[email protected]>
@cloud-fan
Copy link
Contributor

thanks, merging to master/3.3/3.2/3.1!

sunchao pushed a commit to sunchao/spark that referenced this pull request Jun 2, 2023
…g to Float/ Double or from Integer to Float

Cast from Integer to Float or from Long to Double/Float may loss precision if the length of Integer/Long beyonds the **significant digits** of a Double(which is 15 or 16 digits) or Float(which is 7 or 8 digits).

For example, ```select *, cast(a as int) from (select cast(33554435 as foat) a )``` gives `33554436` instead of `33554435`.

When it comes the optimization rule `UnwrapCastInBinaryComparison`, it may result in incorrect (confused) result .
We can reproduce it with following script.
```
spark.range(10).map(i => 64707595868612313L).createOrReplaceTempView("tbl")
val df = sql("select * from tbl where cast(value as double) = cast('64707595868612313' as double)")
df.explain(true)
df.show()
```

With we disable this optimization rule , it returns 10 records.
But if we enable this optimization rule, it returns empty, since the sql is optimized to
```
select * from tbl where value = 64707595868612312L
```

Fix the behavior that may confuse users (or maybe a bug?)

No

Add a new UT

Closes apache#36873 from WangGuangxin/SPARK-24994-followup.

Authored-by: wangguangxin.cn <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
(cherry picked from commit 9612db3)
Signed-off-by: Wenchen Fan <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants