Skip to content

Commit

Permalink
[SPARK-39476][SQL] Disable Unwrap cast optimize when casting from Lon…
Browse files Browse the repository at this point in the history
…g to Float/ Double or from Integer to Float

Cast from Integer to Float or from Long to Double/Float may loss precision if the length of Integer/Long beyonds the **significant digits** of a Double(which is 15 or 16 digits) or Float(which is 7 or 8 digits).

For example, ```select *, cast(a as int) from (select cast(33554435 as foat) a )``` gives `33554436` instead of `33554435`.

When it comes the optimization rule `UnwrapCastInBinaryComparison`, it may result in incorrect (confused) result .
We can reproduce it with following script.
```
spark.range(10).map(i => 64707595868612313L).createOrReplaceTempView("tbl")
val df = sql("select * from tbl where cast(value as double) = cast('64707595868612313' as double)")
df.explain(true)
df.show()
```

With we disable this optimization rule , it returns 10 records.
But if we enable this optimization rule, it returns empty, since the sql is optimized to
```
select * from tbl where value = 64707595868612312L
```

Fix the behavior that may confuse users (or maybe a bug?)

No

Add a new UT

Closes #36873 from WangGuangxin/SPARK-24994-followup.

Authored-by: wangguangxin.cn <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
(cherry picked from commit 9612db3)
Signed-off-by: Wenchen Fan <[email protected]>
  • Loading branch information
WangGuangxin authored and cloud-fan committed Jun 16, 2022
1 parent f23a544 commit 380177d
Show file tree
Hide file tree
Showing 2 changed files with 42 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -359,7 +359,17 @@ object UnwrapCastInBinaryComparison extends Rule[LogicalPlan] {
!fromExp.foldable &&
fromExp.dataType.isInstanceOf[NumericType] &&
toType.isInstanceOf[NumericType] &&
Cast.canUpCast(fromExp.dataType, toType)
canUnwrapCast(fromExp.dataType, toType)
}

private def canUnwrapCast(from: DataType, to: DataType): Boolean = (from, to) match {
// SPARK-39476: It's not safe to unwrap cast from Integer to Float or from Long to Float/Double,
// since the length of Integer/Long may exceed the significant digits of Float/Double.
case (IntegerType, FloatType) => false
case (LongType, FloatType) => false
case (LongType, DoubleType) => false
case _ if from.isInstanceOf[NumericType] => Cast.canUpCast(from, to)
case _ => false
}

private[optimizer] def getRange(dt: DataType): Option[(Any, Any)] = dt match {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -190,5 +190,36 @@ class UnwrapCastInComparisonEndToEndSuite extends QueryTest with SharedSparkSess
}
}

test("SPARK-39476: Should not unwrap cast from Long to Double/Float") {
withTable(t) {
Seq((6470759586864300301L))
.toDF("c1").write.saveAsTable(t)
val df = spark.table(t)

checkAnswer(
df.where("cast(c1 as double) == cast(6470759586864300301L as double)")
.select("c1"),
Row(6470759586864300301L))

checkAnswer(
df.where("cast(c1 as float) == cast(6470759586864300301L as float)")
.select("c1"),
Row(6470759586864300301L))
}
}

test("SPARK-39476: Should not unwrap cast from Integer to Float") {
withTable(t) {
Seq((33554435))
.toDF("c1").write.saveAsTable(t)
val df = spark.table(t)

checkAnswer(
df.where("cast(c1 as float) == cast(33554435 as float)")
.select("c1"),
Row(33554435))
}
}

private def decimal(v: BigDecimal): Decimal = Decimal(v, 5, 2)
}

0 comments on commit 380177d

Please sign in to comment.