-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-37507][SQL] Add a new SQL function to_binary #35415
Conversation
720746a
to
fa925c7
Compare
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
Show resolved
Hide resolved
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
Outdated
Show resolved
Hide resolved
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
Outdated
Show resolved
Hide resolved
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
Outdated
Show resolved
Hide resolved
a5c5653
to
67c433c
Compare
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
Outdated
Show resolved
Hide resolved
Looks fine from a cursory look. cc @cloud-fan, @MaxGekk and @gengliangwang in case you guys find some time to review. |
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
Outdated
Show resolved
Hide resolved
@ExpressionDescription( | ||
usage = """ | ||
_FUNC_(str[, fmt]) - Converts the input `str` to a binary value based on the supplied `fmt`. | ||
By default, the binary format for conversion is "hex" if `fmt` is omitted. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Setting hex
as the default format referencing reference.
sql/core/src/test/resources/sql-tests/results/ansi/string-functions.sql.out
Outdated
Show resolved
Hide resolved
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
Show resolved
Hide resolved
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
Outdated
Show resolved
Hide resolved
struct<> | ||
-- !query output | ||
org.apache.spark.sql.AnalysisException | ||
cannot resolve 'to_binary('abc', 'invalidFormat')' due to data type mismatch: Unsupported encoding format: Some(invalidFormat). The format has to be a case-insensitive string literal of 'hex', 'utf-8', 'base2', or 'base64'; line 1 pos 7 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Intentionally failed query for testing.
3b937f1
to
300f1e8
Compare
Sorry I rebased the PR so the commit history looks confusing, but only two commits below are new: CC @HyukjinKwon |
Merged to master. |
val value = lit.eval() | ||
if (value == null) Literal(null, BinaryType) | ||
else { | ||
value.asInstanceOf[UTF8String].toString.toLowerCase(Locale.ROOT) match { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm, shall we check the type of format
first? What happens to to_binary('abc', 1)
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added the input check in #35533.
|
||
|
||
-- !query | ||
select to_binary('abc', 'base2') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this be an error instead? For base2 it is expecting a string of 0s and 1s.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please correct me if I'm wrong:
My understanding is base2
(binary
) should use the default encoding/decoding format, it happens to be utf-8
here.
We may also exclude base2
, following what Snowflake to_binary does.
Also CC @cloud-fan @HyukjinKwon @gengliangwang
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's exclude base2
as its behavior is a bit arguable.
case "hex" => Unhex(expr) | ||
case "utf-8" => Encode(expr, Literal("UTF-8")) | ||
case "base64" => UnBase64(expr) | ||
case "base2" => Cast(expr, BinaryType) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is the same as Encode(expr, Literal("UTF-8"))
, not sure it works for base2
Thank you @cloud-fan @entong ! I will open a new PR to resolve the above comments. |
### What changes were proposed in this pull request? Adjust input `format` of function `to_binary`: - gracefully fail for the non-string `format` parameter - remove arguable `base2` format support ### Why are the changes needed? Currently, function to_binary doesn't deal with the non-string `format` parameter properly. For example, `spark.sql("select to_binary('abc', 1)")` raises casting error, rather than hint that encoding format is unsupported. In addition, `base2` format is arguable as discussed [here](#35415 (comment)). We may exclude it following what Snowflake [to_binary](https://docs.snowflake.com/en/sql-reference/functions/to_binary.html) does for now. ### Does this PR introduce _any_ user-facing change? Yes. - Better error messages for non-string `format` parameter. For example: From: ``` scala> spark.sql("select to_binary('abc', 1)") org.apache.spark.sql.AnalysisException: class java.lang.Integer cannot be cast to class org.apache.spark.unsafe.types.UTF8String (java.lang.Integer is in module java.base of loader 'bootstrap'; org.apache.spark.unsafe.types.UTF8String is in unnamed module of loader 'app'); line 1 pos 7 ``` To: ``` scala> spark.sql("select to_binary('abc', 1)") org.apache.spark.sql.AnalysisException: cannot resolve 'to_binary('abc', 1)' due to data type mismatch: Unsupported encoding format: Some(1). The format has to be a case-insensitive string literal of 'hex', 'utf-8', 'base2', or 'base64'; line 1 pos 7; ``` - Removed `base2` format support ``` scala> spark.sql("select to_binary('abc', 'base2')").show() org.apache.spark.sql.AnalysisException: cannot resolve 'to_binary('abc', 'base2')' due to data type mismatch: Unsupported encoding format: Some(base2). The format has to be a case-insensitive string literal of 'hex', 'utf-8', or 'base64'; line 1 pos 7; ``` ### How was this patch tested? Unit test. Closes #35533 from xinrong-databricks/to_binary_followup. Authored-by: Xinrong Meng <[email protected]> Signed-off-by: Wenchen Fan <[email protected]>
What changes were proposed in this pull request?
Introduce a SQL function
to_binary
: Converts the input string to a binary value based on the supplied format (of how to interpret the string).Syntax:
where
fmt
can be a case-insensitive string literal of "hex", "utf-8", "base2", or "base64".fmt
is omitted.Why are the changes needed?
to_binary
is a common function available in many DBMSes, for example:Introducing it improves compatibility and the ease of migration.
In addition,
to_binary
can unify existing Spark functions:encode
,unhex
,unbase64
, andbinary
, which makes API easier to remember and use.Does this PR introduce any user-facing change?
Yes, a new function for the string to binary conversion with a specified format.
How was this patch tested?
Unit test.