Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-47977] DateTimeUtils.timestampDiff and DateTimeUtils.timestampAdd should not throw INTERNAL_ERROR exception #46210

Closed
wants to merge 6 commits into from

Conversation

vitaliili-db
Copy link
Contributor

What changes were proposed in this pull request?

Convert INTERNAL_ERROR for timestampAdd and timestampDiff to error with class. Reusing INVALID_PARAMETER_VALUE.DATETIME_UNIT used when parsing expressions.

The change is needed since timestampDiff and timestampAdd expressions could be constructed without going through parser - e.g. PySpark creates timestampDiff through PythonSQLUtils.

Why are the changes needed?

Correct classification of errors

Does this PR introduce any user-facing change?

No

How was this patch tested?

Existing unit tests updated with new error class

Was this patch authored or co-authored using generative AI tooling?

No

…Add should not throw INTERNAL_ERROR exception
@github-actions github-actions bot added the SQL label Apr 24, 2024
@vitaliili-db
Copy link
Contributor Author

@MaxGekk please take a look

@vitaliili-db
Copy link
Contributor Author

@cloud-fan please review

@@ -698,7 +698,7 @@ object DateTimeUtils extends SparkDateTimeUtils {
}
} catch {
case _: scala.MatchError =>
throw SparkException.internalError(s"Got the unexpected unit '$unit'.")
throw QueryExecutionErrors.invalidDatetimeUnitError("timestampAdd", unit)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: TIMESTAMPADD

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@@ -736,7 +736,7 @@ object DateTimeUtils extends SparkDateTimeUtils {
val endLocalTs = getLocalDateTime(endTs, zoneId)
timestampDiffMap(unitInUpperCase)(startLocalTs, endLocalTs)
} else {
throw SparkException.internalError(s"Got the unexpected unit '$unit'.")
throw QueryExecutionErrors.invalidDatetimeUnitError("timestampDiff", unit)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: TIMESTAMPDIFF

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

Copy link
Contributor

@sigmod sigmod left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks @vitaliili-db !

@cloud-fan
Copy link
Contributor

it has merge conflicts, @vitaliili-db can you fix it?

@vitaliili-db
Copy link
Contributor Author

@cloud-fan done!

@gengliangwang
Copy link
Member

Thanks, merging to master

riyaverm-db pushed a commit to riyaverm-db/spark that referenced this pull request Jun 7, 2024
…Add should not throw INTERNAL_ERROR exception

### What changes were proposed in this pull request?

Convert `INTERNAL_ERROR` for `timestampAdd` and `timestampDiff` to error with class. Reusing `INVALID_PARAMETER_VALUE.DATETIME_UNIT` used when parsing expressions.

The change is needed since `timestampDiff` and `timestampAdd` expressions could be constructed without going through parser - e.g. PySpark creates `timestampDiff` through PythonSQLUtils.

### Why are the changes needed?

Correct classification of errors

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Existing unit tests updated with new error class

### Was this patch authored or co-authored using generative AI tooling?

No

Closes apache#46210 from vitaliili-db/SPARK-47977.

Authored-by: Vitalii Li <[email protected]>
Signed-off-by: Gengliang Wang <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants