-
Notifications
You must be signed in to change notification settings - Fork 121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add to_string
method to SparkLikeExprDateTimeNamespace
#1842
base: main
Are you sure you want to change the base?
feat: add to_string
method to SparkLikeExprDateTimeNamespace
#1842
Conversation
β¦into feat/add-to_string-to-spark-like-dt
tests/conftest.py
Outdated
@@ -159,6 +159,7 @@ def pyspark_lazy_constructor() -> Callable[[Any], IntoFrame]: # pragma: no cove | |||
.config("spark.sql.shuffle.partitions", "2") | |||
# common timezone for all tests environments | |||
.config("spark.sql.session.timeZone", "UTC") | |||
.config("spark.sql.legacy.timeParserPolicy", "LEGACY") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what does this do?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I faced the date migration issue from spark 2.0 to 3.0 (basically to match the behaviour datetime parsing of spark versions <3). I refered this issue here https://stackoverflow.com/questions/62602720/string-to-date-migration-from-spark-2-0-to-3-0-gives-fail-to-recognize-eee-mmm
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of setting this, I suggest using a valid pattern for Spark 3.0: https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
Otherwise we need to ask users to also set this config
When we remove the config we get the error:
pyspark.errors.exceptions.captured.SparkUpgradeException: [INCONSISTENT_BEHAVIOR_CROSS_VERSION.DATETIME_PATTERN_RECOGNITION] You may get a different result due to the upgrading to Spark >= 3.0:
Fail to recognize 'YYYY' pattern in the DateTimeFormatter. 1) You can set "spark.sql.legacy.timeParserPolicy" to "LEGACY" to restore the behavior before Spark 3.0. 2) You can form a valid datetime pattern with the guide from 'https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html'.
we need to substitute YYYY
to yyyy
(as we do in [strptime_to_pyspark_format](https://github.com/narwhals-dev/narwhals/blob/main/narwhals/_spark_like/expr_str.py#L136)?
)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks @Dhanunjaya-Elluri! @EdAbati , does this PR look good to you?
β¦into feat/add-to_string-to-spark-like-dt
narwhals/_spark_like/expr_dt.py
Outdated
def to_string(self: Self, format: str) -> SparkLikeExpr: # noqa: A002 | ||
def _format_iso_week_with_day(_input: Column) -> Column: | ||
"""Format datetime as ISO week string with day.""" | ||
year = F.date_format(_input, "YYYY") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
year = F.date_format(_input, "YYYY") | |
year = F.date_format(_input, "yyyy") |
narwhals/_spark_like/expr_dt.py
Outdated
|
||
def _format_iso_week(_input: Column) -> Column: | ||
"""Format datetime as ISO week string.""" | ||
year = F.date_format(_input, "YYYY") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
year = F.date_format(_input, "YYYY") | |
year = F.date_format(_input, "yyyy") |
Thanks @Dhanunjaya-Elluri for this PR! π Unfortunately when I run these tests on my machine (I am not in UTC timezone), I get all these errors:
I didn't have time to look more into this, I leave this resource that could be useful https://spark.apache.org/docs/latest/api/python/user_guide/sql/arrow_pandas.html#timestamp-with-time-zone-semantics |
Just to confirm, the rest of the tests all pass on your machine? |
Apparently, this line is already set to UTC, but it doesn't work somehow. Line 161 in 0f38b77
I did a work around to set the timezone: export TZ=UTC The simplest setup is to use |
β¦into feat/add-to_string-to-spark-like-dt
What type of PR is this? (check all applicable)
Related issues
Checklist
If you have comments or can explain your changes, please do so below