Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add to_string method to SparkLikeExprDateTimeNamespace #1842

Open
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

Dhanunjaya-Elluri
Copy link
Contributor

@Dhanunjaya-Elluri Dhanunjaya-Elluri commented Jan 20, 2025

What type of PR is this? (check all applicable)

  • πŸ’Ύ Refactor
  • ✨ Feature
  • πŸ› Bug Fix
  • πŸ”§ Optimization
  • πŸ“ Documentation
  • βœ… Test
  • 🐳 Other

Related issues

Checklist

  • Code follows style guide (ruff)
  • Tests added
  • Documented the changes

If you have comments or can explain your changes, please do so below

@@ -159,6 +159,7 @@ def pyspark_lazy_constructor() -> Callable[[Any], IntoFrame]: # pragma: no cove
.config("spark.sql.shuffle.partitions", "2")
# common timezone for all tests environments
.config("spark.sql.session.timeZone", "UTC")
.config("spark.sql.legacy.timeParserPolicy", "LEGACY")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what does this do?

Copy link
Contributor Author

@Dhanunjaya-Elluri Dhanunjaya-Elluri Jan 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I faced the date migration issue from spark 2.0 to 3.0 (basically to match the behaviour datetime parsing of spark versions <3). I refered this issue here https://stackoverflow.com/questions/62602720/string-to-date-migration-from-spark-2-0-to-3-0-gives-fail-to-recognize-eee-mmm

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of setting this, I suggest using a valid pattern for Spark 3.0: https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
Otherwise we need to ask users to also set this config

When we remove the config we get the error:

pyspark.errors.exceptions.captured.SparkUpgradeException: [INCONSISTENT_BEHAVIOR_CROSS_VERSION.DATETIME_PATTERN_RECOGNITION] You may get a different result due to the upgrading to Spark >= 3.0:
    Fail to recognize 'YYYY' pattern in the DateTimeFormatter. 1) You can set "spark.sql.legacy.timeParserPolicy" to "LEGACY" to restore the behavior before Spark 3.0. 2) You can form a valid datetime pattern with the guide from 'https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html'.

we need to substitute YYYY to yyyy (as we do in [strptime_to_pyspark_format](https://github.com/narwhals-dev/narwhals/blob/main/narwhals/_spark_like/expr_str.py#L136)?)

Copy link
Member

@MarcoGorelli MarcoGorelli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks @Dhanunjaya-Elluri! @EdAbati , does this PR look good to you?

def to_string(self: Self, format: str) -> SparkLikeExpr: # noqa: A002
def _format_iso_week_with_day(_input: Column) -> Column:
"""Format datetime as ISO week string with day."""
year = F.date_format(_input, "YYYY")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
year = F.date_format(_input, "YYYY")
year = F.date_format(_input, "yyyy")


def _format_iso_week(_input: Column) -> Column:
"""Format datetime as ISO week string."""
year = F.date_format(_input, "YYYY")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
year = F.date_format(_input, "YYYY")
year = F.date_format(_input, "yyyy")

@EdAbati
Copy link
Collaborator

EdAbati commented Jan 23, 2025

Thanks @Dhanunjaya-Elluri for this PR! πŸ™

Unfortunately when I run these tests on my machine (I am not in UTC timezone), I get all these errors:

FAILED tests/expr_and_series/dt/to_string_test.py::test_dt_to_string_iso_local_date_expr[pyspark-data0-2020-01-09] - AssertionError: Mismatch at index 0: 2020-01-08 23:00:00 != 2020-01-09 00:00:00
FAILED tests/expr_and_series/dt/to_string_test.py::test_dt_to_string_expr[pyspark-%Y/%m/%d %H:%M:%S] - AssertionError: Mismatch at index 0: 2020/01/02 01:04:14 != 2020/01/02 02:04:14
FAILED tests/expr_and_series/dt/to_string_test.py::test_dt_to_string_expr[pyspark-%Y-%m-%d %H:%M:%S] - AssertionError: Mismatch at index 0: 2020-01-02 01:04:14 != 2020-01-02 02:04:14
FAILED tests/expr_and_series/dt/to_string_test.py::test_dt_to_string_iso_local_datetime_expr[pyspark-data2-2020-01-09T12:34:56.123456] - AssertionError: Mismatch at index 0: 2020-01-09 11:34:56.123456 != 2020-01-09 12:34:56.123456
FAILED tests/expr_and_series/dt/to_string_test.py::test_dt_to_string_iso_local_datetime_expr[pyspark-data1-2020-01-09T12:34:56.000123] - AssertionError: Mismatch at index 0: 2020-01-09 11:34:56.000123 != 2020-01-09 12:34:56.000123
FAILED tests/expr_and_series/dt/to_string_test.py::test_dt_to_string_iso_local_datetime_expr[pyspark-data0-2020-01-09T12:34:56.000000] - AssertionError: Mismatch at index 0: 2020-01-09 11:34:56 != 2020-01-09 12:34:56
FAILED tests/expr_and_series/dt/datetime_attributes_test.py::test_datetime_attributes[pyspark-hour-expected4] - AssertionError: Mismatch at index 0: 1 != 2

I didn't have time to look more into this, I leave this resource that could be useful https://spark.apache.org/docs/latest/api/python/user_guide/sql/arrow_pandas.html#timestamp-with-time-zone-semantics

@MarcoGorelli
Copy link
Member

Unfortunately when I run these tests on my machine (I am not in UTC timezone), I get all these errors:

Just to confirm, the rest of the tests all pass on your machine?

@Dhanunjaya-Elluri
Copy link
Contributor Author

Unfortunately when I run these tests on my machine (I am not in UTC timezone), I get all these errors:

Just to confirm, the rest of the tests all pass on your machine?

Apparently, this line is already set to UTC, but it doesn't work somehow.

.config("spark.sql.session.timeZone", "UTC")

I did a work around to set the timezone:

export TZ=UTC

The simplest setup is to use os.environ["TZ"] = "UTC". I'll push these changes now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants