-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-24131][PYSPARK][Followup] Add majorMinorVersion API to PySpark for determining Spark versions #21211
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems fine otherwise but let me leave it to @jkbradley.
python/pyspark/util.py
Outdated
""" | ||
m = re.search('^(\d+)\.(\d+)(\..*)?$', version) | ||
if m is None: | ||
raise ValueError("invalid version string: " + version) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shall we match the message with Scala side?
throw new IllegalArgumentException(s"Spark tried to parse '$sparkVersion' as a Spark" +
s" version string, but it could not find the major and minor version numbers.")
python/pyspark/util.py
Outdated
Provides utility method to determine Spark versions with given input string. | ||
""" | ||
@staticmethod | ||
def majorMinorVersion(version): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
version
-> sparkVersion
.
python/pyspark/util.py
Outdated
return (int(m.group(1)), int(m.group(2))) | ||
""" | ||
m = re.search('^(\d+)\.(\d+)(\..*)?$', version) | ||
if m is None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd do if m is not None
to match the order with Scala side.
python/pyspark/util.py
Outdated
@staticmethod | ||
def majorMinorVersion(version): | ||
""" | ||
Get major and minor version numbers for given Spark version string. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
* Given a Spark version string, return the (major version number, minor version number).
* E.g., for 2.0.1-SNAPSHOT, return (2, 0).
python/pyspark/tests.py
Outdated
def test_parsing_version_string(self): | ||
from pyspark.util import VersionUtils | ||
|
||
(major, minor) = VersionUtils.majorMinorVersion("2.4.0") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we don't need this since it's tested in doctests.
Test build #90030 has finished for PR 21211 at commit
|
@HyukjinKwon Thanks! I forgot to change doctests. I will update and address
your comments once I get access of my laptop.
…On Wed, May 2, 2018, 1:41 PM Hyukjin Kwon ***@***.***> wrote:
***@***.**** commented on this pull request.
Seems fine otherwise but let me leave it to @jkbradley
<https://github.com/jkbradley>.
------------------------------
In python/pyspark/util.py
<#21211 (comment)>:
>
- """
- m = re.search('^(\d+)\.(\d+)(\..*)?$', version)
- if m is None:
- return None
- else:
- return (int(m.group(1)), int(m.group(2)))
+ """
+ m = re.search('^(\d+)\.(\d+)(\..*)?$', version)
+ if m is None:
+ raise ValueError("invalid version string: " + version)
Shall we match the message with Scala side?
throw new IllegalArgumentException(s"Spark tried to parse '$sparkVersion' as a Spark" +
s" version string, but it could not find the major and minor version numbers.")
------------------------------
In python/pyspark/util.py
<#21211 (comment)>:
> """
- Get major and minor version numbers for given Spark version string.
-
- >>> version = "2.4.0"
- >>> majorMinorVersion(version)
- (2, 4)
+ Provides utility method to determine Spark versions with given input string.
+ """
+ @staticmethod
+ def majorMinorVersion(version):
version -> sparkVersion.
------------------------------
In python/pyspark/util.py
<#21211 (comment)>:
> """
- Get major and minor version numbers for given Spark version string.
-
- >>> version = "2.4.0"
- >>> majorMinorVersion(version)
- (2, 4)
+ Provides utility method to determine Spark versions with given input string.
* Given a Spark version string, return the (major version number, minor version number).
* E.g., for 2.0.1-SNAPSHOT, return (2, 0).
------------------------------
In python/pyspark/util.py
<#21211 (comment)>:
>
- """
- m = re.search('^(\d+)\.(\d+)(\..*)?$', version)
- if m is None:
- return None
- else:
- return (int(m.group(1)), int(m.group(2)))
+ """
+ m = re.search('^(\d+)\.(\d+)(\..*)?$', version)
+ if m is None:
I'd do if m is not None to match the order with Scala side.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#21211 (review)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAEM95EttpNHdza3nQplTx476aMF9i1Jks5tuUcPgaJpZM4Tu3oh>
.
|
@HyukjinKwon Thanks for the comments above. I think I addressed them all. |
Test build #90064 has finished for PR 21211 at commit
|
@jkbradley, have you had a change to take a look? If there's no more comments, will just merge it in few days. |
retest this please |
Test build #90348 has finished for PR 21211 at commit
|
Merged to master. |
Thanks @HyukjinKwon |
What changes were proposed in this pull request?
More close to Scala API behavior when can't parse input by throwing exception. Add tests.
How was this patch tested?
Added tests.