-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-46692][BUILD] Load inputs.envs
always in Python
and Upload
-related steps
#44698
Conversation
…ariable transmission `PYTHON_TO_TEST`
…ariable transmission `PYTHON_TO_TEST`
PYTHON_TO_TEST
PYTHON_TO_TEST
For greater stability, we can fix this issue first, and then I will observe whether the environment variables |
I found that in step spark/.github/workflows/build_and_test.yml Line 446 in efa891c
|
After this PR is merged, I will observe whether the value of the environment variable 'PYTHON_TO_TEST' in steps |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ya, this looks like the root causes.
- Are these all instances?
- Given that the change affects not only
PYTHON_TO_TEST
, but also all environment variables, please revise the PR title.Fix potential issues with environment variable transmission PYTHON_TO_TEST
PYTHON_TO_TEST
pyspark job GA
@dongjoon-hyun |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM.
pyspark job GA
inputs.envs
always in Python
and Upload
steps
inputs.envs
always in Python
and Upload
stepsinputs.envs
always in Python
and Upload
-related steps
@dongjoon-hyun If tomorrow's scheduled task meets expectations, I will provide a PR to bring |
Ya, +1 for the plan. Thank you. |
@dongjoon-hyun Perhaps we need to install the Line 95 in a3266b4
|
We should but I think that's not available. If grpc is not installed it should skip the tests properly. |
Okay, let's continue to observe. |
…logics out ### What changes were proposed in this pull request? This PR factor Connect/non-Connect specific logics out into dedicated test classes. This PR is a followup of #40785 ### Why are the changes needed? In order to avoid test failure such as #44698 (comment) by missing dependencies ### Does this PR introduce _any_ user-facing change? No, test-only. ### How was this patch tested? CI in this PR should verify it. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #44715 from HyukjinKwon/SPARK-42960-followup. Authored-by: Hyukjin Kwon <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]>
@HyukjinKwon Do we need to install |
pyarrow in PyPy isn't available up to my best knowledge. |
I tried to install 'pyarrow' in pypy3 over the weekend, but one UT failed. So should we ignore it? |
For them, let's skip it for now, and file a JIRA for each tests. |
Okay. |
### What changes were proposed in this pull request? The pr aims to upgrade upload-artifact action from v3 to v4. After PR #44698, our environment variable(`PYTHON_TO_TEST`) is correctly passed and assigned value. We will bring back this PR: #44662 ### Why are the changes needed? - v4.0.0 release notes: https://github.com/actions/upload-artifact/releases/tag/v4.0.0 They have numerous performance and behavioral improvements. - v3 VS v4: actions/upload-artifact@v3...v4.0.0 ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Pass GA. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #44728 from panbingkun/SPARK-46474_GO_AHEAD. Authored-by: panbingkun <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]>
What changes were proposed in this pull request?
The pr aims to fix potential environmental variable(eg:
PYTHON_TO_TEST
.) transfer issue inpyspark job GA
Why are the changes needed?
https://github.com/apache/spark/actions/workflows/build_python.yml
Let's take a successful
build_ python
Using as an example, https://github.com/apache/spark/actions/runs/7476792796/job/203480906671.python 3.10

Obviously, it is meaningless to enumerate the packages of
Python 3.9
when building PySpark based onPython 3.10
.https://github.com/apache/spark/actions/runs/7476792796/job/20348079659
2.python 3.11

https://github.com/apache/spark/actions/runs/7476792796/job/20348091081
Does this PR introduce any user-facing change?
No.
How was this patch tested?
Pass GA.
Was this patch authored or co-authored using generative AI tooling?
No.