Skip to content

Commit

Permalink
Bump sarplus version to 0.6.6 (#1721)
Browse files Browse the repository at this point in the history
* Bump sarplus version to 0.6.6

* Update sarplus docs

* Update DEVELOPMENT.md
  • Loading branch information
simonzhaoms authored May 25, 2022
1 parent edf825f commit 3b38062
Show file tree
Hide file tree
Showing 3 changed files with 11 additions and 9 deletions.
8 changes: 5 additions & 3 deletions contrib/sarplus/DEVELOPMENT.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,9 @@ Steps to package and publish (also described in
cp ../VERSION ./pysarplus/ # copy version file
python -m build --sdist
MINOR_VERSION=$(python --version | cut -d '.' -f 2)
CIBW_BUILD="cp3${MINOR_VERSION}-manylinux_x86_64" python -m cibuildwheel --platform linux --output-dir dist
for MINOR_VERSION in {6..10}; do
CIBW_BUILD="cp3${MINOR_VERSION}-manylinux_x86_64" python -m cibuildwheel --platform linux --output-dir dist
done
python -m twine upload dist/*
```

Expand Down Expand Up @@ -103,11 +105,11 @@ on **Spark 3.2**, which adds an extra function `path()`, so an
additional package called [Sarplus Spark 3.2
Plus](https://search.maven.org/artifact/com.microsoft.sarplus/sarplus-spark-3-2-plus_2.12)
(with Maven coordinate such as
`com.microsoft.sarplus:sarplus-spark-3-2-plus_2.12:0.6.5`) should be
`com.microsoft.sarplus:sarplus-spark-3-2-plus_2.12:0.6.6`) should be
used if running on Spark 3.2 instead of
[Sarplus](https://search.maven.org/artifact/com.microsoft.sarplus/sarplus_2.12)
(with Maven coordinate like
`com.microsoft.sarplus:sarplus_2.12:0.6.5`).
`com.microsoft.sarplus:sarplus_2.12:0.6.6`).

In addition to `spark.sql.crossJoin.enabled true`, extra
configurations are required when running on Spark 3.x:
Expand Down
10 changes: 5 additions & 5 deletions contrib/sarplus/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -157,7 +157,7 @@ Insert this cell prior to the code above.
```python
import os

SARPLUS_MVN_COORDINATE = "com.microsoft.sarplus:sarplus_2.12:0.6.5"
SARPLUS_MVN_COORDINATE = "com.microsoft.sarplus:sarplus_2.12:0.6.6"
SUBMIT_ARGS = f"--packages {SARPLUS_MVN_COORDINATE} pyspark-shell"
os.environ["PYSPARK_SUBMIT_ARGS"] = SUBMIT_ARGS

Expand All @@ -180,7 +180,7 @@ spark = (
### PySpark Shell

```bash
SARPLUS_MVN_COORDINATE="com.microsoft.sarplus:sarplus_2.12:0.6.5"
SARPLUS_MVN_COORDINATE="com.microsoft.sarplus:sarplus_2.12:0.6.6"

# Install pysarplus
pip install pysarplus
Expand All @@ -201,14 +201,14 @@ pyspark --packages "${SARPLUS_MVN_COORDINATE}" \
1. Create Library
1. Under `Library Source` select `Maven`
1. Enter into `Coordinates`:
* `com.microsoft.sarplus:sarplus_2.12:0.6.5`
* or `com.microsoft.sarplus:sarplus-spark-3-2-plus_2.12:0.6.5` (if
* `com.microsoft.sarplus:sarplus_2.12:0.6.6`
* or `com.microsoft.sarplus:sarplus-spark-3-2-plus_2.12:0.6.6` (if
you're on Spark 3.2+)
1. Hit `Create`
1. Attach to your cluster
1. Create 2nd library
1. Under `Library Source` select `PyPI`
1. Enter `pysarplus==0.6.5`
1. Enter `pysarplus==0.6.6`
1. Hit `Create`

This will install C++, Python and Scala code on your cluster. See
Expand Down
2 changes: 1 addition & 1 deletion contrib/sarplus/VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.6.5
0.6.6

0 comments on commit 3b38062

Please sign in to comment.