Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run python tests #27

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions .zuul.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
- project:
name: theopenlab/spark
check:
jobs:
- spark-build-and-python-test-arm64

- job:
name: spark-build-and-python-test-arm64
parent: init-test
description: |
The spark build and test other modules in openlab cluster.
run: .zuul/playbooks/spark-build/run_python_tests.yaml
nodeset: ubuntu-xenial-arm64
timeout: 86400
53 changes: 53 additions & 0 deletions .zuul/playbooks/spark-build/run_python_tests.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
- hosts: all
tasks:
- name: Build spark master using mvn with hadoop 2.7
shell:
cmd: |
set -exo pipefail
sudo apt-get update -y

# Install java
sudo apt-get install default-jre -y
sudo apt-get install default-jdk -y
java_home=$(dirname $(dirname $(update-alternatives --list javac)))
echo "export JAVA_HOME=${java_home}" >> ~/.profile
echo "export PATH=${java_home}/bin:$PATH" >> ~/.profile
source ~/.profile

# Install maven
wget http://www.us.apache.org/dist/maven/maven-3/3.6.2/binaries/apache-maven-3.6.2-bin.tar.gz
tar -xvf apache-maven-3.6.2-bin.tar.gz
export PATH=$PWD/apache-maven-3.6.2/bin:$PATH

# fix kafka authfail tests
sudo sed -i "s|127.0.0.1 $(hostname) localhost|127.0.0.1 localhost $(hostname)|" /etc/hosts

cd {{ ansible_user_dir }}/{{ zuul.project.src_dir }}

./build/mvn install -DskipTests -Phadoop-2.7 -Pyarn -Phive -Phive-thriftserver -Pkinesis-asl -Pmesos

# use leveldbjni arm supporting jar
wget https://repo1.maven.org/maven2/org/openlabtesting/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar
mvn install:install-file -DgroupId=org.fusesource.leveldbjni -DartifactId=leveldbjni-all -Dversion=1.8 -Dpackaging=jar -Dfile=leveldbjni-all-1.8.jar

# install python3.6
sudo add-apt-repository ppa:jonathonf/python-3.6 -y
sudo apt-get update -y
sudo apt-get install python3.6 -y
sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.6 1
sudo apt-get install python3.6-dev -y

# install pip(pip3)
curl https://bootstrap.pypa.io/get-pip.py | sudo python3.6

# install packages needed
sudo pip2 install coverage numpy
sudo pip install coverage numpy

sleep 36000
# run python tests
python/run-tests --python-executables=python2.7,python3.6

chdir: '/home/zuul/src'
executable: /bin/bash
environment: '{{ global_env }}'
3 changes: 2 additions & 1 deletion common/kvstore/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,9 @@
<artifactId>guava</artifactId>
</dependency>
<dependency>
<groupId>org.fusesource.leveldbjni</groupId>
<groupId>${leveldbjni.group}</groupId>
<artifactId>leveldbjni-all</artifactId>
<version>1.8</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
Expand Down
2 changes: 1 addition & 1 deletion common/network-common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@
</dependency>

<dependency>
<groupId>org.fusesource.leveldbjni</groupId>
<groupId>${leveldbjni.group}</groupId>
<artifactId>leveldbjni-all</artifactId>
<version>1.8</version>
</dependency>
Expand Down
9 changes: 8 additions & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -241,6 +241,7 @@
<spark.test.home>${session.executionRootDirectory}</spark.test.home>

<CodeCacheSize>1g</CodeCacheSize>
<leveldbjni.group>org.fusesource.leveldbjni</leveldbjni.group>
</properties>
<repositories>
<repository>
Expand Down Expand Up @@ -527,7 +528,7 @@
<version>${commons.httpcore.version}</version>
</dependency>
<dependency>
<groupId>org.fusesource.leveldbjni</groupId>
<groupId>${leveldbjni.group}</groupId>
<artifactId>leveldbjni-all</artifactId>
<version>1.8</version>
</dependency>
Expand Down Expand Up @@ -3073,5 +3074,11 @@
<profile>
<id>sparkr</id>
</profile>
<!--profile>
<id>aarch64</id>
<properties>
<leveldbjni.group>org.openlabtesting.leveldbjni</leveldbjni.group>
</properties>
</profile-->
</profiles>
</project>
8 changes: 4 additions & 4 deletions python/pyspark/mllib/tests/test_streaming_algorithms.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,14 +33,14 @@
class MLLibStreamingTestCase(unittest.TestCase):
def setUp(self):
self.sc = SparkContext('local[4]', "MLlib tests")
self.ssc = StreamingContext(self.sc, 1.0)
self.ssc = StreamingContext(self.sc, 3.0)

def tearDown(self):
self.ssc.stop(False)
self.sc.stop()

@staticmethod
def _eventually(condition, timeout=30.0, catch_assertions=False):
def _eventually(condition, timeout=120.0, catch_assertions=False):
"""
Wait a given amount of time for a condition to pass, else fail with an error.
This is a helper utility for streaming ML tests.
Expand Down Expand Up @@ -289,7 +289,7 @@ def condition():
return True

# We want all batches to finish for this test.
self._eventually(condition, 60.0, catch_assertions=True)
self._eventually(condition, catch_assertions=True)

t_models = array(models)
diff = t_models[1:] - t_models[:-1]
Expand Down Expand Up @@ -364,7 +364,7 @@ def condition():
return True
return "Latest errors: " + ", ".join(map(lambda x: str(x), errors))

self._eventually(condition, timeout=60.0)
self._eventually(condition)


class StreamingLinearRegressionWithTests(MLLibStreamingTestCase):
Expand Down