Skip to content

Commit

Permalink
Merge pull request apache#100 from mesosphere/spark-382-document-vers…
Browse files Browse the repository at this point in the history
…ions

[SPARK-382] Document versions
  • Loading branch information
mgummelt authored Nov 15, 2016
2 parents c4eab28 + e318bdf commit 8957b19
Showing 1 changed file with 15 additions and 1 deletion.
16 changes: 15 additions & 1 deletion docs/run-job.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ more][13].

$ dcos spark run --submit-args=`--class MySampleClass http://external.website/mysparkapp.jar 30`


$ dcos spark run --submit-args="--py-files mydependency.py http://external.website/mysparkapp.py 30"

`dcos spark run` is a thin wrapper around the standard Spark
Expand Down Expand Up @@ -62,6 +62,20 @@ To set Spark properties with a configuration file, create a
`spark-defaults.conf` file and set the environment variable
`SPARK_CONF_DIR` to the containing directory. [Learn more][15].

# Versioning

The DC/OS Spark docker image contains OpenJDK 8 and Python 2.7.6.

DC/OS Spark distributions 1.X are compiled with Scala 2.10. DC/OS
Spark distributions 2.X are compiled with Scala 2.11. Scala is not
binary compatible across minor verions, so your Spark job must be
compiled with the same Scala version as your version of DC/OS Spark.

The default DC/OS Spark distribution is compiled against Hadoop 2.6
libraries. However, you may choose a different version by following
the instructions in the "Customize Spark Distribution" section [here](install.md).


[13]: http://spark.apache.org/docs/latest/submitting-applications.html
[14]: http://spark.apache.org/docs/latest/configuration.html#spark-properties
[15]: http://spark.apache.org/docs/latest/configuration.html#overriding-configuration-directory

0 comments on commit 8957b19

Please sign in to comment.