Skip to content

Commit

Permalink
[SPARK-17964][SPARKR] Enable SparkR with Mesos client mode and cluste…
Browse files Browse the repository at this point in the history
…r mode

## What changes were proposed in this pull request?

Enabled SparkR with Mesos client mode and cluster mode. Just a few changes were required to get this working on Mesos: (1) removed the SparkR on Mesos error checks and (2) do not require "--class" to be specified for R apps. The logic to check spark.mesos.executor.home was already in there.

sun-rui

## How was this patch tested?

1. SparkSubmitSuite
2. On local mesos cluster (on laptop): ran SparkR shell, spark-submit client mode, and spark-submit cluster mode, with the "examples/src/main/R/dataframe.R" example application.
3. On multi-node mesos cluster: ran SparkR shell, spark-submit client mode, and spark-submit cluster mode, with the "examples/src/main/R/dataframe.R" example application. I tested with the following --conf values set: spark.mesos.executor.docker.image and spark.mesos.executor.home

This contribution is my original work and I license the work to the project under the project's open source license.

Author: Susan X. Huynh <[email protected]>

Closes #15700 from susanxhuynh/susan-r-branch.
  • Loading branch information
susanxhuynh authored and srowen committed Nov 5, 2016
1 parent fb0d608 commit 9a87c31
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 8 deletions.
1 change: 0 additions & 1 deletion core/src/main/scala/org/apache/spark/api/r/RUtils.scala
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,6 @@ private[spark] object RUtils {
}
} else {
// Otherwise, assume the package is local
// TODO: support this for Mesos
val sparkRPkgPath = localSparkRPackagePath.getOrElse {
throw new SparkException("SPARK_HOME not set. Can't locate SparkR package.")
}
Expand Down
14 changes: 7 additions & 7 deletions core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
Original file line number Diff line number Diff line change
Expand Up @@ -322,17 +322,14 @@ object SparkSubmit {
}

// Require all R files to be local
if (args.isR && !isYarnCluster) {
if (args.isR && !isYarnCluster && !isMesosCluster) {
if (Utils.nonLocalPaths(args.primaryResource).nonEmpty) {
printErrorAndExit(s"Only local R files are supported: ${args.primaryResource}")
}
}

// The following modes are not supported or applicable
(clusterManager, deployMode) match {
case (MESOS, CLUSTER) if args.isR =>
printErrorAndExit("Cluster deploy mode is currently not supported for R " +
"applications on Mesos clusters.")
case (STANDALONE, CLUSTER) if args.isPython =>
printErrorAndExit("Cluster deploy mode is currently not supported for python " +
"applications on standalone clusters.")
Expand Down Expand Up @@ -410,9 +407,9 @@ object SparkSubmit {
printErrorAndExit("Distributing R packages with standalone cluster is not supported.")
}

// TODO: Support SparkR with mesos cluster
if (args.isR && clusterManager == MESOS) {
printErrorAndExit("SparkR is not supported for Mesos cluster.")
// TODO: Support distributing R packages with mesos cluster
if (args.isR && clusterManager == MESOS && !RUtils.rPackages.isEmpty) {
printErrorAndExit("Distributing R packages with mesos cluster is not supported.")
}

// If we're running an R app, set the main class to our specific R runner
Expand Down Expand Up @@ -598,6 +595,9 @@ object SparkSubmit {
if (args.pyFiles != null) {
sysProps("spark.submit.pyFiles") = args.pyFiles
}
} else if (args.isR) {
// Second argument is main class
childArgs += (args.primaryResource, "")
} else {
childArgs += (args.primaryResource, args.mainClass)
}
Expand Down

0 comments on commit 9a87c31

Please sign in to comment.