Skip to content

Commit

Permalink
Merge pull request apache#442 from pwendell/standalone
Browse files Browse the repository at this point in the history
Workers should use working directory as spark home if it's not specified

If users don't set SPARK_HOME in their environment file when launching an application, the standalone cluster should default to the spark home of the worker.
(cherry picked from commit 59f475c)

Signed-off-by: Patrick Wendell <[email protected]>
  • Loading branch information
pwendell committed Jan 15, 2014
1 parent 29c76d9 commit e3fa36f
Showing 1 changed file with 4 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -209,8 +209,11 @@ private[spark] class Worker(
logWarning("Invalid Master (" + masterUrl + ") attempted to launch executor.")
} else {
logInfo("Asked to launch executor %s/%d for %s".format(appId, execId, appDesc.name))
// TODO (pwendell): We shuld make sparkHome an Option[String] in
// ApplicationDescription to be more explicit about this.
val effectiveSparkHome = Option(execSparkHome_).getOrElse(sparkHome.getAbsolutePath)
val manager = new ExecutorRunner(appId, execId, appDesc, cores_, memory_,
self, workerId, host, new File(execSparkHome_), workDir, akkaUrl, ExecutorState.RUNNING)
self, workerId, host, new File(effectiveSparkHome), workDir, akkaUrl, ExecutorState.RUNNING)
executors(appId + "/" + execId) = manager
manager.start()
coresUsed += cores_
Expand Down

0 comments on commit e3fa36f

Please sign in to comment.