Skip to content

Commit

Permalink
[SPARK-23941][MESOS] Mesos task failed on specific spark app name
Browse files Browse the repository at this point in the history
## What changes were proposed in this pull request?
Shell escaped the name passed to spark-submit and change how conf attributes are shell escaped.

## How was this patch tested?
This test has been tested manually with Hive-on-spark with mesos or with the use case described in the issue with the sparkPi application with a custom name which contains illegal shell characters.

With this PR, hive-on-spark on mesos works like a charm with hive 3.0.0-SNAPSHOT.

I state that this contribution is my original work and that I license the work to the project under the project’s open source license

Author: Bounkong Khamphousone <[email protected]>

Closes #21014 from tiboun/fix/SPARK-23941.

(cherry picked from commit 6782359)
Signed-off-by: Marcelo Vanzin <[email protected]>
  • Loading branch information
BounkongK authored and Marcelo Vanzin committed May 1, 2018
1 parent 52a420f commit 682f05d
Showing 1 changed file with 2 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -530,9 +530,9 @@ private[spark] class MesosClusterScheduler(
.filter { case (key, _) => !replicatedOptionsBlacklist.contains(key) }
.toMap
(defaultConf ++ driverConf).foreach { case (key, value) =>
options ++= Seq("--conf", s""""$key=${shellEscape(value)}"""".stripMargin) }
options ++= Seq("--conf", s"${key}=${value}") }

options
options.map(shellEscape)
}

/**
Expand Down

0 comments on commit 682f05d

Please sign in to comment.