Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-23464][MESOS] Fix mesos cluster scheduler options double-escaping #20641

Closed
wants to merge 2 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -530,7 +530,7 @@ private[spark] class MesosClusterScheduler(
.filter { case (key, _) => !replicatedOptionsBlacklist.contains(key) }
.toMap
(defaultConf ++ driverConf).foreach { case (key, value) =>
options ++= Seq("--conf", s""""$key=${shellEscape(value)}"""".stripMargin) }
options ++= Seq("--conf", s"$key=${shellEscape(value)}") }

options
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -199,6 +199,38 @@ class MesosClusterSchedulerSuite extends SparkFunSuite with LocalSparkContext wi
})
}

test("properly wraps and escapes parameters passed to driver command") {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this test fail with the old code?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does.

[info] - properly wraps and escapes parameters passed to driver command *** FAILED *** (154 milliseconds)
[info]   "test/./bin/spark-submit --name test --master mesos://mesos://localhost:5050 --driver-cores 1.0 --driver-memory 1000M --class mainClass --conf "spark.app.name=test" --conf "spark.mesos.executor.home=tes
t" --conf "spark.executor.extraJavaOptions="-Dparam1=\"value 1\" -Dparam2=value\\ 2 -Dpath=\$PATH"" --conf "spark.driver.extraJavaOptions="-XX+PrintGC -Dparam1=val1 -Dparam2=val2"" ./jar arg" did not contain "--
conf spark.driver.extraJavaOptions="-XX+PrintGC -Dparam1=val1 -Dparam2=val2"" (MesosClusterSchedulerSuite.scala:227)
[info]   org.scalatest.exceptions.TestFailedException:
[info]   at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:528)
[info]   at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
[info]   at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:501)
[info]   at org.apache.spark.scheduler.cluster.mesos.MesosClusterSchedulerSuite$$anonfun$16.apply(MesosClusterSchedulerSuite.scala:227)
[info]   at org.apache.spark.scheduler.cluster.mesos.MesosClusterSchedulerSuite$$anonfun$16.apply(MesosClusterSchedulerSuite.scala:202)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks ready to merge; anything else needed @skonto or @susanxhuynh ?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would be great to merge it if you think it's ready.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for the delay. I was going to test this in DC/OS and haven't gotten a chance to do so.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@susanxhuynh I can do a test for this.

setScheduler()

val mem = 1000
val cpu = 1

val response = scheduler.submitDriver(
new MesosDriverDescription("d1", "jar", mem, cpu, true,
command,
Map("spark.mesos.executor.home" -> "test",
"spark.app.name" -> "test",
// no special characters, wrap only
"spark.driver.extraJavaOptions" ->
"-XX+PrintGC -Dparam1=val1 -Dparam2=val2",
// special characters, to be escaped
"spark.executor.extraJavaOptions" ->
"""-Dparam1="value 1" -Dparam2=value\ 2 -Dpath=$PATH"""),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure the second param (param2) would be parsed correctly if you actually ran the command. Doesn't there need to be quotes around the space? Have you tested it and checked if the executor gets the correct value for param2?

Copy link
Author

@krcz krcz Mar 12, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@susanxhuynh
I have checked it and it worked. There don't need to be quotes, as the space has been escaped. Backslash stops it from being interpreted as a boundary between arguments and makes it being understood as simple space in the value. This is standard behaviour of bash (and sh) syntax.

"s1",
new Date()))
assert(response.success)

val offer = Utils.createOffer("o1", "s1", mem, cpu)
scheduler.resourceOffers(driver, List(offer).asJava)
val tasks = Utils.verifyTaskLaunched(driver, "o1")
val driverCmd = tasks.head.getCommand.getValue
assert(driverCmd.contains(
"""--conf spark.driver.extraJavaOptions="-XX+PrintGC -Dparam1=val1 -Dparam2=val2""""))
assert(driverCmd.contains(
"""--conf spark.executor.extraJavaOptions="""
+ """"-Dparam1=\"value 1\" -Dparam2=value\\ 2 -Dpath=\$PATH""""))
}

test("supports spark.mesos.driverEnv.*") {
setScheduler()

Expand Down