-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-5437] Fix DriverSuite and SparkSubmitSuite timeout issues #4230
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -28,31 +28,30 @@ import org.apache.spark.util.Utils | |
|
||
class DriverSuite extends FunSuite with Timeouts { | ||
|
||
// Regression test for SPARK-530: "Spark driver process doesn't exit after finishing" | ||
test("driver should exit after finishing") { | ||
val sparkHome = sys.props.getOrElse("spark.test.home", fail("spark.test.home is not set!")) | ||
// Regression test for SPARK-530: "Spark driver process doesn't exit after finishing" | ||
val masters = Table(("master"), ("local"), ("local-cluster[2,1,512]")) | ||
val masters = Table("master", "local", "local-cluster[2,1,512]") | ||
forAll(masters) { (master: String) => | ||
failAfter(60 seconds) { | ||
Utils.executeAndGetOutput( | ||
Seq(s"$sparkHome/bin/spark-class", "org.apache.spark.DriverWithoutCleanup", master), | ||
new File(sparkHome), | ||
Map("SPARK_TESTING" -> "1", "SPARK_HOME" -> sparkHome)) | ||
} | ||
val process = Utils.executeCommand( | ||
Seq(s"$sparkHome/bin/spark-class", "org.apache.spark.DriverWithoutCleanup", master), | ||
new File(sparkHome), | ||
Map("SPARK_TESTING" -> "1", "SPARK_HOME" -> sparkHome)) | ||
failAfter(60 seconds) { process.waitFor() } | ||
// Ensure we still kill the process in case it timed out | ||
process.destroy() | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @andrewor14 shouldn't the (I know this was committed a while back, and the test is even ignored now -- but I was looking at some flaky tests and this just happened to catch my eye, so I was curious) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. yes, it would be good to fix it, though I don't think it will solve the flakiness that we currently encounter There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. by the way I'm addressing this at #6886 |
||
} | ||
} | ||
} | ||
|
||
/** | ||
* Program that creates a Spark driver but doesn't call SparkContext.stop() or | ||
* Sys.exit() after finishing. | ||
* Program that creates a Spark driver but doesn't call SparkContext#stop() or | ||
* sys.exit() after finishing. | ||
*/ | ||
object DriverWithoutCleanup { | ||
def main(args: Array[String]) { | ||
Utils.configTestLog4j("INFO") | ||
// Bind the web UI to an ephemeral port in order to avoid conflicts with other tests running on | ||
// the same machine (we shouldn't just disable the UI here, since that might mask bugs): | ||
val conf = new SparkConf().set("spark.ui.port", "0") | ||
val conf = new SparkConf | ||
val sc = new SparkContext(args(0), "DriverWithoutCleanup", conf) | ||
sc.parallelize(1 to 100, 4).count() | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since you're touching this, I kinda like the approach of adding the bug number to the
test()
call that I've seen in other places.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just moved this line, but I can move it there too