Skip to content

Commit

Permalink
add dir check before delete
Browse files Browse the repository at this point in the history
sparkFilesDir is point to the current working dir in executor, and we only need to delete the tmp dir create by driver, for safe, we check it first.
  • Loading branch information
Sephiroth-Lin committed Feb 7, 2015
1 parent d7ccc64 commit b38e0f0
Showing 1 changed file with 14 additions and 6 deletions.
20 changes: 14 additions & 6 deletions core/src/main/scala/org/apache/spark/SparkEnv.scala
Original file line number Diff line number Diff line change
Expand Up @@ -94,12 +94,20 @@ class SparkEnv (

// Note that blockTransferService is stopped by BlockManager since it is started by it.

// If we only stop sc, but the driver process still run as a services then we need to delete
// the tmp dir, if not, it will create too many tmp dirs
try {
Utils.deleteRecursively(new File(sparkFilesDir))
} catch {
case e: Exception => logError(s"Exception while deleting Spark temp dir: $sparkFilesDir", e)
/**
* If we only stop sc, but the driver process still run as a services then we need to delete
* the tmp dir, if not, it will create too many tmp dirs.
*
* We only need to delete the tmp dir create by driver, so we need to check the sparkFilesDir,
* because sparkFilesDir is point to the current working dir in executor.
*/
if("." != sparkFilesDir){
try {
Utils.deleteRecursively(new File(sparkFilesDir))
} catch {
case e: Exception =>
logWarning(s"Exception while deleting Spark temp dir: $sparkFilesDir", e)
}
}
}

Expand Down

0 comments on commit b38e0f0

Please sign in to comment.