Skip to content

Commit

Permalink
don't check sparkFilesDir, check executorId
Browse files Browse the repository at this point in the history
  • Loading branch information
Sephiroth-Lin committed Feb 7, 2015
1 parent dd9686e commit f48a3c6
Showing 1 changed file with 5 additions and 8 deletions.
13 changes: 5 additions & 8 deletions core/src/main/scala/org/apache/spark/SparkEnv.scala
Original file line number Diff line number Diff line change
Expand Up @@ -94,14 +94,11 @@ class SparkEnv (

// Note that blockTransferService is stopped by BlockManager since it is started by it.

/**
* If we only stop sc, but the driver process still run as a services then we need to delete
* the tmp dir, if not, it will create too many tmp dirs.
*
* We only need to delete the tmp dir create by driver, so we need to check the sparkFilesDir,
* because sparkFilesDir is point to the current working dir in executor.
*/
if ("." != sparkFilesDir) {
// If we only stop sc, but the driver process still run as a services then we need to delete
// the tmp dir, if not, it will create too many tmp dirs.
// We only need to delete the tmp dir create by driver, because sparkFilesDir is point to the
// current working dir in executor which we do not need to delete.
if (SparkContext.DRIVER_IDENTIFIER == executorId) {
try {
Utils.deleteRecursively(new File(sparkFilesDir))
} catch {
Expand Down

0 comments on commit f48a3c6

Please sign in to comment.