Skip to content

Commit

Permalink
check sparkFilesDir before delete
Browse files Browse the repository at this point in the history
change from < if (SparkContext.DRIVER_IDENTIFIER == executorId) > to < if (sparkFilesDir != ".") >, and add comment where sparkFilesDir is created.
  • Loading branch information
Sephiroth-Lin committed Feb 9, 2015
1 parent f48a3c6 commit b2018a5
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion core/src/main/scala/org/apache/spark/SparkEnv.scala
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ class SparkEnv (
// the tmp dir, if not, it will create too many tmp dirs.
// We only need to delete the tmp dir create by driver, because sparkFilesDir is point to the
// current working dir in executor which we do not need to delete.
if (SparkContext.DRIVER_IDENTIFIER == executorId) {
if (sparkFilesDir != ".") {
try {
Utils.deleteRecursively(new File(sparkFilesDir))
} catch {
Expand Down Expand Up @@ -351,6 +351,8 @@ object SparkEnv extends Logging {
// Set the sparkFiles directory, used when downloading dependencies. In local mode,
// this is a temporary directory; in distributed mode, this is the executor's current working
// directory.
// As we use this value to decide whether if we need to delete the tmp file in stop(), so if you
// want to change this code please be careful.
val sparkFilesDir: String = if (isDriver) {
Utils.createTempDir(Utils.getLocalDir(conf), "userFiles").getAbsolutePath
} else {
Expand Down

0 comments on commit b2018a5

Please sign in to comment.