-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-5644] [Core]Delete tmp dir when sc is stop #4412
Changes from 4 commits
c0d5b28
aeac518
e2a2b1b
1d70926
d7ccc64
b38e0f0
dd9686e
f48a3c6
b2018a5
1339c96
4edf394
b968e14
fbbc785
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -93,6 +93,14 @@ class SparkEnv ( | |
// actorSystem.awaitTermination() | ||
|
||
// Note that blockTransferService is stopped by BlockManager since it is started by it. | ||
|
||
// If we only stop sc, but the driver process still run as a services then we need to delete | ||
// the tmp dir, if not, it will create too many tmp dirs | ||
try { | ||
Utils.deleteRecursively(new File(sparkFilesDir)) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'm not sure this one is safe. Looking at the code, There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I agree; this seems unsafe. It would be a disaster if we accidentally deleted directories that we didn't create, so we can't delete any path that could point to the CWD. Instead, we might be able to either ensure that the CWD is a subfolder of a spark local directory (so it will be cleaned up as part of our baseDir cleanup) or just change Old versions of the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @srowen @JoshRosen you are right, as we only need to delete the tmp dir create by driver, so check executorId first to confirm only in driver we will delete the tmp dir. Thank you. |
||
} catch { | ||
case e: Exception => logError(s"Exception while deleting Spark temp dir: $sparkFilesDir", e) | ||
} | ||
} | ||
|
||
private[spark] | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems OK to me. I would make the error log a warning though as it's non-fatal. Consider using an interpolated string here as you do below.