-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SPARK-1305: Support persisting RDD's directly to Tachyon #158
Changes from all commits
791189b
556978b
70ca182
8011a96
e01a271
dc8ef24
47304b3
e554b1e
fcaeab2
e3ddbba
8859371
776a56c
e82909c
bf278fa
1dcadf9
77be7e8
8968b67
6a22c1a
2825a13
ca14469
716e93b
d827250
6adb58f
939e467
bbeb4de
eacb2e8
16c5798
86a2eab
fd84156
e700d9c
76805aa
c9aeabf
04301d3
4572f9f
49cc724
64348b2
589eafe
91fa09d
be79d77
619a9a8
ed73e19
3dcace4
77d2703
d9a6438
9b97935
5cc041c
120e48a
8adfcfa
51149e7
7cd4600
55b5918
e0f4891
a8b3ec6
ae7834b
9f7fa1b
72b7768
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -19,14 +19,13 @@ package org.apache.spark | |
|
||
import java.io._ | ||
import java.net.URI | ||
import java.util.{Properties, UUID} | ||
import java.util.concurrent.atomic.AtomicInteger | ||
|
||
import java.util.{Properties, UUID} | ||
import java.util.UUID.randomUUID | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. nit: incorrect import order |
||
import scala.collection.{Map, Set} | ||
import scala.collection.generic.Growable | ||
import scala.collection.mutable.{ArrayBuffer, HashMap} | ||
import scala.reflect.{ClassTag, classTag} | ||
|
||
import org.apache.hadoop.conf.Configuration | ||
import org.apache.hadoop.fs.Path | ||
import org.apache.hadoop.io.{ArrayWritable, BooleanWritable, BytesWritable, DoubleWritable, FloatWritable, IntWritable, LongWritable, NullWritable, Text, Writable} | ||
|
@@ -130,6 +129,11 @@ class SparkContext( | |
val master = conf.get("spark.master") | ||
val appName = conf.get("spark.app.name") | ||
|
||
// Generate the random name for a temp folder in Tachyon | ||
// Add a timestamp as the suffix here to make it more safe | ||
val tachyonFolderName = "spark-" + randomUUID.toString() | ||
conf.set("spark.tachyonStore.folderName", tachyonFolderName) | ||
|
||
val isLocal = (master == "local" || master.startsWith("local[")) | ||
|
||
if (master == "yarn-client") System.setProperty("SPARK_YARN_MODE", "true") | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -41,13 +41,22 @@ object ExecutorExitCode { | |
/** DiskStore failed to create a local temporary directory after many attempts. */ | ||
val DISK_STORE_FAILED_TO_CREATE_DIR = 53 | ||
|
||
/** TachyonStore failed to initialize after many attempts. */ | ||
val TACHYON_STORE_FAILED_TO_INITIALIZE = 54 | ||
|
||
/** TachyonStore failed to create a local temporary directory after many attempts. */ | ||
val TACHYON_STORE_FAILED_TO_CREATE_DIR = 55 | ||
|
||
def explainExitCode(exitCode: Int): String = { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Because you added two new special exit codes above, you should also modify this method to explain them. That's why we have the named exit codes here, to give users a meaningful message if the executor crashes. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Done |
||
exitCode match { | ||
case UNCAUGHT_EXCEPTION => "Uncaught exception" | ||
case UNCAUGHT_EXCEPTION_TWICE => "Uncaught exception, and logging the exception failed" | ||
case OOM => "OutOfMemoryError" | ||
case DISK_STORE_FAILED_TO_CREATE_DIR => | ||
"Failed to create local directory (bad spark.local.dir?)" | ||
case TACHYON_STORE_FAILED_TO_INITIALIZE => "TachyonStore failed to initialize." | ||
case TACHYON_STORE_FAILED_TO_CREATE_DIR => | ||
"TachyonStore failed to create a local temporary directory." | ||
case _ => | ||
"Unknown executor exit code (" + exitCode + ")" + ( | ||
if (exitCode > 128) { | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The exclusions here don't seem to match the exclusions in the sbt build (https://github.com/RongGu/spark-1/blob/master/project/SparkBuild.scala#L325) -- is there a reason for this difference?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems this one excludes more than the sbt one:
excludeAll(excludeHadoop, excludeCurator, excludeEclipseJetty, excludePowermock),
This one also excluded junit. there is no particular reason to do so...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will Powermock and JUnit even be included in the tachyon-client artifact?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No. They won't. So, from Tachyon 0.5.0, we use tachyon-client.