Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert "[SPARK-24418][BUILD] Upgrade Scala to 2.11.12 and 2.12.6" #22160

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -258,4 +258,4 @@ data/mllib/images/kittens/29.5.a_b_EGDP022204.jpg
data/mllib/images/kittens/54893.jpg
data/mllib/images/kittens/DP153539.jpg
data/mllib/images/kittens/DP802813.jpg
data/mllib/images/multi-channel/chr30.4.184.jpg
data/mllib/images/multi-channel/chr30.4.184.jpg
10 changes: 5 additions & 5 deletions dev/deps/spark-deps-hadoop-2.6
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ jersey-media-jaxb-2.22.2.jar
jersey-server-2.22.2.jar
jetty-6.1.26.jar
jetty-util-6.1.26.jar
jline-2.14.6.jar
jline-2.12.1.jar
joda-time-2.9.3.jar
jodd-core-3.5.2.jar
jpam-1.1.jar
Expand Down Expand Up @@ -169,10 +169,10 @@ parquet-jackson-1.10.0.jar
protobuf-java-2.5.0.jar
py4j-0.10.7.jar
pyrolite-4.13.jar
scala-compiler-2.11.12.jar
scala-library-2.11.12.jar
scala-parser-combinators_2.11-1.1.0.jar
scala-reflect-2.11.12.jar
scala-compiler-2.11.8.jar
scala-library-2.11.8.jar
scala-parser-combinators_2.11-1.0.4.jar
scala-reflect-2.11.8.jar
scala-xml_2.11-1.0.5.jar
shapeless_2.11-2.3.2.jar
slf4j-api-1.7.16.jar
Expand Down
10 changes: 5 additions & 5 deletions dev/deps/spark-deps-hadoop-2.7
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ jersey-server-2.22.2.jar
jetty-6.1.26.jar
jetty-sslengine-6.1.26.jar
jetty-util-6.1.26.jar
jline-2.14.6.jar
jline-2.12.1.jar
joda-time-2.9.3.jar
jodd-core-3.5.2.jar
jpam-1.1.jar
Expand Down Expand Up @@ -171,10 +171,10 @@ parquet-jackson-1.10.0.jar
protobuf-java-2.5.0.jar
py4j-0.10.7.jar
pyrolite-4.13.jar
scala-compiler-2.11.12.jar
scala-library-2.11.12.jar
scala-parser-combinators_2.11-1.1.0.jar
scala-reflect-2.11.12.jar
scala-compiler-2.11.8.jar
scala-library-2.11.8.jar
scala-parser-combinators_2.11-1.0.4.jar
scala-reflect-2.11.8.jar
scala-xml_2.11-1.0.5.jar
shapeless_2.11-2.3.2.jar
slf4j-api-1.7.16.jar
Expand Down
10 changes: 5 additions & 5 deletions dev/deps/spark-deps-hadoop-3.1
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ jersey-media-jaxb-2.22.2.jar
jersey-server-2.22.2.jar
jetty-webapp-9.3.24.v20180605.jar
jetty-xml-9.3.24.v20180605.jar
jline-2.14.6.jar
jline-2.12.1.jar
joda-time-2.9.3.jar
jodd-core-3.5.2.jar
jpam-1.1.jar
Expand Down Expand Up @@ -189,10 +189,10 @@ protobuf-java-2.5.0.jar
py4j-0.10.7.jar
pyrolite-4.13.jar
re2j-1.1.jar
scala-compiler-2.11.12.jar
scala-library-2.11.12.jar
scala-parser-combinators_2.11-1.1.0.jar
scala-reflect-2.11.12.jar
scala-compiler-2.11.8.jar
scala-library-2.11.8.jar
scala-parser-combinators_2.11-1.0.4.jar
scala-reflect-2.11.8.jar
scala-xml_2.11-1.0.5.jar
shapeless_2.11-2.3.2.jar
slf4j-api-1.7.16.jar
Expand Down
8 changes: 4 additions & 4 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@
<commons.math3.version>3.4.1</commons.math3.version>
<!-- managed up from 3.2.1 for SPARK-11652 -->
<commons.collections.version>3.2.2</commons.collections.version>
<scala.version>2.11.12</scala.version>
<scala.version>2.11.8</scala.version>
<scala.binary.version>2.11</scala.binary.version>
<codehaus.jackson.version>1.9.13</codehaus.jackson.version>
<fasterxml.jackson.version>2.6.7</fasterxml.jackson.version>
Expand Down Expand Up @@ -740,13 +740,13 @@
<dependency>
<groupId>org.scala-lang.modules</groupId>
<artifactId>scala-parser-combinators_${scala.binary.version}</artifactId>
<version>1.1.0</version>
<version>1.0.4</version>
</dependency>
<!-- SPARK-16770 affecting Scala 2.11.x -->
<dependency>
<groupId>jline</groupId>
<artifactId>jline</artifactId>
<version>2.14.6</version>
<version>2.12.1</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
Expand Down Expand Up @@ -2756,7 +2756,7 @@
<profile>
<id>scala-2.12</id>
<properties>
<scala.version>2.12.6</scala.version>
<scala.version>2.12.4</scala.version>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh I get your point @dbtsai -- do you need to revert anything related to 2.12? I think we do need 2.12.6 for fixes for 2.12 to work, but, then again this support isn't released yet. So I figure there's nothing to 'fix' by reverting this?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since 2.12.6 removes the hacks we were using to initialize Spark context, if we want to revert SPARK-24418, we have to use older version of 2.12.x

My 2 cents, reverting SPARK-24418 will make 2.12 work harder since we have to deal with Scala Shell issue as part of the tasks.

Alternatively, instead of reverting SPARK-24418, we should consider to merge #21749 which fixes the message printing issue.

<scala.binary.version>2.12</scala.binary.version>
</properties>
<build>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
def this() = this(None, new JPrintWriter(Console.out, true))

override def createInterpreter(): Unit = {
intp = new SparkILoopInterpreter(settings, out, initializeSpark)
intp = new SparkILoopInterpreter(settings, out)
}

val initializationCommands: Seq[String] = Seq(
Expand Down Expand Up @@ -73,15 +73,11 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
"import org.apache.spark.sql.functions._"
)

def initializeSpark(): Unit = {
if (!intp.reporter.hasErrors) {
// `savingReplayStack` removes the commands from session history.
savingReplayStack {
initializationCommands.foreach(intp quietRun _)
def initializeSpark() {
intp.beQuietDuring {
savingReplayStack { // remove the commands from session history.
initializationCommands.foreach(processLine)
}
} else {
throw new RuntimeException(s"Scala $versionString interpreter encountered " +
"errors during initialization")
}
}

Expand All @@ -105,6 +101,16 @@ class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
/** Available commands */
override def commands: List[LoopCommand] = standardCommands

/**
* We override `loadFiles` because we need to initialize Spark *before* the REPL
* sees any files, so that the Spark context is visible in those files. This is a bit of a
* hack, but there isn't another hook available to us at this point.
*/
override def loadFiles(settings: Settings): Unit = {
initializeSpark()
super.loadFiles(settings)
}

override def resetCommand(line: String): Unit = {
super.resetCommand(line)
initializeSpark()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,22 +21,8 @@ import scala.collection.mutable
import scala.tools.nsc.Settings
import scala.tools.nsc.interpreter._

class SparkILoopInterpreter(settings: Settings, out: JPrintWriter, initializeSpark: () => Unit)
extends IMain(settings, out) { self =>

/**
* We override `initializeSynchronous` to initialize Spark *after* `intp` is properly initialized
* and *before* the REPL sees any files in the private `loadInitFiles` functions, so that
* the Spark context is visible in those files.
*
* This is a bit of a hack, but there isn't another hook available to us at this point.
*
* See the discussion in Scala community https://github.com/scala/bug/issues/10913 for detail.
*/
override def initializeSynchronous(): Unit = {
super.initializeSynchronous()
initializeSpark()
}
class SparkILoopInterpreter(settings: Settings, out: JPrintWriter) extends IMain(settings, out) {
self =>

override lazy val memberHandlers = new {
val intp: self.type = self
Expand Down