Skip to content

Commit

Permalink
Remove SPARK_LIBRARY_PATH
Browse files Browse the repository at this point in the history
  • Loading branch information
pwendell committed Apr 13, 2014
1 parent 6eaf7d0 commit 4982331
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 5 deletions.
1 change: 0 additions & 1 deletion bin/run-example
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,6 @@ fi

# Set JAVA_OPTS to be able to load native libraries and to set heap size
JAVA_OPTS="$SPARK_JAVA_OPTS"
JAVA_OPTS="$JAVA_OPTS -Djava.library.path=$SPARK_LIBRARY_PATH"
# Load extra JAVA_OPTS from conf/java-opts, if it exists
if [ -e "$FWDIR/conf/java-opts" ] ; then
JAVA_OPTS="$JAVA_OPTS `cat $FWDIR/conf/java-opts`"
Expand Down
1 change: 0 additions & 1 deletion bin/spark-class
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,6 @@ fi

# Set JAVA_OPTS to be able to load native libraries and to set heap size
JAVA_OPTS="$OUR_JAVA_OPTS"
JAVA_OPTS="$JAVA_OPTS -Djava.library.path=$SPARK_LIBRARY_PATH"
JAVA_OPTS="$JAVA_OPTS -Xms$OUR_JAVA_MEM -Xmx$OUR_JAVA_MEM"
# Load extra JAVA_OPTS from conf/java-opts, if it exists
if [ -e "$FWDIR/conf/java-opts" ] ; then
Expand Down
6 changes: 3 additions & 3 deletions docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -650,8 +650,9 @@ Apart from these, the following properties are also available, and may be useful
<td>spark.executor.extraJavaOptions</td>
<td>(none)</td>
<td>
A string of extra JVM options to pass to executors. For instance, GC settings. Note that
it is illegal to set Spark properties or heap size settings with this flag.
A string of extra JVM options to pass to executors. For instance, GC settings or custom
paths for native code. Note that it is illegal to set Spark properties or heap size
settings with this option.
</td>
</tr>

Expand All @@ -678,7 +679,6 @@ The following variables can be set in `spark-env.sh`:
* `JAVA_HOME`, the location where Java is installed (if it's not on your default `PATH`)
* `PYSPARK_PYTHON`, the Python binary to use for PySpark
* `SPARK_LOCAL_IP`, to configure which IP address of the machine to bind to.
* `SPARK_LIBRARY_PATH`, to add search directories for native libraries.
* `SPARK_CLASSPATH`, to add elements to Spark's classpath that you want to be present for _all_ applications.
Note that applications can also add dependencies for themselves through `SparkContext.addJar` -- we recommend
doing that when possible.
Expand Down

0 comments on commit 4982331

Please sign in to comment.