Skip to content

Commit

Permalink
Minor changes
Browse files Browse the repository at this point in the history
  • Loading branch information
andrewor14 committed May 13, 2014
1 parent 336bbd9 commit a8c39c5
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 6 deletions.
10 changes: 5 additions & 5 deletions docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,9 @@ Spark provides three locations to configure the system:
Spark properties control most application settings and are configured separately for each
application. The preferred way is to set them through
[SparkConf](api/scala/index.html#org.apache.spark.SparkConf) and passing it as an argument to your
SparkContext. SparkConf lets you configure most of the common properties to initialize a cluster
(e.g., master URL and application name), as well as arbitrary key-value pairs through the `set()`
method. For example, we could initialize an application as follows:
SparkContext. SparkConf allows you to configure most of the common properties to initialize a
cluster (e.g. master URL and application name), as well as arbitrary key-value pairs through the
`set()` method. For example, we could initialize an application as follows:

{% highlight scala %}
val conf = new SparkConf
Expand All @@ -45,7 +45,7 @@ key and a value separated by whitespace. For example,

Any values specified in the file will be passed on to the application, and merged with those
specified through SparkConf. If the same configuration property exists in both `spark-defaults.conf`
and SparkConf, then the latter will take precedence as it is most application-specific.
and SparkConf, then the latter will take precedence as it is the most application-specific.

## All Configuration Properties

Expand Down Expand Up @@ -203,7 +203,7 @@ Apart from these, the following properties are also available, and may be useful
Comma separated list of filter class names to apply to the Spark web ui. The filter should be a
standard javax servlet Filter. Parameters to each filter can also be specified by setting a
java system property of spark.<class name of filter>.params='param1=value1,param2=value2'
(e.g.-Dspark.ui.filters=com.test.filter1 -Dspark.com.test.filter1.params='param1=foo,param2=testing')
(e.g. -Dspark.ui.filters=com.test.filter1 -Dspark.com.test.filter1.params='param1=foo,param2=testing')
</td>
</tr>
<tr>
Expand Down
2 changes: 1 addition & 1 deletion docs/spark-standalone.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ Once you've set up this file, you can launch or stop your cluster with the follo
- `sbin/start-slaves.sh` - Starts a slave instance on each machine specified in the `conf/slaves` file.
- `sbin/start-all.sh` - Starts both a master and a number of slaves as described above.
- `sbin/stop-master.sh` - Stops the master that was started via the `bin/start-master.sh` script.
- `sbin/stop-slaves.sh` - Stops all slave instances the machines specified in the `conf/slaves` file.
- `sbin/stop-slaves.sh` - Stops all slave instances on the machines specified in the `conf/slaves` file.
- `sbin/stop-all.sh` - Stops both the master and the slaves as described above.

Note that these scripts must be executed on the machine you want to run the Spark master on, not your local machine.
Expand Down

0 comments on commit a8c39c5

Please sign in to comment.