Add example running command to spark

pull/1/head
Baohua Yang 2015-09-24 10:43:24 +08:00
parent 0a7a37060f
commit 8bcced517b
2 changed files with 7 additions and 1 deletions

View File

@ -45,7 +45,11 @@ Use nginx as a proxy with authentication for backend application.
docker registry mirror, with redis as the backend cache.
## spark_cluster
Spark cluster with master and worker nodes
Spark cluster with master and worker nodes.
```sh
docker-compose scale worker=2
```
Try submitting a test pi application using the spark-submit command.
```sh
/urs/local/spark/bin/spark-submit --master spark://master:7077 --class org.apache.spark.examples.SparkPi /usr/local/spark/lib/spark-examples-1.4.0-hadoop2.6.0.jar 1000
```

View File

@ -2,6 +2,8 @@
# This compose file will start spark master node and the worker node.
# All nodes will become a cluster automatically.
# You can run: docker-compose scale worker=2
# After startup, try submit a pi calculation application.
# /urs/local/spark/bin/spark-submit --master spark://master:7077 --class org.apache.spark.examples.SparkPi /usr/local/spark/lib/spark-examples-1.4.0-hadoop2.6.0.jar 1000
master:
image: sequenceiq/spark:1.4.0