2015-07-30 14:54:22 +08:00
|
|
|
Docker Compose Files
|
|
|
|
===
|
2015-09-22 15:22:12 +08:00
|
|
|
Some typical docker compose examples.
|
2015-07-30 14:54:22 +08:00
|
|
|
|
2017-09-15 13:33:44 +08:00
|
|
|
If you're not familiar with Docker, can have a look at these books (in CN):
|
2015-08-18 11:49:57 +08:00
|
|
|
|
2017-09-15 13:33:44 +08:00
|
|
|
* [Docker Practice](https://github.com/yeasy/docker_practice)
|
|
|
|
|
|
|
|
# Install Docker&Docker Compose
|
|
|
|
|
|
|
|
```bash
|
2015-08-18 11:49:57 +08:00
|
|
|
$ curl -sSL https://get.docker.com/ | sh
|
2015-07-30 14:54:22 +08:00
|
|
|
$ sudo pip install docker-compose
|
|
|
|
```
|
|
|
|
|
|
|
|
# Docker-compose Usage
|
2016-01-29 16:12:09 +08:00
|
|
|
See [Docker Compose Documentation](https://docs.docker.com/compose/).
|
2015-07-30 14:54:22 +08:00
|
|
|
|
|
|
|
|
2016-01-29 16:12:09 +08:00
|
|
|
# Examples files
|
2015-07-30 14:54:22 +08:00
|
|
|
|
2016-01-29 16:12:09 +08:00
|
|
|
## [consul-discovery](consul-discovery)
|
2015-08-18 10:47:49 +08:00
|
|
|
Using consul to make a service-discoverable architecture.
|
|
|
|
|
2016-01-29 16:12:09 +08:00
|
|
|
## [elk_netflow](elk_netflow)
|
2015-09-22 15:22:12 +08:00
|
|
|
Elk cluster, with netflow support.
|
|
|
|
```sh
|
|
|
|
docker-compose scale es=3
|
|
|
|
```
|
|
|
|
|
2016-01-29 16:12:09 +08:00
|
|
|
## [haproxy_web](haproxy_web)
|
2015-11-18 21:34:42 +08:00
|
|
|
A simple haproxy and web applications cluster.
|
|
|
|
|
2017-10-26 13:16:13 +08:00
|
|
|
## [hyperledger_fabric](hyperledger_fabric)
|
|
|
|
Quickly bootup a hyperledger fabric cluster with several validator nodes, without vagrant or any manual configuration.
|
2016-04-15 16:45:57 +08:00
|
|
|
|
2017-10-26 13:16:13 +08:00
|
|
|
Now we support from v0.6 to v1.0.x.
|
|
|
|
|
|
|
|
See [hyperledger_fabric](hyperledger_fabric) for more details.
|
2016-04-15 16:45:57 +08:00
|
|
|
|
2017-09-15 13:33:44 +08:00
|
|
|
## [kafka](kafka)
|
|
|
|
Start a simple kafka service for testing.
|
|
|
|
|
2016-01-29 16:12:09 +08:00
|
|
|
## [mongo_cluster](mongo_cluster)
|
2015-07-30 14:54:22 +08:00
|
|
|
Start 3 mongo instance to make a replica set.
|
|
|
|
|
2016-01-29 16:12:09 +08:00
|
|
|
## [mongo-elasticsearch](mongo-elasticsearch)
|
2015-08-21 15:37:35 +08:00
|
|
|
Start mongo (as cluster) and elasticsearch, use a mongo-connector to sync the data from mongo to elasticsearch.
|
|
|
|
|
2016-01-29 16:12:09 +08:00
|
|
|
## [mongo_webui](mongo_webui)
|
2015-07-30 14:54:22 +08:00
|
|
|
Start 1 mongo instance and a mongo-express web tool to watch it.
|
|
|
|
|
|
|
|
The mongo instance will store data into local /opt/data/mongo_home.
|
|
|
|
|
|
|
|
The web UI will listen on local 8081 port.
|
2015-08-10 16:14:51 +08:00
|
|
|
|
2016-01-29 16:12:09 +08:00
|
|
|
## [nginx_auth](nginx_auth)
|
2015-08-10 16:14:51 +08:00
|
|
|
Use nginx as a proxy with authentication for backend application.
|
|
|
|
|
2016-01-29 16:12:09 +08:00
|
|
|
## [packetbeat_ek](packetbeat_ek)
|
2016-01-28 16:15:31 +08:00
|
|
|
Demo the packetbeat, elasticsearch and kibana.
|
|
|
|
|
2016-01-28 16:21:04 +08:00
|
|
|
Some kibana [dashboard config](https://github.com/elastic/beats-dashboards) files are included.
|
2016-01-28 16:15:31 +08:00
|
|
|
|
|
|
|
To import them, after all containers startup, go inside the kibana container, and run
|
|
|
|
```sh
|
2016-01-29 16:16:59 +08:00
|
|
|
$ cd /kibana/beats-dashboards-1.0.1 && ./load.sh http://elasticsearch:9200
|
2016-01-28 16:15:31 +08:00
|
|
|
```
|
|
|
|
|
2016-01-29 16:12:09 +08:00
|
|
|
## [registry_mirror](registry_mirror)
|
2015-08-10 16:14:51 +08:00
|
|
|
docker registry mirror, with redis as the backend cache.
|
|
|
|
|
2016-01-29 16:12:09 +08:00
|
|
|
## [spark_cluster](spark_cluster)
|
2015-09-24 10:43:24 +08:00
|
|
|
Spark cluster with master and worker nodes.
|
2015-08-10 16:14:51 +08:00
|
|
|
```sh
|
2015-09-22 15:22:12 +08:00
|
|
|
docker-compose scale worker=2
|
2015-08-10 16:14:51 +08:00
|
|
|
```
|
2015-09-24 10:43:24 +08:00
|
|
|
Try submitting a test pi application using the spark-submit command.
|
|
|
|
```sh
|
|
|
|
/urs/local/spark/bin/spark-submit --master spark://master:7077 --class org.apache.spark.examples.SparkPi /usr/local/spark/lib/spark-examples-1.4.0-hadoop2.6.0.jar 1000
|
|
|
|
```
|