Some typical docker compose templates.
 
 
 
 
 
 
Go to file
Naoya Horiguchi e545919108 Fix error messages in make clean
"make clean" emits the following error message:

  root@ubuntu:~/docker-compose-files/hyperledger_fabric/v1.0.5# make clean
  Clean all HLF containers and fabric cc images
  awk: cmd. line:1: { print , }
  awk: cmd. line:1:         ^ syntax error
  awk: cmd. line:1: { print , }
  awk: cmd. line:1:           ^ syntax error
  awk: cmd. line:1: { print , }
  awk: cmd. line:1:            ^ unexpected newline or end of string

This is simply because $1 and $2 are not properly escaped.

And we have another messages like below:

  "docker rm" requires at least 1 argument(s).
  See 'docker rm --help'.

  Usage:  docker rm [OPTIONS] CONTAINER [CONTAINER...]

  Remove one or more containers
  Makefile:138: recipe for target 'clean' failed
  make: [clean] Error 1 (ignored)
  "docker rmi" requires at least 1 argument(s).
  See 'docker rmi --help'.

  Usage:  docker rmi [OPTIONS] IMAGE [IMAGE...]

  Remove one or more images
  Makefile:138: recipe for target 'clean' failed
  make: [clean] Error 1 (ignored)

If we have no target containers or images, we don't want to call docker
rm/rmi. xargs has -r option for this purpose, so let's turn it on.

Signed-off-by: Naoya Horiguchi <n-horiguchi@ah.jp.nec.com>
2017-12-25 09:59:09 +09:00
.github Start reconfiguration demo 2017-11-26 19:50:49 +08:00
consul-discovery Add consul discoverable architecture 2015-08-18 10:47:49 +08:00
elk_netflow Add more charts 2015-11-12 09:56:36 +08:00
haproxy_web Use lf instead of crlf 2016-04-15 16:45:21 +08:00
hyperledger_fabric Fix error messages in make clean 2017-12-25 09:59:09 +09:00
kafka Update image tag to use fix number 2017-09-15 13:39:12 +08:00
mongo-elasticsearch Use mongosetup image 2015-08-21 17:03:19 +08:00
mongo_cluster Use lf instead of crlf 2016-04-15 16:45:21 +08:00
mongo_webui add more files 2015-08-10 16:14:51 +08:00
nginx_auth Use new nginx auth image 2015-08-12 14:14:39 +08:00
packetbeat_ek Use lf instead of crlf 2016-04-15 16:45:21 +08:00
registry_mirror Use lf instead of crlf 2016-04-15 16:45:21 +08:00
spark_cluster Add example running command to spark 2015-09-24 10:43:24 +08:00
.gitignore Add pycharm .idea dir 2016-04-21 13:53:48 +08:00
README.md Update hyperledger fabric name in README 2017-10-25 22:16:13 -07:00

README.md

Docker Compose Files

Some typical docker compose examples.

If you're not familiar with Docker, can have a look at these books (in CN):

Install Docker&Docker Compose

$ curl -sSL https://get.docker.com/ | sh
$ sudo pip install docker-compose

Docker-compose Usage

See Docker Compose Documentation.

Examples files

consul-discovery

Using consul to make a service-discoverable architecture.

elk_netflow

Elk cluster, with netflow support.

docker-compose scale es=3

haproxy_web

A simple haproxy and web applications cluster.

hyperledger_fabric

Quickly bootup a hyperledger fabric cluster with several validator nodes, without vagrant or any manual configuration.

Now we support from v0.6 to v1.0.x.

See hyperledger_fabric for more details.

kafka

Start a simple kafka service for testing.

mongo_cluster

Start 3 mongo instance to make a replica set.

mongo-elasticsearch

Start mongo (as cluster) and elasticsearch, use a mongo-connector to sync the data from mongo to elasticsearch.

mongo_webui

Start 1 mongo instance and a mongo-express web tool to watch it.

The mongo instance will store data into local /opt/data/mongo_home.

The web UI will listen on local 8081 port.

nginx_auth

Use nginx as a proxy with authentication for backend application.

packetbeat_ek

Demo the packetbeat, elasticsearch and kibana.

Some kibana dashboard config files are included.

To import them, after all containers startup, go inside the kibana container, and run

$ cd /kibana/beats-dashboards-1.0.1 && ./load.sh http://elasticsearch:9200

registry_mirror

docker registry mirror, with redis as the backend cache.

spark_cluster

Spark cluster with master and worker nodes.

docker-compose scale worker=2

Try submitting a test pi application using the spark-submit command.

/urs/local/spark/bin/spark-submit --master spark://master:7077 --class org.apache.spark.examples.SparkPi /usr/local/spark/lib/spark-examples-1.4.0-hadoop2.6.0.jar 1000