docs: update patterns containing version number

Update doc regarding last changes (new branch, ansible version, ...)

Signed-off-by: Guillaume Abrioux <gabrioux@redhat.com>
pull/2123/head
Guillaume Abrioux 2017-10-30 15:49:51 +01:00
parent 97b1cb0258
commit c9c278de7d
5 changed files with 23 additions and 20 deletions

View File

@ -66,9 +66,9 @@ If a change should be backported to a ``stable-*`` Git branch:
- Determine the latest available stable branch:
``git branch -r --list "origin/stable-[0-9].[0-9]" | sort -r | sed 1q``
- Create a new local branch for your PR, based on the stable branch:
``git checkout --no-track -b my-backported-change origin/stable-2.2``
``git checkout --no-track -b my-backported-change origin/stable-3.0``
- Cherry-pick your change: ``git cherry-pick -x (your-sha1)``
- Create a new pull request against the ``stable-2.2`` branch.
- Create a new pull request against the ``stable-3.0`` branch.
- Ensure that your PR's title has the prefix "backport:", so it's clear
to reviewers what this is about.
- Add a comment in your backport PR linking to the original (master) PR.
@ -79,7 +79,7 @@ regressions.
Once this is done, one of the project maintainers will tag the tip of the
stable branch with your change. For example::
git checkout stable-2.2
git checkout stable-3.0
git pull --ff-only
git tag v2.2.5
git push origin v2.2.5
git tag v3.0.12
git push origin v3.0.12

View File

@ -62,10 +62,13 @@ The ``master`` branch should be considered experimental and used with caution.
- ``stable-2.1`` Support for ceph version ``jewel``. This branch supports ansible versions
``2.1`` and ``2.2.1``.
- ``stable-2.2`` Support for ceph versions ``jewel`` and ``kraken``. This branch supports ansible versions
- ``stable-2.2`` Support for ceph versions ``jewel`` and ``luminous``. This branch supports ansible versions
``2.1`` and ``2.2.2``.
- ``master`` Support for ceph versions ``jewel``, ``kraken`` and ``luminous``. This branch supports ansible version ``2.3.1``.
- ``stable-3.0`` Support for ceph versions ``jewel`` and ``luminous``. This branch supports ansible versions
``2.2`` and ``2.4.1``.
- ``master`` Support for ceph versions ``jewel``, and ``luminous``. This branch supports ansible version ``2.4.1``.
Configuration and Usage
=======================

View File

@ -17,7 +17,7 @@ has the following required configuration options:
This scenario has the following optional configuration options:
- ``osd_objectstore``: defaults to ``filestore`` if not set. Available options are ``filestore`` or ``bluestore``.
You can only select ``bluestore`` with the ceph release is Luminous or greater.
You can only select ``bluestore`` if the ceph release is Luminous or greater.
- ``dmcrypt``: defaults to ``false`` if not set.

View File

@ -36,15 +36,15 @@ To run a single scenario, make sure it is available (should be defined from
tox -l
In this example, we will use the ``kraken-ansible2.2-xenial_cluster`` one. The
In this example, we will use the ``luminous-ansible2.4-xenial_cluster`` one. The
harness defaults to ``VirtualBox`` as the backend, so if you have that
installed in your system then this command should just work::
tox -e kraken-ansible2.2-xenial_cluster
tox -e luminous-ansible2.4-xenial_cluster
And for libvirt it would be::
tox -e kraken-ansible2.2-xenial_cluster -- --provider=libvirt
tox -e luminous-ansible2.4-xenial_cluster -- --provider=libvirt
.. warning:: Depending on the type of scenario and resources available, running
@ -60,9 +60,9 @@ end.
The output would look something similar to this trimmed version::
kraken-ansible2.2-xenial_cluster create: /Users/alfredo/python/upstream/ceph-ansible/.tox/kraken-ansible2.2-xenial_cluster
kraken-ansible2.2-xenial_cluster installdeps: ansible==2.2.2, -r/Users/alfredo/python/upstream/ceph-ansible/tests/requirements.txt
kraken-ansible2.2-xenial_cluster runtests: commands[0] | vagrant up --no-provision --provider=virtualbox
luminous-ansible2.4-xenial_cluster create: /Users/alfredo/python/upstream/ceph-ansible/.tox/luminous-ansible2.4-xenial_cluster
luminous-ansible2.4-xenial_cluster installdeps: ansible==2.4.1, -r/Users/alfredo/python/upstream/ceph-ansible/tests/requirements.txt
luminous-ansible2.4-xenial_cluster runtests: commands[0] | vagrant up --no-provision --provider=virtualbox
Bringing machine 'client0' up with 'virtualbox' provider...
Bringing machine 'rgw0' up with 'virtualbox' provider...
Bringing machine 'mds0' up with 'virtualbox' provider...
@ -91,9 +91,9 @@ playbook(s)::
Once the whole environment is all running the tests will be sent out to the
hosts, with output similar to this::
kraken-ansible2.2-xenial_cluster runtests: commands[4] | testinfra -n 4 --sudo -v --connection=ansible --ansible-inventory=/Users/alfredo/python/upstream/ceph-ansible/tests/functional/ubuntu/16.04/cluster/hosts /Users/alfredo/python/upstream/ceph-ansible/tests/functional/tests
luminous-ansible2.4-xenial_cluster runtests: commands[4] | testinfra -n 4 --sudo -v --connection=ansible --ansible-inventory=/Users/alfredo/python/upstream/ceph-ansible/tests/functional/ubuntu/16.04/cluster/hosts /Users/alfredo/python/upstream/ceph-ansible/tests/functional/tests
============================ test session starts ===========================
platform darwin -- Python 2.7.8, pytest-3.0.7, py-1.4.33, pluggy-0.4.0 -- /Users/alfredo/python/upstream/ceph-ansible/.tox/kraken-ansible2.2-xenial_cluster/bin/python
platform darwin -- Python 2.7.8, pytest-3.0.7, py-1.4.33, pluggy-0.4.0 -- /Users/alfredo/python/upstream/ceph-ansible/.tox/luminous-ansible2.4-xenial_cluster/bin/python
cachedir: ../../../../.cache
rootdir: /Users/alfredo/python/upstream/ceph-ansible/tests, inifile: pytest.ini
plugins: testinfra-1.5.4, xdist-1.15.0
@ -122,7 +122,7 @@ hosts, with output similar to this::
Finally the whole environment gets torn down::
kraken-ansible2.2-xenial_cluster runtests: commands[5] | vagrant destroy --force
luminous-ansible2.4-xenial_cluster runtests: commands[5] | vagrant destroy --force
==> osd0: Forcing shutdown of VM...
==> osd0: Destroying VM and associated drives...
==> mon2: Forcing shutdown of VM...
@ -142,6 +142,6 @@ Finally the whole environment gets torn down::
And a brief summary of the scenario(s) that ran is displayed::
________________________________________________ summary _________________________________________________
kraken-ansible2.2-xenial_cluster: commands succeeded
luminous-ansible2.4-xenial_cluster: commands succeeded
congratulations :)

View File

@ -154,14 +154,14 @@ from the root of the project) will list them, shortened for brevity::
$ tox -l
...
jewel-ansible2.2-centos7_cluster
luminous-ansible2.4-centos7_cluster
...
These scenarios are made from different variables, in the above command there
are 3:
* jewel: the Ceph version to test
* ansible2.2: the Ansible version to install
* ansible2.4: the Ansible version to install
* ``centos7_cluster``: the name of the scenario
The last one is important in the *wiring up* of the scenario. It is a variable