docs - Use code-block to format code examples in Developer Guide (#75849)

Fixes #75663
pull/75891/head
Samuel Gaist 3 years ago committed by GitHub
parent 8988f8ab5d
commit 19fce0527a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -25,16 +25,22 @@ Creating a PR
* Create the directory ``~/dev/ansible/collections/ansible_collections/community``::
* Create the directory ``~/dev/ansible/collections/ansible_collections/community``:
.. code-block:: shell
mkdir -p ~/dev/ansible/collections/ansible_collections/community
* Clone `the community.general Git repository <https://github.com/ansible-collections/community.general/>`_ or a fork of it into the directory ``general``::
* Clone `the community.general Git repository <https://github.com/ansible-collections/community.general/>`_ or a fork of it into the directory ``general``:
.. code-block:: shell
cd ~/dev/ansible/collections/ansible_collections/community
git clone git@github.com:ansible-collections/community.general.git general
* If you clone from a fork, add the original repository as a remote ``upstream``::
* If you clone from a fork, add the original repository as a remote ``upstream``:
.. code-block:: shell
cd ~/dev/ansible/collections/ansible_collections/community/general
git remote add upstream git@github.com:ansible-collections/community.general.git

@ -13,7 +13,9 @@ A collection is a simple data structure. None of the directories are required un
Collection directories and files
================================
A collection can contain these directories and files::
A collection can contain these directories and files:
.. code-block:: shell-session
collection/
├── docs/

@ -20,7 +20,9 @@ You must always execute ``ansible-test`` from the root directory of a collection
Compile and sanity tests
------------------------
To run all compile and sanity tests::
To run all compile and sanity tests:
.. code-block:: shell-session
ansible-test sanity --docker default -v
@ -31,15 +33,21 @@ Adding unit tests
You must place unit tests in the appropriate ``tests/unit/plugins/`` directory. For example, you would place tests for ``plugins/module_utils/foo/bar.py`` in ``tests/unit/plugins/module_utils/foo/test_bar.py`` or ``tests/unit/plugins/module_utils/foo/bar/test_bar.py``. For examples, see the `unit tests in community.general <https://github.com/ansible-collections/community.general/tree/master/tests/unit/>`_.
To run all unit tests for all supported Python versions::
To run all unit tests for all supported Python versions:
.. code-block:: shell-session
ansible-test units --docker default -v
To run all unit tests only for a specific Python version::
To run all unit tests only for a specific Python version:
.. code-block:: shell-session
ansible-test units --docker default -v --python 3.6
To run only a specific unit test::
To run only a specific unit test:
.. code-block:: shell-session
ansible-test units --docker default -v --python 3.6 tests/unit/plugins/module_utils/foo/test_bar.py
@ -59,13 +67,17 @@ For examples, see the `integration tests in community.general <https://github.co
Since integration tests can install requirements, and set-up, start and stop services, we recommended running them in docker containers or otherwise restricted environments whenever possible. By default, ``ansible-test`` supports Docker images for several operating systems. See the `list of supported docker images <https://github.com/ansible/ansible/blob/devel/test/lib/ansible_test/_data/completion/docker.txt>`_ for all options. Use the ``default`` image mainly for platform-independent integration tests, such as those for cloud modules. The following examples use the ``centos8`` image.
To execute all integration tests for a collection::
To execute all integration tests for a collection:
.. code-block:: shell-session
ansible-test integration --docker centos8 -v
If you want more detailed output, run the command with ``-vvv`` instead of ``-v``. Alternatively, specify ``--retry-on-error`` to automatically re-run failed tests with higher verbosity levels.
To execute only the integration tests in a specific directory::
To execute only the integration tests in a specific directory:
.. code-block:: shell-session
ansible-test integration --docker centos8 -v connection_bar

@ -360,9 +360,11 @@ Inventory script conventions
Inventory scripts must accept the ``--list`` and ``--host <hostname>`` arguments. Although other arguments are allowed, Ansible will not use them.
Such arguments might still be useful for executing the scripts directly.
When the script is called with the single argument ``--list``, the script must output to stdout a JSON object that contains all the groups to be managed. Each group's value should be either an object containing a list of each host, any child groups, and potential group variables, or simply a list of hosts::
When the script is called with the single argument ``--list``, the script must output to stdout a JSON object that contains all the groups to be managed. Each group's value should be either an object containing a list of each host, any child groups, and potential group variables, or simply a list of hosts:
.. code-block:: json
{
"group001": {
"hosts": ["host001", "host002"],
@ -383,12 +385,13 @@ When the script is called with the single argument ``--list``, the script must o
If any of the elements of a group are empty, they may be omitted from the output.
When called with the argument ``--host <hostname>`` (where <hostname> is a host from above), the script must print a JSON object, either empty or containing variables to make them available to templates and playbooks. For example::
When called with the argument ``--host <hostname>`` (where <hostname> is a host from above), the script must print a JSON object, either empty or containing variables to make them available to templates and playbooks. For example:
.. code-block:: json
{
"VAR001": "VALUE",
"VAR002": "VALUE",
"VAR002": "VALUE"
}
Printing variables is optional. If the script does not print variables, it should print an empty JSON object.
@ -404,7 +407,9 @@ The stock inventory script system mentioned above works for all versions of Ansi
To avoid this inefficiency, if the inventory script returns a top-level element called "_meta", it is possible to return all the host variables in a single script execution. When this meta element contains a value for "hostvars", the inventory script will not be invoked with ``--host`` for each host. This behavior results in a significant performance increase for large numbers of hosts.
The data to be added to the top-level JSON object looks like this::
The data to be added to the top-level JSON object looks like this:
.. code-block:: text
{
@ -424,7 +429,9 @@ The data to be added to the top-level JSON object looks like this::
}
To satisfy the requirements of using ``_meta``, to prevent ansible from calling your inventory with ``--host`` you must at least populate ``_meta`` with an empty ``hostvars`` object.
For example::
For example:
.. code-block:: text
{

@ -8,7 +8,9 @@ Ansible provides a number of module utilities, or snippets of shared code, that
provide helper functions you can use when developing your own modules. The
``basic.py`` module utility provides the main entry point for accessing the
Ansible library, and all Python Ansible modules must import something from
``ansible.module_utils``. A common option is to import ``AnsibleModule``::
``ansible.module_utils``. A common option is to import ``AnsibleModule``:
.. code-block:: python
from ansible.module_utils.basic import AnsibleModule

@ -325,7 +325,9 @@ EXAMPLES block
After the shebang, the UTF-8 coding, the copyright line, the license section, and the ``DOCUMENTATION`` block comes the ``EXAMPLES`` block. Here you show users how your module works with real-world examples in multi-line plain-text YAML format. The best examples are ready for the user to copy and paste into a playbook. Review and update your examples with every change to your module.
Per playbook best practices, each example should include a ``name:`` line::
Per playbook best practices, each example should include a ``name:`` line:
.. code-block:: text
EXAMPLES = r'''
- name: Ensure foo is installed
@ -371,7 +373,9 @@ Otherwise, for each value returned, provide the following fields. All fields are
:contains:
Optional. To describe nested return values, set ``type: dict``, or ``type: list``/``elements: dict``, or if you really have to, ``type: complex``, and repeat the elements above for each sub-field.
Here are two example ``RETURN`` sections, one with three simple fields and one with a complex nested field::
Here are two example ``RETURN`` sections, one with three simple fields and one with a complex nested field:
.. code-block:: text
RETURN = r'''
dest:

@ -153,7 +153,9 @@ Verifying your module code in a playbook
The next step in verifying your new module is to consume it with an Ansible playbook.
- Create a playbook in any directory: ``$ touch testmod.yml``
- Add the following to the new playbook file::
- Add the following to the new playbook file:
.. code-block:: yaml
- name: test my new module
hosts: localhost

@ -579,7 +579,9 @@ are some steps that need to be followed to set this up:
#!powershell
You can add more args to ``$complex_args`` as required by the module or define the module options through a JSON file
with the structure::
with the structure:
.. code-block:: json
{
"ANSIBLE_MODULE_ARGS": {

@ -425,7 +425,9 @@ _ansible_selinux_special_fs
List. Names of filesystems which should have a special SELinux
context. They are used by the `AnsibleModule` methods which operate on
files (changing attributes, moving, and copying). To set, add a comma separated string of filesystem names in :file:`ansible.cfg`::
files (changing attributes, moving, and copying). To set, add a comma separated string of filesystem names in :file:`ansible.cfg`:
.. code-block:: ini
# ansible.cfg
[selinux]

@ -14,17 +14,23 @@ Rebasing the branch used to create your PR will resolve both of these issues.
Configuring your remotes
========================
Before you can rebase your PR, you need to make sure you have the proper remotes configured. These instructions apply to any repository on GitHub, including collections repositories. On other platforms (bitbucket, gitlab), the same principles and commands apply but the syntax may be different. We use the ansible/ansible repository here as an example. In other repositories, the branch names may be different. Assuming you cloned your fork in the usual fashion, the ``origin`` remote will point to your fork::
Before you can rebase your PR, you need to make sure you have the proper remotes configured. These instructions apply to any repository on GitHub, including collections repositories. On other platforms (bitbucket, gitlab), the same principles and commands apply but the syntax may be different. We use the ansible/ansible repository here as an example. In other repositories, the branch names may be different. Assuming you cloned your fork in the usual fashion, the ``origin`` remote will point to your fork:
.. code-block:: shell-session
$ git remote -v
origin git@github.com:YOUR_GITHUB_USERNAME/ansible.git (fetch)
origin git@github.com:YOUR_GITHUB_USERNAME/ansible.git (push)
However, you also need to add a remote which points to the upstream repository::
However, you also need to add a remote which points to the upstream repository:
.. code-block:: shell-session
$ git remote add upstream https://github.com/ansible/ansible.git
Which should leave you with the following remotes::
Which should leave you with the following remotes:
.. code-block:: shell-session
$ git remote -v
origin git@github.com:YOUR_GITHUB_USERNAME/ansible.git (fetch)
@ -32,7 +38,9 @@ Which should leave you with the following remotes::
upstream https://github.com/ansible/ansible.git (fetch)
upstream https://github.com/ansible/ansible.git (push)
Checking the status of your branch should show your fork is up-to-date with the ``origin`` remote::
Checking the status of your branch should show your fork is up-to-date with the ``origin`` remote:
.. code-block:: shell-session
$ git status
On branch YOUR_BRANCH
@ -42,14 +50,18 @@ Checking the status of your branch should show your fork is up-to-date with the
Rebasing your branch
====================
Once you have an ``upstream`` remote configured, you can rebase the branch for your PR::
Once you have an ``upstream`` remote configured, you can rebase the branch for your PR:
.. code-block:: shell-session
$ git pull --rebase upstream devel
This will replay the changes in your branch on top of the changes made in the upstream ``devel`` branch.
If there are merge conflicts, you will be prompted to resolve those before you can continue.
After you rebase, the status of your branch changes::
After you rebase, the status of your branch changes:
.. code-block:: shell-session
$ git status
On branch YOUR_BRANCH
@ -65,7 +77,9 @@ Updating your pull request
Now that you've rebased your branch, you need to push your changes to GitHub to update your PR.
Since rebasing re-writes git history, you will need to use a force push::
Since rebasing re-writes git history, you will need to use a force push:
.. code-block:: shell-session
$ git push --force-with-lease

@ -40,7 +40,9 @@ To add new machines, there is no additional SSL signing server involved, so ther
If there's another source of truth in your infrastructure, Ansible can also connect to that. Ansible can draw inventory, group, and variable information from sources like EC2, Rackspace, OpenStack, and more.
Here's what a plain text inventory file looks like::
Here's what a plain text inventory file looks like:
.. code-block:: text
---
[webservers]
@ -62,7 +64,9 @@ Playbooks can finely orchestrate multiple slices of your infrastructure topology
Ansible's approach to orchestration is one of finely-tuned simplicity, as we believe your automation code should make perfect sense to you years down the road and there should be very little to remember about special syntax or features.
Here's what a simple playbook looks like::
Here's what a simple playbook looks like:
.. code-block:: yaml
---
- hosts: webservers

@ -60,7 +60,9 @@ Organization
When Pull Requests (PRs) are created they are tested using Azure Pipelines, a Continuous Integration (CI) tool. Results are shown at the end of every PR.
When Azure Pipelines detects an error and it can be linked back to a file that has been modified in the PR then the relevant lines will be added as a GitHub comment. For example::
When Azure Pipelines detects an error and it can be linked back to a file that has been modified in the PR then the relevant lines will be added as a GitHub comment. For example:
.. code-block:: text
The test `ansible-test sanity --test pep8` failed with the following errors:
@ -71,11 +73,15 @@ When Azure Pipelines detects an error and it can be linked back to a file that h
From the above example we can see that ``--test pep8`` and ``--test validate-modules`` have identified an issue. The commands given allow you to run the same tests locally to ensure you've fixed all issues without having to push your changes to GitHub and wait for Azure Pipelines, for example:
If you haven't already got Ansible available, use the local checkout by running::
If you haven't already got Ansible available, use the local checkout by running:
.. code-block:: shell-session
source hacking/env-setup
Then run the tests detailed in the GitHub comment::
Then run the tests detailed in the GitHub comment:
.. code-block:: shell-session
ansible-test sanity --test pep8
ansible-test sanity --test validate-modules
@ -126,8 +132,9 @@ Here's how:
other flavors, since some features (for example, package managers such as apt or yum) are specific to those OS versions.
Create a fresh area to work::
Create a fresh area to work:
.. code-block:: shell-session
git clone https://github.com/ansible/ansible.git ansible-pr-testing
cd ansible-pr-testing
@ -140,7 +147,9 @@ Next, find the pull request you'd like to test and make note of its number. It w
It is important that the PR request target be ``ansible:devel``, as we do not accept pull requests into any other branch. Dot releases are cherry-picked manually by Ansible staff.
Use the pull request number when you fetch the proposed changes and create your branch for testing::
Use the pull request number when you fetch the proposed changes and create your branch for testing:
.. code-block:: shell-session
git fetch origin refs/pull/XXXX/head:testing_PRXXXX
git checkout testing_PRXXXX
@ -156,7 +165,9 @@ The first command fetches the proposed changes from the pull request and creates
The Ansible source includes a script that allows you to use Ansible directly from source without requiring a
full installation that is frequently used by developers on Ansible.
Simply source it (to use the Linux/Unix terminology) to begin using it immediately::
Simply source it (to use the Linux/Unix terminology) to begin using it immediately:
.. code-block:: shell-session
source ./hacking/env-setup

@ -3,7 +3,9 @@ no-main-display
As of Ansible 2.8, ``Display`` should no longer be imported from ``__main__``.
``Display`` is now a singleton and should be utilized like the following::
``Display`` is now a singleton and should be utilized like the following:
.. code-block:: python
from ansible.utils.display import Display
display = Display()

@ -33,12 +33,16 @@ ansible-test command
--------------------
The example below assumes ``bin/`` is in your ``$PATH``. An easy way to achieve that
is to initialize your environment with the ``env-setup`` command::
is to initialize your environment with the ``env-setup`` command:
.. code-block:: shell-session
source hacking/env-setup
ansible-test --help
You can also call ``ansible-test`` with the full path::
You can also call ``ansible-test`` with the full path:
.. code-block:: shell-session
bin/ansible-test --help
@ -74,19 +78,27 @@ outside of those test subdirectories. They will also not reconfigure or bounce
Use the ``--docker-no-pull`` option to avoid pulling the latest container image. This is required when using custom local images that are not available for download.
Run as follows for all POSIX platform tests executed by our CI system in a fedora32 docker container::
Run as follows for all POSIX platform tests executed by our CI system in a fedora32 docker container:
.. code-block:: shell-session
ansible-test integration shippable/ --docker fedora32
You can target a specific tests as well, such as for individual modules::
You can target a specific tests as well, such as for individual modules:
.. code-block:: shell-session
ansible-test integration ping
You can use the ``-v`` option to make the output more verbose::
You can use the ``-v`` option to make the output more verbose:
.. code-block:: shell-session
ansible-test integration lineinfile -vvv
Use the following command to list all the available targets::
Use the following command to list all the available targets:
.. code-block:: shell-session
ansible-test integration --list-targets
@ -98,7 +110,9 @@ Destructive Tests
=================
These tests are allowed to install and remove some trivial packages. You will likely want to devote these
to a virtual environment, such as Docker. They won't reformat your filesystem::
to a virtual environment, such as Docker. They won't reformat your filesystem:
.. code-block:: shell-session
ansible-test integration destructive/ --docker fedora32
@ -112,16 +126,22 @@ for testing, and enable PowerShell Remoting to continue.
Running these tests may result in changes to your Windows host, so don't run
them against a production/critical Windows environment.
Enable PowerShell Remoting (run on the Windows host via Remote Desktop)::
Enable PowerShell Remoting (run on the Windows host via Remote Desktop):
.. code-block:: shell-session
Enable-PSRemoting -Force
Define Windows inventory::
Define Windows inventory:
.. code-block:: shell-session
cp inventory.winrm.template inventory.winrm
${EDITOR:-vi} inventory.winrm
Run the Windows tests executed by our CI system::
Run the Windows tests executed by our CI system:
.. code-block:: shell-session
ansible-test windows-integration -v shippable/
@ -140,12 +160,16 @@ the Ansible continuous integration (CI) system is recommended.
Running Integration Tests
-------------------------
To run all CI integration test targets for POSIX platforms in a Ubuntu 18.04 container::
To run all CI integration test targets for POSIX platforms in a Ubuntu 18.04 container:
.. code-block:: shell-session
ansible-test integration shippable/ --docker ubuntu1804
You can also run specific tests or select a different Linux distribution.
For example, to run tests for the ``ping`` module on a Ubuntu 18.04 container::
For example, to run tests for the ``ping`` module on a Ubuntu 18.04 container:
.. code-block:: shell-session
ansible-test integration ping --docker ubuntu1804

@ -41,7 +41,9 @@ In order to run cloud tests, you must provide access credentials in a file
named ``credentials.yml``. A sample credentials file named
``credentials.template`` is available for syntax help.
Provide cloud credentials::
Provide cloud credentials:
.. code-block:: shell-session
cp credentials.template credentials.yml
${EDITOR:-vi} credentials.yml
@ -85,11 +87,15 @@ Running Tests
The tests are invoked via a ``Makefile``.
If you haven't already got Ansible available use the local checkout by doing::
If you haven't already got Ansible available use the local checkout by doing:
.. code-block:: shell-session
source hacking/env-setup
Run the tests by doing::
Run the tests by doing:
.. code-block:: shell-session
cd test/integration/
# TARGET is the name of the test from the list at the top of this page

@ -70,7 +70,9 @@ be written. Online reports are available but only cover the ``devel`` branch (s
Add the ``--coverage`` option to any test command to collect code coverage data. If you
aren't using the ``--venv`` or ``--docker`` options which create an isolated python
environment then you may have to use the ``--requirements`` option to ensure that the
correct version of the coverage module is installed::
correct version of the coverage module is installed:
.. code-block:: shell-session
ansible-test coverage erase
ansible-test units --coverage apt
@ -84,6 +86,8 @@ Reports can be generated in several different formats:
* ``ansible-test coverage html`` - HTML report.
* ``ansible-test coverage xml`` - XML report.
To clear data between test runs, use the ``ansible-test coverage erase`` command. For a full list of features see the online help::
To clear data between test runs, use the ``ansible-test coverage erase`` command. For a full list of features see the online help:
.. code-block:: shell-session
ansible-test coverage --help

@ -53,7 +53,9 @@ If you are running unit tests against things other than modules, such as module
ansible-test units --docker -v test/units/module_utils/basic/test_imports.py
For advanced usage see the online help::
For advanced usage see the online help:
.. code:: shell
ansible-test units --help
@ -104,35 +106,39 @@ Ansible drives unit tests through `pytest <https://docs.pytest.org/en/latest/>`_
means that tests can either be written a simple functions which are included in any file
name like ``test_<something>.py`` or as classes.
Here is an example of a function::
Here is an example of a function:
.. code:: python
#this function will be called simply because it is called test_*()
def test_add()
def test_add():
a = 10
b = 23
c = 33
assert a + b = c
assert a + b == c
Here is an example of a class:
Here is an example of a class::
.. code:: python
import unittest
class AddTester(unittest.TestCase)
class AddTester(unittest.TestCase):
def SetUp()
def SetUp():
self.a = 10
self.b = 23
# this function will
def test_add()
def test_add():
c = 33
assert self.a + self.b = c
assert self.a + self.b == c
# this function will
def test_subtract()
def test_subtract():
c = -13
assert self.a - self.b = c
assert self.a - self.b == c
Both methods work fine in most circumstances; the function-based interface is simpler and
quicker and so that's probably where you should start when you are just trying to add a

@ -168,7 +168,9 @@ Ensuring failure cases are visible with mock objects
Functions like :meth:`module.fail_json` are normally expected to terminate execution. When you
run with a mock module object this doesn't happen since the mock always returns another mock
from a function call. You can set up the mock to raise an exception as shown above, or you can
assert that these functions have not been called in each test. For example::
assert that these functions have not been called in each test. For example:
.. code-block:: python
module = MagicMock()
function_to_test(module, argument)
@ -185,7 +187,9 @@ The setup of an actual module is quite complex (see `Passing Arguments`_ below)
isn't needed for most functions which use a module. Instead you can use a mock object as
the module and create any module attributes needed by the function you are testing. If
you do this, beware that the module exit functions need special handling as mentioned
above, either by throwing an exception or ensuring that they haven't been called. For example::
above, either by throwing an exception or ensuring that they haven't been called. For example:
.. code-block:: python
class AnsibleExitJson(Exception):
"""Exception class to be raised by module.exit_json and caught by the test case"""
@ -218,7 +222,9 @@ present in the message. This means that we can check that we use the correct
parameters and nothing else.
*Example: in rds_instance unit tests a simple instance state is defined*::
*Example: in rds_instance unit tests a simple instance state is defined*:
.. code-block:: python
def simple_instance_list(status, pending):
return {u'DBInstances': [{u'DBInstanceArn': 'arn:aws:rds:us-east-1:1234567890:db:fakedb',
@ -226,7 +232,9 @@ parameters and nothing else.
u'PendingModifiedValues': pending,
u'DBInstanceIdentifier': 'fakedb'}]}
This is then used to create a list of states::
This is then used to create a list of states:
.. code-block:: python
rds_client_double = MagicMock()
rds_client_double.describe_db_instances.side_effect = [
@ -243,7 +251,9 @@ This is then used to create a list of states::
These states are then used as returns from a mock object to ensure that the ``await`` function
waits through all of the states that would mean the RDS instance has not yet completed
configuration::
configuration:
.. code-block:: python
rds_i.await_resource(rds_client_double, "some-instance", "available", mod_mock,
await_pending=1)
@ -292,7 +302,9 @@ To pass arguments to a module correctly, use the ``set_module_args`` method whic
as its parameter. Module creation and argument processing is
handled through the :class:`AnsibleModule` object in the basic section of the utilities. Normally
this accepts input on ``STDIN``, which is not convenient for unit testing. When the special
variable is set it will be treated as if the input came on ``STDIN`` to the module. Simply call that function before setting up your module::
variable is set it will be treated as if the input came on ``STDIN`` to the module. Simply call that function before setting up your module:
.. code-block:: python
import json
from units.modules.utils import set_module_args
@ -314,7 +326,9 @@ Handling exit correctly
The :meth:`module.exit_json` function won't work properly in a testing environment since it
writes error information to ``STDOUT`` upon exit, where it
is difficult to examine. This can be mitigated by replacing it (and :meth:`module.fail_json`) with
a function that raises an exception::
a function that raises an exception:
.. code-block:: python
def exit_json(*args, **kwargs):
if 'changed' not in kwargs:
@ -322,7 +336,9 @@ a function that raises an exception::
raise AnsibleExitJson(kwargs)
Now you can ensure that the first function called is the one you expected simply by
testing for the correct exception::
testing for the correct exception:
.. code-block:: python
def test_returned_value(self):
set_module_args({
@ -342,7 +358,9 @@ Running the main function
-------------------------
If you do want to run the actual main function of a module you must import the module, set
the arguments as above, set up the appropriate exit exception and then run the module::
the arguments as above, set up the appropriate exit exception and then run the module:
.. code-block:: python
# This test is based around pytest's features for individual test functions
import pytest
@ -364,7 +382,9 @@ Handling calls to external executables
Module must use :meth:`AnsibleModule.run_command` in order to execute an external command. This
method needs to be mocked:
Here is a simple mock of :meth:`AnsibleModule.run_command` (taken from :file:`test/units/modules/packaging/os/test_rhn_register.py`)::
Here is a simple mock of :meth:`AnsibleModule.run_command` (taken from :file:`test/units/modules/packaging/os/test_rhn_register.py`):
.. code-block:: python
with patch.object(basic.AnsibleModule, 'run_command') as run_command:
run_command.return_value = 0, '', '' # successful execution, no output
@ -381,7 +401,9 @@ A Complete Example
------------------
The following example is a complete skeleton that reuses the mocks explained above and adds a new
mock for :meth:`Ansible.get_bin_path`::
mock for :meth:`Ansible.get_bin_path`:
.. code-block:: python
import json
@ -470,7 +492,9 @@ Restructuring modules to enable testing module set up and other processes
Often modules have a ``main()`` function which sets up the module and then performs other
actions. This can make it difficult to check argument processing. This can be made easier by
moving module configuration and initialization into a separate function. For example::
moving module configuration and initialization into a separate function. For example:
.. code-block:: python
argument_spec = dict(
# module function variables
@ -498,7 +522,9 @@ moving module configuration and initialization into a separate function. For exa
return_dict = run_task(module, conn)
module.exit_json(**return_dict)
This now makes it possible to run tests against the module initiation function::
This now makes it possible to run tests against the module initiation function:
.. code-block:: python
def test_rds_module_setup_fails_if_db_instance_identifier_parameter_missing():
# db_instance_identifier parameter is missing

Loading…
Cancel
Save