diff --git a/docs/docsite/rst/dev_guide/platforms/vmware_guidelines.rst b/docs/docsite/rst/dev_guide/platforms/vmware_guidelines.rst index 5f41ea9d6b7..2865321e5c4 100644 --- a/docs/docsite/rst/dev_guide/platforms/vmware_guidelines.rst +++ b/docs/docsite/rst/dev_guide/platforms/vmware_guidelines.rst @@ -46,10 +46,10 @@ Requirements - `pyvmomi `_ - `requests `_ -If you want to deploy your test environment in a hypervisor, both `VMware or Libvirt `_ works well. +If you want to deploy your test environment in a hypervisor, both `VMware or Libvirt `_ work well. NFS server configuration -~~~~~~~~~~~~~~~~~~~~~~~~ +^^^^^^^^^^^^^^^^^^^^^^^^ Your NFS server must expose the following directory structure: @@ -182,7 +182,7 @@ Functional tests ---------------- Writing new tests -~~~~~~~~~~~~~~~~~ +^^^^^^^^^^^^^^^^^ If you are writing a new collection of integration tests, there are a few VMware-specific things to note beyond the standard Ansible :ref:`integration testing` process. @@ -200,19 +200,19 @@ The resources defined there are automatically created by importing that role at This will give you a ready to use cluster, datacenter, datastores, folder, switch, dvswitch, ESXi hosts, and VMs. No need to create too much resources -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Most of the time, it's not necessary to use ``with_items`` to create multiple resources. By avoiding it, you speed up the test execution and you simplify the clean up afterwards. VM names should be predictable -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ If you need to create a new VM during your test, you can use ``test_vm1``, ``test_vm2`` or ``test_vm3``. This way it will be automatically clean up for you. Avoid the common boiler plate code in your test playbook -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ From Ansible 2.10, the test suite uses `modules_defaults`. This module allow us to preinitialize the following default keys of the VMware modules: diff --git a/docs/docsite/rst/dev_guide/platforms/vmware_rest_guidelines.rst b/docs/docsite/rst/dev_guide/platforms/vmware_rest_guidelines.rst index af7d672958b..db35559a229 100644 --- a/docs/docsite/rst/dev_guide/platforms/vmware_rest_guidelines.rst +++ b/docs/docsite/rst/dev_guide/platforms/vmware_rest_guidelines.rst @@ -58,7 +58,7 @@ All the modules are covered by a functional test. The tests are located in the : To run the tests, you will need a vcenter instance and an ESXi. black code formatter -~~~~~~~~~~~~~~~~~~~~ +^^^^^^^^^^^^^^^^^^^^ We follow the coding style of `Black `. You can run the code formatter with the following command. @@ -69,7 +69,7 @@ You can run the code formatter with the following command. tox -e black sanity tests -~~~~~~~~~~~~ +^^^^^^^^^^^^ Here we use Python 3.8, the minimal version is 3.6. @@ -80,7 +80,7 @@ Here we use Python 3.8, the minimal version is 3.6. integration tests -~~~~~~~~~~~~~~~~~ +^^^^^^^^^^^^^^^^^ These tests should be run against your test environment. diff --git a/docs/docsite/rst/dev_guide/testing.rst b/docs/docsite/rst/dev_guide/testing.rst index 6febfb5ca6b..0170f114ebb 100644 --- a/docs/docsite/rst/dev_guide/testing.rst +++ b/docs/docsite/rst/dev_guide/testing.rst @@ -175,7 +175,7 @@ Some ideas of what to test are: * Test on different operating systems, or against different library versions Run sanity tests -```````````````` +^^^^^^^^^^^^^^^^ .. code:: shell @@ -184,7 +184,7 @@ Run sanity tests More information: :ref:`testing_sanity` Run unit tests -`````````````` +^^^^^^^^^^^^^^ .. code:: shell @@ -193,7 +193,7 @@ Run unit tests More information: :ref:`testing_units` Run integration tests -````````````````````` +^^^^^^^^^^^^^^^^^^^^^ .. code:: shell @@ -220,7 +220,7 @@ If the PR does not resolve the issue, or if you see any failures from the unit/i | \``` Code Coverage Online -```````````````````` +^^^^^^^^^^^^^^^^^^^^ `The online code coverage reports `_ are a good way to identify areas for testing improvement in Ansible. By following red colors you can diff --git a/docs/docsite/rst/dev_guide/testing/sanity/ignores.rst b/docs/docsite/rst/dev_guide/testing/sanity/ignores.rst index 9d7a94c018b..ea1d665b335 100644 --- a/docs/docsite/rst/dev_guide/testing/sanity/ignores.rst +++ b/docs/docsite/rst/dev_guide/testing/sanity/ignores.rst @@ -38,7 +38,7 @@ Ignore File Location The location of the ignore file depends on the type of content being tested. Ansible Collections -~~~~~~~~~~~~~~~~~~~ +^^^^^^^^^^^^^^^^^^^ Since sanity tests change between Ansible releases, a separate ignore file is needed for each Ansible major release. @@ -47,7 +47,7 @@ The filename is ``tests/sanity/ignore-X.Y.txt`` where ``X.Y`` is the Ansible rel Maintaining a separate file for each Ansible release allows a collection to pass tests for multiple versions of Ansible. Ansible -~~~~~~~ +^^^^^^^ When testing Ansible, all ignores are placed in the ``test/sanity/ignore.txt`` file. diff --git a/docs/docsite/rst/dev_guide/testing_integration.rst b/docs/docsite/rst/dev_guide/testing_integration.rst index d976a63b327..5402d644573 100644 --- a/docs/docsite/rst/dev_guide/testing_integration.rst +++ b/docs/docsite/rst/dev_guide/testing_integration.rst @@ -153,7 +153,7 @@ Container Images ---------------- Python 2 -```````` +^^^^^^^^ Most container images are for testing with Python 2: @@ -162,7 +162,7 @@ Most container images are for testing with Python 2: - opensuse15py2 Python 3 -```````` +^^^^^^^^ To test with Python 3 use the following images: diff --git a/docs/docsite/rst/dev_guide/testing_units.rst b/docs/docsite/rst/dev_guide/testing_units.rst index 7573da6f5ba..dda2d0349df 100644 --- a/docs/docsite/rst/dev_guide/testing_units.rst +++ b/docs/docsite/rst/dev_guide/testing_units.rst @@ -98,7 +98,7 @@ Extending unit tests Structuring Unit Tests -`````````````````````` +---------------------- Ansible drives unit tests through `pytest `_. This means that tests can either be written a simple functions which are included in any file @@ -151,7 +151,7 @@ directory, which is then included directly. Module test case common code -```````````````````````````` +---------------------------- Keep common code as specific as possible within the `test/units/` directory structure. Don't import common unit test code from directories outside the current or parent directories. @@ -161,7 +161,7 @@ files that aren't themselves tests. Fixtures files -`````````````` +-------------- To mock out fetching results from devices, or provide other complex data structures that come from external libraries, you can use ``fixtures`` to read in pre-generated data. @@ -174,7 +174,7 @@ If you are simulating APIs you may find that Python placebo is useful. See Code Coverage For New or Updated Unit Tests -``````````````````````````````````````````` +------------------------------------------- New code will be missing from the codecov.io coverage reports (see :ref:`developing_testing`), so local reporting is needed. Most ``ansible-test`` commands allow you to collect code coverage; this is particularly useful when to indicate where to extend testing. diff --git a/docs/docsite/rst/dev_guide/testing_units_modules.rst b/docs/docsite/rst/dev_guide/testing_units_modules.rst index 917a3465434..9dd2ee94019 100644 --- a/docs/docsite/rst/dev_guide/testing_units_modules.rst +++ b/docs/docsite/rst/dev_guide/testing_units_modules.rst @@ -207,7 +207,7 @@ integration testing section, which run against the actual API. There are several where the unit tests are likely to work better. Defining a module against an API specification -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ This case is especially important for modules interacting with web services, which provide an API that Ansible uses but which are beyond the control of the user. @@ -254,7 +254,7 @@ potentially unusual that it would be impossible to reliably trigger through the integration tests but which happen unpredictably in reality. Defining a module to work against multiple API versions -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ This case is especially important for modules interacting with many different versions of software; for example, package installation modules that might be expected to work with