Docs: Fix section header notation in Developer Guide (#75721)

pull/75730/head
Aine Riordan 3 years ago committed by GitHub
parent 8dc5516c83
commit 17122edfc5
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -46,10 +46,10 @@ Requirements
- `pyvmomi <https://github.com/vmware/pyvmomi/tree/master/pyVmomi>`_
- `requests <https://2.python-requests.org/en/master/>`_
If you want to deploy your test environment in a hypervisor, both `VMware or Libvirt <https://github.com/goneri/vmware-on-libvirt>`_ works well.
If you want to deploy your test environment in a hypervisor, both `VMware or Libvirt <https://github.com/goneri/vmware-on-libvirt>`_ work well.
NFS server configuration
~~~~~~~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^^^^^^^
Your NFS server must expose the following directory structure:
@ -182,7 +182,7 @@ Functional tests
----------------
Writing new tests
~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^
If you are writing a new collection of integration tests, there are a few VMware-specific things to note beyond
the standard Ansible :ref:`integration testing<testing_integration>` process.
@ -200,19 +200,19 @@ The resources defined there are automatically created by importing that role at
This will give you a ready to use cluster, datacenter, datastores, folder, switch, dvswitch, ESXi hosts, and VMs.
No need to create too much resources
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Most of the time, it's not necessary to use ``with_items`` to create multiple resources. By avoiding it,
you speed up the test execution and you simplify the clean up afterwards.
VM names should be predictable
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If you need to create a new VM during your test, you can use ``test_vm1``, ``test_vm2`` or ``test_vm3``. This
way it will be automatically clean up for you.
Avoid the common boiler plate code in your test playbook
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
From Ansible 2.10, the test suite uses `modules_defaults`. This module
allow us to preinitialize the following default keys of the VMware modules:

@ -58,7 +58,7 @@ All the modules are covered by a functional test. The tests are located in the :
To run the tests, you will need a vcenter instance and an ESXi.
black code formatter
~~~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^^^
We follow the coding style of `Black <https://github.com/psf/black>`.
You can run the code formatter with the following command.
@ -69,7 +69,7 @@ You can run the code formatter with the following command.
tox -e black
sanity tests
~~~~~~~~~~~~
^^^^^^^^^^^^
Here we use Python 3.8, the minimal version is 3.6.
@ -80,7 +80,7 @@ Here we use Python 3.8, the minimal version is 3.6.
integration tests
~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^
These tests should be run against your test environment.

@ -175,7 +175,7 @@ Some ideas of what to test are:
* Test on different operating systems, or against different library versions
Run sanity tests
````````````````
^^^^^^^^^^^^^^^^
.. code:: shell
@ -184,7 +184,7 @@ Run sanity tests
More information: :ref:`testing_sanity`
Run unit tests
``````````````
^^^^^^^^^^^^^^
.. code:: shell
@ -193,7 +193,7 @@ Run unit tests
More information: :ref:`testing_units`
Run integration tests
`````````````````````
^^^^^^^^^^^^^^^^^^^^^
.. code:: shell
@ -220,7 +220,7 @@ If the PR does not resolve the issue, or if you see any failures from the unit/i
| \```
Code Coverage Online
````````````````````
^^^^^^^^^^^^^^^^^^^^
`The online code coverage reports <https://codecov.io/gh/ansible/ansible>`_ are a good way
to identify areas for testing improvement in Ansible. By following red colors you can

@ -38,7 +38,7 @@ Ignore File Location
The location of the ignore file depends on the type of content being tested.
Ansible Collections
~~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^^
Since sanity tests change between Ansible releases, a separate ignore file is needed for each Ansible major release.
@ -47,7 +47,7 @@ The filename is ``tests/sanity/ignore-X.Y.txt`` where ``X.Y`` is the Ansible rel
Maintaining a separate file for each Ansible release allows a collection to pass tests for multiple versions of Ansible.
Ansible
~~~~~~~
^^^^^^^
When testing Ansible, all ignores are placed in the ``test/sanity/ignore.txt`` file.

@ -153,7 +153,7 @@ Container Images
----------------
Python 2
````````
^^^^^^^^
Most container images are for testing with Python 2:
@ -162,7 +162,7 @@ Most container images are for testing with Python 2:
- opensuse15py2
Python 3
````````
^^^^^^^^
To test with Python 3 use the following images:

@ -98,7 +98,7 @@ Extending unit tests
Structuring Unit Tests
``````````````````````
----------------------
Ansible drives unit tests through `pytest <https://docs.pytest.org/en/latest/>`_. This
means that tests can either be written a simple functions which are included in any file
@ -151,7 +151,7 @@ directory, which is then included directly.
Module test case common code
````````````````````````````
----------------------------
Keep common code as specific as possible within the `test/units/` directory structure.
Don't import common unit test code from directories outside the current or parent directories.
@ -161,7 +161,7 @@ files that aren't themselves tests.
Fixtures files
``````````````
--------------
To mock out fetching results from devices, or provide other complex data structures that
come from external libraries, you can use ``fixtures`` to read in pre-generated data.
@ -174,7 +174,7 @@ If you are simulating APIs you may find that Python placebo is useful. See
Code Coverage For New or Updated Unit Tests
```````````````````````````````````````````
-------------------------------------------
New code will be missing from the codecov.io coverage reports (see :ref:`developing_testing`), so
local reporting is needed. Most ``ansible-test`` commands allow you to collect code
coverage; this is particularly useful when to indicate where to extend testing.

@ -207,7 +207,7 @@ integration testing section, which run against the actual API. There are several
where the unit tests are likely to work better.
Defining a module against an API specification
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This case is especially important for modules interacting with web services, which provide
an API that Ansible uses but which are beyond the control of the user.
@ -254,7 +254,7 @@ potentially unusual that it would be impossible to reliably trigger through the
integration tests but which happen unpredictably in reality.
Defining a module to work against multiple API versions
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This case is especially important for modules interacting with many different versions of
software; for example, package installation modules that might be expected to work with

Loading…
Cancel
Save