Compare commits

...

32 Commits

Author SHA1 Message Date
sivel / Matt Martz f83bccc457
New release v2.20.0rc2 (#86034) 1 month ago
Jordan Borean fb61d54216
Fix psrp - ReadTimeout exceptions now mark host as unreachable (#85974) (#85993)
* psrp - ReadTimeout exceptions now mark host as unreachable

* add try to _exec_psrp_script

* fix indent E111

* update raise format

switch to raise Exception from e



---------


(cherry picked from commit 9fcf1f7c58)

Co-authored-by: Michał Gąsior <rogacz@gmail.com>
2 months ago
Martin Krizek 061c504e98
[stable-2.20] Avoid the ssh-agent exiting before tests end (#85979) (#86008)
There were couple of occurrences where the hard 30 seconds limit on
running ssh-agent was not enough for the test to run and the ssh-agent
was killed resulting in the test failing with "Connection refused". This
change just lets the agent run in the background and kills it
manually after the tests finish.
(cherry picked from commit 05d5b0f168)
2 months ago
sivel / Matt Martz 48789c4efc
[stable-2.20] Don't deprecate six yet (#86020) (#86021)
* [stable-2.20] Don't deprecate six yet (#86020)
(cherry picked from commit 1a3e63c)
2 months ago
sivel / Matt Martz f14923276c
Update Ansible release version to v2.20.0rc1.post0. (#85991) 2 months ago
sivel / Matt Martz 5431f258b8
New release v2.20.0rc1 (#85989) 2 months ago
sivel / Matt Martz 163a6ec526
fix urls in additional ansible-doc test (#85988)
(cherry picked from commit c02f59ca3a)
2 months ago
Matt Clay 110993bbfd
ansible-test - Update base/default/distro containers (#85985)
(cherry picked from commit 83c79240ec)
2 months ago
Matt Clay bdba82ff07
ansible-test - Upgrade coverage to 7.10.7 (#85981)
(cherry picked from commit 7c2311d547)
2 months ago
Matt Clay 405e2bf2bf
ansible-test - Update pinned pip to 25.2 (#85982)
(cherry picked from commit de7dd5bbb2)
2 months ago
Matt Clay ed60501603
ansible-test - Update sanity test requirements (#85980)
(cherry picked from commit 9ee667030f)
2 months ago
Matt Clay d0552b56ef
ansible-test - Update base/default containers (#85967)
(cherry picked from commit 82b64d4b69)
2 months ago
Matt Clay ab4d37a803
Use bcrypt < 5 for unit tests (#85969)
(cherry picked from commit 06456c68ec)
2 months ago
Sviatoslav Sydorenko 1b6bcc53b1
Mention pkg name in `package-data` sanity output
The logs were displaying a series of numbers in parens like `(66.1.0)`
at the end of each error line. its unintuitive what that means. I had
to look into the source code to confirm my suspicion of it being the
version of `setuptools`. This patch spells it out.

(cherry picked from commit 53afc6f203)
2 months ago
Sviatoslav Sydorenko 3f7cfd961c
Use strict_optional @ ansible.galaxy.dependency_resolution
This patch drops unnecessary default for
`CollectionDependencyProvider`'s `concrete_artifacts_manager` argument
as it is always passed, in every place across the code base where the
provider is constructed.

It was also causing MyPy violations on calls to
`_ComputedReqKindsMixin.from_requirement_dict()` in the "strict
optional" mode which is now enforced for $sbj, while remaining
disabled globally.

It is a #85545 follow-up.

(cherry picked from commit 0cd36ce6d0)
2 months ago
Sviatoslav Sydorenko bef8eece4b
Type-annotate ansible.galaxy.dependency_resolution
This patch is a combination of `pyrefly autotype` and manual
post-processing. Parts of it migrate pre-existing comment-based
annotations, fixing incorrect ones where applicable.

The change also configures MyPy to run checks against actual
`resolvelib` annotations and includes a small tweak of
`ansible.galaxy.collection._resolve_depenency_map` to make it
compatible with those.

Co-Authored-By: Jordan Borean <jborean93@gmail.com>
Co-Authored-By: Matt Clay <matt@mystile.com>
Co-Authored-By: Sloane Hertel <19572925+s-hertel@users.noreply.github.com>
(cherry picked from commit c9131aa847)
2 months ago
David Shrewsbury cc6a93f23e
Fix flakey get_url test (#85953)
(cherry picked from commit feda8fc564)
2 months ago
sivel / Matt Martz 59dc766d7d
[stable-2.20] Fix signal propagation (#85907) (#85983)
(cherry picked from commit 5a9afe4)
2 months ago
sivel / Matt Martz a4776f850c
Update Ansible release version to v2.20.0b2.post0. (#85954) 2 months ago
sivel / Matt Martz 4f1fe10921
New release v2.20.0b2 (#85950) 2 months ago
sivel / Matt Martz ff29cd4ff0
Update DataLoader.get_basedir to be an abspath (#85940)
(cherry picked from commit 6673a14a9e)
2 months ago
s-hamann 06f272129c
fetch - return file in result when changed is true (#85729)
Set the (source) file attribute in the return value if the file changed
(e.g. on initial fetch). The attribute is already set in all other
cases.

(cherry picked from commit 0c7dcb65cf)
2 months ago
Martin Krizek 0f079fd23f
Deprecate `ansible.module_utils.six` (#85934)
* Deprecate `ansible.module_utils.six`

Fixes #85920

(cherry picked from commit 686c3658ae)
2 months ago
Sloane Hertel cbeb1da98b
Remove support for resolvelib < 0.8.0 (#85936)
* Remove support for resolvelib < 0.8.0

Remove code handling differences between resolvelib 0.5.3 and 0.8.0

Drop some versions from the test to reduce the time it takes to run

Co-authored-by: Sviatoslav Sydorenko <wk@sydorenko.org.ua>

* Remove type annotation

---------

Co-authored-by: Sviatoslav Sydorenko <wk@sydorenko.org.ua>
(cherry picked from commit cb2ecda514)
2 months ago
Martin Krizek 7db5959813
Don't special case implicit meta tasks when filtering on tags (#85805)
* Don't special case implicit meta tasks when filtering on tags

Fixes #85475

(cherry picked from commit 313c6f6b4d)
2 months ago
Abhijeet Kasurde 40b11f86fb
known_hosts: return rc and stderr in fail_json (#85871)
* When ssh-keygen fails, return rc and stderr in fail_json
  in order to help debugging.

Fixes: #85850

Signed-off-by: Abhijeet Kasurde <Akasurde@redhat.com>
(cherry picked from commit 6bee84318d)
2 months ago
Martin Krizek 580eb781dc
import_tasks processing closer to include_tasks (#85877)
Fixes #69882
Closes #83853
Fixes #85855
Fixes #85856

(cherry picked from commit c3f87b31d1)
2 months ago
Abhijeet Kasurde f23224d7b4
falsy: Update doc (#85913)
Signed-off-by: Abhijeet Kasurde <Akasurde@redhat.com>
(cherry picked from commit c5e6227bdb)
2 months ago
Luca Steinke 418746dcfc
fix description of truthy test (#85911)
There's a "not" too much here.

Maybe further examples can be found.

(cherry picked from commit eafa139f77)
2 months ago
Felix Fontein 48ef23fe4f
Make sure ansible-doc doesn't crash when scanning collections whose path contains ansible_collections twice (#85361)
Ref: https://github.com/ansible/ansible/issues/84909#issuecomment-2767335761

Co-authored-by: s-hertel <19572925+s-hertel@users.noreply.github.com>
Co-authored-by: Brian Coca <bcoca@users.noreply.github.com>
(cherry picked from commit c6d8d206af)
2 months ago
Matt Davis 00ee6040b6
Update Ansible release version to v2.20.0b1.post0. (#85902) 2 months ago
Matt Davis 21de43ab65
New release v2.20.0b1 (#85901) 2 months ago

@ -0,0 +1,207 @@
======================================================
ansible-core 2.20 "Good Times Bad Times" Release Notes
======================================================
.. contents:: Topics
v2.20.0rc2
==========
Release Summary
---------------
| Release Date: 2025-10-20
| `Porting Guide <https://docs.ansible.com/ansible-core/2.20/porting_guides/porting_guide_core_2.20.html>`__
Bugfixes
--------
- psrp - ReadTimeout exceptions now mark host as unreachable instead of fatal (https://github.com/ansible/ansible/issues/85966)
v2.20.0rc1
==========
Release Summary
---------------
| Release Date: 2025-10-14
| `Porting Guide <https://docs.ansible.com/ansible-core/2.20/porting_guides/porting_guide_core_2.20.html>`__
Minor Changes
-------------
- ansible-test - Default to Python 3.14 in the ``base`` and ``default`` test containers.
- ansible-test - Filter out pylint messages for invalid filenames and display a notice when doing so.
- ansible-test - Update astroid imports in custom pylint checkers.
- ansible-test - Update pinned ``pip`` version to 25.2.
- ansible-test - Update pinned sanity test requirements, including upgrading to pylint 4.0.0.
Bugfixes
--------
- SIGINT/SIGTERM Handling - Make SIGINT/SIGTERM handling more robust by splitting concerns between forks and the parent.
v2.20.0b2
=========
Release Summary
---------------
| Release Date: 2025-10-06
| `Porting Guide <https://docs.ansible.com/ansible-core/2.20/porting_guides/porting_guide_core_2.20.html>`__
Minor Changes
-------------
- DataLoader - Update ``DataLoader.get_basedir`` to be an abspath
- known_hosts - return rc and stderr when ssh-keygen command fails for further debugging (https://github.com/ansible/ansible/issues/85850).
Removed Features (previously deprecated)
----------------------------------------
- ansible-galaxy - remove support for resolvelib >= 0.5.3, < 0.8.0.
Bugfixes
--------
- Fix issue where play tags prevented executing notified handlers (https://github.com/ansible/ansible/issues/85475)
- Fix issues with keywords being incorrectly validated on ``import_tasks`` (https://github.com/ansible/ansible/issues/85855, https://github.com/ansible/ansible/issues/85856)
- Fix traceback when trying to import non-existing file via nested ``import_tasks`` (https://github.com/ansible/ansible/issues/69882)
- ansible-doc - prevent crash when scanning collections in paths that have more than one ``ansible_collections`` in it (https://github.com/ansible/ansible/issues/84909, https://github.com/ansible/ansible/pull/85361).
- fetch - also return ``file`` in the result when changed is ``True`` (https://github.com/ansible/ansible/pull/85729).
v2.20.0b1
=========
Release Summary
---------------
| Release Date: 2025-09-23
| `Porting Guide <https://docs.ansible.com/ansible-core/2.20/porting_guides/porting_guide_core_2.20.html>`__
Major Changes
-------------
- ansible - Add support for Python 3.14.
- ansible - Drop support for Python 3.11 on the controller.
- ansible - Drop support for Python 3.8 on targets.
Minor Changes
-------------
- Add tech preview play argument spec validation, which can be enabled by setting the play keyword ``validate_argspec`` to ``True`` or the name of an argument spec. When ``validate_argspec`` is set to ``True``, a play ``name`` is required and used as the argument spec name. When enabled, the argument spec is loaded from a file matching the pattern <playbook_name>.meta.yml. At minimum, this file should contain ``{"argument_specs": {"name": {"options": {}}}}``, where "name" is the name of the play or configured argument spec.
- Added Univention Corporate Server as a part of Debian OS distribution family (https://github.com/ansible/ansible/issues/85490).
- AnsibleModule - Add temporary internal monkeypatch-able hook to alter module result serialization by splitting serialization from ``_return_formatted`` into ``_record_module_result``.
- Python type hints applied to ``to_text`` and ``to_bytes`` functions for better type hint interactions with code utilizing these functions.
- ansible now warns if you use reserved tags that were only meant for selection and not for use in play.
- ansible-doc - Return a more verbose error message when the ``description`` field is missing.
- ansible-doc - show ``notes``, ``seealso``, and top-level ``version_added`` for role entrypoints (https://github.com/ansible/ansible/pull/81796).
- ansible-doc adds support for RETURN documentation to support doc fragment plugins
- ansible-test - Implement new authentication methods for accessing the Ansible Core CI service.
- ansible-test - Improve formatting of generated coverage config file.
- ansible-test - Removed support for automatic provisioning of obsolete instances for network-integration tests.
- ansible-test - Replace FreeBSD 14.2 with 14.3.
- ansible-test - Replace RHEL 9.5 with 9.6.
- ansible-test - Update Ubuntu containers.
- ansible-test - Update base/default containers to include Python 3.14.0.
- ansible-test - Update pinned sanity test requirements.
- ansible-test - Update test containers.
- ansible-test - Upgrade Alpine 3.21 to 3.22.
- ansible-test - Upgrade Fedora 41 to Fedora 42.
- ansible-test - Upgrade to ``coverage`` version 7.10.7 for Python 3.9 and later.
- ansible-test - Use OS packages to satisfy controller requirements on FreeBSD 13.5 during managed instance bootstrapping.
- apt_repository - use correct debug method to print debug message.
- blockinfile - add new module option ``encoding`` to support files in encodings other than UTF-8 (https://github.com/ansible/ansible/pull/85291).
- deb822_repository - Add automatic installation of the ``python3-debian`` package if it is missing by adding the parameter ``install_python_debian``
- default callback plugin - add option to configure indentation for JSON and YAML output (https://github.com/ansible/ansible/pull/85497).
- encrypt - check datatype of salt_size in password_hash filter.
- fetch_file - add ca_path and cookies parameter arguments (https://github.com/ansible/ansible/issues/85172).
- include_vars - Raise an error if 'extensions' is not specified as a list.
- include_vars - Raise an error if 'ignore_files' is not specified as a list.
- lineinfile - add new module option ``encoding`` to support files in encodings other than UTF-8 (https://github.com/ansible/ansible/pull/84999).
- regex - Document the match_type fullmatch.
- regex - Ensure that match_type is one of match, fullmatch, or search (https://github.com/ansible/ansible/pull/85629).
- replace - read/write files in text-mode as unicode chars instead of as bytes and switch regex matching to unicode chars instead of bytes. (https://github.com/ansible/ansible/pull/85785).
- service_facts - handle keyerror exceptions with warning.
- service_facts - warn user about missing service details instead of ignoring.
- setup - added new subkey ``lvs`` within each entry of ``ansible_facts['vgs']`` to provide complete logical volume data scoped by volume group. The top level ``lvs`` fact by comparison, deduplicates logical volume names across volume groups and may be incomplete. (https://github.com/ansible/ansible/issues/85632)
- six - bump six version from 1.16.0 to 1.17.0 (https://github.com/ansible/ansible/issues/85408).
- stat module - add SELinux context as a return value, and add a new option to trigger this return, which is False by default. (https://github.com/ansible/ansible/issues/85217).
- tags now warn when using reserved keywords.
- wrapt - bump version from 1.15.0 to 1.17.2 (https://github.com/ansible/ansible/issues/85407).
Breaking Changes / Porting Guide
--------------------------------
- powershell - Removed code that tried to remote quotes from paths when performing Windows operations like copying and fetching file. This should not affect normal playbooks unless a value is quoted too many times.
Deprecated Features
-------------------
- Deprecated the shell plugin's ``wrap_for_exec`` function. This API is not used in Ansible or any known collection and is being removed to simplify the plugin API. Plugin authors should wrap their command to execute within an explicit shell or other known executable.
- INJECT_FACTS_AS_VARS configuration currently defaults to ``True``, this is now deprecated and it will switch to ``False`` by Ansible 2.24. You will only get notified if you are accessing 'injected' facts (for example, ansible_os_distribution vs ansible_facts['os_distribution']).
- hash_params function in roles/__init__ is being deprecated as it is not in use.
- include_vars - Specifying 'ignore_files' as a string is deprecated.
- vars, the internal variable cache will be removed in 2.24. This cache, once used internally exposes variables in inconsistent states, the 'vars' and 'varnames' lookups should be used instead.
Removed Features (previously deprecated)
----------------------------------------
- Removed the option to set the ``DEFAULT_TRANSPORT`` configuration to ``smart`` that selects the default transport as either ``ssh`` or ``paramiko`` based on the underlying platform configuraton.
- ``vault``/``unvault`` filters - remove the deprecated ``vaultid`` parameter.
- ansible-doc - role entrypoint attributes are no longer shown
- ansible-galaxy - removed the v2 Galaxy server API. Galaxy servers hosting collections must support v3.
- dnf/dnf5 - remove deprecated ``install_repoquery`` option.
- encrypt - remove deprecated passlib_or_crypt API.
- paramiko - Removed the ``PARAMIKO_HOST_KEY_AUTO_ADD`` and ``PARAMIKO_LOOK_FOR_KEYS`` configuration keys, which were previously deprecated.
- py3compat - remove deprecated ``py3compat.environ`` call.
- vars plugins - removed the deprecated ``get_host_vars`` or ``get_group_vars`` fallback for vars plugins that do not inherit from ``BaseVarsPlugin`` and define a ``get_vars`` method.
- yum_repository - remove deprecated ``keepcache`` option.
Bugfixes
--------
- Do not re-add ``tags`` on blocks from within ``import_tasks``.
- The ``ansible_failed_task`` variable is now correctly exposed in a rescue section, even when a failing handler is triggered by the ``flush_handlers`` task in the corresponding ``block`` (https://github.com/ansible/ansible/issues/85682)
- Windows async - Handle running PowerShell modules with trailing data after the module result
- ``ansible-galaxy collection list`` - fail when none of the configured collection paths exist.
- ``ternary`` filter - evaluate values lazily (https://github.com/ansible/ansible/issues/85743)
- ansible-doc --list/--list_files/--metadata-dump - fixed relative imports in nested filter/test plugin files (https://github.com/ansible/ansible/issues/85753).
- ansible-galaxy - Use the provided import task url, instead of parsing to get the task id and reconstructing the URL
- ansible-galaxy no longer shows the internal protomatter collection when listing.
- ansible-test - Always exclude the ``tests/output/`` directory from a collection's code coverage. (https://github.com/ansible/ansible/issues/84244)
- ansible-test - Fix a traceback that can occur when using delegation before the ansible-test temp directory is created.
- ansible-test - Limit package install retries during managed remote instance bootstrapping.
- ansible-test - Use a consistent coverage config for all collection testing.
- apt - mark dependencies installed as part of deb file installation as auto (https://github.com/ansible/ansible/issues/78123).
- argspec validation - The ``str`` argspec type treats ``None`` values as empty string for better consistency with pre-2.19 templating conversions.
- cache plugins - close temp cache file before moving it to fix error on WSL. (https://github.com/ansible/ansible/pull/85816)
- callback plugins - fix displaying the rendered ``ansible_host`` variable with ``delegate_to`` (https://github.com/ansible/ansible/issues/84922).
- callback plugins - improve consistency accessing the Task object's resolved_action attribute.
- conditionals - When displaying a broken conditional error or deprecation warning, the origin of the non-boolean result is included (if available), and the raw result is omitted.
- display - Fixed reference to undefined `_DeferredWarningContext` when issuing early warnings during startup. (https://github.com/ansible/ansible/issues/85886)
- dnf - Check if installroot is directory or not (https://github.com/ansible/ansible/issues/85680).
- failed_when - When using ``failed_when`` to suppress an error, the ``exception`` key in the result is renamed to ``failed_when_suppressed_exception``. This prevents the error from being displayed by callbacks after being suppressed. (https://github.com/ansible/ansible/issues/85505)
- import_tasks - fix templating parent include arguments.
- include_role - allow host specific values in all ``*_from`` arguments (https://github.com/ansible/ansible/issues/66497)
- pip - Fix pip module output so that it returns changed when the only operation is initializing a venv.
- plugins config, get_option_and_origin now correctly displays the value and origin of the option.
- run_command - Fixed premature selector unregistration on empty read from stdout/stderr that caused truncated output or hangs in rare situations.
- script inventory plugin will now show correct 'incorrect' type when doing implicit conversions on groups.
- ssh connection - fix documented variables for the ``host`` option. Connection options can be configured with delegated variables in general.
- template lookup - Skip finalization on the internal templating operation to allow markers to be returned and handled by, e.g. the ``default`` filter. Previously, finalization tripped markers, causing an exception to end processing of the current template pipeline. (https://github.com/ansible/ansible/issues/85674)
- templating - Avoid tripping markers within Jinja generated code. (https://github.com/ansible/ansible/issues/85674)
- templating - Ensure filter plugin result processing occurs under the correct call context. (https://github.com/ansible/ansible/issues/85585)
- templating - Fix slicing of tuples in templating (https://github.com/ansible/ansible/issues/85606).
- templating - Multi-node template results coerce embedded ``None`` nodes to empty string (instead of rendering literal ``None`` to the output).
- templating - Undefined marker values sourced from the Jinja ``getattr->getitem`` fallback are now accessed correctly, raising AnsibleUndefinedVariable for user plugins that do not understand markers. Previously, these values were erroneously returned to user plugin code that had not opted in to marker acceptance.
- tqm - use display.error_as_warning instead of display.warning_as_error.
- tqm - use display.error_as_warning instead of self.warning.
- uri - fix form-multipart file not being found when task is retried (https://github.com/ansible/ansible/issues/85009)
- validate-modules sanity test - fix handling of missing doc fragments (https://github.com/ansible/ansible/pull/85638).
Known Issues
------------
- templating - Exceptions raised in a Jinja ``set`` or ``with`` block which are not accessed by the template are ignored in the same manner as undefined values.
- templating - Passing a container created in a Jinja ``set`` or ``with`` block to a method results in a copy of that container. Mutations to that container which are not returned by the method will be discarded.

@ -1,2 +1,356 @@
ancestor: 2.18.0 ancestor: 2.18.0
releases: {} releases:
2.20.0b1:
changes:
breaking_changes:
- powershell - Removed code that tried to remote quotes from paths when performing
Windows operations like copying and fetching file. This should not affect
normal playbooks unless a value is quoted too many times.
bugfixes:
- Do not re-add ``tags`` on blocks from within ``import_tasks``.
- The ``ansible_failed_task`` variable is now correctly exposed in a rescue
section, even when a failing handler is triggered by the ``flush_handlers``
task in the corresponding ``block`` (https://github.com/ansible/ansible/issues/85682)
- Windows async - Handle running PowerShell modules with trailing data after
the module result
- '``ansible-galaxy collection list`` - fail when none of the configured collection
paths exist.'
- '``ternary`` filter - evaluate values lazily (https://github.com/ansible/ansible/issues/85743)'
- ansible-doc --list/--list_files/--metadata-dump - fixed relative imports in
nested filter/test plugin files (https://github.com/ansible/ansible/issues/85753).
- ansible-galaxy - Use the provided import task url, instead of parsing to get
the task id and reconstructing the URL
- ansible-galaxy no longer shows the internal protomatter collection when listing.
- ansible-test - Always exclude the ``tests/output/`` directory from a collection's
code coverage. (https://github.com/ansible/ansible/issues/84244)
- ansible-test - Fix a traceback that can occur when using delegation before
the ansible-test temp directory is created.
- ansible-test - Limit package install retries during managed remote instance
bootstrapping.
- ansible-test - Use a consistent coverage config for all collection testing.
- apt - mark dependencies installed as part of deb file installation as auto
(https://github.com/ansible/ansible/issues/78123).
- argspec validation - The ``str`` argspec type treats ``None`` values as empty
string for better consistency with pre-2.19 templating conversions.
- cache plugins - close temp cache file before moving it to fix error on WSL.
(https://github.com/ansible/ansible/pull/85816)
- callback plugins - fix displaying the rendered ``ansible_host`` variable with
``delegate_to`` (https://github.com/ansible/ansible/issues/84922).
- callback plugins - improve consistency accessing the Task object's resolved_action
attribute.
- conditionals - When displaying a broken conditional error or deprecation warning,
the origin of the non-boolean result is included (if available), and the raw
result is omitted.
- display - Fixed reference to undefined `_DeferredWarningContext` when issuing
early warnings during startup. (https://github.com/ansible/ansible/issues/85886)
- dnf - Check if installroot is directory or not (https://github.com/ansible/ansible/issues/85680).
- failed_when - When using ``failed_when`` to suppress an error, the ``exception``
key in the result is renamed to ``failed_when_suppressed_exception``. This
prevents the error from being displayed by callbacks after being suppressed.
(https://github.com/ansible/ansible/issues/85505)
- import_tasks - fix templating parent include arguments.
- include_role - allow host specific values in all ``*_from`` arguments (https://github.com/ansible/ansible/issues/66497)
- pip - Fix pip module output so that it returns changed when the only operation
is initializing a venv.
- plugins config, get_option_and_origin now correctly displays the value and
origin of the option.
- run_command - Fixed premature selector unregistration on empty read from stdout/stderr
that caused truncated output or hangs in rare situations.
- script inventory plugin will now show correct 'incorrect' type when doing
implicit conversions on groups.
- ssh connection - fix documented variables for the ``host`` option. Connection
options can be configured with delegated variables in general.
- template lookup - Skip finalization on the internal templating operation to
allow markers to be returned and handled by, e.g. the ``default`` filter.
Previously, finalization tripped markers, causing an exception to end processing
of the current template pipeline. (https://github.com/ansible/ansible/issues/85674)
- templating - Avoid tripping markers within Jinja generated code. (https://github.com/ansible/ansible/issues/85674)
- templating - Ensure filter plugin result processing occurs under the correct
call context. (https://github.com/ansible/ansible/issues/85585)
- templating - Fix slicing of tuples in templating (https://github.com/ansible/ansible/issues/85606).
- templating - Multi-node template results coerce embedded ``None`` nodes to
empty string (instead of rendering literal ``None`` to the output).
- templating - Undefined marker values sourced from the Jinja ``getattr->getitem``
fallback are now accessed correctly, raising AnsibleUndefinedVariable for
user plugins that do not understand markers. Previously, these values were
erroneously returned to user plugin code that had not opted in to marker acceptance.
- tqm - use display.error_as_warning instead of display.warning_as_error.
- tqm - use display.error_as_warning instead of self.warning.
- uri - fix form-multipart file not being found when task is retried (https://github.com/ansible/ansible/issues/85009)
- validate-modules sanity test - fix handling of missing doc fragments (https://github.com/ansible/ansible/pull/85638).
deprecated_features:
- Deprecated the shell plugin's ``wrap_for_exec`` function. This API is not
used in Ansible or any known collection and is being removed to simplify the
plugin API. Plugin authors should wrap their command to execute within an
explicit shell or other known executable.
- INJECT_FACTS_AS_VARS configuration currently defaults to ``True``, this is
now deprecated and it will switch to ``False`` by Ansible 2.24. You will only
get notified if you are accessing 'injected' facts (for example, ansible_os_distribution
vs ansible_facts['os_distribution']).
- hash_params function in roles/__init__ is being deprecated as it is not in
use.
- include_vars - Specifying 'ignore_files' as a string is deprecated.
- vars, the internal variable cache will be removed in 2.24. This cache, once
used internally exposes variables in inconsistent states, the 'vars' and 'varnames'
lookups should be used instead.
known_issues:
- templating - Exceptions raised in a Jinja ``set`` or ``with`` block which
are not accessed by the template are ignored in the same manner as undefined
values.
- templating - Passing a container created in a Jinja ``set`` or ``with`` block
to a method results in a copy of that container. Mutations to that container
which are not returned by the method will be discarded.
major_changes:
- ansible - Add support for Python 3.14.
- ansible - Drop support for Python 3.11 on the controller.
- ansible - Drop support for Python 3.8 on targets.
minor_changes:
- 'Add tech preview play argument spec validation, which can be enabled by setting
the play keyword ``validate_argspec`` to ``True`` or the name of an argument
spec. When ``validate_argspec`` is set to ``True``, a play ``name`` is required
and used as the argument spec name. When enabled, the argument spec is loaded
from a file matching the pattern <playbook_name>.meta.yml. At minimum, this
file should contain ``{"argument_specs": {"name": {"options": {}}}}``, where
"name" is the name of the play or configured argument spec.'
- Added Univention Corporate Server as a part of Debian OS distribution family
(https://github.com/ansible/ansible/issues/85490).
- AnsibleModule - Add temporary internal monkeypatch-able hook to alter module
result serialization by splitting serialization from ``_return_formatted``
into ``_record_module_result``.
- Python type hints applied to ``to_text`` and ``to_bytes`` functions for better
type hint interactions with code utilizing these functions.
- ansible now warns if you use reserved tags that were only meant for selection
and not for use in play.
- ansible-doc - Return a more verbose error message when the ``description``
field is missing.
- ansible-doc - show ``notes``, ``seealso``, and top-level ``version_added``
for role entrypoints (https://github.com/ansible/ansible/pull/81796).
- ansible-doc adds support for RETURN documentation to support doc fragment
plugins
- ansible-test - Implement new authentication methods for accessing the Ansible
Core CI service.
- ansible-test - Improve formatting of generated coverage config file.
- ansible-test - Removed support for automatic provisioning of obsolete instances
for network-integration tests.
- ansible-test - Replace FreeBSD 14.2 with 14.3.
- ansible-test - Replace RHEL 9.5 with 9.6.
- ansible-test - Update Ubuntu containers.
- ansible-test - Update base/default containers to include Python 3.14.0.
- ansible-test - Update pinned sanity test requirements.
- ansible-test - Update test containers.
- ansible-test - Upgrade Alpine 3.21 to 3.22.
- ansible-test - Upgrade Fedora 41 to Fedora 42.
- ansible-test - Upgrade to ``coverage`` version 7.10.7 for Python 3.9 and later.
- ansible-test - Use OS packages to satisfy controller requirements on FreeBSD
13.5 during managed instance bootstrapping.
- apt_repository - use correct debug method to print debug message.
- blockinfile - add new module option ``encoding`` to support files in encodings
other than UTF-8 (https://github.com/ansible/ansible/pull/85291).
- deb822_repository - Add automatic installation of the ``python3-debian`` package
if it is missing by adding the parameter ``install_python_debian``
- default callback plugin - add option to configure indentation for JSON and
YAML output (https://github.com/ansible/ansible/pull/85497).
- encrypt - check datatype of salt_size in password_hash filter.
- fetch_file - add ca_path and cookies parameter arguments (https://github.com/ansible/ansible/issues/85172).
- include_vars - Raise an error if 'extensions' is not specified as a list.
- include_vars - Raise an error if 'ignore_files' is not specified as a list.
- lineinfile - add new module option ``encoding`` to support files in encodings
other than UTF-8 (https://github.com/ansible/ansible/pull/84999).
- regex - Document the match_type fullmatch.
- regex - Ensure that match_type is one of match, fullmatch, or search (https://github.com/ansible/ansible/pull/85629).
- replace - read/write files in text-mode as unicode chars instead of as bytes
and switch regex matching to unicode chars instead of bytes. (https://github.com/ansible/ansible/pull/85785).
- service_facts - handle keyerror exceptions with warning.
- service_facts - warn user about missing service details instead of ignoring.
- setup - added new subkey ``lvs`` within each entry of ``ansible_facts['vgs']``
to provide complete logical volume data scoped by volume group. The top level
``lvs`` fact by comparison, deduplicates logical volume names across volume
groups and may be incomplete. (https://github.com/ansible/ansible/issues/85632)
- six - bump six version from 1.16.0 to 1.17.0 (https://github.com/ansible/ansible/issues/85408).
- stat module - add SELinux context as a return value, and add a new option
to trigger this return, which is False by default. (https://github.com/ansible/ansible/issues/85217).
- tags now warn when using reserved keywords.
- wrapt - bump version from 1.15.0 to 1.17.2 (https://github.com/ansible/ansible/issues/85407).
release_summary: '| Release Date: 2025-09-23
| `Porting Guide <https://docs.ansible.com/ansible-core/2.20/porting_guides/porting_guide_core_2.20.html>`__
'
removed_features:
- Removed the option to set the ``DEFAULT_TRANSPORT`` configuration to ``smart``
that selects the default transport as either ``ssh`` or ``paramiko`` based
on the underlying platform configuraton.
- '``vault``/``unvault`` filters - remove the deprecated ``vaultid`` parameter.'
- ansible-doc - role entrypoint attributes are no longer shown
- ansible-galaxy - removed the v2 Galaxy server API. Galaxy servers hosting
collections must support v3.
- dnf/dnf5 - remove deprecated ``install_repoquery`` option.
- encrypt - remove deprecated passlib_or_crypt API.
- paramiko - Removed the ``PARAMIKO_HOST_KEY_AUTO_ADD`` and ``PARAMIKO_LOOK_FOR_KEYS``
configuration keys, which were previously deprecated.
- py3compat - remove deprecated ``py3compat.environ`` call.
- vars plugins - removed the deprecated ``get_host_vars`` or ``get_group_vars``
fallback for vars plugins that do not inherit from ``BaseVarsPlugin`` and
define a ``get_vars`` method.
- yum_repository - remove deprecated ``keepcache`` option.
codename: Good Times Bad Times
fragments:
- 2.20.0b1_summary.yaml
- 66497-include_role-_from-dedup.yml
- 81796-ansible-doc-roles.yml
- 85010-uri-multipart-file-on-retry.yml
- 85217-stat-add-selinux-context.yml
- 85487-add-dependency-installation-to-deb822_repository.yml
- 85497-default-callback-indent.yml
- 85524-resolve-task-resolved_action-early.yml
- 85556-fix-pip-changed.yml
- 85596-hide-proto.yml
- 85599-fix-templating-import_tasks-parent-include.yml
- 85632-setup-logical-volume-name-uniqueness.yml
- 85638-ansible-test-validate-modules-doc-fragments.yml
- 85682-rescue-flush_handlers.yml
- 85743-lazy-ternary.yml
- 85816-wsl-cache-files.yml
- ansible-doc-description-verbosity.yml
- ansible-test-auth-update.yml
- ansible-test-bootstrap-retry.yml
- ansible-test-containers.yml
- ansible-test-coverage-config.yml
- ansible-test-coverage-upgrade.yml
- ansible-test-freebsd-14.3.yml
- ansible-test-freebsd-bootstrap.yml
- ansible-test-ios.yml
- ansible-test-missing-dir-fix.yml
- ansible-test-remotes.yml
- ansible-test-rhel-9.6.yml
- ansible-test-sanity-requirements.yml
- apt_deb_install.yml
- apt_repo_debug.yml
- blockinfile-new-module-option-encoding.yml
- concat_coerce_none_to_empty.yml
- deprecate_inject.yml
- display_internals.yml
- dnf-remove-install_repoquery.yml
- dnf_installroot_dir.yml
- elide_broken_conditional_result.yml
- encrypt.yml
- failed-when-exception.yml
- fetch_file.yml
- fix-displaying-delegate_to-ansible_host.yml
- fix-listing-nested-filter-and-test-plugins.yml
- fix_script_error.yml
- galaxy-use-import-task-url.yml
- getattr_marker_access.yml
- hide_proto.yml
- import_tasks-dont-readd-tags.yml
- include_vars.yml
- known_issues_jinja_error.yml
- lineinfile-new-module-option-encoding.yml
- module_direct_exec.yml
- openrc.yml
- paramiko-global-config-removal.yml
- password_hash_encrypt.yml
- play-argument-spec-validation.yml
- plugins_fix_origin.yml
- powershell-quoting.yml
- python-support.yml
- regex_test.yml
- remove-role-entrypoint-attrs.yml
- remove-v2-galaxy-api.yml
- remove_hash_params.yml
- remove_py3compat.yml
- replace-update-string-comparison-method-to-unicode.yml
- return_fragments.yml
- run_command_output_selector.yml
- shell-wrap_for_exec_deprecation.yml
- six_1.7.0.yml
- smart-transport-removal.yml
- tag_u_it.yml
- template-tuple-fix.yml
- template_lookup_skip_finalize.yml
- templating-filter-generators.yml
- to-text-to-bytes.yml
- tqm.yml
- ucs.yml
- vars-remove-get_hostgroup_vars.yml
- vars_begone.yml
- vault-vaultid-removal.yml
- warn_reserved_tags.yml
- win_async-junk-output.yml
- wrapt_1.17.2.yml
- yum_repository-remove-keepcache.yml
release_date: '2025-09-23'
2.20.0b2:
changes:
bugfixes:
- Fix issue where play tags prevented executing notified handlers (https://github.com/ansible/ansible/issues/85475)
- Fix issues with keywords being incorrectly validated on ``import_tasks`` (https://github.com/ansible/ansible/issues/85855,
https://github.com/ansible/ansible/issues/85856)
- Fix traceback when trying to import non-existing file via nested ``import_tasks``
(https://github.com/ansible/ansible/issues/69882)
- ansible-doc - prevent crash when scanning collections in paths that have more
than one ``ansible_collections`` in it (https://github.com/ansible/ansible/issues/84909,
https://github.com/ansible/ansible/pull/85361).
- fetch - also return ``file`` in the result when changed is ``True`` (https://github.com/ansible/ansible/pull/85729).
minor_changes:
- DataLoader - Update ``DataLoader.get_basedir`` to be an abspath
- known_hosts - return rc and stderr when ssh-keygen command fails for further
debugging (https://github.com/ansible/ansible/issues/85850).
release_summary: '| Release Date: 2025-10-06
| `Porting Guide <https://docs.ansible.com/ansible-core/2.20/porting_guides/porting_guide_core_2.20.html>`__
'
removed_features:
- ansible-galaxy - remove support for resolvelib >= 0.5.3, < 0.8.0.
codename: Good Times Bad Times
fragments:
- 2.20.0b2_summary.yaml
- 85361-collection-name-from-path-none.yml
- 85475-fix-flush_handlers-play-tags.yml
- data-loader-basedir-abspath.yml
- drop-resolvelib-lt-0_8_0.yml
- fix-fetch-return-file.yml
- import_tasks-fixes.yml
- known_hosts.yml
release_date: '2025-10-06'
2.20.0rc1:
changes:
bugfixes:
- SIGINT/SIGTERM Handling - Make SIGINT/SIGTERM handling more robust by splitting
concerns between forks and the parent.
minor_changes:
- ansible-test - Default to Python 3.14 in the ``base`` and ``default`` test
containers.
- ansible-test - Filter out pylint messages for invalid filenames and display
a notice when doing so.
- ansible-test - Update astroid imports in custom pylint checkers.
- ansible-test - Update pinned ``pip`` version to 25.2.
- ansible-test - Update pinned sanity test requirements, including upgrading
to pylint 4.0.0.
release_summary: '| Release Date: 2025-10-14
| `Porting Guide <https://docs.ansible.com/ansible-core/2.20/porting_guides/porting_guide_core_2.20.html>`__
'
codename: Good Times Bad Times
fragments:
- 2.20.0rc1_summary.yaml
- ansible-test-sanity-requirements-again.yml
- fix-signal-propagation.yml
release_date: '2025-10-14'
2.20.0rc2:
changes:
bugfixes:
- psrp - ReadTimeout exceptions now mark host as unreachable instead of fatal
(https://github.com/ansible/ansible/issues/85966)
release_summary: '| Release Date: 2025-10-20
| `Porting Guide <https://docs.ansible.com/ansible-core/2.20/porting_guides/porting_guide_core_2.20.html>`__
'
codename: Good Times Bad Times
fragments:
- 2.20.0rc2_summary.yaml
- 85966-psrp-readtimeout.yml
release_date: '2025-10-20'

@ -0,0 +1,3 @@
release_summary: |
| Release Date: 2025-09-23
| `Porting Guide <https://docs.ansible.com/ansible-core/2.20/porting_guides/porting_guide_core_2.20.html>`__

@ -0,0 +1,3 @@
release_summary: |
| Release Date: 2025-10-06
| `Porting Guide <https://docs.ansible.com/ansible-core/2.20/porting_guides/porting_guide_core_2.20.html>`__

@ -0,0 +1,3 @@
release_summary: |
| Release Date: 2025-10-14
| `Porting Guide <https://docs.ansible.com/ansible-core/2.20/porting_guides/porting_guide_core_2.20.html>`__

@ -0,0 +1,3 @@
release_summary: |
| Release Date: 2025-10-20
| `Porting Guide <https://docs.ansible.com/ansible-core/2.20/porting_guides/porting_guide_core_2.20.html>`__

@ -0,0 +1,3 @@
bugfixes:
- "ansible-doc - prevent crash when scanning collections in paths that have more than one ``ansible_collections`` in it
(https://github.com/ansible/ansible/issues/84909, https://github.com/ansible/ansible/pull/85361)."

@ -0,0 +1,2 @@
bugfixes:
- Fix issue where play tags prevented executing notified handlers (https://github.com/ansible/ansible/issues/85475)

@ -0,0 +1,2 @@
bugfixes:
- psrp - ReadTimeout exceptions now mark host as unreachable instead of fatal (https://github.com/ansible/ansible/issues/85966)

@ -1,2 +1,3 @@
minor_changes: minor_changes:
- ansible-test - Update test containers. - ansible-test - Update test containers.
- ansible-test - Update base/default containers to include Python 3.14.0.

@ -1,2 +1,2 @@
minor_changes: minor_changes:
- ansible-test - Upgrade to ``coverage`` version 7.10.6 for Python 3.9 and later. - ansible-test - Upgrade to ``coverage`` version 7.10.7 for Python 3.9 and later.

@ -0,0 +1,6 @@
minor_changes:
- ansible-test - Update pinned sanity test requirements, including upgrading to pylint 4.0.0.
- ansible-test - Filter out pylint messages for invalid filenames and display a notice when doing so.
- ansible-test - Update astroid imports in custom pylint checkers.
- ansible-test - Default to Python 3.14 in the ``base`` and ``default`` test containers.
- ansible-test - Update pinned ``pip`` version to 25.2.

@ -0,0 +1,2 @@
minor_changes:
- DataLoader - Update ``DataLoader.get_basedir`` to be an abspath

@ -0,0 +1,2 @@
removed_features:
- ansible-galaxy - remove support for resolvelib >= 0.5.3, < 0.8.0.

@ -0,0 +1,2 @@
bugfixes:
- fetch - also return ``file`` in the result when changed is ``True`` (https://github.com/ansible/ansible/pull/85729).

@ -0,0 +1,3 @@
bugfixes:
- SIGINT/SIGTERM Handling - Make SIGINT/SIGTERM handling more robust by splitting concerns
between forks and the parent.

@ -0,0 +1,3 @@
bugfixes:
- Fix traceback when trying to import non-existing file via nested ``import_tasks`` (https://github.com/ansible/ansible/issues/69882)
- Fix issues with keywords being incorrectly validated on ``import_tasks`` (https://github.com/ansible/ansible/issues/85855, https://github.com/ansible/ansible/issues/85856)

@ -0,0 +1,3 @@
---
minor_changes:
- known_hosts - return rc and stderr when ssh-keygen command fails for further debugging (https://github.com/ansible/ansible/issues/85850).

@ -236,7 +236,9 @@ class RoleMixin(object):
b_colldirs = list_collection_dirs(coll_filter=collection_filter) b_colldirs = list_collection_dirs(coll_filter=collection_filter)
for b_path in b_colldirs: for b_path in b_colldirs:
path = to_text(b_path, errors='surrogate_or_strict') path = to_text(b_path, errors='surrogate_or_strict')
collname = _get_collection_name_from_path(b_path) if not (collname := _get_collection_name_from_path(b_path)):
display.debug(f'Skipping invalid path {b_path!r}')
continue
roles_dir = os.path.join(path, 'roles') roles_dir = os.path.join(path, 'roles')
if os.path.exists(roles_dir): if os.path.exists(roles_dir):

@ -17,8 +17,10 @@ def list_collections(coll_filter=None, search_paths=None, dedupe=True, artifacts
collections = {} collections = {}
for candidate in list_collection_dirs(search_paths=search_paths, coll_filter=coll_filter, artifacts_manager=artifacts_manager, dedupe=dedupe): for candidate in list_collection_dirs(search_paths=search_paths, coll_filter=coll_filter, artifacts_manager=artifacts_manager, dedupe=dedupe):
collection = _get_collection_name_from_path(candidate) if collection := _get_collection_name_from_path(candidate):
collections[collection] = candidate collections[collection] = candidate
else:
display.debug(f'Skipping invalid collection in path: {candidate!r}')
return collections return collections

@ -17,6 +17,7 @@
from __future__ import annotations from __future__ import annotations
import errno
import io import io
import os import os
import signal import signal
@ -103,11 +104,19 @@ class WorkerProcess(multiprocessing_context.Process): # type: ignore[name-defin
self._cliargs = cliargs self._cliargs = cliargs
def _term(self, signum, frame) -> None: def _term(self, signum, frame) -> None:
""" """In child termination when notified by the parent"""
terminate the process group created by calling setsid when signal.signal(signum, signal.SIG_DFL)
a terminate signal is received by the fork
""" try:
os.killpg(self.pid, signum) os.killpg(self.pid, signum)
os.kill(self.pid, signum)
except OSError as e:
if e.errno != errno.ESRCH:
signame = signal.strsignal(signum)
display.error(f'Unable to send {signame} to child[{self.pid}]: {e}')
# fallthrough, if we are still here, just die
os._exit(1)
def start(self) -> None: def start(self) -> None:
""" """
@ -121,11 +130,6 @@ class WorkerProcess(multiprocessing_context.Process): # type: ignore[name-defin
# FUTURE: this lock can be removed once a more generalized pre-fork thread pause is in place # FUTURE: this lock can be removed once a more generalized pre-fork thread pause is in place
with display._lock: with display._lock:
super(WorkerProcess, self).start() super(WorkerProcess, self).start()
# Since setsid is called later, if the worker is termed
# it won't term the new process group
# register a handler to propagate the signal
signal.signal(signal.SIGTERM, self._term)
signal.signal(signal.SIGINT, self._term)
def _hard_exit(self, e: str) -> t.NoReturn: def _hard_exit(self, e: str) -> t.NoReturn:
""" """
@ -170,7 +174,6 @@ class WorkerProcess(multiprocessing_context.Process): # type: ignore[name-defin
# to give better errors, and to prevent fd 0 reuse # to give better errors, and to prevent fd 0 reuse
sys.stdin.close() sys.stdin.close()
except Exception as e: except Exception as e:
display.debug(f'Could not detach from stdio: {traceback.format_exc()}')
display.error(f'Could not detach from stdio: {e}') display.error(f'Could not detach from stdio: {e}')
os._exit(1) os._exit(1)
@ -187,6 +190,9 @@ class WorkerProcess(multiprocessing_context.Process): # type: ignore[name-defin
# Set the queue on Display so calls to Display.display are proxied over the queue # Set the queue on Display so calls to Display.display are proxied over the queue
display.set_queue(self._final_q) display.set_queue(self._final_q)
self._detach() self._detach()
# propagate signals
signal.signal(signal.SIGINT, self._term)
signal.signal(signal.SIGTERM, self._term)
try: try:
with _task.TaskContext(self._task): with _task.TaskContext(self._task):
return self._run() return self._run()

@ -18,8 +18,10 @@
from __future__ import annotations from __future__ import annotations
import dataclasses import dataclasses
import errno
import os import os
import sys import sys
import signal
import tempfile import tempfile
import threading import threading
import time import time
@ -185,8 +187,48 @@ class TaskQueueManager:
# plugins for inter-process locking. # plugins for inter-process locking.
self._connection_lockfile = tempfile.TemporaryFile() self._connection_lockfile = tempfile.TemporaryFile()
self._workers: list[WorkerProcess | None] = []
# signal handlers to propagate signals to workers
signal.signal(signal.SIGTERM, self._signal_handler)
signal.signal(signal.SIGINT, self._signal_handler)
def _initialize_processes(self, num: int) -> None: def _initialize_processes(self, num: int) -> None:
self._workers: list[WorkerProcess | None] = [None] * num # mutable update to ensure the reference stays the same
self._workers[:] = [None] * num
def _signal_handler(self, signum, frame) -> None:
"""
terminate all running process groups created as a result of calling
setsid from within a WorkerProcess.
Since the children become process leaders, signals will not
automatically propagate to them.
"""
signal.signal(signum, signal.SIG_DFL)
for worker in self._workers:
if worker is None or not worker.is_alive():
continue
if worker.pid:
try:
# notify workers
os.kill(worker.pid, signum)
except OSError as e:
if e.errno != errno.ESRCH:
signame = signal.strsignal(signum)
display.error(f'Unable to send {signame} to child[{worker.pid}]: {e}')
if signum == signal.SIGINT:
# Defer to CLI handling
raise KeyboardInterrupt()
pid = os.getpid()
try:
os.kill(pid, signum)
except OSError as e:
signame = signal.strsignal(signum)
display.error(f'Unable to send {signame} to {pid}: {e}')
def load_callbacks(self): def load_callbacks(self):
""" """

@ -1839,10 +1839,13 @@ def _resolve_depenency_map(
offline=offline, offline=offline,
) )
try: try:
return collection_dep_resolver.resolve( return t.cast(
requested_requirements, dict[str, Candidate],
max_rounds=2000000, # NOTE: same constant pip uses collection_dep_resolver.resolve(
).mapping requested_requirements,
max_rounds=2000000, # NOTE: same constant pip uses
).mapping,
)
except CollectionDependencyResolutionImpossible as dep_exc: except CollectionDependencyResolutionImpossible as dep_exc:
conflict_causes = ( conflict_causes = (
'* {req.fqcn!s}:{req.ver!s} ({dep_origin!s})'.format( '* {req.fqcn!s}:{req.ver!s} ({dep_origin!s})'.format(

@ -5,6 +5,7 @@
from __future__ import annotations from __future__ import annotations
import collections.abc as _c
import typing as t import typing as t
if t.TYPE_CHECKING: if t.TYPE_CHECKING:
@ -21,15 +22,15 @@ from ansible.galaxy.dependency_resolution.resolvers import CollectionDependencyR
def build_collection_dependency_resolver( def build_collection_dependency_resolver(
galaxy_apis, # type: t.Iterable[GalaxyAPI] galaxy_apis: _c.Iterable[GalaxyAPI],
concrete_artifacts_manager, # type: ConcreteArtifactsManager concrete_artifacts_manager: ConcreteArtifactsManager,
preferred_candidates=None, # type: t.Iterable[Candidate] preferred_candidates: _c.Iterable[Candidate] | None = None,
with_deps=True, # type: bool with_deps: bool = True,
with_pre_releases=False, # type: bool with_pre_releases: bool = False,
upgrade=False, # type: bool upgrade: bool = False,
include_signatures=True, # type: bool include_signatures: bool = True,
offline=False, # type: bool offline: bool = False,
): # type: (...) -> CollectionDependencyResolver ) -> CollectionDependencyResolver:
"""Return a collection dependency resolver. """Return a collection dependency resolver.
The returned instance will have a ``resolve()`` method for The returned instance will have a ``resolve()`` method for

@ -6,12 +6,12 @@
from __future__ import annotations from __future__ import annotations
import collections.abc as _c
import os import os
import pathlib import pathlib
import typing as t import typing as t
from collections import namedtuple from collections import namedtuple
from collections.abc import MutableSequence, MutableMapping
from glob import iglob from glob import iglob
from urllib.parse import urlparse from urllib.parse import urlparse
from yaml import safe_load from yaml import safe_load
@ -43,7 +43,12 @@ _SOURCE_METADATA_FILE = b'GALAXY.yml'
display = Display() display = Display()
def get_validated_source_info(b_source_info_path, namespace, name, version): def get_validated_source_info(
b_source_info_path: bytes,
namespace: str,
name: str,
version: str,
) -> dict[str, object] | None:
source_info_path = to_text(b_source_info_path, errors='surrogate_or_strict') source_info_path = to_text(b_source_info_path, errors='surrogate_or_strict')
if not os.path.isfile(b_source_info_path): if not os.path.isfile(b_source_info_path):
@ -58,7 +63,7 @@ def get_validated_source_info(b_source_info_path, namespace, name, version):
) )
return None return None
if not isinstance(metadata, MutableMapping): if not isinstance(metadata, dict):
display.warning(f"Error getting collection source information at '{source_info_path}': expected a YAML dictionary") display.warning(f"Error getting collection source information at '{source_info_path}': expected a YAML dictionary")
return None return None
@ -72,7 +77,12 @@ def get_validated_source_info(b_source_info_path, namespace, name, version):
return metadata return metadata
def _validate_v1_source_info_schema(namespace, name, version, provided_arguments): def _validate_v1_source_info_schema(
namespace: str,
name: str,
version: str,
provided_arguments: dict[str, object],
) -> list[str]:
argument_spec_data = dict( argument_spec_data = dict(
format_version=dict(choices=["1.0.0"]), format_version=dict(choices=["1.0.0"]),
download_url=dict(), download_url=dict(),
@ -102,24 +112,24 @@ def _validate_v1_source_info_schema(namespace, name, version, provided_arguments
return validation_result.error_messages return validation_result.error_messages
def _is_collection_src_dir(dir_path): def _is_collection_src_dir(dir_path: bytes | str) -> bool:
b_dir_path = to_bytes(dir_path, errors='surrogate_or_strict') b_dir_path = to_bytes(dir_path, errors='surrogate_or_strict')
return os.path.isfile(os.path.join(b_dir_path, _GALAXY_YAML)) return os.path.isfile(os.path.join(b_dir_path, _GALAXY_YAML))
def _is_installed_collection_dir(dir_path): def _is_installed_collection_dir(dir_path: bytes | str) -> bool:
b_dir_path = to_bytes(dir_path, errors='surrogate_or_strict') b_dir_path = to_bytes(dir_path, errors='surrogate_or_strict')
return os.path.isfile(os.path.join(b_dir_path, _MANIFEST_JSON)) return os.path.isfile(os.path.join(b_dir_path, _MANIFEST_JSON))
def _is_collection_dir(dir_path): def _is_collection_dir(dir_path: bytes | str) -> bool:
return ( return (
_is_installed_collection_dir(dir_path) or _is_installed_collection_dir(dir_path) or
_is_collection_src_dir(dir_path) _is_collection_src_dir(dir_path)
) )
def _find_collections_in_subdirs(dir_path): def _find_collections_in_subdirs(dir_path: str) -> _c.Iterator[bytes]:
b_dir_path = to_bytes(dir_path, errors='surrogate_or_strict') b_dir_path = to_bytes(dir_path, errors='surrogate_or_strict')
subdir_glob_pattern = os.path.join( subdir_glob_pattern = os.path.join(
@ -135,23 +145,23 @@ def _find_collections_in_subdirs(dir_path):
yield subdir yield subdir
def _is_collection_namespace_dir(tested_str): def _is_collection_namespace_dir(tested_str: str) -> bool:
return any(_find_collections_in_subdirs(tested_str)) return any(_find_collections_in_subdirs(tested_str))
def _is_file_path(tested_str): def _is_file_path(tested_str: str) -> bool:
return os.path.isfile(to_bytes(tested_str, errors='surrogate_or_strict')) return os.path.isfile(to_bytes(tested_str, errors='surrogate_or_strict'))
def _is_http_url(tested_str): def _is_http_url(tested_str: str) -> bool:
return urlparse(tested_str).scheme.lower() in {'http', 'https'} return urlparse(tested_str).scheme.lower() in {'http', 'https'}
def _is_git_url(tested_str): def _is_git_url(tested_str: str) -> bool:
return tested_str.startswith(('git+', 'git@')) return tested_str.startswith(('git+', 'git@'))
def _is_concrete_artifact_pointer(tested_str): def _is_concrete_artifact_pointer(tested_str: str) -> bool:
return any( return any(
predicate(tested_str) predicate(tested_str)
for predicate in ( for predicate in (
@ -168,7 +178,7 @@ def _is_concrete_artifact_pointer(tested_str):
class _ComputedReqKindsMixin: class _ComputedReqKindsMixin:
UNIQUE_ATTRS = ('fqcn', 'ver', 'src', 'type') UNIQUE_ATTRS = ('fqcn', 'ver', 'src', 'type')
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs) -> None:
if not self.may_have_offline_galaxy_info: if not self.may_have_offline_galaxy_info:
self._source_info = None self._source_info = None
else: else:
@ -181,18 +191,18 @@ class _ComputedReqKindsMixin:
self.ver self.ver
) )
def __hash__(self): def __hash__(self) -> int:
return hash(tuple(getattr(self, attr) for attr in _ComputedReqKindsMixin.UNIQUE_ATTRS)) return hash(tuple(getattr(self, attr) for attr in _ComputedReqKindsMixin.UNIQUE_ATTRS))
def __eq__(self, candidate): def __eq__(self, candidate: _c.Hashable) -> bool:
return hash(self) == hash(candidate) return hash(self) == hash(candidate)
@classmethod @classmethod
def from_dir_path_as_unknown( # type: ignore[misc] def from_dir_path_as_unknown(
cls, # type: t.Type[Collection] cls,
dir_path, # type: bytes dir_path: bytes,
art_mgr, # type: ConcreteArtifactsManager art_mgr: ConcreteArtifactsManager,
): # type: (...) -> Collection ) -> t.Self:
"""Make collection from an unspecified dir type. """Make collection from an unspecified dir type.
This alternative constructor attempts to grab metadata from the This alternative constructor attempts to grab metadata from the
@ -215,11 +225,11 @@ class _ComputedReqKindsMixin:
return cls.from_dir_path_implicit(dir_path) return cls.from_dir_path_implicit(dir_path)
@classmethod @classmethod
def from_dir_path( # type: ignore[misc] def from_dir_path(
cls, # type: t.Type[Collection] cls,
dir_path, # type: bytes dir_path: bytes,
art_mgr, # type: ConcreteArtifactsManager art_mgr: ConcreteArtifactsManager,
): # type: (...) -> Collection ) -> t.Self:
"""Make collection from an directory with metadata.""" """Make collection from an directory with metadata."""
if dir_path.endswith(to_bytes(os.path.sep)): if dir_path.endswith(to_bytes(os.path.sep)):
dir_path = dir_path.rstrip(to_bytes(os.path.sep)) dir_path = dir_path.rstrip(to_bytes(os.path.sep))
@ -262,10 +272,10 @@ class _ComputedReqKindsMixin:
return cls(req_name, req_version, dir_path, 'dir', None) return cls(req_name, req_version, dir_path, 'dir', None)
@classmethod @classmethod
def from_dir_path_implicit( # type: ignore[misc] def from_dir_path_implicit(
cls, # type: t.Type[Collection] cls,
dir_path, # type: bytes dir_path: bytes,
): # type: (...) -> Collection ) -> t.Self:
"""Construct a collection instance based on an arbitrary dir. """Construct a collection instance based on an arbitrary dir.
This alternative constructor infers the FQCN based on the parent This alternative constructor infers the FQCN based on the parent
@ -278,11 +288,16 @@ class _ComputedReqKindsMixin:
u_dir_path = to_text(dir_path, errors='surrogate_or_strict') u_dir_path = to_text(dir_path, errors='surrogate_or_strict')
path_list = u_dir_path.split(os.path.sep) path_list = u_dir_path.split(os.path.sep)
req_name = '.'.join(path_list[-2:]) req_name = '.'.join(path_list[-2:])
return cls(req_name, '*', dir_path, 'dir', None) # type: ignore[call-arg] return cls(req_name, '*', dir_path, 'dir', None)
@classmethod @classmethod
def from_string(cls, collection_input, artifacts_manager, supplemental_signatures): def from_string(
req = {} cls,
collection_input: str,
artifacts_manager: ConcreteArtifactsManager,
supplemental_signatures: list[str] | None,
) -> t.Self:
req: dict[str, str | list[str] | None] = {}
if _is_concrete_artifact_pointer(collection_input) or AnsibleCollectionRef.is_valid_collection_name(collection_input): if _is_concrete_artifact_pointer(collection_input) or AnsibleCollectionRef.is_valid_collection_name(collection_input):
# Arg is a file path or URL to a collection, or just a collection # Arg is a file path or URL to a collection, or just a collection
req['name'] = collection_input req['name'] = collection_input
@ -307,7 +322,14 @@ class _ComputedReqKindsMixin:
return cls.from_requirement_dict(req, artifacts_manager) return cls.from_requirement_dict(req, artifacts_manager)
@classmethod @classmethod
def from_requirement_dict(cls, collection_req, art_mgr, validate_signature_options=True): def from_requirement_dict(
cls,
# NOTE: The actual `collection_req` shape is supposed to be
# NOTE: `dict[str, str | list[str] | None]`
collection_req: dict[str, t.Any],
art_mgr: ConcreteArtifactsManager,
validate_signature_options: bool = True,
) -> t.Self:
req_name = collection_req.get('name', None) req_name = collection_req.get('name', None)
req_version = collection_req.get('version', '*') req_version = collection_req.get('version', '*')
req_type = collection_req.get('type') req_type = collection_req.get('type')
@ -320,7 +342,7 @@ class _ComputedReqKindsMixin:
f"Signatures were provided to verify {req_name} but no keyring was configured." f"Signatures were provided to verify {req_name} but no keyring was configured."
) )
if not isinstance(req_signature_sources, MutableSequence): if not isinstance(req_signature_sources, _c.MutableSequence):
req_signature_sources = [req_signature_sources] req_signature_sources = [req_signature_sources]
req_signature_sources = frozenset(req_signature_sources) req_signature_sources = frozenset(req_signature_sources)
@ -434,7 +456,11 @@ class _ComputedReqKindsMixin:
format(not_url=req_source.api_server), format(not_url=req_source.api_server),
) )
if req_type == 'dir' and req_source.endswith(os.path.sep): if (
req_type == 'dir'
and isinstance(req_source, str)
and req_source.endswith(os.path.sep)
):
req_source = req_source.rstrip(os.path.sep) req_source = req_source.rstrip(os.path.sep)
tmp_inst_req = cls(req_name, req_version, req_source, req_type, req_signature_sources) tmp_inst_req = cls(req_name, req_version, req_source, req_type, req_signature_sources)
@ -451,16 +477,16 @@ class _ComputedReqKindsMixin:
req_signature_sources, req_signature_sources,
) )
def __repr__(self): def __repr__(self) -> str:
return ( return (
'<{self!s} of type {coll_type!r} from {src!s}>'. '<{self!s} of type {coll_type!r} from {src!s}>'.
format(self=self, coll_type=self.type, src=self.src or 'Galaxy') format(self=self, coll_type=self.type, src=self.src or 'Galaxy')
) )
def __str__(self): def __str__(self) -> str:
return to_native(self.__unicode__()) return to_native(self.__unicode__())
def __unicode__(self): def __unicode__(self) -> str:
if self.fqcn is None: if self.fqcn is None:
return ( return (
f'{self.type} collection from a Git repo' if self.is_scm f'{self.type} collection from a Git repo' if self.is_scm
@ -473,7 +499,7 @@ class _ComputedReqKindsMixin:
) )
@property @property
def may_have_offline_galaxy_info(self): def may_have_offline_galaxy_info(self) -> bool:
if self.fqcn is None: if self.fqcn is None:
# Virtual collection # Virtual collection
return False return False
@ -482,7 +508,7 @@ class _ComputedReqKindsMixin:
return False return False
return True return True
def construct_galaxy_info_path(self, b_collection_path): def construct_galaxy_info_path(self, b_collection_path: bytes) -> bytes:
if not self.may_have_offline_galaxy_info and not self.type == 'galaxy': if not self.may_have_offline_galaxy_info and not self.type == 'galaxy':
raise TypeError('Only installed collections from a Galaxy server have offline Galaxy info') raise TypeError('Only installed collections from a Galaxy server have offline Galaxy info')
@ -502,21 +528,21 @@ class _ComputedReqKindsMixin:
return self.fqcn.split('.') return self.fqcn.split('.')
@property @property
def namespace(self): def namespace(self) -> str:
if self.is_virtual: if self.is_virtual:
raise TypeError(f'{self.type} collections do not have a namespace') raise TypeError(f'{self.type} collections do not have a namespace')
return self._get_separate_ns_n_name()[0] return self._get_separate_ns_n_name()[0]
@property @property
def name(self): def name(self) -> str:
if self.is_virtual: if self.is_virtual:
raise TypeError(f'{self.type} collections do not have a name') raise TypeError(f'{self.type} collections do not have a name')
return self._get_separate_ns_n_name()[-1] return self._get_separate_ns_n_name()[-1]
@property @property
def canonical_package_id(self): def canonical_package_id(self) -> str:
if not self.is_virtual: if not self.is_virtual:
return to_native(self.fqcn) return to_native(self.fqcn)
@ -526,46 +552,46 @@ class _ComputedReqKindsMixin:
) )
@property @property
def is_virtual(self): def is_virtual(self) -> bool:
return self.is_scm or self.is_subdirs return self.is_scm or self.is_subdirs
@property @property
def is_file(self): def is_file(self) -> bool:
return self.type == 'file' return self.type == 'file'
@property @property
def is_dir(self): def is_dir(self) -> bool:
return self.type == 'dir' return self.type == 'dir'
@property @property
def namespace_collection_paths(self): def namespace_collection_paths(self) -> list[str]:
return [ return [
to_native(path) to_native(path)
for path in _find_collections_in_subdirs(self.src) for path in _find_collections_in_subdirs(self.src)
] ]
@property @property
def is_subdirs(self): def is_subdirs(self) -> bool:
return self.type == 'subdirs' return self.type == 'subdirs'
@property @property
def is_url(self): def is_url(self) -> bool:
return self.type == 'url' return self.type == 'url'
@property @property
def is_scm(self): def is_scm(self) -> bool:
return self.type == 'git' return self.type == 'git'
@property @property
def is_concrete_artifact(self): def is_concrete_artifact(self) -> bool:
return self.type in {'git', 'url', 'file', 'dir', 'subdirs'} return self.type in {'git', 'url', 'file', 'dir', 'subdirs'}
@property @property
def is_online_index_pointer(self): def is_online_index_pointer(self) -> bool:
return not self.is_concrete_artifact return not self.is_concrete_artifact
@property @property
def is_pinned(self): def is_pinned(self) -> bool:
"""Indicate if the version set is considered pinned. """Indicate if the version set is considered pinned.
This essentially computes whether the version field of the current This essentially computes whether the version field of the current
@ -585,7 +611,7 @@ class _ComputedReqKindsMixin:
) )
@property @property
def source_info(self): def source_info(self) -> dict[str, object] | None:
return self._source_info return self._source_info
@ -601,11 +627,11 @@ class Requirement(
): ):
"""An abstract requirement request.""" """An abstract requirement request."""
def __new__(cls, *args, **kwargs): def __new__(cls, *args: object, **kwargs: object) -> t.Self:
self = RequirementNamedTuple.__new__(cls, *args, **kwargs) self = RequirementNamedTuple.__new__(cls, *args, **kwargs)
return self return self
def __init__(self, *args, **kwargs): def __init__(self, *args: object, **kwargs: object) -> None:
super(Requirement, self).__init__() super(Requirement, self).__init__()
@ -615,14 +641,14 @@ class Candidate(
): ):
"""A concrete collection candidate with its version resolved.""" """A concrete collection candidate with its version resolved."""
def __new__(cls, *args, **kwargs): def __new__(cls, *args: object, **kwargs: object) -> t.Self:
self = CandidateNamedTuple.__new__(cls, *args, **kwargs) self = CandidateNamedTuple.__new__(cls, *args, **kwargs)
return self return self
def __init__(self, *args, **kwargs): def __init__(self, *args: object, **kwargs: object) -> None:
super(Candidate, self).__init__() super(Candidate, self).__init__()
def with_signatures_repopulated(self): # type: (Candidate) -> Candidate def with_signatures_repopulated(self) -> Candidate:
"""Populate a new Candidate instance with Galaxy signatures. """Populate a new Candidate instance with Galaxy signatures.
:raises AnsibleAssertionError: If the supplied candidate is not sourced from a Galaxy-like index. :raises AnsibleAssertionError: If the supplied candidate is not sourced from a Galaxy-like index.
""" """

@ -5,6 +5,7 @@
from __future__ import annotations from __future__ import annotations
import collections.abc as _c
import functools import functools
import typing as t import typing as t
@ -15,6 +16,8 @@ if t.TYPE_CHECKING:
from ansible.galaxy.collection.galaxy_api_proxy import MultiGalaxyAPIProxy from ansible.galaxy.collection.galaxy_api_proxy import MultiGalaxyAPIProxy
from ansible.galaxy.api import GalaxyAPI from ansible.galaxy.api import GalaxyAPI
from resolvelib.structs import RequirementInformation
from ansible.galaxy.collection.gpg import get_signature_from_source from ansible.galaxy.collection.gpg import get_signature_from_source
from ansible.galaxy.dependency_resolution.dataclasses import ( from ansible.galaxy.dependency_resolution.dataclasses import (
Candidate, Candidate,
@ -37,24 +40,24 @@ except ImportError:
# TODO: add python requirements to ansible-test's ansible-core distribution info and remove the hardcoded lowerbound/upperbound fallback # TODO: add python requirements to ansible-test's ansible-core distribution info and remove the hardcoded lowerbound/upperbound fallback
RESOLVELIB_LOWERBOUND = SemanticVersion("0.5.3") RESOLVELIB_LOWERBOUND = SemanticVersion("0.8.0")
RESOLVELIB_UPPERBOUND = SemanticVersion("2.0.0") RESOLVELIB_UPPERBOUND = SemanticVersion("2.0.0")
RESOLVELIB_VERSION = SemanticVersion.from_loose_version(LooseVersion(resolvelib_version)) RESOLVELIB_VERSION = SemanticVersion.from_loose_version(LooseVersion(resolvelib_version))
class CollectionDependencyProviderBase(AbstractProvider): class CollectionDependencyProvider(AbstractProvider):
"""Delegate providing a requirement interface for the resolver.""" """Delegate providing a requirement interface for the resolver."""
def __init__( def __init__(
self, # type: CollectionDependencyProviderBase self,
apis, # type: MultiGalaxyAPIProxy apis: MultiGalaxyAPIProxy,
concrete_artifacts_manager=None, # type: ConcreteArtifactsManager concrete_artifacts_manager: ConcreteArtifactsManager,
preferred_candidates=None, # type: t.Iterable[Candidate] preferred_candidates: _c.Iterable[Candidate] | None = None,
with_deps=True, # type: bool with_deps: bool = True,
with_pre_releases=False, # type: bool with_pre_releases: bool = False,
upgrade=False, # type: bool upgrade: bool = False,
include_signatures=True, # type: bool include_signatures: bool = True,
): # type: (...) -> None ) -> None:
r"""Initialize helper attributes. r"""Initialize helper attributes.
:param api: An instance of the multiple Galaxy APIs wrapper. :param api: An instance of the multiple Galaxy APIs wrapper.
@ -90,8 +93,10 @@ class CollectionDependencyProviderBase(AbstractProvider):
self._upgrade = upgrade self._upgrade = upgrade
self._include_signatures = include_signatures self._include_signatures = include_signatures
def identify(self, requirement_or_candidate): def identify(
# type: (t.Union[Candidate, Requirement]) -> str self,
requirement_or_candidate: Candidate | Requirement,
) -> str:
"""Given requirement or candidate, return an identifier for it. """Given requirement or candidate, return an identifier for it.
This is used to identify a requirement or candidate, e.g. This is used to identify a requirement or candidate, e.g.
@ -102,8 +107,19 @@ class CollectionDependencyProviderBase(AbstractProvider):
""" """
return requirement_or_candidate.canonical_package_id return requirement_or_candidate.canonical_package_id
def get_preference(self, *args, **kwargs): def get_preference(
# type: (t.Any, t.Any) -> t.Union[float, int] self,
identifier: str,
resolutions: _c.Mapping[str, Candidate],
candidates: _c.Mapping[str, _c.Iterator[Candidate]],
information: _c.Mapping[
str,
_c.Iterator[RequirementInformation[Requirement, Candidate]],
],
backtrack_causes: _c.Sequence[
RequirementInformation[Requirement, Candidate],
],
) -> float | int:
"""Return sort key function return value for given requirement. """Return sort key function return value for given requirement.
This result should be based on preference that is defined as This result should be based on preference that is defined as
@ -111,38 +127,6 @@ class CollectionDependencyProviderBase(AbstractProvider):
The lower the return value is, the more preferred this The lower the return value is, the more preferred this
group of arguments is. group of arguments is.
resolvelib >=0.5.3, <0.7.0
:param resolution: Currently pinned candidate, or ``None``.
:param candidates: A list of possible candidates.
:param information: A list of requirement information.
Each ``information`` instance is a named tuple with two entries:
* ``requirement`` specifies a requirement contributing to
the current candidate list
* ``parent`` specifies the candidate that provides
(depended on) the requirement, or `None`
to indicate a root requirement.
resolvelib >=0.7.0, < 0.8.0
:param identifier: The value returned by ``identify()``.
:param resolutions: Mapping of identifier, candidate pairs.
:param candidates: Possible candidates for the identifier.
Mapping of identifier, list of candidate pairs.
:param information: Requirement information of each package.
Mapping of identifier, list of named tuple pairs.
The named tuples have the entries ``requirement`` and ``parent``.
resolvelib >=0.8.0, <= 1.0.1
:param identifier: The value returned by ``identify()``. :param identifier: The value returned by ``identify()``.
:param resolutions: Mapping of identifier, candidate pairs. :param resolutions: Mapping of identifier, candidate pairs.
@ -178,10 +162,6 @@ class CollectionDependencyProviderBase(AbstractProvider):
the value is, the more preferred this requirement is (i.e. the the value is, the more preferred this requirement is (i.e. the
sorting function is called with ``reverse=False``). sorting function is called with ``reverse=False``).
""" """
raise NotImplementedError
def _get_preference(self, candidates):
# type: (list[Candidate]) -> t.Union[float, int]
if any( if any(
candidate in self._preferred_candidates candidate in self._preferred_candidates
for candidate in candidates for candidate in candidates
@ -191,8 +171,12 @@ class CollectionDependencyProviderBase(AbstractProvider):
return float('-inf') return float('-inf')
return len(candidates) return len(candidates)
def find_matches(self, *args, **kwargs): def find_matches(
# type: (t.Any, t.Any) -> list[Candidate] self,
identifier: str,
requirements: _c.Mapping[str, _c.Iterator[Requirement]],
incompatibilities: _c.Mapping[str, _c.Iterator[Candidate]],
) -> list[Candidate]:
r"""Find all possible candidates satisfying given requirements. r"""Find all possible candidates satisfying given requirements.
This tries to get candidates based on the requirements' types. This tries to get candidates based on the requirements' types.
@ -203,32 +187,13 @@ class CollectionDependencyProviderBase(AbstractProvider):
For a "named" requirement, Galaxy-compatible APIs are consulted For a "named" requirement, Galaxy-compatible APIs are consulted
to find concrete candidates for this requirement. If there's a to find concrete candidates for this requirement. If there's a
pre-installed candidate, it's prepended in front of others. pre-installed candidate, it's prepended in front of others.
resolvelib >=0.5.3, <0.6.0
:param requirements: A collection of requirements which all of \
the returned candidates must match. \
All requirements are guaranteed to have \
the same identifier. \
The collection is never empty.
resolvelib >=0.6.0
:param identifier: The value returned by ``identify()``.
:param requirements: The requirements all returned candidates must satisfy.
Mapping of identifier, iterator of requirement pairs.
:param incompatibilities: Incompatible versions that must be excluded
from the returned list.
:returns: An iterable that orders candidates by preference, \
e.g. the most preferred candidate comes first.
""" """
raise NotImplementedError return [
match for match in self._find_matches(list(requirements[identifier]))
if not any(match.ver == incompat.ver for incompat in incompatibilities[identifier])
]
def _find_matches(self, requirements): def _find_matches(self, requirements: list[Requirement]) -> list[Candidate]:
# type: (list[Requirement]) -> list[Candidate]
# FIXME: The first requirement may be a Git repo followed by # FIXME: The first requirement may be a Git repo followed by
# FIXME: its cloned tmp dir. Using only the first one creates # FIXME: its cloned tmp dir. Using only the first one creates
# FIXME: loops that prevent any further dependency exploration. # FIXME: loops that prevent any further dependency exploration.
@ -249,7 +214,10 @@ class CollectionDependencyProviderBase(AbstractProvider):
all(self.is_satisfied_by(requirement, candidate) for requirement in requirements) all(self.is_satisfied_by(requirement, candidate) for requirement in requirements)
} }
try: try:
coll_versions = [] if preinstalled_candidates else self._api_proxy.get_collection_versions(first_req) # type: t.Iterable[t.Tuple[str, GalaxyAPI]] coll_versions: _c.Iterable[tuple[str, GalaxyAPI]] = (
[] if preinstalled_candidates
else self._api_proxy.get_collection_versions(first_req)
)
except TypeError as exc: except TypeError as exc:
if first_req.is_concrete_artifact: if first_req.is_concrete_artifact:
# Non hashable versions will cause a TypeError # Non hashable versions will cause a TypeError
@ -292,7 +260,7 @@ class CollectionDependencyProviderBase(AbstractProvider):
latest_matches = [] latest_matches = []
signatures = [] signatures = []
extra_signature_sources = [] # type: list[str] extra_signature_sources: list[str] = []
discarding_pre_releases_acceptable = any( discarding_pre_releases_acceptable = any(
not is_pre_release(candidate_version) not is_pre_release(candidate_version)
@ -397,8 +365,11 @@ class CollectionDependencyProviderBase(AbstractProvider):
return list(preinstalled_candidates) + latest_matches return list(preinstalled_candidates) + latest_matches
def is_satisfied_by(self, requirement, candidate): def is_satisfied_by(
# type: (Requirement, Candidate) -> bool self,
requirement: Requirement,
candidate: Candidate,
) -> bool:
r"""Whether the given requirement is satisfiable by a candidate. r"""Whether the given requirement is satisfiable by a candidate.
:param requirement: A requirement that produced the `candidate`. :param requirement: A requirement that produced the `candidate`.
@ -424,8 +395,7 @@ class CollectionDependencyProviderBase(AbstractProvider):
requirements=requirement.ver, requirements=requirement.ver,
) )
def get_dependencies(self, candidate): def get_dependencies(self, candidate: Candidate) -> list[Requirement]:
# type: (Candidate) -> list[Candidate]
r"""Get direct dependencies of a candidate. r"""Get direct dependencies of a candidate.
:returns: A collection of requirements that `candidate` \ :returns: A collection of requirements that `candidate` \
@ -456,52 +426,3 @@ class CollectionDependencyProviderBase(AbstractProvider):
self._make_req_from_dict({'name': dep_name, 'version': dep_req}) self._make_req_from_dict({'name': dep_name, 'version': dep_req})
for dep_name, dep_req in req_map.items() for dep_name, dep_req in req_map.items()
] ]
# Classes to handle resolvelib API changes between minor versions for 0.X
class CollectionDependencyProvider050(CollectionDependencyProviderBase):
def find_matches(self, requirements): # type: ignore[override]
# type: (list[Requirement]) -> list[Candidate]
return self._find_matches(requirements)
def get_preference(self, resolution, candidates, information): # type: ignore[override]
# type: (t.Optional[Candidate], list[Candidate], list[t.NamedTuple]) -> t.Union[float, int]
return self._get_preference(candidates)
class CollectionDependencyProvider060(CollectionDependencyProviderBase):
def find_matches(self, identifier, requirements, incompatibilities): # type: ignore[override]
# type: (str, t.Mapping[str, t.Iterator[Requirement]], t.Mapping[str, t.Iterator[Requirement]]) -> list[Candidate]
return [
match for match in self._find_matches(list(requirements[identifier]))
if not any(match.ver == incompat.ver for incompat in incompatibilities[identifier])
]
def get_preference(self, resolution, candidates, information): # type: ignore[override]
# type: (t.Optional[Candidate], list[Candidate], list[t.NamedTuple]) -> t.Union[float, int]
return self._get_preference(candidates)
class CollectionDependencyProvider070(CollectionDependencyProvider060):
def get_preference(self, identifier, resolutions, candidates, information): # type: ignore[override]
# type: (str, t.Mapping[str, Candidate], t.Mapping[str, t.Iterator[Candidate]], t.Iterator[t.NamedTuple]) -> t.Union[float, int]
return self._get_preference(list(candidates[identifier]))
class CollectionDependencyProvider080(CollectionDependencyProvider060):
def get_preference(self, identifier, resolutions, candidates, information, backtrack_causes): # type: ignore[override]
# type: (str, t.Mapping[str, Candidate], t.Mapping[str, t.Iterator[Candidate]], t.Iterator[t.NamedTuple], t.Sequence) -> t.Union[float, int]
return self._get_preference(list(candidates[identifier]))
def _get_provider(): # type () -> CollectionDependencyProviderBase
if RESOLVELIB_VERSION >= SemanticVersion("0.8.0"):
return CollectionDependencyProvider080
if RESOLVELIB_VERSION >= SemanticVersion("0.7.0"):
return CollectionDependencyProvider070
if RESOLVELIB_VERSION >= SemanticVersion("0.6.0"):
return CollectionDependencyProvider060
return CollectionDependencyProvider050
CollectionDependencyProvider = _get_provider()

@ -11,8 +11,7 @@ from ansible.module_utils.compat.version import LooseVersion
from ansible.utils.version import SemanticVersion from ansible.utils.version import SemanticVersion
def is_pre_release(version): def is_pre_release(version: str) -> bool:
# type: (str) -> bool
"""Figure out if a given version is a pre-release.""" """Figure out if a given version is a pre-release."""
try: try:
return SemanticVersion(version).is_prerelease return SemanticVersion(version).is_prerelease
@ -20,8 +19,7 @@ def is_pre_release(version):
return False return False
def meets_requirements(version, requirements): def meets_requirements(version: str, requirements: str) -> bool:
# type: (str, str) -> bool
"""Verify if a given version satisfies all the requirements. """Verify if a given version satisfies all the requirements.
Supported version identifiers are: Supported version identifiers are:

@ -225,7 +225,13 @@ def sanity_check(module, host, key, sshkeygen):
rc, stdout, stderr = module.run_command(sshkeygen_command) rc, stdout, stderr = module.run_command(sshkeygen_command)
if stdout == '': # host not found if stdout == '': # host not found
module.fail_json(msg="Host parameter does not match hashed host field in supplied key") results = {
"msg": "Host parameter does not match hashed host field in supplied key",
"rc": rc,
}
if stderr:
results["stderr"] = stderr
module.fail_json(**results)
def search_for_host_key(module, host, key, path, sshkeygen): def search_for_host_key(module, host, key, path, sshkeygen):

@ -54,7 +54,7 @@ class DataLoader:
def __init__(self) -> None: def __init__(self) -> None:
self._basedir: str = '.' self._basedir: str = os.path.abspath('.')
# NOTE: not effective with forks as the main copy does not get updated. # NOTE: not effective with forks as the main copy does not get updated.
# avoids rereading files # avoids rereading files
@ -227,7 +227,7 @@ class DataLoader:
def set_basedir(self, basedir: str) -> None: def set_basedir(self, basedir: str) -> None:
""" sets the base directory, used to find files when a relative path is given """ """ sets the base directory, used to find files when a relative path is given """
self._basedir = basedir self._basedir = os.path.abspath(basedir)
def path_dwim(self, given: str) -> str: def path_dwim(self, given: str) -> str:
""" """

@ -17,7 +17,6 @@
from __future__ import annotations from __future__ import annotations
import ansible.constants as C
from ansible.errors import AnsibleParserError from ansible.errors import AnsibleParserError
from ansible.module_utils.common.sentinel import Sentinel from ansible.module_utils.common.sentinel import Sentinel
from ansible.playbook.attribute import NonInheritableFieldAttribute from ansible.playbook.attribute import NonInheritableFieldAttribute
@ -316,8 +315,7 @@ class Block(Base, Conditional, CollectionSearch, Taggable, Notifiable, Delegatab
filtered_block = evaluate_block(task) filtered_block = evaluate_block(task)
if filtered_block.has_tasks(): if filtered_block.has_tasks():
tmp_list.append(filtered_block) tmp_list.append(filtered_block)
elif ((task.action in C._ACTION_META and task.implicit) or elif task.evaluate_tags(self._play.only_tags, self._play.skip_tags, all_vars=all_vars):
task.evaluate_tags(self._play.only_tags, self._play.skip_tags, all_vars=all_vars)):
tmp_list.append(task) tmp_list.append(task)
return tmp_list return tmp_list

@ -165,17 +165,29 @@ def load_list_of_tasks(ds, play, block=None, role=None, task_include=None, use_h
subdir = 'tasks' subdir = 'tasks'
if use_handlers: if use_handlers:
subdir = 'handlers' subdir = 'handlers'
try:
include_target = templar.template(task.args['_raw_params'])
except AnsibleUndefinedVariable as ex:
raise AnsibleParserError(
message=f"Error when evaluating variable in import path {task.args['_raw_params']!r}.",
help_text="When using static imports, ensure that any variables used in their names are defined in vars/vars_files\n"
"or extra-vars passed in from the command line. Static imports cannot use variables from facts or inventory\n"
"sources like group or host vars.",
obj=task_ds,
) from ex
# FIXME this appears to be (almost?) duplicate code as in IncludedFile for include_tasks
while parent_include is not None: while parent_include is not None:
if not isinstance(parent_include, TaskInclude): if not isinstance(parent_include, TaskInclude):
parent_include = parent_include._parent parent_include = parent_include._parent
continue continue
parent_include.post_validate(templar=templar) if isinstance(parent_include, IncludeRole):
parent_include_dir = os.path.dirname(parent_include.args.get('_raw_params')) parent_include_dir = parent_include._role_path
else:
parent_include_dir = os.path.dirname(templar.template(parent_include.args.get('_raw_params')))
if cumulative_path is None: if cumulative_path is None:
cumulative_path = parent_include_dir cumulative_path = parent_include_dir
elif not os.path.isabs(cumulative_path): elif not os.path.isabs(cumulative_path):
cumulative_path = os.path.join(parent_include_dir, cumulative_path) cumulative_path = os.path.join(parent_include_dir, cumulative_path)
include_target = templar.template(task.args['_raw_params'])
if task._role: if task._role:
new_basedir = os.path.join(task._role._role_path, subdir, cumulative_path) new_basedir = os.path.join(task._role._role_path, subdir, cumulative_path)
include_file = loader.path_dwim_relative(new_basedir, subdir, include_target) include_file = loader.path_dwim_relative(new_basedir, subdir, include_target)
@ -189,16 +201,6 @@ def load_list_of_tasks(ds, play, block=None, role=None, task_include=None, use_h
parent_include = parent_include._parent parent_include = parent_include._parent
if not found: if not found:
try:
include_target = templar.template(task.args['_raw_params'])
except AnsibleUndefinedVariable as ex:
raise AnsibleParserError(
message=f"Error when evaluating variable in import path {task.args['_raw_params']!r}.",
help_text="When using static imports, ensure that any variables used in their names are defined in vars/vars_files\n"
"or extra-vars passed in from the command line. Static imports cannot use variables from facts or inventory\n"
"sources like group or host vars.",
obj=task_ds,
) from ex
if task._role: if task._role:
include_file = loader.path_dwim_relative(task._role._role_path, subdir, include_target) include_file = loader.path_dwim_relative(task._role._role_path, subdir, include_target)
else: else:

@ -314,19 +314,9 @@ class Play(Base, Taggable, CollectionSearch):
t.args['_raw_params'] = 'flush_handlers' t.args['_raw_params'] = 'flush_handlers'
t.implicit = True t.implicit = True
t.set_loader(self._loader) t.set_loader(self._loader)
t.tags = ['always']
if self.tags: flush_block.block = [t]
# Avoid calling flush_handlers in case the whole play is skipped on tags,
# this could be performance improvement since calling flush_handlers on
# large inventories could be expensive even if no hosts are notified
# since we call flush_handlers per host.
# Block.filter_tagged_tasks ignores evaluating tags on implicit meta
# tasks so we need to explicitly call Task.evaluate_tags here.
t.tags = self.tags
if t.evaluate_tags(self.only_tags, self.skip_tags, all_vars=self.vars):
flush_block.block = [t]
else:
flush_block.block = [t]
# NOTE keep flush_handlers tasks even if a section has no regular tasks, # NOTE keep flush_handlers tasks even if a section has no regular tasks,
# there may be notified handlers from the previous section # there may be notified handlers from the previous section

@ -192,7 +192,7 @@ class ActionModule(ActionBase):
msg="checksum mismatch", file=source, dest=dest, remote_md5sum=None, msg="checksum mismatch", file=source, dest=dest, remote_md5sum=None,
checksum=new_checksum, remote_checksum=remote_checksum)) checksum=new_checksum, remote_checksum=remote_checksum))
else: else:
result.update({'changed': True, 'md5sum': new_md5, 'dest': dest, result.update({'changed': True, 'md5sum': new_md5, 'file': source, 'dest': dest,
'remote_md5sum': None, 'checksum': new_checksum, 'remote_md5sum': None, 'checksum': new_checksum,
'remote_checksum': remote_checksum}) 'remote_checksum': remote_checksum})
else: else:

@ -331,7 +331,7 @@ try:
from pypsrp.host import PSHost, PSHostUserInterface from pypsrp.host import PSHost, PSHostUserInterface
from pypsrp.powershell import PowerShell, RunspacePool from pypsrp.powershell import PowerShell, RunspacePool
from pypsrp.wsman import WSMan from pypsrp.wsman import WSMan
from requests.exceptions import ConnectionError, ConnectTimeout from requests.exceptions import ConnectionError, ConnectTimeout, ReadTimeout
except ImportError as err: except ImportError as err:
HAS_PYPSRP = False HAS_PYPSRP = False
PYPSRP_IMP_ERR = err PYPSRP_IMP_ERR = err
@ -479,11 +479,16 @@ class Connection(ConnectionBase):
pwsh_in_data = in_data pwsh_in_data = in_data
display.vvv(u"PSRP: EXEC %s" % script, host=self._psrp_host) display.vvv(u"PSRP: EXEC %s" % script, host=self._psrp_host)
rc, stdout, stderr = self._exec_psrp_script( try:
script=script, rc, stdout, stderr = self._exec_psrp_script(
input_data=pwsh_in_data.splitlines() if pwsh_in_data else None, script=script,
arguments=script_args, input_data=pwsh_in_data.splitlines() if pwsh_in_data else None,
) arguments=script_args,
)
except ReadTimeout as e:
raise AnsibleConnectionFailure(
"HTTP read timeout during PSRP script execution"
) from e
return rc, stdout, stderr return rc, stdout, stderr
def put_file(self, in_path: str, out_path: str) -> None: def put_file(self, in_path: str, out_path: str) -> None:

@ -5,7 +5,7 @@ DOCUMENTATION:
short_description: Pythonic false short_description: Pythonic false
description: description:
- This check is a more Python version of what is 'false'. - This check is a more Python version of what is 'false'.
- It is the opposite of 'truthy'. - It is the opposite of P(ansible.builtin.truthy#test).
options: options:
_input: _input:
description: An expression that can be expressed in a boolean context. description: An expression that can be expressed in a boolean context.

@ -20,5 +20,5 @@ EXAMPLES: |
thisisfalse: '{{ "" is truthy }}' thisisfalse: '{{ "" is truthy }}'
RETURN: RETURN:
_value: _value:
description: Returns V(True) if the condition is not "Python truthy", V(False) otherwise. description: Returns V(True) if the condition is "Python truthy", V(False) otherwise.
type: boolean type: boolean

@ -17,6 +17,6 @@
from __future__ import annotations from __future__ import annotations
__version__ = '2.20.0.dev0' __version__ = '2.20.0rc2'
__author__ = 'Ansible, Inc.' __author__ = 'Ansible, Inc.'
__codename__ = "Good Times Bad Times" __codename__ = "Good Times Bad Times"

@ -17,7 +17,6 @@
from __future__ import annotations from __future__ import annotations
import os
import sys import sys
import typing as t import typing as t
@ -447,7 +446,7 @@ class VariableManager:
""" """
variables = {} variables = {}
variables['playbook_dir'] = os.path.abspath(self._loader.get_basedir()) variables['playbook_dir'] = self._loader.get_basedir()
variables['ansible_playbook_python'] = sys.executable variables['ansible_playbook_python'] = sys.executable
variables['ansible_config_file'] = C.CONFIG_FILE variables['ansible_config_file'] = C.CONFIG_FILE

@ -1,5 +1,5 @@
[build-system] [build-system]
requires = ["setuptools >= 66.1.0, <= 80.3.1", "wheel == 0.45.1"] # lower bound to support controller Python versions, upper bound for latest version tested at release requires = ["setuptools >= 66.1.0, <= 80.9.0", "wheel == 0.45.1"] # lower bound to support controller Python versions, upper bound for latest version tested at release
build-backend = "setuptools.build_meta" build-backend = "setuptools.build_meta"
[project] [project]

@ -12,4 +12,4 @@ packaging
# NOTE: Ref: https://github.com/sarugaku/resolvelib/issues/69 # NOTE: Ref: https://github.com/sarugaku/resolvelib/issues/69
# NOTE: When updating the upper bound, also update the latest version used # NOTE: When updating the upper bound, also update the latest version used
# NOTE: in the ansible-galaxy-collection test suite. # NOTE: in the ansible-galaxy-collection test suite.
resolvelib >= 0.5.3, < 2.0.0 # dependency resolver used by ansible-galaxy resolvelib >= 0.8.0, < 2.0.0 # dependency resolver used by ansible-galaxy

@ -121,7 +121,7 @@ done
echo "testing role text output" echo "testing role text output"
# we use sed to strip the role path from the first line # we use sed to strip the role path from the first line
current_role_out="$(ansible-doc -t role -r ./roles test_role1 | sed '1 s/\(^> ROLE: \*test_role1\*\).*(.*)$/\1/')" current_role_out="$(ansible-doc -t role -r ./roles test_role1 | sed '1 s/\(^> ROLE: \*test_role1\*\).*(.*)$/\1/' | python fix-urls.py)"
expected_role_out="$(sed '1 s/\(^> ROLE: \*test_role1\*\).*(.*)$/\1/' fakerole.output)" expected_role_out="$(sed '1 s/\(^> ROLE: \*test_role1\*\).*(.*)$/\1/' fakerole.output)"
test "$current_role_out" == "$expected_role_out" test "$current_role_out" == "$expected_role_out"
@ -211,6 +211,13 @@ ANSIBLE_LIBRARY='./nolibrary' ansible-doc --metadata-dump --no-fail-on-errors --
output=$(ANSIBLE_LIBRARY='./nolibrary' ansible-doc --metadata-dump --playbook-dir broken-docs testns.testcol 2>&1 | grep -c 'ERROR' || true) output=$(ANSIBLE_LIBRARY='./nolibrary' ansible-doc --metadata-dump --playbook-dir broken-docs testns.testcol 2>&1 | grep -c 'ERROR' || true)
test "${output}" -eq 1 test "${output}" -eq 1
# ensure --metadata-dump does not crash if the ansible_collections is nested (https://github.com/ansible/ansible/issues/84909)
testdir="$(pwd)"
pbdir="collections/ansible_collections/testns/testcol/playbooks"
cd "$pbdir"
ANSIBLE_COLLECTIONS_PATH="$testdir/$pbdir/collections" ansible-doc -vvv --metadata-dump --no-fail-on-errors
cd "$testdir"
echo "test doc list on broken role metadata" echo "test doc list on broken role metadata"
# ensure that role doc does not fail when --no-fail-on-errors is supplied # ensure that role doc does not fail when --no-fail-on-errors is supplied
ANSIBLE_LIBRARY='./nolibrary' ansible-doc --no-fail-on-errors --playbook-dir broken-docs testns.testcol.testrole -t role 1>/dev/null 2>&1 ANSIBLE_LIBRARY='./nolibrary' ansible-doc --no-fail-on-errors --playbook-dir broken-docs testns.testcol.testrole -t role 1>/dev/null 2>&1

@ -5,7 +5,5 @@ test_repo_path: "{{ galaxy_dir }}/development/ansible_test"
test_error_repo_path: "{{ galaxy_dir }}/development/error_test" test_error_repo_path: "{{ galaxy_dir }}/development/error_test"
supported_resolvelib_versions: supported_resolvelib_versions:
- "0.5.3" # Oldest supported - "0.8.0" # Oldest supported
- "0.6.0" - "< 2.0.0"
- "0.7.0"
- "0.8.0"

@ -5,18 +5,14 @@ gpg_homedir: "{{ galaxy_dir }}/gpg"
offline_server: https://test-hub.demolab.local/api/galaxy/content/api/ offline_server: https://test-hub.demolab.local/api/galaxy/content/api/
# Test oldest and most recently supported, and versions with notable changes. # Test oldest and most recently supported, and versions with notable changes.
# The last breaking change for a feature ansible-galaxy uses was in 0.8.0.
# It would be redundant to test every minor version since 0.8.0, so we just test against the latest minor release.
# NOTE: If ansible-galaxy incorporates new resolvelib features, this matrix should be updated to verify the features work on all supported versions. # NOTE: If ansible-galaxy incorporates new resolvelib features, this matrix should be updated to verify the features work on all supported versions.
supported_resolvelib_versions: supported_resolvelib_versions:
- "0.5.3" # test CollectionDependencyProvider050 - "0.8.0"
- "0.6.0" # test CollectionDependencyProvider060 - "< 2.0.0"
- "0.7.0" # test CollectionDependencyProvider070
- "<2.0.0" # test CollectionDependencyProvider080
unsupported_resolvelib_versions: unsupported_resolvelib_versions:
- "0.2.0" # Fails on import - "0.2.0" # Fails on import
- "0.5.1" - "0.5.3"
pulp_repositories: pulp_repositories:
- primary - primary

@ -14,4 +14,4 @@ import string # pylint: disable=unused-import
# 'Call' object has no attribute 'value' # 'Call' object has no attribute 'value'
result = {None: None}[{}.get('something')] result = {None: None}[{}.get('something')]
foo = {}.keys() foo = {}.keys() # should trigger disallowed-name, but doesn't in pylint 4.0.0, probably due to https://github.com/pylint-dev/pylint/issues/10652

@ -1,7 +1,6 @@
plugins/modules/bad.py import plugins/modules/bad.py import
plugins/modules/bad.py pylint:ansible-bad-module-import plugins/modules/bad.py pylint:ansible-bad-module-import
plugins/lookup/bad.py import plugins/lookup/bad.py import
plugins/plugin_utils/check_pylint.py pylint:disallowed-name
tests/integration/targets/hello/files/bad.py pylint:ansible-bad-function tests/integration/targets/hello/files/bad.py pylint:ansible-bad-function
tests/integration/targets/hello/files/bad.py pylint:ansible-bad-import tests/integration/targets/hello/files/bad.py pylint:ansible-bad-import
tests/integration/targets/hello/files/bad.py pylint:ansible-bad-import-from tests/integration/targets/hello/files/bad.py pylint:ansible-bad-import-from

@ -2,4 +2,5 @@ needs/ssh
shippable/posix/group3 shippable/posix/group3
needs/target/connection needs/target/connection
needs/target/setup_test_user needs/target/setup_test_user
needs/target/test_utils
setup/always/setup_passlib_controller # required for setup_test_user setup/always/setup_passlib_controller # required for setup_test_user

@ -17,7 +17,7 @@ if command -v sshpass > /dev/null; then
# ansible with timeout. If we time out, our custom prompt was successfully # ansible with timeout. If we time out, our custom prompt was successfully
# searched for. It's a weird way of doing things, but it does ensure # searched for. It's a weird way of doing things, but it does ensure
# that the flag gets passed to sshpass. # that the flag gets passed to sshpass.
timeout 5 ansible -m ping \ ../test_utils/scripts/timeout.py 5 -- ansible -m ping \
-e ansible_connection=ssh \ -e ansible_connection=ssh \
-e ansible_ssh_password_mechanism=sshpass \ -e ansible_ssh_password_mechanism=sshpass \
-e ansible_sshpass_prompt=notThis: \ -e ansible_sshpass_prompt=notThis: \

@ -30,8 +30,10 @@
that: that:
- fetched is changed - fetched is changed
- fetched.checksum == "a94a8fe5ccb19ba61c4c0873d391e987982fbbd3" - fetched.checksum == "a94a8fe5ccb19ba61c4c0873d391e987982fbbd3"
- fetched.file == remote_tmp_dir ~ "/orig"
- fetched_again is not changed - fetched_again is not changed
- fetched_again.checksum == "a94a8fe5ccb19ba61c4c0873d391e987982fbbd3" - fetched_again.checksum == "a94a8fe5ccb19ba61c4c0873d391e987982fbbd3"
- fetched_again.file == remote_tmp_dir ~ "/orig"
- fetched.remote_checksum == "a94a8fe5ccb19ba61c4c0873d391e987982fbbd3" - fetched.remote_checksum == "a94a8fe5ccb19ba61c4c0873d391e987982fbbd3"
- lookup("file", output_dir + "/fetched/" + inventory_hostname + remote_tmp_dir + "/orig") == "test" - lookup("file", output_dir + "/fetched/" + inventory_hostname + remote_tmp_dir + "/orig") == "test"
- fetch_check_mode is skipped - fetch_check_mode is skipped

@ -396,6 +396,8 @@
src: "testserver.py" src: "testserver.py"
dest: "{{ remote_tmp_dir }}/testserver.py" dest: "{{ remote_tmp_dir }}/testserver.py"
# NOTE: This http test server will live for only the timeout specified in "async", so all uses
# of it must be grouped relatively close together.
- name: start SimpleHTTPServer for issues 27617 - name: start SimpleHTTPServer for issues 27617
shell: cd {{ files_dir }} && {{ ansible_python.executable }} {{ remote_tmp_dir}}/testserver.py {{ http_port }} shell: cd {{ files_dir }} && {{ ansible_python.executable }} {{ remote_tmp_dir}}/testserver.py {{ http_port }}
async: 90 async: 90
@ -578,6 +580,19 @@
- "stat_result_sha256_with_file_scheme_71420.stat.exists == true" - "stat_result_sha256_with_file_scheme_71420.stat.exists == true"
- "stat_result_sha256_checksum_only.stat.exists == true" - "stat_result_sha256_checksum_only.stat.exists == true"
- name: Test for incomplete data read (issue 85164)
get_url:
url: 'http://localhost:{{ http_port }}/incompleteRead'
dest: '{{ remote_tmp_dir }}/85164.txt'
ignore_errors: true
register: result
- name: Assert we have an incomplete read failure
assert:
that:
- result is failed
- '"Incomplete read" in result.msg'
#https://github.com/ansible/ansible/issues/16191 #https://github.com/ansible/ansible/issues/16191
- name: Test url split with no filename - name: Test url split with no filename
get_url: get_url:
@ -761,16 +776,3 @@
- assert: - assert:
that: that:
- get_dir_filename.dest == remote_tmp_dir ~ "/filename.json" - get_dir_filename.dest == remote_tmp_dir ~ "/filename.json"
- name: Test for incomplete data read (issue 85164)
get_url:
url: 'http://localhost:{{ http_port }}/incompleteRead'
dest: '{{ remote_tmp_dir }}/85164.txt'
ignore_errors: true
register: result
- name: Assert we have an incomplete read failure
assert:
that:
- result is failed
- '"Incomplete read" in result.msg'

@ -229,6 +229,13 @@ ansible-playbook handler_notify_earlier_handler.yml "$@" 2>&1 | tee out.txt
ANSIBLE_DEBUG=1 ansible-playbook tagged_play.yml --skip-tags the_whole_play "$@" 2>&1 | tee out.txt ANSIBLE_DEBUG=1 ansible-playbook tagged_play.yml --skip-tags the_whole_play "$@" 2>&1 | tee out.txt
[ "$(grep out.txt -ce 'META: triggered running handlers')" = "0" ] [ "$(grep out.txt -ce 'META: triggered running handlers')" = "0" ]
[ "$(grep out.txt -ce 'No handler notifications for')" = "0" ]
[ "$(grep out.txt -ce 'handler_ran')" = "0" ] [ "$(grep out.txt -ce 'handler_ran')" = "0" ]
[ "$(grep out.txt -ce 'handler1_ran')" = "0" ]
ansible-playbook rescue_flush_handlers.yml "$@" ansible-playbook rescue_flush_handlers.yml "$@"
ANSIBLE_DEBUG=1 ansible-playbook tagged_play.yml --tags task_tag "$@" 2>&1 | tee out.txt
[ "$(grep out.txt -ce 'META: triggered running handlers')" = "1" ]
[ "$(grep out.txt -ce 'handler_ran')" = "0" ]
[ "$(grep out.txt -ce 'handler1_ran')" = "1" ]

@ -2,9 +2,19 @@
gather_facts: false gather_facts: false
tags: the_whole_play tags: the_whole_play
tasks: tasks:
- command: echo - debug:
changed_when: true
notify: h notify: h
- debug:
changed_when: true
notify: h1
tags: task_tag
handlers: handlers:
- name: h - name: h
debug: debug:
msg: handler_ran msg: handler_ran
- name: h1
debug:
msg: handler1_ran

@ -155,3 +155,9 @@ ansible-playbook test_null_include_filename.yml 2>&1 | tee test_null_include_fil
test "$(grep -c 'No file specified for ansible.builtin.include_tasks' test_null_include_filename.out)" = 1 test "$(grep -c 'No file specified for ansible.builtin.include_tasks' test_null_include_filename.out)" = 1
test "$(grep -c '.*/include_import/null_filename/tasks.yml:4:3.*' test_null_include_filename.out)" = 1 test "$(grep -c '.*/include_import/null_filename/tasks.yml:4:3.*' test_null_include_filename.out)" = 1
test "$(grep -c '\- name: invalid include_task definition' test_null_include_filename.out)" = 1 test "$(grep -c '\- name: invalid include_task definition' test_null_include_filename.out)" = 1
# https://github.com/ansible/ansible/issues/69882
set +e
ansible-playbook test_nested_non_existent_tasks.yml 2>&1 | tee test_nested_non_existent_tasks.out
set -e
test "$(grep -c 'Could not find or access' test_nested_non_existent_tasks.out)" = 3

@ -0,0 +1,5 @@
- hosts: localhost
gather_facts: false
tasks:
- include_role:
name: nested_tasks

@ -10,4 +10,9 @@
that: that:
- nested_adjacent_count|int == 2 - nested_adjacent_count|int == 2
- set_fact:
not_available_at_parsing: root
- import_tasks: "{{ role_path }}/tests/main.yml" - import_tasks: "{{ role_path }}/tests/main.yml"
become: true
become_user: "{{ not_available_at_parsing }}"

@ -0,0 +1,6 @@
- command: whoami
register: r
- assert:
that:
- r.stdout == not_available_at_parsing

@ -507,3 +507,4 @@
that: that:
- result is failed - result is failed
- result.msg == 'Host parameter does not match hashed host field in supplied key' - result.msg == 'Host parameter does not match hashed host field in supplied key'
- result.rc is defined

@ -0,0 +1,3 @@
shippable/posix/group4
context/controller
needs/target/test_utils

@ -0,0 +1,14 @@
localhost0
localhost1
localhost2
localhost3
localhost4
localhost5
localhost6
localhost7
localhost8
localhost9
[all:vars]
ansible_connection=local
ansible_python_interpreter={{ansible_playbook_python}}

@ -0,0 +1,21 @@
#!/usr/bin/env bash
set -x
../test_utils/scripts/timeout.py -s SIGINT 3 -- \
ansible all -i inventory -m debug -a 'msg={{lookup("pipe", "sleep 33")}}' -f 10
if [[ "$?" != "124" ]]; then
echo "Process was not terminated due to timeout"
exit 1
fi
# a short sleep to let processes die
sleep 2
sleeps="$(pgrep -alf 'sleep\ 33')"
rc="$?"
if [[ "$rc" == "0" ]]; then
echo "Found lingering processes:"
echo "$sleeps"
exit 1
fi

@ -29,15 +29,17 @@
vars: vars:
pid: '{{ auto.stdout|regex_findall("ssh-agent\[(\d+)\]")|first }}' pid: '{{ auto.stdout|regex_findall("ssh-agent\[(\d+)\]")|first }}'
- command: ssh-agent -D -s -a '{{ output_dir }}/agent.sock' - shell: ssh-agent -D -s -a '{{ output_dir }}/agent.sock' &
async: 30 register: ssh_agent_result
poll: 0
- command: ansible-playbook -i {{ ansible_inventory_sources|first|quote }} -vvv {{ role_path }}/auto.yml - block:
environment: - command: ansible-playbook -i {{ ansible_inventory_sources|first|quote }} -vvv {{ role_path }}/auto.yml
ANSIBLE_CALLBACK_RESULT_FORMAT: yaml environment:
ANSIBLE_SSH_AGENT: '{{ output_dir }}/agent.sock' ANSIBLE_CALLBACK_RESULT_FORMAT: yaml
register: existing ANSIBLE_SSH_AGENT: '{{ output_dir }}/agent.sock'
register: existing
always:
- command: "kill {{ ssh_agent_result.stdout | regex_search('Agent pid ([0-9]+)', '\\1') | first }}"
- assert: - assert:
that: that:

@ -2,21 +2,32 @@
from __future__ import annotations from __future__ import annotations
import argparse import argparse
import signal
import subprocess import subprocess
import sys import sys
def signal_type(v: str) -> signal.Signals:
if v.isdecimal():
return signal.Signals(int(v))
if not v.startswith('SIG'):
v = f'SIG{v}'
return getattr(signal.Signals, v)
parser = argparse.ArgumentParser() parser = argparse.ArgumentParser()
parser.add_argument('duration', type=int) parser.add_argument('duration', type=int)
parser.add_argument('--signal', '-s', default=signal.SIGTERM, type=signal_type)
parser.add_argument('command', nargs='+') parser.add_argument('command', nargs='+')
args = parser.parse_args() args = parser.parse_args()
p: subprocess.Popen | None = None
try: try:
p = subprocess.run( p = subprocess.Popen(args.command)
' '.join(args.command), p.wait(timeout=args.duration)
shell=True,
timeout=args.duration,
check=False,
)
sys.exit(p.returncode) sys.exit(p.returncode)
except subprocess.TimeoutExpired: except subprocess.TimeoutExpired:
if p and p.poll() is None:
p.send_signal(args.signal)
p.wait()
sys.exit(124) sys.exit(124)

@ -1,7 +1,7 @@
base image=quay.io/ansible/base-test-container:v2.20-0 python=3.13,3.9,3.10,3.11,3.12,3.14 base image=quay.io/ansible/base-test-container:v2.20-2 python=3.14,3.9,3.10,3.11,3.12,3.13
default image=quay.io/ansible/default-test-container:v2.20-0 python=3.13,3.9,3.10,3.11,3.12,3.14 context=collection default image=quay.io/ansible/default-test-container:v2.20-3 python=3.14,3.9,3.10,3.11,3.12,3.13 context=collection
default image=quay.io/ansible/ansible-core-test-container:v2.20-0 python=3.13,3.9,3.10,3.11,3.12,3.14 context=ansible-core default image=quay.io/ansible/ansible-core-test-container:v2.20-3 python=3.14,3.9,3.10,3.11,3.12,3.13 context=ansible-core
alpine322 image=quay.io/ansible/alpine-test-container:3.22-v2.20-0 python=3.12 cgroup=none audit=none alpine322 image=quay.io/ansible/alpine-test-container:3.22-v2.20-1 python=3.12 cgroup=none audit=none
fedora42 image=quay.io/ansible/fedora-test-container:42-v2.20-0 python=3.13 cgroup=v2-only fedora42 image=quay.io/ansible/fedora-test-container:42-v2.20-1 python=3.13 cgroup=v2-only
ubuntu2204 image=quay.io/ansible/ubuntu-test-container:22.04-v2.20-0 python=3.10 ubuntu2204 image=quay.io/ansible/ubuntu-test-container:22.04-v2.20-1 python=3.10
ubuntu2404 image=quay.io/ansible/ubuntu-test-container:24.04-v2.20-0 python=3.12 ubuntu2404 image=quay.io/ansible/ubuntu-test-container:24.04-v2.20-1 python=3.12

@ -1,2 +1,2 @@
# The test-constraints sanity test verifies this file, but changes must be made manually to keep it in up-to-date. # The test-constraints sanity test verifies this file, but changes must be made manually to keep it in up-to-date.
coverage == 7.10.6 ; python_version >= '3.9' and python_version <= '3.14' coverage == 7.10.7 ; python_version >= '3.9' and python_version <= '3.14'

@ -12,4 +12,4 @@ packaging
# NOTE: Ref: https://github.com/sarugaku/resolvelib/issues/69 # NOTE: Ref: https://github.com/sarugaku/resolvelib/issues/69
# NOTE: When updating the upper bound, also update the latest version used # NOTE: When updating the upper bound, also update the latest version used
# NOTE: in the ansible-galaxy-collection test suite. # NOTE: in the ansible-galaxy-collection test suite.
resolvelib >= 0.5.3, < 2.0.0 # dependency resolver used by ansible-galaxy resolvelib >= 0.8.0, < 2.0.0 # dependency resolver used by ansible-galaxy

@ -1,5 +1,5 @@
# edit "sanity.ansible-doc.in" and generate with: hacking/update-sanity-requirements.py --test ansible-doc # edit "sanity.ansible-doc.in" and generate with: hacking/update-sanity-requirements.py --test ansible-doc
Jinja2==3.1.6 Jinja2==3.1.6
MarkupSafe==3.0.2 MarkupSafe==3.0.3
packaging==25.0 packaging==25.0
PyYAML==6.0.2 PyYAML==6.0.3

@ -2,7 +2,7 @@
antsibull-changelog==0.29.0 antsibull-changelog==0.29.0
docutils==0.18.1 docutils==0.18.1
packaging==25.0 packaging==25.0
PyYAML==6.0.2 PyYAML==6.0.3
rstcheck==5.0.0 rstcheck==5.0.0
semantic-version==2.10.0 semantic-version==2.10.0
types-docutils==0.18.3 types-docutils==0.18.3

@ -1,4 +1,4 @@
# edit "sanity.import.plugin.in" and generate with: hacking/update-sanity-requirements.py --test import.plugin # edit "sanity.import.plugin.in" and generate with: hacking/update-sanity-requirements.py --test import.plugin
Jinja2==3.1.6 Jinja2==3.1.6
MarkupSafe==3.0.2 MarkupSafe==3.0.3
PyYAML==6.0.2 PyYAML==6.0.3

@ -1,2 +1,2 @@
# edit "sanity.import.in" and generate with: hacking/update-sanity-requirements.py --test import # edit "sanity.import.in" and generate with: hacking/update-sanity-requirements.py --test import
PyYAML==6.0.2 PyYAML==6.0.3

@ -1,2 +1,2 @@
# edit "sanity.integration-aliases.in" and generate with: hacking/update-sanity-requirements.py --test integration-aliases # edit "sanity.integration-aliases.in" and generate with: hacking/update-sanity-requirements.py --test integration-aliases
PyYAML==6.0.2 PyYAML==6.0.3

@ -1,9 +1,9 @@
# edit "sanity.pylint.in" and generate with: hacking/update-sanity-requirements.py --test pylint # edit "sanity.pylint.in" and generate with: hacking/update-sanity-requirements.py --test pylint
astroid==3.3.11 astroid==4.0.1
dill==0.4.0 dill==0.4.0
isort==6.0.1 isort==7.0.0
mccabe==0.7.0 mccabe==0.7.0
platformdirs==4.4.0 platformdirs==4.5.0
pylint==3.3.8 pylint==4.0.0
PyYAML==6.0.2 PyYAML==6.0.3
tomlkit==0.13.3 tomlkit==0.13.3

@ -1,3 +1,3 @@
# edit "sanity.runtime-metadata.in" and generate with: hacking/update-sanity-requirements.py --test runtime-metadata # edit "sanity.runtime-metadata.in" and generate with: hacking/update-sanity-requirements.py --test runtime-metadata
PyYAML==6.0.2 PyYAML==6.0.3
voluptuous==0.15.2 voluptuous==0.15.2

@ -1,6 +1,6 @@
# edit "sanity.validate-modules.in" and generate with: hacking/update-sanity-requirements.py --test validate-modules # edit "sanity.validate-modules.in" and generate with: hacking/update-sanity-requirements.py --test validate-modules
antsibull-docs-parser==1.0.0 antsibull-docs-parser==1.0.0
Jinja2==3.1.6 Jinja2==3.1.6
MarkupSafe==3.0.2 MarkupSafe==3.0.3
PyYAML==6.0.2 PyYAML==6.0.3
voluptuous==0.15.2 voluptuous==0.15.2

@ -1,4 +1,4 @@
# edit "sanity.yamllint.in" and generate with: hacking/update-sanity-requirements.py --test yamllint # edit "sanity.yamllint.in" and generate with: hacking/update-sanity-requirements.py --test yamllint
pathspec==0.12.1 pathspec==0.12.1
PyYAML==6.0.2 PyYAML==6.0.3
yamllint==1.37.1 yamllint==1.37.1

@ -301,4 +301,15 @@ class PylintTest(SanitySingleVersion):
else: else:
messages = [] messages = []
expected_paths = set(paths)
unexpected_messages = [message for message in messages if message["path"] not in expected_paths]
messages = [message for message in messages if message["path"] in expected_paths]
for unexpected_message in unexpected_messages:
display.info(f"Unexpected message: {json.dumps(unexpected_message)}", verbosity=4)
if unexpected_messages:
display.notice(f"Discarded {len(unexpected_messages)} unexpected messages. Use -vvvv to display.")
return messages return messages

@ -69,7 +69,7 @@ class CoverageVersion:
COVERAGE_VERSIONS = ( COVERAGE_VERSIONS = (
# IMPORTANT: Keep this in sync with the ansible-test.txt requirements file. # IMPORTANT: Keep this in sync with the ansible-test.txt requirements file.
CoverageVersion('7.10.6', 7, (3, 9), (3, 14)), CoverageVersion('7.10.7', 7, (3, 9), (3, 14)),
) )
""" """
This tuple specifies the coverage version to use for Python version ranges. This tuple specifies the coverage version to use for Python version ranges.

@ -433,7 +433,7 @@ def get_venv_packages(python: PythonConfig) -> dict[str, str]:
# See: https://github.com/ansible/base-test-container/blob/main/files/installer.py # See: https://github.com/ansible/base-test-container/blob/main/files/installer.py
default_packages = dict( default_packages = dict(
pip='24.2', pip='25.2',
) )
override_packages: dict[str, dict[str, str]] = { override_packages: dict[str, dict[str, str]] = {

@ -11,9 +11,11 @@ import functools
import pathlib import pathlib
import re import re
import astroid import astroid.bases
import astroid.context import astroid.exceptions
import astroid.nodes
import astroid.typing import astroid.typing
import astroid.util
import pylint.lint import pylint.lint
import pylint.checkers import pylint.checkers
@ -42,7 +44,7 @@ class DeprecationCallArgs:
def all_args_dynamic(self) -> bool: def all_args_dynamic(self) -> bool:
"""True if all args are dynamic or None, otherwise False.""" """True if all args are dynamic or None, otherwise False."""
return all(arg is None or isinstance(arg, astroid.NodeNG) for arg in dataclasses.asdict(self).values()) return all(arg is None or isinstance(arg, astroid.nodes.NodeNG) for arg in dataclasses.asdict(self).values())
class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker): class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
@ -177,7 +179,7 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
def __init__(self, *args, **kwargs) -> None: def __init__(self, *args, **kwargs) -> None:
super().__init__(*args, **kwargs) super().__init__(*args, **kwargs)
self.module_cache: dict[str, astroid.Module] = {} self.module_cache: dict[str, astroid.nodes.Module] = {}
@functools.cached_property @functools.cached_property
def collection_name(self) -> str | None: def collection_name(self) -> str | None:
@ -226,7 +228,7 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
return None return None
@pylint.checkers.utils.only_required_for_messages(*(msgs.keys())) @pylint.checkers.utils.only_required_for_messages(*(msgs.keys()))
def visit_call(self, node: astroid.Call) -> None: def visit_call(self, node: astroid.nodes.Call) -> None:
"""Visit a call node.""" """Visit a call node."""
if inferred := self.infer(node.func): if inferred := self.infer(node.func):
name = self.get_fully_qualified_name(inferred) name = self.get_fully_qualified_name(inferred)
@ -234,50 +236,50 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
if args := self.DEPRECATION_FUNCTIONS.get(name): if args := self.DEPRECATION_FUNCTIONS.get(name):
self.check_call(node, name, args) self.check_call(node, name, args)
def infer(self, node: astroid.NodeNG) -> astroid.NodeNG | None: def infer(self, node: astroid.nodes.NodeNG) -> astroid.nodes.NodeNG | None:
"""Return the inferred node from the given node, or `None` if it cannot be unambiguously inferred.""" """Return the inferred node from the given node, or `None` if it cannot be unambiguously inferred."""
names: list[str] = [] names: list[str] = []
target: astroid.NodeNG | None = node target: astroid.nodes.NodeNG | None = node
inferred: astroid.typing.InferenceResult | None = None inferred: astroid.typing.InferenceResult | None = None
while target: while target:
if inferred := astroid.util.safe_infer(target): if inferred := astroid.util.safe_infer(target):
break break
if isinstance(target, astroid.Call): if isinstance(target, astroid.nodes.Call):
inferred = self.infer(target.func) inferred = self.infer(target.func)
break break
if isinstance(target, astroid.FunctionDef): if isinstance(target, astroid.nodes.FunctionDef):
inferred = target inferred = target
break break
if isinstance(target, astroid.Name): if isinstance(target, astroid.nodes.Name):
target = self.infer_name(target) target = self.infer_name(target)
elif isinstance(target, astroid.AssignName) and isinstance(target.parent, astroid.Assign): elif isinstance(target, astroid.nodes.AssignName) and isinstance(target.parent, astroid.nodes.Assign):
target = target.parent.value target = target.parent.value
elif isinstance(target, astroid.Attribute): elif isinstance(target, astroid.nodes.Attribute):
names.append(target.attrname) names.append(target.attrname)
target = target.expr target = target.expr
else: else:
break break
for name in reversed(names): for name in reversed(names):
if isinstance(inferred, astroid.Instance): if isinstance(inferred, astroid.bases.Instance):
try: try:
attr = next(iter(inferred.getattr(name)), None) attr = next(iter(inferred.getattr(name)), None)
except astroid.AttributeInferenceError: except astroid.exceptions.AttributeInferenceError:
break break
if isinstance(attr, astroid.AssignAttr): if isinstance(attr, astroid.nodes.AssignAttr):
inferred = self.get_ansible_module(attr) inferred = self.get_ansible_module(attr)
continue continue
if isinstance(attr, astroid.FunctionDef): if isinstance(attr, astroid.nodes.FunctionDef):
inferred = attr inferred = attr
continue continue
if not isinstance(inferred, (astroid.Module, astroid.ClassDef)): if not isinstance(inferred, (astroid.nodes.Module, astroid.nodes.ClassDef)):
inferred = None inferred = None
break break
@ -288,15 +290,15 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
else: else:
inferred = self.infer(inferred) inferred = self.infer(inferred)
if isinstance(inferred, astroid.FunctionDef) and isinstance(inferred.parent, astroid.ClassDef): if isinstance(inferred, astroid.nodes.FunctionDef) and isinstance(inferred.parent, astroid.nodes.ClassDef):
inferred = astroid.BoundMethod(inferred, inferred.parent) inferred = astroid.bases.BoundMethod(inferred, inferred.parent)
return inferred return inferred
def infer_name(self, node: astroid.Name) -> astroid.NodeNG | None: def infer_name(self, node: astroid.nodes.Name) -> astroid.nodes.NodeNG | None:
"""Infer the node referenced by the given name, or `None` if it cannot be unambiguously inferred.""" """Infer the node referenced by the given name, or `None` if it cannot be unambiguously inferred."""
scope = node.scope() scope = node.scope()
inferred: astroid.NodeNG | None = None inferred: astroid.nodes.NodeNG | None = None
name = node.name name = node.name
while scope: while scope:
@ -306,12 +308,12 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
scope = scope.parent.scope() if scope.parent else None scope = scope.parent.scope() if scope.parent else None
continue continue
if isinstance(assignment, astroid.AssignName) and isinstance(assignment.parent, astroid.Assign): if isinstance(assignment, astroid.nodes.AssignName) and isinstance(assignment.parent, astroid.nodes.Assign):
inferred = assignment.parent.value inferred = assignment.parent.value
elif ( elif (
isinstance(scope, astroid.FunctionDef) isinstance(scope, astroid.nodes.FunctionDef)
and isinstance(assignment, astroid.AssignName) and isinstance(assignment, astroid.nodes.AssignName)
and isinstance(assignment.parent, astroid.Arguments) and isinstance(assignment.parent, astroid.nodes.Arguments)
and assignment.parent.annotations and assignment.parent.annotations
): ):
idx, _node = assignment.parent.find_argname(name) idx, _node = assignment.parent.find_argname(name)
@ -322,12 +324,12 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
except IndexError: except IndexError:
pass pass
else: else:
if isinstance(annotation, astroid.Name): if isinstance(annotation, astroid.nodes.Name):
name = annotation.name name = annotation.name
continue continue
elif isinstance(assignment, astroid.ClassDef): elif isinstance(assignment, astroid.nodes.ClassDef):
inferred = assignment inferred = assignment
elif isinstance(assignment, astroid.ImportFrom): elif isinstance(assignment, astroid.nodes.ImportFrom):
if module := self.get_module(assignment): if module := self.get_module(assignment):
name = assignment.real_name(name) name = assignment.real_name(name)
scope = module.scope() scope = module.scope()
@ -337,7 +339,7 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
return inferred return inferred
def get_module(self, node: astroid.ImportFrom) -> astroid.Module | None: def get_module(self, node: astroid.nodes.ImportFrom) -> astroid.nodes.Module | None:
"""Import the requested module if possible and cache the result.""" """Import the requested module if possible and cache the result."""
module_name = pylint.checkers.utils.get_import_name(node, node.modname) module_name = pylint.checkers.utils.get_import_name(node, node.modname)
@ -357,21 +359,21 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
return module return module
@staticmethod @staticmethod
def get_fully_qualified_name(node: astroid.NodeNG) -> str | None: def get_fully_qualified_name(node: astroid.nodes.NodeNG) -> str | None:
"""Return the fully qualified name of the given inferred node.""" """Return the fully qualified name of the given inferred node."""
parent = node.parent parent = node.parent
parts: tuple[str, ...] | None parts: tuple[str, ...] | None
if isinstance(node, astroid.FunctionDef) and isinstance(parent, astroid.Module): if isinstance(node, astroid.nodes.FunctionDef) and isinstance(parent, astroid.nodes.Module):
parts = (parent.name, node.name) parts = (parent.name, node.name)
elif isinstance(node, astroid.BoundMethod) and isinstance(parent, astroid.ClassDef) and isinstance(parent.parent, astroid.Module): elif isinstance(node, astroid.bases.BoundMethod) and isinstance(parent, astroid.nodes.ClassDef) and isinstance(parent.parent, astroid.nodes.Module):
parts = (parent.parent.name, parent.name, node.name) parts = (parent.parent.name, parent.name, node.name)
else: else:
parts = None parts = None
return '.'.join(parts) if parts else None return '.'.join(parts) if parts else None
def check_call(self, node: astroid.Call, name: str, args: tuple[str, ...]) -> None: def check_call(self, node: astroid.nodes.Call, name: str, args: tuple[str, ...]) -> None:
"""Check the given deprecation call node for valid arguments.""" """Check the given deprecation call node for valid arguments."""
call_args = self.get_deprecation_call_args(node, args) call_args = self.get_deprecation_call_args(node, args)
@ -400,7 +402,7 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
self.check_version(node, name, call_args) self.check_version(node, name, call_args)
@staticmethod @staticmethod
def get_deprecation_call_args(node: astroid.Call, args: tuple[str, ...]) -> DeprecationCallArgs: def get_deprecation_call_args(node: astroid.nodes.Call, args: tuple[str, ...]) -> DeprecationCallArgs:
"""Get the deprecation call arguments from the given node.""" """Get the deprecation call arguments from the given node."""
fields: dict[str, object] = {} fields: dict[str, object] = {}
@ -413,12 +415,12 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
fields[keyword.arg] = keyword.value fields[keyword.arg] = keyword.value
for key, value in fields.items(): for key, value in fields.items():
if isinstance(value, astroid.Const): if isinstance(value, astroid.nodes.Const):
fields[key] = value.value fields[key] = value.value
return DeprecationCallArgs(**fields) return DeprecationCallArgs(**fields)
def check_collection_name(self, node: astroid.Call, name: str, args: DeprecationCallArgs) -> None: def check_collection_name(self, node: astroid.nodes.Call, name: str, args: DeprecationCallArgs) -> None:
"""Check the collection name provided to the given call node.""" """Check the collection name provided to the given call node."""
deprecator_requirement = self.is_deprecator_required() deprecator_requirement = self.is_deprecator_required()
@ -459,14 +461,14 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
if args.collection_name and args.collection_name != expected_collection_name: if args.collection_name and args.collection_name != expected_collection_name:
self.add_message('wrong-collection-deprecated', node=node, args=(args.collection_name, name)) self.add_message('wrong-collection-deprecated', node=node, args=(args.collection_name, name))
def check_version(self, node: astroid.Call, name: str, args: DeprecationCallArgs) -> None: def check_version(self, node: astroid.nodes.Call, name: str, args: DeprecationCallArgs) -> None:
"""Check the version provided to the given call node.""" """Check the version provided to the given call node."""
if self.collection_name: if self.collection_name:
self.check_collection_version(node, name, args) self.check_collection_version(node, name, args)
else: else:
self.check_core_version(node, name, args) self.check_core_version(node, name, args)
def check_core_version(self, node: astroid.Call, name: str, args: DeprecationCallArgs) -> None: def check_core_version(self, node: astroid.nodes.Call, name: str, args: DeprecationCallArgs) -> None:
"""Check the core version provided to the given call node.""" """Check the core version provided to the given call node."""
try: try:
if not isinstance(args.version, str) or not args.version: if not isinstance(args.version, str) or not args.version:
@ -480,7 +482,7 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
if self.ANSIBLE_VERSION >= strict_version: if self.ANSIBLE_VERSION >= strict_version:
self.add_message('ansible-deprecated-version', node=node, args=(args.version, name)) self.add_message('ansible-deprecated-version', node=node, args=(args.version, name))
def check_collection_version(self, node: astroid.Call, name: str, args: DeprecationCallArgs) -> None: def check_collection_version(self, node: astroid.nodes.Call, name: str, args: DeprecationCallArgs) -> None:
"""Check the collection version provided to the given call node.""" """Check the collection version provided to the given call node."""
try: try:
if not isinstance(args.version, str) or not args.version: if not isinstance(args.version, str) or not args.version:
@ -497,7 +499,7 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
if semantic_version.major != 0 and (semantic_version.minor != 0 or semantic_version.patch != 0): if semantic_version.major != 0 and (semantic_version.minor != 0 or semantic_version.patch != 0):
self.add_message('removal-version-must-be-major', node=node, args=(args.version,)) self.add_message('removal-version-must-be-major', node=node, args=(args.version,))
def check_date(self, node: astroid.Call, name: str, args: DeprecationCallArgs) -> None: def check_date(self, node: astroid.nodes.Call, name: str, args: DeprecationCallArgs) -> None:
"""Check the date provided to the given call node.""" """Check the date provided to the given call node."""
try: try:
date_parsed = self.parse_isodate(args.date) date_parsed = self.parse_isodate(args.date)
@ -515,18 +517,19 @@ class AnsibleDeprecatedChecker(pylint.checkers.BaseChecker):
raise TypeError(type(value)) raise TypeError(type(value))
def get_ansible_module(self, node: astroid.AssignAttr) -> astroid.Instance | None: def get_ansible_module(self, node: astroid.nodes.AssignAttr) -> astroid.bases.Instance | None:
"""Infer an AnsibleModule instance node from the given assignment.""" """Infer an AnsibleModule instance node from the given assignment."""
if isinstance(node.parent, astroid.Assign) and isinstance(node.parent.type_annotation, astroid.Name): if isinstance(node.parent, astroid.nodes.Assign) and isinstance(node.parent.type_annotation, astroid.nodes.Name):
inferred = self.infer_name(node.parent.type_annotation) inferred = self.infer_name(node.parent.type_annotation)
elif isinstance(node.parent, astroid.Assign) and isinstance(node.parent.parent, astroid.FunctionDef) and isinstance(node.parent.value, astroid.Name): elif (isinstance(node.parent, astroid.nodes.Assign) and isinstance(node.parent.parent, astroid.nodes.FunctionDef) and
isinstance(node.parent.value, astroid.nodes.Name)):
inferred = self.infer_name(node.parent.value) inferred = self.infer_name(node.parent.value)
elif isinstance(node.parent, astroid.AnnAssign) and isinstance(node.parent.annotation, astroid.Name): elif isinstance(node.parent, astroid.nodes.AnnAssign) and isinstance(node.parent.annotation, astroid.nodes.Name):
inferred = self.infer_name(node.parent.annotation) inferred = self.infer_name(node.parent.annotation)
else: else:
inferred = None inferred = None
if isinstance(inferred, astroid.ClassDef) and inferred.name == 'AnsibleModule': if isinstance(inferred, astroid.nodes.ClassDef) and inferred.name == 'AnsibleModule':
return inferred.instantiate_class() return inferred.instantiate_class()
return None return None

@ -5,7 +5,9 @@
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
from __future__ import annotations from __future__ import annotations
import astroid import astroid.bases
import astroid.exceptions
import astroid.nodes
try: try:
from pylint.checkers.utils import check_messages from pylint.checkers.utils import check_messages
@ -39,22 +41,22 @@ class AnsibleStringFormatChecker(BaseChecker):
def visit_call(self, node): def visit_call(self, node):
"""Visit a call node.""" """Visit a call node."""
func = utils.safe_infer(node.func) func = utils.safe_infer(node.func)
if (isinstance(func, astroid.BoundMethod) if (isinstance(func, astroid.bases.BoundMethod)
and isinstance(func.bound, astroid.Instance) and isinstance(func.bound, astroid.bases.Instance)
and func.bound.name in ('str', 'unicode', 'bytes')): and func.bound.name in ('str', 'unicode', 'bytes')):
if func.name == 'format': if func.name == 'format':
self._check_new_format(node, func) self._check_new_format(node, func)
def _check_new_format(self, node, func): def _check_new_format(self, node, func):
""" Check the new string formatting """ """ Check the new string formatting """
if (isinstance(node.func, astroid.Attribute) if (isinstance(node.func, astroid.nodes.Attribute)
and not isinstance(node.func.expr, astroid.Const)): and not isinstance(node.func.expr, astroid.nodes.Const)):
return return
try: try:
strnode = next(func.bound.infer()) strnode = next(func.bound.infer())
except astroid.InferenceError: except astroid.exceptions.InferenceError:
return return
if not isinstance(strnode, astroid.Const): if not isinstance(strnode, astroid.nodes.Const):
return return
if isinstance(strnode.value, bytes): if isinstance(strnode.value, bytes):

@ -2,10 +2,12 @@
from __future__ import annotations from __future__ import annotations
import functools
import os import os
import typing as t import typing as t
import astroid import astroid.exceptions
import astroid.nodes
from pylint.checkers import BaseChecker from pylint.checkers import BaseChecker
@ -108,10 +110,6 @@ class AnsibleUnwantedChecker(BaseChecker):
'Iterator', 'Iterator',
) )
), ),
'ansible.module_utils.six': UnwantedEntry(
'the Python standard library equivalent'
),
} }
unwanted_functions = { unwanted_functions = {
@ -136,21 +134,34 @@ class AnsibleUnwantedChecker(BaseChecker):
modules_only=True), modules_only=True),
} }
def visit_import(self, node): # type: (astroid.node_classes.Import) -> None def __init__(self, *args, **kwargs) -> None:
super().__init__(*args, **kwargs)
# ansible.module_utils.six is deprecated and collections can still use it until it is removed
if self.is_ansible_core:
self.unwanted_imports['ansible.module_utils.six'] = UnwantedEntry(
'the Python standard library equivalent'
)
@functools.cached_property
def is_ansible_core(self) -> bool:
"""True if ansible-core is being tested."""
return not self.linter.config.collection_name
def visit_import(self, node: astroid.nodes.Import) -> None:
"""Visit an import node.""" """Visit an import node."""
for name in node.names: for name in node.names:
self._check_import(node, name[0]) self._check_import(node, name[0])
def visit_importfrom(self, node): # type: (astroid.node_classes.ImportFrom) -> None def visit_importfrom(self, node: astroid.nodes.ImportFrom) -> None:
"""Visit an import from node.""" """Visit an import from node."""
self._check_importfrom(node, node.modname, node.names) self._check_importfrom(node, node.modname, node.names)
def visit_attribute(self, node): # type: (astroid.node_classes.Attribute) -> None def visit_attribute(self, node: astroid.nodes.Attribute) -> None:
"""Visit an attribute node.""" """Visit an attribute node."""
last_child = node.last_child() last_child = node.last_child()
# this is faster than using type inference and will catch the most common cases # this is faster than using type inference and will catch the most common cases
if not isinstance(last_child, astroid.node_classes.Name): if not isinstance(last_child, astroid.nodes.Name):
return return
module = last_child.name module = last_child.name
@ -161,13 +172,13 @@ class AnsibleUnwantedChecker(BaseChecker):
if entry.applies_to(self.linter.current_file, node.attrname): if entry.applies_to(self.linter.current_file, node.attrname):
self.add_message(self.BAD_IMPORT_FROM, args=(node.attrname, entry.alternative, module), node=node) self.add_message(self.BAD_IMPORT_FROM, args=(node.attrname, entry.alternative, module), node=node)
def visit_call(self, node): # type: (astroid.node_classes.Call) -> None def visit_call(self, node: astroid.nodes.Call) -> None:
"""Visit a call node.""" """Visit a call node."""
try: try:
for i in node.func.inferred(): for i in node.func.inferred():
func = None func = None
if isinstance(i, astroid.scoped_nodes.FunctionDef) and isinstance(i.parent, astroid.scoped_nodes.Module): if isinstance(i, astroid.nodes.FunctionDef) and isinstance(i.parent, astroid.nodes.Module):
func = '%s.%s' % (i.parent.name, i.name) func = '%s.%s' % (i.parent.name, i.name)
if not func: if not func:
@ -180,7 +191,7 @@ class AnsibleUnwantedChecker(BaseChecker):
except astroid.exceptions.InferenceError: except astroid.exceptions.InferenceError:
pass pass
def _check_import(self, node, modname): # type: (astroid.node_classes.Import, str) -> None def _check_import(self, node: astroid.nodes.Import, modname: str) -> None:
"""Check the imports on the specified import node.""" """Check the imports on the specified import node."""
self._check_module_import(node, modname) self._check_module_import(node, modname)
@ -192,7 +203,7 @@ class AnsibleUnwantedChecker(BaseChecker):
if entry.applies_to(self.linter.current_file): if entry.applies_to(self.linter.current_file):
self.add_message(self.BAD_IMPORT, args=(entry.alternative, modname), node=node) self.add_message(self.BAD_IMPORT, args=(entry.alternative, modname), node=node)
def _check_importfrom(self, node, modname, names): # type: (astroid.node_classes.ImportFrom, str, t.List[str]) -> None def _check_importfrom(self, node: astroid.nodes.ImportFrom, modname: str, names: list[tuple[str, str | None]]) -> None:
"""Check the imports on the specified import from node.""" """Check the imports on the specified import from node."""
self._check_module_import(node, modname) self._check_module_import(node, modname)
@ -205,7 +216,7 @@ class AnsibleUnwantedChecker(BaseChecker):
if entry.applies_to(self.linter.current_file, name[0]): if entry.applies_to(self.linter.current_file, name[0]):
self.add_message(self.BAD_IMPORT_FROM, args=(name[0], entry.alternative, modname), node=node) self.add_message(self.BAD_IMPORT_FROM, args=(name[0], entry.alternative, modname), node=node)
def _check_module_import(self, node, modname): # type: (t.Union[astroid.node_classes.Import, astroid.node_classes.ImportFrom], str) -> None def _check_module_import(self, node: astroid.nodes.Import | astroid.nodes.ImportFrom, modname: str) -> None:
"""Check the module import on the given import or import from node.""" """Check the module import on the given import or import from node."""
if not is_module_path(self.linter.current_file): if not is_module_path(self.linter.current_file):
return return

@ -1,7 +1,8 @@
# edit "black.requirements.in" and generate with: hacking/update-sanity-requirements.py --test black # edit "black.requirements.in" and generate with: hacking/update-sanity-requirements.py --test black
black==25.1.0 black==25.9.0
click==8.2.1 click==8.3.0
mypy_extensions==1.1.0 mypy_extensions==1.1.0
packaging==25.0 packaging==25.0
pathspec==0.12.1 pathspec==0.12.1
platformdirs==4.4.0 platformdirs==4.5.0
pytokens==0.1.10

@ -1,4 +1,4 @@
# edit "deprecated-config.requirements.in" and generate with: hacking/update-sanity-requirements.py --test deprecated-config # edit "deprecated-config.requirements.in" and generate with: hacking/update-sanity-requirements.py --test deprecated-config
Jinja2==3.1.6 Jinja2==3.1.6
MarkupSafe==3.0.2 MarkupSafe==3.0.3
PyYAML==6.0.2 PyYAML==6.0.3

@ -4,6 +4,7 @@ jinja2 # type stubs not published separately
packaging # type stubs not published separately packaging # type stubs not published separately
pytest # type stubs not published separately pytest # type stubs not published separately
pytest-mock # type stubs not published separately pytest-mock # type stubs not published separately
resolvelib # type stubs not published separately
tomli # type stubs not published separately, required for toml inventory plugin tomli # type stubs not published separately, required for toml inventory plugin
types-backports types-backports
types-paramiko types-paramiko

@ -1,23 +1,24 @@
# edit "mypy.requirements.in" and generate with: hacking/update-sanity-requirements.py --test mypy # edit "mypy.requirements.in" and generate with: hacking/update-sanity-requirements.py --test mypy
cffi==2.0.0 cffi==2.0.0
cryptography==45.0.7 cryptography==46.0.2
iniconfig==2.1.0 iniconfig==2.1.0
Jinja2==3.1.6 Jinja2==3.1.6
MarkupSafe==3.0.2 MarkupSafe==3.0.3
mypy==1.17.1 mypy==1.18.2
mypy_extensions==1.1.0 mypy_extensions==1.1.0
packaging==25.0 packaging==25.0
pathspec==0.12.1 pathspec==0.12.1
pluggy==1.6.0 pluggy==1.6.0
pycparser==2.22 pycparser==2.23
Pygments==2.19.2 Pygments==2.19.2
pytest==8.4.2 pytest==8.4.2
pytest-mock==3.15.0 pytest-mock==3.15.1
tomli==2.2.1 resolvelib==1.2.1
tomli==2.3.0
types-backports==0.1.3 types-backports==0.1.3
types-paramiko==4.0.0.20250822 types-paramiko==4.0.0.20250822
types-PyYAML==6.0.12.20250822 types-PyYAML==6.0.12.20250915
types-requests==2.32.4.20250809 types-requests==2.32.4.20250913
types-setuptools==80.9.0.20250822 types-setuptools==80.9.0.20250822
types-toml==0.10.8.20240310 types-toml==0.10.8.20240310
typing_extensions==4.15.0 typing_extensions==4.15.0

@ -23,6 +23,9 @@ ignore_missing_imports = True
[mypy-ansible_test.*] [mypy-ansible_test.*]
ignore_missing_imports = True ignore_missing_imports = True
[mypy-ansible.galaxy.dependency_resolution.*]
strict_optional = True
[mypy-ansible.module_utils.six.moves.*] [mypy-ansible.module_utils.six.moves.*]
ignore_missing_imports = True ignore_missing_imports = True
@ -104,9 +107,6 @@ ignore_missing_imports = True
[mypy-distro.*] [mypy-distro.*]
ignore_missing_imports = True ignore_missing_imports = True
[mypy-resolvelib.*]
ignore_missing_imports = True
[mypy-urlparse.*] [mypy-urlparse.*]
ignore_missing_imports = True ignore_missing_imports = True

@ -36,10 +36,19 @@ ignore_missing_imports = True
[mypy-astroid] [mypy-astroid]
ignore_missing_imports = True ignore_missing_imports = True
[mypy-astroid.bases]
ignore_missing_imports = True
[mypy-astroid.exceptions]
ignore_missing_imports = True
[mypy-astroid.nodes]
ignore_missing_imports = True
[mypy-astroid.typing] [mypy-astroid.typing]
ignore_missing_imports = True ignore_missing_imports = True
[mypy-astroid.context] [mypy-astroid.util]
ignore_missing_imports = True ignore_missing_imports = True
[mypy-pylint] [mypy-pylint]

@ -247,7 +247,7 @@ def check_build(complete_file_list: list[str], use_upper_setuptools_version: boo
errors.extend(check_files('sdist', expected_sdist_files, actual_sdist_files)) errors.extend(check_files('sdist', expected_sdist_files, actual_sdist_files))
errors.extend(check_files('wheel', expected_wheel_files, actual_wheel_files)) errors.extend(check_files('wheel', expected_wheel_files, actual_wheel_files))
errors = [f'{msg} ({setuptools_version})' for msg in errors] errors = [f'{msg} (setuptools=={setuptools_version})' for msg in errors]
return errors return errors

@ -1,10 +1,10 @@
# edit "pymarkdown.requirements.in" and generate with: hacking/update-sanity-requirements.py --test pymarkdown # edit "pymarkdown.requirements.in" and generate with: hacking/update-sanity-requirements.py --test pymarkdown
application_properties==0.9.0 application_properties==0.9.0
Columnar==1.4.1 Columnar==1.4.1
pyjson5==1.6.9 pyjson5==2.0.0
pymarkdownlnt==0.9.32 pymarkdownlnt==0.9.32
PyYAML==6.0.2 PyYAML==6.0.3
tomli==2.2.1 tomli==2.3.0
toolz==1.0.0 toolz==1.0.0
typing_extensions==4.15.0 typing_extensions==4.15.0
wcwidth==0.2.13 wcwidth==0.2.14

@ -65,7 +65,6 @@ test/integration/targets/ansible-doc/library/facts_one shebang
test/integration/targets/ansible-test-sanity/ansible_collections/ns/col/tests/integration/targets/hello/files/bad.py pylint:ansible-bad-function # ignore, required for testing test/integration/targets/ansible-test-sanity/ansible_collections/ns/col/tests/integration/targets/hello/files/bad.py pylint:ansible-bad-function # ignore, required for testing
test/integration/targets/ansible-test-sanity/ansible_collections/ns/col/tests/integration/targets/hello/files/bad.py pylint:ansible-bad-import-from # ignore, required for testing test/integration/targets/ansible-test-sanity/ansible_collections/ns/col/tests/integration/targets/hello/files/bad.py pylint:ansible-bad-import-from # ignore, required for testing
test/integration/targets/ansible-test-sanity/ansible_collections/ns/col/tests/integration/targets/hello/files/bad.py pylint:ansible-bad-import # ignore, required for testing test/integration/targets/ansible-test-sanity/ansible_collections/ns/col/tests/integration/targets/hello/files/bad.py pylint:ansible-bad-import # ignore, required for testing
test/integration/targets/ansible-test-sanity/ansible_collections/ns/col/plugins/plugin_utils/check_pylint.py pylint:disallowed-name # ignore, required for testing
test/integration/targets/ansible-test-integration/ansible_collections/ns/col/plugins/modules/hello.py pylint:relative-beyond-top-level test/integration/targets/ansible-test-integration/ansible_collections/ns/col/plugins/modules/hello.py pylint:relative-beyond-top-level
test/integration/targets/ansible-test-units/ansible_collections/ns/col/plugins/modules/hello.py pylint:relative-beyond-top-level test/integration/targets/ansible-test-units/ansible_collections/ns/col/plugins/modules/hello.py pylint:relative-beyond-top-level
test/integration/targets/ansible-test-units/ansible_collections/ns/col/tests/unit/plugins/modules/test_hello.py pylint:relative-beyond-top-level test/integration/targets/ansible-test-units/ansible_collections/ns/col/tests/unit/plugins/modules/test_hello.py pylint:relative-beyond-top-level
@ -115,6 +114,11 @@ test/integration/targets/win_script/files/test_script_with_args.ps1 pslint:PSAvo
test/integration/targets/win_script/files/test_script_with_splatting.ps1 pslint:PSAvoidUsingWriteHost # Keep test/integration/targets/win_script/files/test_script_with_splatting.ps1 pslint:PSAvoidUsingWriteHost # Keep
test/integration/targets/ssh_agent/fake_agents/ssh-agent-bad-shebang shebang # required for test test/integration/targets/ssh_agent/fake_agents/ssh-agent-bad-shebang shebang # required for test
test/lib/ansible_test/_data/requirements/sanity.pslint.ps1 pslint:PSCustomUseLiteralPath # Uses wildcards on purpose test/lib/ansible_test/_data/requirements/sanity.pslint.ps1 pslint:PSCustomUseLiteralPath # Uses wildcards on purpose
test/lib/ansible_test/_internal/compat/packaging.py pylint:invalid-name # pylint bug: https://github.com/pylint-dev/pylint/issues/10652
test/lib/ansible_test/_internal/compat/yaml.py pylint:invalid-name # pylint bug: https://github.com/pylint-dev/pylint/issues/10652
test/lib/ansible_test/_internal/init.py pylint:invalid-name # pylint bug: https://github.com/pylint-dev/pylint/issues/10652
test/lib/ansible_test/_internal/util.py pylint:invalid-name # pylint bug: https://github.com/pylint-dev/pylint/issues/10652
test/lib/ansible_test/_util/target/setup/requirements.py pylint:invalid-name # pylint bug: https://github.com/pylint-dev/pylint/issues/10652
test/support/windows-integration/collections/ansible_collections/ansible/windows/plugins/module_utils/WebRequest.psm1 pslint!skip test/support/windows-integration/collections/ansible_collections/ansible/windows/plugins/module_utils/WebRequest.psm1 pslint!skip
test/support/windows-integration/collections/ansible_collections/ansible/windows/plugins/modules/win_uri.ps1 pslint!skip test/support/windows-integration/collections/ansible_collections/ansible/windows/plugins/modules/win_uri.ps1 pslint!skip
test/support/windows-integration/plugins/modules/async_status.ps1 pslint!skip test/support/windows-integration/plugins/modules/async_status.ps1 pslint!skip

@ -17,6 +17,7 @@
from __future__ import annotations from __future__ import annotations
import collections
import os import os
import pathlib import pathlib
import tempfile import tempfile
@ -109,9 +110,9 @@ class TestDataLoader(unittest.TestCase):
self.assertIn('/tmp/roles/testrole/tasks/included2.yml', called_args) self.assertIn('/tmp/roles/testrole/tasks/included2.yml', called_args)
self.assertIn('/tmp/roles/testrole/tasks/tasks/included2.yml', called_args) self.assertIn('/tmp/roles/testrole/tasks/tasks/included2.yml', called_args)
# relative directories below are taken in account too: c = collections.Counter(called_args)
self.assertIn('tasks/included2.yml', called_args) assert c['/tmp/roles/testrole/tasks/included2.yml'] == 1
self.assertIn('included2.yml', called_args) assert c['/tmp/roles/testrole/tasks/tasks/included2.yml'] == 2
def test_path_dwim_root(self): def test_path_dwim_root(self):
self.assertEqual(self._loader.path_dwim('/'), '/') self.assertEqual(self._loader.path_dwim('/'), '/')
@ -167,7 +168,7 @@ class TestPathDwimRelativeStackDataLoader(unittest.TestCase):
self.assertRaisesRegex(AnsibleFileNotFound, 'on the Ansible Controller', self._loader.path_dwim_relative_stack, None, None, None) self.assertRaisesRegex(AnsibleFileNotFound, 'on the Ansible Controller', self._loader.path_dwim_relative_stack, None, None, None)
def test_empty_strings(self): def test_empty_strings(self):
self.assertEqual(self._loader.path_dwim_relative_stack('', '', ''), './') self.assertEqual(self._loader.path_dwim_relative_stack('', '', ''), os.path.abspath('./') + '/')
def test_empty_lists(self): def test_empty_lists(self):
self.assertEqual(self._loader.path_dwim_relative_stack([], '', '~/'), os.path.expanduser('~')) self.assertEqual(self._loader.path_dwim_relative_stack([], '', '~/'), os.path.expanduser('~'))

@ -1,4 +1,4 @@
bcrypt ; python_version >= '3.12' # controller only bcrypt < 5 ; python_version >= '3.12' # controller only, bcrypt 5+ not compatible with passlib
passlib ; python_version >= '3.12' # controller only passlib ; python_version >= '3.12' # controller only
pexpect ; python_version >= '3.12' # controller only pexpect ; python_version >= '3.12' # controller only
pywinrm ; python_version >= '3.12' # controller only pywinrm ; python_version >= '3.12' # controller only

Loading…
Cancel
Save