Compare commits

...

43 Commits

Author SHA1 Message Date
Matt Clay 731b4d0242
New release v2.19.0b4 (#85133) 7 months ago
Matt Davis d6a8582da7 fix ensure_type to support vaulted values (#85129)
* restored parity with 2.18

Co-authored-by: Matt Clay <matt@mystile.com>
(cherry picked from commit 9a426fe303)
7 months ago
Matt Davis 204cdcee67 ensure that all config return values are Origin-tagged (#85127)
Co-authored-by: Matt Clay <matt@mystile.com>
(cherry picked from commit fc8a227647)
7 months ago
Matt Davis df214f93a7 apply trust to declarative plugin config (#85126)
* trust strings in loaded doc fragments
* added tests
* added hard_fail_context test mechanism

Co-authored-by: Matt Clay <matt@mystile.com>
(cherry picked from commit 9efba4f972)
7 months ago
Lorenzo Tanganelli 27aca0abd3 facts: CloudStack KVM Hypervisor to Linux virtual fact (#85117)
Co-authored-by: Abhijeet Kasurde <akasurde@redhat.com>
(cherry picked from commit 8a4fb78988)
7 months ago
Matt Clay 38ef2b8c25 ansible-test - Relax some deprecation checks (#85122)
(cherry picked from commit 7b69cf3266)
7 months ago
Matt Clay 23f935eb0d template module - render `None` as empty string (#85121)
* template module - render `None` as empty string

* Update changelogs/fragments/template-none.yml

Co-authored-by: Matt Davis <6775756+nitzmahone@users.noreply.github.com>

---------

Co-authored-by: Matt Davis <6775756+nitzmahone@users.noreply.github.com>
(cherry picked from commit 4fe9606530)
7 months ago
Matt Davis 9fff6d433d Misc config type coercion fixes (#85119)
* remove dead config comment noise

* update `list` typed config defaults to be lists

* fix tag preservation/propagation in config
* numerous other ensure_type bugfixes
* 100% unit test coverage of ensure_type
* emit warnings on template_default failures
* fix unhandled exception in convert_bool on unhashable inputs

Co-authored-by: Matt Clay <matt@mystile.com>

---------

Co-authored-by: Matt Clay <matt@mystile.com>
(cherry picked from commit d33bedc48f)
7 months ago
j-dr3 e8d914e992 sysvinit: make examples consistent (#85108)
(cherry picked from commit dcc5dac184)
7 months ago
omahs 121871af86 Fix typos (#85107)
(cherry picked from commit 471c5229a7)
7 months ago
Martin Krizek f70dbc15e5 Passing warnings to exit/fail_json is deprecated. (#85109)
(cherry picked from commit 8b9ddf5544)
7 months ago
Matt Davis 80af44d822 add fuzzy matching to package_data sanity (#85103)
* add fuzzy matching to package_data sanity

* relaxes exact directory matches for license files to allow setuptools > 72 to pass

* sanity

(cherry picked from commit 7e00053a30)
7 months ago
Martin Krizek da59710961 dnf5: skip pkgs that don't satisfy bugfix/security when specified (#85111)
(cherry picked from commit 107842fd7d)
7 months ago
Matt Davis c83b70a04c
Update Ansible release version to v2.19.0b3.post0. (#85102) 7 months ago
Matt Davis c742fdc66c
New release v2.19.0b3 (#85101) 7 months ago
pollenJP(@'ω'@) 7a932a93b0 get_url: missing closing brace in docs (#85096)
(cherry picked from commit 1c29910087)
7 months ago
Matt Davis 8c8717a8e4 Switch to stackwalk caller ID (#85095)
* See changelog fragment for most changes.
* Defer early config warnings until display is functioning, eliminating related fallback display logic.
* Added more type annotations and docstrings.
* ansible-test - pylint sanity for deprecations improved.
* Refactored inline legacy resolutions in PluginLoader.

Co-authored-by: Matt Clay <matt@mystile.com>
(cherry picked from commit ff6998f2b9)
7 months ago
Jordan Borean 6054b29cb7
Add win_script become tests (#85079)
(cherry picked from commit e4cac2ac33)
7 months ago
Brian Coca 131175a5a6
ensure predictable permissions on module artifacts (#84948)
and test it!

(cherry picked from commit 9f894b81c2)
7 months ago
Martin Krizek 0aab250fbc
dnf5: avoid generating excessive history entries (#85065)
Fixes #85046

(cherry picked from commit cff49a62ec)
7 months ago
Martin Krizek dcec78b0f9
async_status: fix example to use finished test (#85066)
Fixes #85048

(cherry picked from commit dbf131c07d)
7 months ago
Brian Coca ea22e5d0dd
show internal but not hidden config options, while still hiding test options (#84997)
(cherry picked from commit aab732cb82)
7 months ago
Brian Coca 867d9d3096
These actions do not support until (#84847)
(cherry picked from commit 8ab342f8cc)
7 months ago
Matt Clay e0e286c009
[stable-2.19] ansible-test - Use `-t` for container stop timeout (#85019) (#85055)
(cherry picked from commit 0aa8afbaf4)
7 months ago
Matt Clay 1c1a271b88
Update Ansible release version to v2.19.0b2.post0. (#85041) 7 months ago
Matt Clay 4e861fa9c8
New release v2.19.0b2 (#85040)
* New release v2.19.0b2

* Revert setuptools version bump
7 months ago
Matt Davis f898f9fec6 Implement TaskResult backward compatibility for callbacks (#85039)
* Implement TaskResult backward compatibility for callbacks
* general API cleanup
* misc deprecations

Co-authored-by: Matt Clay <matt@mystile.com>

* fix v2_on_any deprecation exclusion for base

---------

Co-authored-by: Matt Clay <matt@mystile.com>
(cherry picked from commit 03181ac87b)
7 months ago
Matt Davis 4714194672 restore parsing/utils/jsonify.py (#85032)
(cherry picked from commit 2033993d89)
7 months ago
Abhijeet Kasurde ffbf121182
comment: raise an exception when an invalid option is provided (#84984)
Co-authored-by: Matt Clay <matt@mystile.com>
Signed-off-by: Abhijeet Kasurde <Akasurde@redhat.com>
(cherry picked from commit 1daa8412d5)
8 months ago
Brian Coca 89a4900b61
normalize error handler choices (#84998)
use existing to avoid deprecation cycle
normalize test too

(cherry picked from commit 2cbb721f6f)
8 months ago
Matt Clay 17d4fdd883
Increase galaxy test publish timeout (#85016)
(cherry picked from commit e6dc17cda4)
8 months ago
Lee Garrett 7fc916361e
Fix test_range_templating on 32-bit architectures (#85007)
* Fix test_range_templating on 32-bit architectures

32-bit archtectures like i386, armel, armhf will fail with the error

ansible._internal._templating._errors.AnsibleTemplatePluginRuntimeError: The
filter plugin 'ansible.builtin.random' failed: Python int too large to convert
to C ssize_t

So just pick sys.maxsize (2**31 - 1) so it works on 32 bit machines.

---------

Co-authored-by: Lee Garrett <lgarrett@rocketjump.eu>
Co-authored-by: Matt Clay <matt@mystile.com>
(cherry picked from commit 5f6aef95ac)
8 months ago
Matt Davis 82ea3addce
Miscellaneous fixes (#85012)
* Add help_text to play_hosts deprecation

* clean up TaskResult type handling

(cherry picked from commit 1b6b910439)
8 months ago
Matt Clay 98009c811b
Disable retries on ansible-galaxy-collection (#85013)
(cherry picked from commit f7d03474a5)
8 months ago
Sloane Hertel de7c454684
Remove unused local function _get_plugin_vars from vars manager (#85008)
(cherry picked from commit 93e6f012cb)
8 months ago
Matt Clay 80d5f05642
Miscellaneous DT fixes (#84991)
* Use `_UNSET` instead of allowing `ellipsis`

* Fix deprecation warning pre-check

* Deprecation warnings from modules can now be disabled.
* Deprecation warnings from modules get the "can be disabled" notice.

* Include help text in pre-display fatal errors

* Simplify lookup warning/debug messaging

* Fix return type of `timedout` test plugin

* Use `object` for `_UNSET`

* Remove obsolete `convert_data` tests

* Remove unnecessary template from test

* Improve legacy YAML objects backward compat

* Fix templar backward compat for None overrides

(cherry picked from commit 6cc97447aa)
8 months ago
Matt Clay ec0d8f3278
Disable parallel publish in galaxy test (#85000)
(cherry picked from commit e094d48b1b)
8 months ago
Abhijeet Kasurde c21a817c47
filter_core integration test - remove Python 2.6 related dead code (#84985)
Signed-off-by: Abhijeet Kasurde <Akasurde@redhat.com>
(cherry picked from commit 500a4aba08)
8 months ago
Martin Krizek 85cb2baf1f
get_bin_path('ssh-agent'): required is deprecated (#84995)
(cherry picked from commit 4868effc71)
8 months ago
Felix Fontein 2fcfad54b0
ansible-doc: fix indent and line wrapping for first line of (sub-)option and (sub-)return value descriptions (#84690)
* Fix initial indent for descriptions of suboptions.
* Fix line width for initial line of option descriptions.

(cherry picked from commit 352d8ec33a)
8 months ago
Matt Clay 6f95a618af
Convert DT issue template to pre-release template (#84982)
(cherry picked from commit 9ddfe9db39)
8 months ago
Matt Martz 19d9253ec9
Update Ansible release version to v2.19.0b1.post0. (#84988) 8 months ago
Matt Martz 8d775ddced
New release v2.19.0b1 (#84979) 8 months ago

@ -1,9 +1,8 @@
name: Fallible 2.19 Data Tagging Preview Bug Report name: Pre-Release Bug Report
description: File a bug report against the Fallible 2.19 Data Tagging Preview description: File a bug report against a pre-release version
labels: labels:
- fallible_dt
- bug - bug
- data_tagging - pre_release
assignees: assignees:
- nitzmahone - nitzmahone
- mattclay - mattclay
@ -12,15 +11,14 @@ body:
attributes: attributes:
value: | value: |
## Bug Report ## Bug Report
- type: dropdown - type: textarea
attributes: attributes:
label: Fallible Version label: Ansible Version
description: The fallible release that reproduces the issue described. description: Paste the full output from `ansible --version` below.
options: render: console
- 2025.4.1 placeholder: $ ansible --version
- 2025.3.11 validations:
- 2025.3.3 required: true
- 2025.1.30
- type: textarea - type: textarea
attributes: attributes:
label: Summary label: Summary
@ -37,8 +35,6 @@ body:
bin/ansible bin/ansible
### Issue Type ### Issue Type
Bug Report Bug Report
### Ansible Version
2.19.0.dev0
### Configuration ### Configuration
### OS / Environment ### OS / Environment
--> -->

@ -0,0 +1,416 @@
==================================================================
ansible-core 2.19 "What Is and What Should Never Be" Release Notes
==================================================================
.. contents:: Topics
v2.19.0b4
=========
Release Summary
---------------
| Release Date: 2025-05-12
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
Minor Changes
-------------
- facts - add "CloudStack KVM Hypervisor" for Linux VM in virtual facts (https://github.com/ansible/ansible/issues/85089).
- modules - use ``AnsibleModule.warn`` instead of passing ``warnings`` to ``exit_json`` or ``fail_json`` which is deprecated.
Bugfixes
--------
- ansible-test - Updated the ``pylint`` sanity test to skip some deprecation validation checks when all arguments are dynamic.
- config - Preserve or apply Origin tag to values returned by config.
- config - Prevented fatal errors when ``MODULE_IGNORE_EXTS`` configuration was set.
- config - Templating failures on config defaults now issue a warning. Previously, failures silently returned an unrendered and untrusted template to the caller.
- config - ``ensure_type`` correctly propagates trust and other tags on returned values.
- config - ``ensure_type`` now converts mappings to ``dict`` when requested, instead of returning the mapping.
- config - ``ensure_type`` now converts sequences to ``list`` when requested, instead of returning the sequence.
- config - ``ensure_type`` now correctly errors when ``pathlist`` or ``pathspec`` types encounter non-string list items.
- config - ``ensure_type`` now reports an error when ``bytes`` are provided for any known ``value_type``. Previously, the behavior was undefined, but often resulted in an unhandled exception or incorrect return type.
- config - ``ensure_type`` with expected type ``int`` now properly converts ``True`` and ``False`` values to ``int``. Previously, these values were silently returned unmodified.
- convert_bool.boolean API conversion function - Unhashable values passed to ``boolean`` behave like other non-boolean convertible values, returning False or raising ``TypeError`` depending on the value of ``strict``. Previously, unhashable values always raised ``ValueError`` due to an invalid set membership check.
- dnf5 - when ``bugfix`` and/or ``security`` is specified, skip packages that do not have any such updates, even for new versions of libdnf5 where this functionality changed and it is considered failure
- plugin loader - Apply template trust to strings loaded from plugin configuration definitions and doc fragments.
- template action - Template files where the entire file's output renders as ``None`` are no longer emitted as the string "None", but instead render to an empty file as in previous releases.
v2.19.0b3
=========
Release Summary
---------------
| Release Date: 2025-05-06
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
Minor Changes
-------------
- ansible-config will now show internal, but not test configuration entries. This allows for debugging but still denoting the configurations as internal use only (_ prefix).
- ansible-test - Improved ``pylint`` checks for Ansible-specific deprecation functions.
- ansible-test - Use the ``-t`` option to set the stop timeout when stopping a container. This avoids use of the ``--time`` option which was deprecated in Docker v28.0.
- collection metadata - The collection loader now parses scalar values from ``meta/runtime.yml`` as strings. This avoids issues caused by unquoted values such as versions or dates being parsed as types other than strings.
- deprecation warnings - Deprecation warning APIs automatically capture the identity of the deprecating plugin. The ``collection_name`` argument is only required to correctly attribute deprecations that occur in module_utils or other non-plugin code.
- deprecation warnings - Improved deprecation messages to more clearly indicate the affected content, including plugin name when available.
- deprecations - Collection name strings not of the form ``ns.coll`` passed to deprecation API functions will result in an error.
- deprecations - Removed support for specifying deprecation dates as a ``datetime.date``, which was included in an earlier 2.19 pre-release.
- deprecations - Some argument names to ``deprecate_value`` for consistency with existing APIs. An earlier 2.19 pre-release included a ``removal_`` prefix on the ``date`` and ``version`` arguments.
- modules - The ``AnsibleModule.deprecate`` function no longer sends deprecation messages to the target host's logging system.
Deprecated Features
-------------------
- Passing a ``warnings` or ``deprecations`` key to ``exit_json`` or ``fail_json`` is deprecated. Use ``AnsibleModule.warn`` or ``AnsibleModule.deprecate`` instead.
- plugins - Accessing plugins with ``_``-prefixed filenames without the ``_`` prefix is deprecated.
Bugfixes
--------
- Ansible will now ensure predictable permissions on remote artifacts, until now it only ensured executable and relied on system masks for the rest.
- dnf5 - avoid generating excessive transaction entries in the dnf5 history (https://github.com/ansible/ansible/issues/85046)
v2.19.0b2
=========
Release Summary
---------------
| Release Date: 2025-04-24
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
Minor Changes
-------------
- comment filter - Improve the error message shown when an invalid ``style`` argument is provided.
Bugfixes
--------
- Remove use of `required` parameter in `get_bin_path` which has been deprecated.
- ansible-doc - fix indentation for first line of descriptions of suboptions and sub-return values (https://github.com/ansible/ansible/pull/84690).
- ansible-doc - fix line wrapping for first line of description of options and return values (https://github.com/ansible/ansible/pull/84690).
v2.19.0b1
=========
Release Summary
---------------
| Release Date: 2025-04-14
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
Major Changes
-------------
- Jinja plugins - Jinja builtin filter and test plugins are now accessible via their fully-qualified names ``ansible.builtin.{name}``.
- Task Execution / Forks - Forks no longer inherit stdio from the parent ``ansible-playbook`` process. ``stdout``, ``stderr``, and ``stdin`` within a worker are detached from the terminal, and non-functional. All needs to access stdio from a fork for controller side plugins requires use of ``Display``.
- ansible-test - Packages beneath ``module_utils`` can now contain ``__init__.py`` files.
- variables - The type system underlying Ansible's variable storage has been significantly overhauled and formalized. Attempts to store unsupported Python object types in variables will now result in an error.
- variables - To support new Ansible features, many variable objects are now represented by subclasses of their respective native Python types. In most cases, they behave indistinguishably from their original types, but some Python libraries do not handle builtin object subclasses properly. Custom plugins that interact with such libraries may require changes to convert and pass the native types.
Minor Changes
-------------
- Added a -vvvvv log message indicating when a host fails to produce output within the timeout period.
- AnsibleModule.uri - Add option ``multipart_encoding`` for ``form-multipart`` files in body to change default base64 encoding for files
- INVENTORY_IGNORE_EXTS config, removed ``ini`` from the default list, inventory scripts using a corresponding .ini configuration are rare now and inventory.ini files are more common. Those that need to ignore the ini files for inventory scripts can still add it to configuration.
- Jinja plugins - Plugins can declare support for undefined values.
- Jinja2 version 3.1.0 or later is now required on the controller.
- Move ``follow_redirects`` parameter to module_utils so external modules can reuse it.
- PlayIterator - do not return tasks from already executed roles so specific strategy plugins do not have to do the filtering of such tasks themselves
- SSH Escalation-related -vvv log messages now include the associated host information.
- Windows - Add support for Windows Server 2025 to Ansible and as an ``ansible-test`` remote target - https://github.com/ansible/ansible/issues/84229
- Windows - refactor the async implementation to better handle errors during bootstrapping and avoid WMI when possible.
- ``ansible-galaxy collection install`` — the collection dependency resolver now prints out conflicts it hits during dependency resolution when it's taking too long and it ends up backtracking a lot. It also displays suggestions on how to help it compute the result more quickly.
- ansible, ansible-console, ansible-pull - add --flush-cache option (https://github.com/ansible/ansible/issues/83749).
- ansible-galaxy - Add support for Keycloak service accounts
- ansible-galaxy - support ``resolvelib >= 0.5.3, < 2.0.0`` (https://github.com/ansible/ansible/issues/84217).
- ansible-test - Added a macOS 15.3 remote VM, replacing 14.3.
- ansible-test - Automatically retry HTTP GET/PUT/DELETE requests on exceptions.
- ansible-test - Default to Python 3.13 in the ``base`` and ``default`` containers.
- ansible-test - Disable the ``deprecated-`` prefixed ``pylint`` rules as their results vary by Python version.
- ansible-test - Disable the ``pep8`` sanity test rules ``E701`` and ``E704`` to improve compatibility with ``black``.
- ansible-test - Improve container runtime probe error handling. When unexpected probe output is encountered, an error with more useful debugging information is provided.
- ansible-test - Replace container Alpine 3.20 with 3.21.
- ansible-test - Replace container Fedora 40 with 41.
- ansible-test - Replace remote Alpine 3.20 with 3.21.
- ansible-test - Replace remote Fedora 40 with 41.
- ansible-test - Replace remote FreeBSD 13.3 with 13.5.
- ansible-test - Replace remote FreeBSD 14.1 with 14.2.
- ansible-test - Replace remote RHEL 9.4 with 9.5.
- ansible-test - Show a more user-friendly error message when a ``runme.sh`` script is not executable.
- ansible-test - The ``yamllint`` sanity test now enforces string values for the ``!vault`` tag.
- ansible-test - Update ``nios-test-container`` to version 7.0.0.
- ansible-test - Update ``pylint`` sanity test to use version 3.3.1.
- ansible-test - Update distro containers to remove unnecessary pakages (apache2, subversion, ruby).
- ansible-test - Update sanity test requirements to latest available versions.
- ansible-test - Update the HTTP test container.
- ansible-test - Update the PyPI test container.
- ansible-test - Update the ``base`` and ``default`` containers.
- ansible-test - Update the utility container.
- ansible-test - Use Python's ``urllib`` instead of ``curl`` for HTTP requests.
- ansible-test - When detection of the current container network fails, a warning is now issued and execution continues. This simplifies usage in cases where the current container cannot be inspected, such as when running in GitHub Codespaces.
- ansible-test acme test container - bump `version to 2.3.0 <https://github.com/ansible/acme-test-container/releases/tag/2.3.0>`__ to include newer versions of Pebble, dependencies, and runtimes. This adds support for ACME profiles, ``dns-account-01`` support, and some smaller improvements (https://github.com/ansible/ansible/pull/84547).
- apt_key module - add notes to docs and errors to point at the CLI tool deprecation by Debian and alternatives
- apt_repository module - add notes to errors to point at the CLI tool deprecation by Debian and alternatives
- become plugins get new property 'pipelining' to show support or lack there of for the feature.
- callback plugins - add has_option() to CallbackBase to match other functions overloaded from AnsiblePlugin
- callback plugins - fix get_options() for CallbackBase
- copy - fix sanity test failures (https://github.com/ansible/ansible/pull/83643).
- copy - parameter ``local_follow`` was incorrectly documented as having default value ``True`` (https://github.com/ansible/ansible/pull/83643).
- cron - Provide additional error information while writing cron file (https://github.com/ansible/ansible/issues/83223).
- csvfile - let the config system do the typecasting (https://github.com/ansible/ansible/pull/82263).
- display - Deduplication of warning and error messages considers the full content of the message (including source and traceback contexts, if enabled). This may result in fewer messages being omitted.
- distribution - Added openSUSE MicroOS to Suse OS family (#84685).
- dnf5, apt - add ``auto_install_module_deps`` option (https://github.com/ansible/ansible/issues/84206)
- docs - add collection name in message from which the module is being deprecated (https://github.com/ansible/ansible/issues/84116).
- env lookup - The error message generated for a missing environment variable when ``default`` is an undefined value (e.g. ``undef('something')``) will contain the hint from that undefined value, except when the undefined value is the default of ``undef()`` with no arguments. Previously, any existing undefined hint would be ignored.
- file - enable file module to disable diff_mode (https://github.com/ansible/ansible/issues/80817).
- file - make code more readable and simple.
- filter - add support for URL-safe encoding and decoding in b64encode and b64decode (https://github.com/ansible/ansible/issues/84147).
- find - add a checksum_algorithm parameter to specify which type of checksum the module will return
- from_json filter - The filter accepts a ``profile`` argument, which defaults to ``tagless``.
- handlers - Templated handler names with syntax errors, or that resolve to ``omit`` are now skipped like handlers with undefined variables in their name.
- improved error message for yaml parsing errors in plugin documentation
- local connection plugin - A new ``become_strip_preamble`` config option (default True) was added; disable to preserve diagnostic ``become`` output in task results.
- local connection plugin - A new ``become_success_timeout`` operation-wide timeout config (default 10s) was added for ``become``.
- local connection plugin - When a ``become`` plugin's ``prompt`` value is a non-string after the ``check_password_prompt`` callback has completed, no prompt stripping will occur on stderr.
- lookup_template - add an option to trim blocks while templating (https://github.com/ansible/ansible/issues/75962).
- module - set ipv4 and ipv6 rules simultaneously in iptables module (https://github.com/ansible/ansible/issues/84404).
- module_utils - Add ``NoReturn`` type annotations to functions which never return.
- modules - PowerShell modules can now receive ``datetime.date``, ``datetime.time`` and ``datetime.datetime`` values as ISO 8601 strings.
- modules - PowerShell modules can now receive strings sourced from inline vault-encrypted strings.
- modules - Unhandled exceptions during Python module execution are now returned as structured data from the target. This allows the new traceback handling to be applied to exceptions raised on targets.
- pipelining logic has mostly moved to connection plugins so they can decide/override settings.
- plugin error handling - When raising exceptions in an exception handler, be sure to use ``raise ... from`` as appropriate. This supersedes the use of the ``AnsibleError`` arg ``orig_exc`` to represent the cause. Specifying ``orig_exc`` as the cause is still permitted. Failure to use ``raise ... from`` when ``orig_exc`` is set will result in a warning. Additionally, if the two cause exceptions do not match, a warning will be issued.
- removed harcoding of su plugin as it now works with pipelining.
- runtime-metadata sanity test - improve validation of ``action_groups`` (https://github.com/ansible/ansible/pull/83965).
- service_facts module got freebsd support added.
- ssh connection plugin - Support ``SSH_ASKPASS`` mechanism to provide passwords, making it the default, but still offering an explicit choice to use ``sshpass`` (https://github.com/ansible/ansible/pull/83936)
- ssh connection plugin now overrides pipelining when a tty is requested.
- ssh-agent - ``ansible``, ``ansible-playbook`` and ``ansible-console`` are capable of spawning or reusing an ssh-agent, allowing plugins to interact with the ssh-agent. Additionally a pure python ssh-agent client has been added, enabling easy interaction with the agent. The ssh connection plugin contains new functionality via ``ansible_ssh_private_key`` and ``ansible_ssh_private_key_passphrase``, for loading an SSH private key into the agent from a variable.
- templating - Access to an undefined variable from inside a lookup, filter, or test (which raises MarkerError) no longer ends processing of the current template. The triggering undefined value is returned as the result of the offending plugin invocation, and the template continues to execute.
- templating - Embedding ``range()`` values in containers such as lists will result in an error on use. Previously the value would be converted to a string representing the range parameters, such as ``range(0, 3)``.
- templating - Handling of omitted values is now a first-class feature of the template engine, and is usable in all Ansible Jinja template contexts. Any template that resolves to ``omit`` is automatically removed from its parent container during templating.
- templating - Template evaluation is lazier than in previous versions. Template expressions which resolve only portions of a data structure no longer result in the entire structure being templated.
- templating - Templating errors now provide more information about both the location and context of the error, especially for deeply-nested and/or indirected templating scenarios.
- templating - Unified ``omit`` behavior now requires that plugins calling ``Templar.template()`` handle cases where the entire template result is omitted, by catching the ``AnsibleValueOmittedError`` that is raised. Previously, this condition caused a randomly-generated string marker to appear in the template result.
- templating - Variables of type ``set`` and ``tuple`` are now converted to ``list`` when exiting the final pass of templating.
- to_json / to_nice_json filters - The filters accept a ``profile`` argument, which defaults to ``tagless``.
- troubleshooting - Tracebacks can be collected and displayed for most errors, warnings, and deprecation warnings (including those generated by modules). Tracebacks are no longer enabled with ``-vvv``; the behavior is directly configurable via the ``DISPLAY_TRACEBACK`` config option. Module tracebacks passed to ``fail_json`` via the ``exception`` kwarg will not be included in the task result unless error tracebacks are configured.
- undef jinja function - The ``undef`` jinja function now raises an error if a non-string hint is given. Attempting to use an undefined hint also results in an error, ensuring incorrect use of the function can be distinguished from the function's normal behavior.
- validate-modules sanity test - make sure that ``module`` and ``plugin`` ``seealso`` entries use FQCNs (https://github.com/ansible/ansible/pull/84325).
- vault - improved vault filter documentation by adding missing example content for dump_template_data.j2, refining examples for clarity, and ensuring variable consistency (https://github.com/ansible/ansible/issues/83583).
- warnings - All warnings (including deprecation warnings) issued during a task's execution are now accessible via the ``warnings`` and ``deprecations`` keys on the task result.
- when the ``dict`` lookup is given a non-dict argument, show the value of the argument and its type in the error message.
- windows - add hard minimum limit for PowerShell to 5.1. Ansible dropped support for older versions of PowerShell in the 2.16 release but this reqirement is now enforced at runtime.
- windows - refactor windows exec runner to improve efficiency and add better error reporting on failures.
- winrm - Remove need for pexpect on macOS hosts when using ``kinit`` to retrieve the Kerberos TGT. By default the code will now only use the builtin ``subprocess`` library which should handle issues with select and a high fd count and also simplify the code.
Breaking Changes / Porting Guide
--------------------------------
- Support for the ``toml`` library has been removed from TOML inventory parsing and dumping. Use ``tomli`` for parsing on Python 3.10. Python 3.11 and later have built-in support for parsing. Use ``tomli-w`` to support outputting inventory in TOML format.
- assert - The ``quiet`` argument must be a commonly-accepted boolean value. Previously, unrecognized values were silently treated as False.
- callback plugins - The structure of the ``exception``, ``warnings`` and ``deprecations`` values visible to callbacks has changed. Callbacks that inspect or serialize these values may require special handling.
- conditionals - Conditional expressions that result in non-boolean values are now an error by default. Such results often indicate unintentional use of templates where they are not supported, resulting in a conditional that is always true. When this option is enabled, conditional expressions which are a literal ``None`` or empty string will evaluate as true, for backwards compatibility. The error can be temporarily changed to a deprecation warning by enabling the ``ALLOW_BROKEN_CONDITIONALS`` config option.
- first_found lookup - When specifying ``files`` or ``paths`` as a templated list containing undefined values, the undefined list elements will be discarded with a warning. Previously, the entire list would be discarded without any warning.
- internals - The ``AnsibleLoader`` and ``AnsibleDumper`` classes for working with YAML are now factory functions and cannot be extended.
- internals - The ``ansible.utils.native_jinja`` Python module has been removed.
- inventory - Invalid variable names provided by inventories result in an inventory parse failure. This behavior is now consistent with other variable name usages throughout Ansible.
- lookup plugins - Lookup plugins called as `with_(lookup)` will no longer have the `_subdir` attribute set.
- lookup plugins - ``terms`` will always be passed to ``run`` as the first positional arg, where previously it was sometimes passed as a keyword arg when using ``with_`` syntax.
- loops - Omit placeholders no longer leak between loop item templating and task templating. Previously, ``omit`` placeholders could remain embedded in loop items after templating and be used as an ``omit`` for task templating. Now, values resolving to ``omit`` are dropped immediately when loop items are templated. To turn missing values into an ``omit`` for task templating, use ``| default(omit)``. This solution is backwards compatible with previous versions of ansible-core.
- modules - Ansible modules using ``sys.excepthook`` must use a standard ``try/except`` instead.
- plugins - Any plugin that sources or creates templates must properly tag them as trusted.
- plugins - Custom Jinja plugins that accept undefined top-level arguments must opt in to receiving them.
- plugins - Custom Jinja plugins that use ``environment.getitem`` to retrieve undefined values will now trigger a ``MarkerError`` exception. This exception must be handled to allow the plugin to return a ``Marker``, or the plugin must opt-in to accepting ``Marker`` values.
- public API - The ``ansible.vars.fact_cache.FactCache`` wrapper has been removed.
- serialization of ``omit`` sentinel - Serialization of variables containing ``omit`` sentinels (e.g., by the ``to_json`` and ``to_yaml`` filters or ``ansible-inventory``) will fail if the variable has not completed templating. Previously, serialization succeeded with placeholder strings emitted in the serialized output.
- set_fact - The string values "yes", "no", "true" and "false" were previously converted (ignoring case) to boolean values when not using Jinja2 native mode. Since Jinja2 native mode is always used, this conversion no longer occurs. When boolean values are required, native boolean syntax should be used where variables are defined, such as in YAML. When native boolean syntax is not an option, the ``bool`` filter can be used to parse string values into booleans.
- template lookup - The ``convert_data`` option is deprecated and no longer has any effect. Use the ``from_json`` filter on the lookup result instead.
- templating - Access to ``_`` prefixed attributes and methods, and methods with known side effects, is no longer permitted. In cases where a matching mapping key is present, the associated value will be returned instead of an error. This increases template environment isolation and ensures more consistent behavior between the ``.`` and ``[]`` operators.
- templating - Conditionals and lookups which use embedded inline templates in Jinja string constants now display a warning. These templates should be converted to their expression equivalent.
- templating - Many Jinja plugins (filters, lookups, tests) and methods previously silently ignored undefined inputs, which often masked subtle errors. Passing an undefined argument to a Jinja plugin or method that does not declare undefined support now results in an undefined value.
- templating - Templates are always rendered in Jinja2 native mode. As a result, non-string values are no longer automatically converted to strings.
- templating - Templates resulting in ``None`` are no longer automatically converted to an empty string.
- templating - Templates with embedded inline templates that were not contained within a Jinja string constant now result in an error, as support for multi-pass templating was removed for security reasons. In most cases, such templates can be easily rewritten to avoid the use of embedded inline templates.
- templating - The ``allow_unsafe_lookups`` option no longer has any effect. Lookup plugins are responsible for tagging strings containing templates to allow evaluation as a template.
- templating - The result of the ``range()`` global function cannot be returned from a template- it should always be passed to a filter (e.g., ``random``). Previously, range objects returned from an intermediate template were always converted to a list, which is inconsistent with inline consumption of range objects.
- templating - ``#jinja2:`` overrides in templates with invalid override names or types are now templating errors.
Deprecated Features
-------------------
- CLI - The ``--inventory-file`` option alias is deprecated. Use the ``-i`` or ``--inventory`` option instead.
- Stategy Plugins - Use of strategy plugins not provided in ``ansible.builtin`` are deprecated and do not carry any backwards compatibility guarantees going forward. A future release will remove the ability to use external strategy plugins. No alternative for third party strategy plugins is currently planned.
- ``ansible.module_utils.compat.datetime`` - The datetime compatibility shims are now deprecated. They are scheduled to be removed in ``ansible-core`` v2.21. This includes ``UTC``, ``utcfromtimestamp()`` and ``utcnow`` importable from said module (https://github.com/ansible/ansible/pull/81874).
- bool filter - Support for coercing unrecognized input values (including None) has been deprecated. Consult the filter documentation for acceptable values, or consider use of the ``truthy`` and ``falsy`` tests.
- cache plugins - The `ansible.plugins.cache.base` Python module is deprecated. Use `ansible.plugins.cache` instead.
- callback plugins - The `v2_on_any` callback method is deprecated. Use specific callback methods instead.
- callback plugins - The v1 callback API (callback methods not prefixed with `v2_`) is deprecated. Use `v2_` prefixed methods instead.
- conditionals - Conditionals using Jinja templating delimiters (e.g., ``{{``, ``{%``) should be rewritten as expressions without delimiters, unless the entire conditional value is a single template that resolves to a trusted string expression. This is useful for dynamic indirection of conditional expressions, but is limited to trusted literal string expressions.
- config - The ``ACTION_WARNINGS`` config has no effect. It previously disabled command warnings, which have since been removed.
- config - The ``DEFAULT_JINJA2_NATIVE`` option has no effect. Jinja2 native mode is now the default and only option.
- config - The ``DEFAULT_NULL_REPRESENTATION`` option has no effect. Null values are no longer automatically converted to another value during templating of single variable references.
- display - The ``Display.get_deprecation_message`` method has been deprecated. Call ``Display.deprecated`` to display a deprecation message, or call it with ``removed=True`` to raise an ``AnsibleError``.
- file loading - Loading text files with ``DataLoader`` containing data that cannot be decoded under the expected encoding is deprecated. In most cases the encoding must be UTF-8, although some plugins allow choosing a different encoding. Previously, invalid data was silently wrapped in Unicode surrogate escape sequences, often resulting in later errors or other data corruption.
- first_found lookup - Splitting of file paths on ``,;:`` is deprecated. Pass a list of paths instead. The ``split`` method on strings can be used to split variables into a list as needed.
- interpreter discovery - The ``auto_legacy`` and ``auto_legacy_silent`` options for ``INTERPRETER_PYTHON`` are deprecated. Use ``auto`` or ``auto_silent`` options instead, as they have the same effect.
- oneline callback - The ``oneline`` callback and its associated ad-hoc CLI args (``-o``, ``--one-line``) are deprecated.
- paramiko - The paramiko connection plugin has been deprecated with planned removal in 2.21.
- playbook variables - The ``play_hosts`` variable has been deprecated, use ``ansible_play_batch`` instead.
- plugin error handling - The ``AnsibleError`` constructor arg ``suppress_extended_error`` is deprecated. Using ``suppress_extended_error=True`` has the same effect as ``show_content=False``.
- plugins - The ``listify_lookup_plugin_terms`` function is obsolete and in most cases no longer needed.
- template lookup - The jinja2_native option is no longer used in the Ansible Core code base. Jinja2 native mode is now the default and only option.
- templating - Support for enabling Jinja2 extensions (not plugins) has been deprecated.
- templating - The ``ansible_managed`` variable available for certain templating scenarios, such as the ``template`` action and ``template`` lookup has been deprecated. Define and use a custom variable instead of relying on ``ansible_managed``.
- templating - The ``disable_lookups`` option has no effect, since plugins must be updated to apply trust before any templating can be performed.
- to_yaml/to_nice_yaml filters - Implicit YAML dumping of vaulted value ciphertext is deprecated. Set `dump_vault_tags` to explicitly specify the desired behavior.
- tree callback - The ``tree`` callback and its associated ad-hoc CLI args (``-t``, ``--tree``) are deprecated.
Removed Features (previously deprecated)
----------------------------------------
- Remove deprecated plural form of collection path (https://github.com/ansible/ansible/pull/84156).
- Removed deprecated STRING_CONVERSION_ACTION (https://github.com/ansible/ansible/issues/84220).
- encrypt - passing unsupported passlib hashtype now raises AnsibleFilterError.
- manager - remove deprecated include_delegate_to parameter from get_vars API.
- modules - Modules returning non-UTF8 strings now result in an error. The ``MODULE_STRICT_UTF8_RESPONSE`` setting can be used to disable this check.
- removed deprecated pycompat24 and compat.importlib.
- selector - remove deprecated compat.selector related files (https://github.com/ansible/ansible/pull/84155).
- windows - removed common module functions ``ConvertFrom-AnsibleJson``, ``Format-AnsibleException`` from Windows modules as they are not used and add uneeded complexity to the code.
Security Fixes
--------------
- include_vars action - Ensure that result masking is correctly requested when vault-encrypted files are read. (CVE-2024-8775)
- task result processing - Ensure that action-sourced result masking (``_ansible_no_log=True``) is preserved. (CVE-2024-8775)
- templating - Ansible's template engine no longer processes Jinja templates in strings unless they are marked as coming from a trusted source. Untrusted strings containing Jinja template markers are ignored with a warning. Examples of trusted sources include playbooks, vars files, and many inventory sources. Examples of untrusted sources include module results and facts. Plugins which have not been updated to preserve trust while manipulating strings may inadvertently cause them to lose their trusted status.
- templating - Changes to conditional expression handling removed numerous instances of insecure multi-pass templating (which could result in execution of untrusted template expressions).
- user action won't allow ssh-keygen, chown and chmod to run on existing ssh public key file, avoiding traversal on existing symlinks (CVE-2024-9902).
Bugfixes
--------
- Ansible will now also warn when reserved keywords are set via a module (set_fact, include_vars, etc).
- Ansible.Basic - Fix ``required_if`` check when the option value to check is unset or set to null.
- Correctly return ``False`` when using the ``filter`` and ``test`` Jinja tests on plugin names which are not filters or tests, respectively. (resolves issue https://github.com/ansible/ansible/issues/82084)
- Do not run implicit ``flush_handlers`` meta tasks when the whole play is excluded from the run due to tags specified.
- Errors now preserve stacked error messages even when YAML is involved.
- Fix a display.debug statement with the wrong param in _get_diff_data() method
- Fix disabling SSL verification when installing collections and roles from git repositories. If ``--ignore-certs`` isn't provided, the value for the ``GALAXY_IGNORE_CERTS`` configuration option will be used (https://github.com/ansible/ansible/issues/83326).
- Fix ipv6 pattern bug in lib/ansible/parsing/utils/addresses.py (https://github.com/ansible/ansible/issues/84237)
- Fix returning 'unreachable' for the overall task result. This prevents false positives when a looped task has unignored unreachable items (https://github.com/ansible/ansible/issues/84019).
- Implicit ``meta: flush_handlers`` tasks now have a parent block to prevent potential tracebacks when calling methods like ``get_play()`` on them internally.
- Improve performance on large inventories by reducing the number of implicit meta tasks.
- Jinja plugins - Errors raised will always be derived from ``AnsibleTemplatePluginError``.
- Optimize the way tasks from within ``include_tasks``/``include_role`` are inserted into the play.
- Time out waiting on become is an unreachable error (https://github.com/ansible/ansible/issues/84468)
- Use consistent multiprocessing context for action write locks
- Use the requested error message in the ansible.module_utils.facts.timeout timeout function instead of hardcoding one.
- Windows - add support for running on system where WDAC is in audit mode with ``Dynamic Code Security`` enabled.
- YAML parsing - The `!unsafe` tag no longer coerces non-string scalars to strings.
- ``ansible-galaxy`` — the collection dependency resolver now treats version specifiers starting with ``!=`` as unpinned.
- ``package``/``dnf`` action plugins - provide the reason behind the failure to gather the ``ansible_pkg_mgr`` fact to identify the package backend
- action plugins - Action plugins that raise unhandled exceptions no longer terminate playbook loops. Previously, exceptions raised by an action plugin caused abnormal loop termination and loss of loop iteration results.
- ansible-config - format galaxy server configs while dumping in JSON format (https://github.com/ansible/ansible/issues/84840).
- ansible-doc - If none of the files in files exists, path will be undefined and a direct reference will throw an UnboundLocalError (https://github.com/ansible/ansible/pull/84464).
- ansible-galaxy - Small adjustments to URL building for ``download_url`` and relative redirects.
- ansible-pull change detection will now work independently of callback or result format settings.
- ansible-test - Enable the ``sys.unraisablehook`` work-around for the ``pylint`` sanity test on Python 3.11. Previously the work-around was only enabled for Python 3.12 and later. However, the same issue has been discovered on Python 3.11.
- ansible-test - Ensure CA certificates are installed on managed FreeBSD instances.
- ansible-test - Fix support for PowerShell module_util imports with the ``-Optional`` flag.
- ansible-test - Fix support for detecting PowerShell modules importing module utils with the newer ``#AnsibleRequires`` format.
- ansible-test - Fix traceback that occurs after an interactive command fails.
- ansible-test - Fix up coverage reporting to properly translate the temporary path of integration test modules to the expected static test module path.
- ansible-test - Fixed traceback when handling certain YAML errors in the ``yamllint`` sanity test.
- ansible-test - Managed macOS instances now use the ``sudo_chdir`` option for the ``sudo`` become plugin to avoid permission errors when dropping privileges.
- ansible-vault will now correctly handle `--prompt`, previously it would issue an error about stdin if no 2nd argument was passed
- ansible_uptime_second - added ansible_uptime_seconds fact support for AIX (https://github.com/ansible/ansible/pull/84321).
- apt_key module - prevent tests from running when apt-key was removed
- base.yml - deprecated libvirt_lxc_noseclabel config.
- build - Pin ``wheel`` in ``pyproject.toml`` to ensure compatibility with supported ``setuptools`` versions.
- config - various fixes to config lookup plugin (https://github.com/ansible/ansible/pull/84398).
- copy - refactor copy module for simplicity.
- copy action now prevents user from setting internal options.
- debconf - set empty password values (https://github.com/ansible/ansible/issues/83214).
- debug - hide loop vars in debug var display (https://github.com/ansible/ansible/issues/65856).
- default callback - Error context is now shown for failing tasks that use the ``debug`` action.
- display - The ``Display.deprecated`` method once again properly handles the ``removed=True`` argument (https://github.com/ansible/ansible/issues/82358).
- distro - add support for Linux Mint Debian Edition (LMDE) (https://github.com/ansible/ansible/issues/84934).
- distro - detect Debian as os_family for LMDE 6 (https://github.com/ansible/ansible/issues/84934).
- dnf5 - Handle forwarded exceptions from dnf5-5.2.13 where a generic ``RuntimeError`` was previously raised
- dnf5 - fix ``is_installed`` check for packages that are not installed but listed as provided by an installed package (https://github.com/ansible/ansible/issues/84578)
- dnf5 - fix installing a package using ``state=latest`` when a binary of the same name as the package is already installed (https://github.com/ansible/ansible/issues/84259)
- dnf5 - fix traceback when ``enable_plugins``/``disable_plugins`` is used on ``python3-libdnf5`` versions that do not support this functionality
- dnf5 - libdnf5 - use ``conf.pkg_gpgcheck`` instead of deprecated ``conf.gpgcheck`` which is used only as a fallback
- dnf5 - matching on a binary can be achieved only by specifying a full path (https://github.com/ansible/ansible/issues/84334)
- facts - gather pagesize and calculate respective values depending upon architecture (https://github.com/ansible/ansible/issues/84773).
- facts - skip if distribution file path is directory, instead of raising error (https://github.com/ansible/ansible/issues/84006).
- find - skip ENOENT error code while recursively enumerating files. find module will now be tolerant to race conditions that remove files or directories from the target it is currently inspecting. (https://github.com/ansible/ansible/issues/84873).
- first_found lookup - Corrected return value documentation to reflect None (not empty string) for no files found.
- gather_facts action now defaults to `ansible.legacy.setup` if `smart` was set, no network OS was found and no other alias for `setup` was present.
- gather_facts action will now issues errors and warnings as appropriate if a network OS is detected but no facts modules are defined for it.
- gather_facts action, will now add setup when 'smart' appears with other modules in the FACTS_MODULES setting (#84750).
- get_url - add support for BSD-style checksum digest file (https://github.com/ansible/ansible/issues/84476).
- get_url - fix honoring ``filename`` from the ``content-disposition`` header even when the type is ``inline`` (https://github.com/ansible/ansible/issues/83690)
- host_group_vars - fixed defining the 'key' variable if the get_vars method is called with cache=False (https://github.com/ansible/ansible/issues/84384)
- include_vars - fix including previously undefined hash variables with hash_behaviour merge (https://github.com/ansible/ansible/issues/84295).
- iptables - Allows the wait parameter to be used with iptables chain creation (https://github.com/ansible/ansible/issues/84490)
- linear strategy - fix executing ``end_role`` meta tasks for each host, instead of handling these as implicit run_once tasks (https://github.com/ansible/ansible/issues/84660).
- local connection plugin - Become timeout errors now include all received data. Previously, the most recently-received data was discarded.
- local connection plugin - Ensure ``become`` success validation always occurs, even when an active plugin does not set ``prompt``.
- local connection plugin - Fixed cases where the internal ``BECOME-SUCCESS`` message appeared in task output.
- local connection plugin - Fixed hang or spurious failure when data arrived concurrently on stdout and stderr during a successful ``become`` operation validation.
- local connection plugin - Fixed hang when a become plugin expects a prompt but a password was not provided.
- local connection plugin - Fixed hang when an active become plugin incorrectly signals lack of prompt.
- local connection plugin - Fixed hang when an internal become read timeout expired before the password prompt was written.
- local connection plugin - Fixed hang when only one of stdout or stderr was closed by the ``become_exe`` subprocess.
- local connection plugin - Fixed long timeout/hang for ``become`` plugins that repeat their prompt on failure (e.g., ``sudo``, some ``su`` implementations).
- local connection plugin - Fixed silent ignore of ``become`` failures and loss of task output when data arrived concurrently on stdout and stderr during ``become`` operation validation.
- local connection plugin - Fixed task output header truncation when post-become data arrived before ``become`` operation validation had completed.
- lookup plugins - The ``terms`` arg to the ``run`` method is now always a list. Previously, there were cases where a non-list could be received.
- module arg templating - When using a templated raw task arg and a templated ``args`` keyword, args are now merged. Previously use of templated raw task args silently ignored all values from the templated ``args`` keyword.
- module defaults - Module defaults are no longer templated unless they are used by a task that does not override them. Previously, all module defaults for all modules were templated for every task.
- module respawn - limit to supported Python versions
- omitting task args - Use of omit for task args now properly falls back to args of lower precedence, such as module defaults. Previously an omitted value would obliterate values of lower precedence.
- package_facts module when using 'auto' will return the first package manager found that provides an output, instead of just the first one, as this can be foreign and not have any packages.
- psrp - Improve stderr parsing when running raw commands that emit error records or stderr lines.
- regex_search filter - Corrected return value documentation to reflect None (not empty string) for no match.
- respawn - use copy of env variables to update existing PYTHONPATH value (https://github.com/ansible/ansible/issues/84954).
- runas become - Fix up become logic to still get the SYSTEM token with the most privileges when running as SYSTEM.
- sequence lookup - sequence query/lookups without positional arguments now return a valid list if their kwargs comprise a valid sequence expression (https://github.com/ansible/ansible/issues/82921).
- service_facts - skip lines which does not contain service names in openrc output (https://github.com/ansible/ansible/issues/84512).
- ssh - Improve the logic for parsing CLIXML data in stderr when working with Windows host. This fixes issues when the raw stderr contains invalid UTF-8 byte sequences and improves embedded CLIXML sequences.
- ssh - Raise exception when sshpass returns error code (https://github.com/ansible/ansible/issues/58133).
- ssh - connection options were incorrectly templated during ``reset_connection`` tasks (https://github.com/ansible/ansible/pull/84238).
- stability - Fixed silent process failure on unhandled IOError/OSError under ``linear`` strategy.
- su become plugin - Ensure generated regex from ``prompt_l10n`` config values is properly escaped.
- su become plugin - Ensure that password prompts are correctly detected in the presence of leading output. Previously, this case resulted in a timeout or hang.
- su become plugin - Ensure that trailing colon is expected on all ``prompt_l10n`` config values.
- sudo become plugin - The `sudo_chdir` config option allows the current directory to be set to the specified value before executing sudo to avoid permission errors when dropping privileges.
- sunos - remove hard coding of virtinfo command in facts gathering code (https://github.com/ansible/ansible/pull/84357).
- to_yaml/to_nice_yaml filters - Eliminated possibility of keyword arg collisions with internally-set defaults.
- unarchive - Clamp timestamps from beyond y2038 to representible values when unpacking zip files on platforms that use 32-bit time_t (e.g. Debian i386).
- uri - Form location correctly when the server returns a relative redirect (https://github.com/ansible/ansible/issues/84540)
- uri - Handle HTTP exceptions raised while reading the content (https://github.com/ansible/ansible/issues/83794).
- uri - mark ``url`` as required (https://github.com/ansible/ansible/pull/83642).
- user - Create Buildroot subclass as alias to Busybox (https://github.com/ansible/ansible/issues/83665).
- user - Set timeout for passphrase interaction.
- user - Update prompt for SSH key passphrase (https://github.com/ansible/ansible/issues/84484).
- user - Use higher precedence HOME_MODE as UMASK for path provided (https://github.com/ansible/ansible/pull/84482).
- user action will now require O(force) to overwrite the public part of an ssh key when generating ssh keys, as was already the case for the private part.
- user module now avoids changing ownership of files symlinked in provided home dir skeleton
- vars lookup - The ``default`` substitution only applies when trying to look up a variable which is not defined. If the variable is defined, but templates to an undefined value, the ``default`` substitution will not apply. Use the ``default`` filter to coerce those values instead.
- wait_for_connection - a warning was displayed if any hosts used a local connection (https://github.com/ansible/ansible/issues/84419)
Known Issues
------------
- templating - Any string value starting with ``#jinja2:`` which is templated will always be interpreted as Jinja2 configuration overrides. To include this literal value at the start of a string, a space or other character must precede it.
- variables - Tagged values cannot be used for dictionary keys in many circumstances.
- variables - The values ``None``, ``True`` and ``False`` cannot be tagged because they are singletons. Attempts to apply tags to these values will be silently ignored.

@ -1,2 +1,913 @@
ancestor: 2.18.0 ancestor: 2.18.0
releases: {} releases:
2.19.0b1:
changes:
breaking_changes:
- Support for the ``toml`` library has been removed from TOML inventory parsing
and dumping. Use ``tomli`` for parsing on Python 3.10. Python 3.11 and later
have built-in support for parsing. Use ``tomli-w`` to support outputting inventory
in TOML format.
- assert - The ``quiet`` argument must be a commonly-accepted boolean value.
Previously, unrecognized values were silently treated as False.
- callback plugins - The structure of the ``exception``, ``warnings`` and ``deprecations``
values visible to callbacks has changed. Callbacks that inspect or serialize
these values may require special handling.
- conditionals - Conditional expressions that result in non-boolean values are
now an error by default. Such results often indicate unintentional use of
templates where they are not supported, resulting in a conditional that is
always true. When this option is enabled, conditional expressions which are
a literal ``None`` or empty string will evaluate as true, for backwards compatibility.
The error can be temporarily changed to a deprecation warning by enabling
the ``ALLOW_BROKEN_CONDITIONALS`` config option.
- first_found lookup - When specifying ``files`` or ``paths`` as a templated
list containing undefined values, the undefined list elements will be discarded
with a warning. Previously, the entire list would be discarded without any
warning.
- internals - The ``AnsibleLoader`` and ``AnsibleDumper`` classes for working
with YAML are now factory functions and cannot be extended.
- internals - The ``ansible.utils.native_jinja`` Python module has been removed.
- inventory - Invalid variable names provided by inventories result in an inventory
parse failure. This behavior is now consistent with other variable name usages
throughout Ansible.
- lookup plugins - Lookup plugins called as `with_(lookup)` will no longer have
the `_subdir` attribute set.
- lookup plugins - ``terms`` will always be passed to ``run`` as the first positional
arg, where previously it was sometimes passed as a keyword arg when using
``with_`` syntax.
- loops - Omit placeholders no longer leak between loop item templating and
task templating. Previously, ``omit`` placeholders could remain embedded in
loop items after templating and be used as an ``omit`` for task templating.
Now, values resolving to ``omit`` are dropped immediately when loop items
are templated. To turn missing values into an ``omit`` for task templating,
use ``| default(omit)``. This solution is backwards compatible with previous
versions of ansible-core.
- modules - Ansible modules using ``sys.excepthook`` must use a standard ``try/except``
instead.
- plugins - Any plugin that sources or creates templates must properly tag them
as trusted.
- plugins - Custom Jinja plugins that accept undefined top-level arguments must
opt in to receiving them.
- plugins - Custom Jinja plugins that use ``environment.getitem`` to retrieve
undefined values will now trigger a ``MarkerError`` exception. This exception
must be handled to allow the plugin to return a ``Marker``, or the plugin
must opt-in to accepting ``Marker`` values.
- public API - The ``ansible.vars.fact_cache.FactCache`` wrapper has been removed.
- serialization of ``omit`` sentinel - Serialization of variables containing
``omit`` sentinels (e.g., by the ``to_json`` and ``to_yaml`` filters or ``ansible-inventory``)
will fail if the variable has not completed templating. Previously, serialization
succeeded with placeholder strings emitted in the serialized output.
- set_fact - The string values "yes", "no", "true" and "false" were previously
converted (ignoring case) to boolean values when not using Jinja2 native mode.
Since Jinja2 native mode is always used, this conversion no longer occurs.
When boolean values are required, native boolean syntax should be used where
variables are defined, such as in YAML. When native boolean syntax is not
an option, the ``bool`` filter can be used to parse string values into booleans.
- template lookup - The ``convert_data`` option is deprecated and no longer
has any effect. Use the ``from_json`` filter on the lookup result instead.
- templating - Access to ``_`` prefixed attributes and methods, and methods
with known side effects, is no longer permitted. In cases where a matching
mapping key is present, the associated value will be returned instead of an
error. This increases template environment isolation and ensures more consistent
behavior between the ``.`` and ``[]`` operators.
- templating - Conditionals and lookups which use embedded inline templates
in Jinja string constants now display a warning. These templates should be
converted to their expression equivalent.
- templating - Many Jinja plugins (filters, lookups, tests) and methods previously
silently ignored undefined inputs, which often masked subtle errors. Passing
an undefined argument to a Jinja plugin or method that does not declare undefined
support now results in an undefined value.
- templating - Templates are always rendered in Jinja2 native mode. As a result,
non-string values are no longer automatically converted to strings.
- templating - Templates resulting in ``None`` are no longer automatically converted
to an empty string.
- templating - Templates with embedded inline templates that were not contained
within a Jinja string constant now result in an error, as support for multi-pass
templating was removed for security reasons. In most cases, such templates
can be easily rewritten to avoid the use of embedded inline templates.
- templating - The ``allow_unsafe_lookups`` option no longer has any effect.
Lookup plugins are responsible for tagging strings containing templates to
allow evaluation as a template.
- templating - The result of the ``range()`` global function cannot be returned
from a template- it should always be passed to a filter (e.g., ``random``).
Previously, range objects returned from an intermediate template were always
converted to a list, which is inconsistent with inline consumption of range
objects.
- templating - ``#jinja2:`` overrides in templates with invalid override names
or types are now templating errors.
bugfixes:
- Ansible will now also warn when reserved keywords are set via a module (set_fact,
include_vars, etc).
- Ansible.Basic - Fix ``required_if`` check when the option value to check is
unset or set to null.
- Correctly return ``False`` when using the ``filter`` and ``test`` Jinja tests
on plugin names which are not filters or tests, respectively. (resolves issue
https://github.com/ansible/ansible/issues/82084)
- Do not run implicit ``flush_handlers`` meta tasks when the whole play is excluded
from the run due to tags specified.
- Errors now preserve stacked error messages even when YAML is involved.
- Fix a display.debug statement with the wrong param in _get_diff_data() method
- Fix disabling SSL verification when installing collections and roles from
git repositories. If ``--ignore-certs`` isn't provided, the value for the
``GALAXY_IGNORE_CERTS`` configuration option will be used (https://github.com/ansible/ansible/issues/83326).
- Fix ipv6 pattern bug in lib/ansible/parsing/utils/addresses.py (https://github.com/ansible/ansible/issues/84237)
- Fix returning 'unreachable' for the overall task result. This prevents false
positives when a looped task has unignored unreachable items (https://github.com/ansible/ansible/issues/84019).
- 'Implicit ``meta: flush_handlers`` tasks now have a parent block to prevent
potential tracebacks when calling methods like ``get_play()`` on them internally.'
- Improve performance on large inventories by reducing the number of implicit
meta tasks.
- Jinja plugins - Errors raised will always be derived from ``AnsibleTemplatePluginError``.
- Optimize the way tasks from within ``include_tasks``/``include_role`` are
inserted into the play.
- Time out waiting on become is an unreachable error (https://github.com/ansible/ansible/issues/84468)
- Use consistent multiprocessing context for action write locks
- Use the requested error message in the ansible.module_utils.facts.timeout
timeout function instead of hardcoding one.
- Windows - add support for running on system where WDAC is in audit mode with
``Dynamic Code Security`` enabled.
- YAML parsing - The `!unsafe` tag no longer coerces non-string scalars to strings.
- "``ansible-galaxy`` \u2014 the collection dependency resolver now treats version
specifiers starting with ``!=`` as unpinned."
- '``package``/``dnf`` action plugins - provide the reason behind the failure
to gather the ``ansible_pkg_mgr`` fact to identify the package backend'
- action plugins - Action plugins that raise unhandled exceptions no longer
terminate playbook loops. Previously, exceptions raised by an action plugin
caused abnormal loop termination and loss of loop iteration results.
- ansible-config - format galaxy server configs while dumping in JSON format
(https://github.com/ansible/ansible/issues/84840).
- ansible-doc - If none of the files in files exists, path will be undefined
and a direct reference will throw an UnboundLocalError (https://github.com/ansible/ansible/pull/84464).
- ansible-galaxy - Small adjustments to URL building for ``download_url`` and
relative redirects.
- ansible-pull change detection will now work independently of callback or result
format settings.
- ansible-test - Enable the ``sys.unraisablehook`` work-around for the ``pylint``
sanity test on Python 3.11. Previously the work-around was only enabled for
Python 3.12 and later. However, the same issue has been discovered on Python
3.11.
- ansible-test - Ensure CA certificates are installed on managed FreeBSD instances.
- ansible-test - Fix support for PowerShell module_util imports with the ``-Optional``
flag.
- ansible-test - Fix support for detecting PowerShell modules importing module
utils with the newer ``#AnsibleRequires`` format.
- ansible-test - Fix traceback that occurs after an interactive command fails.
- ansible-test - Fix up coverage reporting to properly translate the temporary
path of integration test modules to the expected static test module path.
- ansible-test - Fixed traceback when handling certain YAML errors in the ``yamllint``
sanity test.
- ansible-test - Managed macOS instances now use the ``sudo_chdir`` option for
the ``sudo`` become plugin to avoid permission errors when dropping privileges.
- ansible-vault will now correctly handle `--prompt`, previously it would issue
an error about stdin if no 2nd argument was passed
- ansible_uptime_second - added ansible_uptime_seconds fact support for AIX
(https://github.com/ansible/ansible/pull/84321).
- apt_key module - prevent tests from running when apt-key was removed
- base.yml - deprecated libvirt_lxc_noseclabel config.
- build - Pin ``wheel`` in ``pyproject.toml`` to ensure compatibility with supported
``setuptools`` versions.
- config - various fixes to config lookup plugin (https://github.com/ansible/ansible/pull/84398).
- copy - refactor copy module for simplicity.
- copy action now prevents user from setting internal options.
- debconf - set empty password values (https://github.com/ansible/ansible/issues/83214).
- debug - hide loop vars in debug var display (https://github.com/ansible/ansible/issues/65856).
- default callback - Error context is now shown for failing tasks that use the
``debug`` action.
- display - The ``Display.deprecated`` method once again properly handles the
``removed=True`` argument (https://github.com/ansible/ansible/issues/82358).
- distro - add support for Linux Mint Debian Edition (LMDE) (https://github.com/ansible/ansible/issues/84934).
- distro - detect Debian as os_family for LMDE 6 (https://github.com/ansible/ansible/issues/84934).
- dnf5 - Handle forwarded exceptions from dnf5-5.2.13 where a generic ``RuntimeError``
was previously raised
- dnf5 - fix ``is_installed`` check for packages that are not installed but
listed as provided by an installed package (https://github.com/ansible/ansible/issues/84578)
- dnf5 - fix installing a package using ``state=latest`` when a binary of the
same name as the package is already installed (https://github.com/ansible/ansible/issues/84259)
- dnf5 - fix traceback when ``enable_plugins``/``disable_plugins`` is used on
``python3-libdnf5`` versions that do not support this functionality
- dnf5 - libdnf5 - use ``conf.pkg_gpgcheck`` instead of deprecated ``conf.gpgcheck``
which is used only as a fallback
- dnf5 - matching on a binary can be achieved only by specifying a full path
(https://github.com/ansible/ansible/issues/84334)
- facts - gather pagesize and calculate respective values depending upon architecture
(https://github.com/ansible/ansible/issues/84773).
- facts - skip if distribution file path is directory, instead of raising error
(https://github.com/ansible/ansible/issues/84006).
- find - skip ENOENT error code while recursively enumerating files. find module
will now be tolerant to race conditions that remove files or directories from
the target it is currently inspecting. (https://github.com/ansible/ansible/issues/84873).
- first_found lookup - Corrected return value documentation to reflect None
(not empty string) for no files found.
- gather_facts action now defaults to `ansible.legacy.setup` if `smart` was
set, no network OS was found and no other alias for `setup` was present.
- gather_facts action will now issues errors and warnings as appropriate if
a network OS is detected but no facts modules are defined for it.
- gather_facts action, will now add setup when 'smart' appears with other modules
in the FACTS_MODULES setting (#84750).
- get_url - add support for BSD-style checksum digest file (https://github.com/ansible/ansible/issues/84476).
- get_url - fix honoring ``filename`` from the ``content-disposition`` header
even when the type is ``inline`` (https://github.com/ansible/ansible/issues/83690)
- host_group_vars - fixed defining the 'key' variable if the get_vars method
is called with cache=False (https://github.com/ansible/ansible/issues/84384)
- include_vars - fix including previously undefined hash variables with hash_behaviour
merge (https://github.com/ansible/ansible/issues/84295).
- iptables - Allows the wait parameter to be used with iptables chain creation
(https://github.com/ansible/ansible/issues/84490)
- linear strategy - fix executing ``end_role`` meta tasks for each host, instead
of handling these as implicit run_once tasks (https://github.com/ansible/ansible/issues/84660).
- local connection plugin - Become timeout errors now include all received data.
Previously, the most recently-received data was discarded.
- local connection plugin - Ensure ``become`` success validation always occurs,
even when an active plugin does not set ``prompt``.
- local connection plugin - Fixed cases where the internal ``BECOME-SUCCESS``
message appeared in task output.
- local connection plugin - Fixed hang or spurious failure when data arrived
concurrently on stdout and stderr during a successful ``become`` operation
validation.
- local connection plugin - Fixed hang when a become plugin expects a prompt
but a password was not provided.
- local connection plugin - Fixed hang when an active become plugin incorrectly
signals lack of prompt.
- local connection plugin - Fixed hang when an internal become read timeout
expired before the password prompt was written.
- local connection plugin - Fixed hang when only one of stdout or stderr was
closed by the ``become_exe`` subprocess.
- local connection plugin - Fixed long timeout/hang for ``become`` plugins that
repeat their prompt on failure (e.g., ``sudo``, some ``su`` implementations).
- local connection plugin - Fixed silent ignore of ``become`` failures and loss
of task output when data arrived concurrently on stdout and stderr during
``become`` operation validation.
- local connection plugin - Fixed task output header truncation when post-become
data arrived before ``become`` operation validation had completed.
- lookup plugins - The ``terms`` arg to the ``run`` method is now always a list.
Previously, there were cases where a non-list could be received.
- module arg templating - When using a templated raw task arg and a templated
``args`` keyword, args are now merged. Previously use of templated raw task
args silently ignored all values from the templated ``args`` keyword.
- module defaults - Module defaults are no longer templated unless they are
used by a task that does not override them. Previously, all module defaults
for all modules were templated for every task.
- module respawn - limit to supported Python versions
- omitting task args - Use of omit for task args now properly falls back to
args of lower precedence, such as module defaults. Previously an omitted value
would obliterate values of lower precedence.
- package_facts module when using 'auto' will return the first package manager
found that provides an output, instead of just the first one, as this can
be foreign and not have any packages.
- psrp - Improve stderr parsing when running raw commands that emit error records
or stderr lines.
- regex_search filter - Corrected return value documentation to reflect None
(not empty string) for no match.
- respawn - use copy of env variables to update existing PYTHONPATH value (https://github.com/ansible/ansible/issues/84954).
- runas become - Fix up become logic to still get the SYSTEM token with the
most privileges when running as SYSTEM.
- sequence lookup - sequence query/lookups without positional arguments now
return a valid list if their kwargs comprise a valid sequence expression (https://github.com/ansible/ansible/issues/82921).
- service_facts - skip lines which does not contain service names in openrc
output (https://github.com/ansible/ansible/issues/84512).
- ssh - Improve the logic for parsing CLIXML data in stderr when working with
Windows host. This fixes issues when the raw stderr contains invalid UTF-8
byte sequences and improves embedded CLIXML sequences.
- ssh - Raise exception when sshpass returns error code (https://github.com/ansible/ansible/issues/58133).
- ssh - connection options were incorrectly templated during ``reset_connection``
tasks (https://github.com/ansible/ansible/pull/84238).
- stability - Fixed silent process failure on unhandled IOError/OSError under
``linear`` strategy.
- su become plugin - Ensure generated regex from ``prompt_l10n`` config values
is properly escaped.
- su become plugin - Ensure that password prompts are correctly detected in
the presence of leading output. Previously, this case resulted in a timeout
or hang.
- su become plugin - Ensure that trailing colon is expected on all ``prompt_l10n``
config values.
- sudo become plugin - The `sudo_chdir` config option allows the current directory
to be set to the specified value before executing sudo to avoid permission
errors when dropping privileges.
- sunos - remove hard coding of virtinfo command in facts gathering code (https://github.com/ansible/ansible/pull/84357).
- to_yaml/to_nice_yaml filters - Eliminated possibility of keyword arg collisions
with internally-set defaults.
- unarchive - Clamp timestamps from beyond y2038 to representible values when
unpacking zip files on platforms that use 32-bit time_t (e.g. Debian i386).
- uri - Form location correctly when the server returns a relative redirect
(https://github.com/ansible/ansible/issues/84540)
- uri - Handle HTTP exceptions raised while reading the content (https://github.com/ansible/ansible/issues/83794).
- uri - mark ``url`` as required (https://github.com/ansible/ansible/pull/83642).
- user - Create Buildroot subclass as alias to Busybox (https://github.com/ansible/ansible/issues/83665).
- user - Set timeout for passphrase interaction.
- user - Update prompt for SSH key passphrase (https://github.com/ansible/ansible/issues/84484).
- user - Use higher precedence HOME_MODE as UMASK for path provided (https://github.com/ansible/ansible/pull/84482).
- user action will now require O(force) to overwrite the public part of an ssh
key when generating ssh keys, as was already the case for the private part.
- user module now avoids changing ownership of files symlinked in provided home
dir skeleton
- vars lookup - The ``default`` substitution only applies when trying to look
up a variable which is not defined. If the variable is defined, but templates
to an undefined value, the ``default`` substitution will not apply. Use the
``default`` filter to coerce those values instead.
- wait_for_connection - a warning was displayed if any hosts used a local connection
(https://github.com/ansible/ansible/issues/84419)
deprecated_features:
- CLI - The ``--inventory-file`` option alias is deprecated. Use the ``-i``
or ``--inventory`` option instead.
- Stategy Plugins - Use of strategy plugins not provided in ``ansible.builtin``
are deprecated and do not carry any backwards compatibility guarantees going
forward. A future release will remove the ability to use external strategy
plugins. No alternative for third party strategy plugins is currently planned.
- '``ansible.module_utils.compat.datetime`` - The datetime compatibility shims
are now deprecated. They are scheduled to be removed in ``ansible-core`` v2.21.
This includes ``UTC``, ``utcfromtimestamp()`` and ``utcnow`` importable from
said module (https://github.com/ansible/ansible/pull/81874).'
- bool filter - Support for coercing unrecognized input values (including None)
has been deprecated. Consult the filter documentation for acceptable values,
or consider use of the ``truthy`` and ``falsy`` tests.
- cache plugins - The `ansible.plugins.cache.base` Python module is deprecated.
Use `ansible.plugins.cache` instead.
- callback plugins - The `v2_on_any` callback method is deprecated. Use specific
callback methods instead.
- callback plugins - The v1 callback API (callback methods not prefixed with
`v2_`) is deprecated. Use `v2_` prefixed methods instead.
- conditionals - Conditionals using Jinja templating delimiters (e.g., ``{{``,
``{%``) should be rewritten as expressions without delimiters, unless the
entire conditional value is a single template that resolves to a trusted string
expression. This is useful for dynamic indirection of conditional expressions,
but is limited to trusted literal string expressions.
- config - The ``ACTION_WARNINGS`` config has no effect. It previously disabled
command warnings, which have since been removed.
- config - The ``DEFAULT_JINJA2_NATIVE`` option has no effect. Jinja2 native
mode is now the default and only option.
- config - The ``DEFAULT_NULL_REPRESENTATION`` option has no effect. Null values
are no longer automatically converted to another value during templating of
single variable references.
- display - The ``Display.get_deprecation_message`` method has been deprecated.
Call ``Display.deprecated`` to display a deprecation message, or call it with
``removed=True`` to raise an ``AnsibleError``.
- file loading - Loading text files with ``DataLoader`` containing data that
cannot be decoded under the expected encoding is deprecated. In most cases
the encoding must be UTF-8, although some plugins allow choosing a different
encoding. Previously, invalid data was silently wrapped in Unicode surrogate
escape sequences, often resulting in later errors or other data corruption.
- first_found lookup - Splitting of file paths on ``,;:`` is deprecated. Pass
a list of paths instead. The ``split`` method on strings can be used to split
variables into a list as needed.
- interpreter discovery - The ``auto_legacy`` and ``auto_legacy_silent`` options
for ``INTERPRETER_PYTHON`` are deprecated. Use ``auto`` or ``auto_silent``
options instead, as they have the same effect.
- oneline callback - The ``oneline`` callback and its associated ad-hoc CLI
args (``-o``, ``--one-line``) are deprecated.
- paramiko - The paramiko connection plugin has been deprecated with planned
removal in 2.21.
- playbook variables - The ``play_hosts`` variable has been deprecated, use
``ansible_play_batch`` instead.
- plugin error handling - The ``AnsibleError`` constructor arg ``suppress_extended_error``
is deprecated. Using ``suppress_extended_error=True`` has the same effect
as ``show_content=False``.
- plugins - The ``listify_lookup_plugin_terms`` function is obsolete and in
most cases no longer needed.
- template lookup - The jinja2_native option is no longer used in the Ansible
Core code base. Jinja2 native mode is now the default and only option.
- templating - Support for enabling Jinja2 extensions (not plugins) has been
deprecated.
- templating - The ``ansible_managed`` variable available for certain templating
scenarios, such as the ``template`` action and ``template`` lookup has been
deprecated. Define and use a custom variable instead of relying on ``ansible_managed``.
- templating - The ``disable_lookups`` option has no effect, since plugins must
be updated to apply trust before any templating can be performed.
- to_yaml/to_nice_yaml filters - Implicit YAML dumping of vaulted value ciphertext
is deprecated. Set `dump_vault_tags` to explicitly specify the desired behavior.
- tree callback - The ``tree`` callback and its associated ad-hoc CLI args (``-t``,
``--tree``) are deprecated.
known_issues:
- templating - Any string value starting with ``#jinja2:`` which is templated
will always be interpreted as Jinja2 configuration overrides. To include this
literal value at the start of a string, a space or other character must precede
it.
- variables - Tagged values cannot be used for dictionary keys in many circumstances.
- variables - The values ``None``, ``True`` and ``False`` cannot be tagged because
they are singletons. Attempts to apply tags to these values will be silently
ignored.
major_changes:
- Jinja plugins - Jinja builtin filter and test plugins are now accessible via
their fully-qualified names ``ansible.builtin.{name}``.
- Task Execution / Forks - Forks no longer inherit stdio from the parent ``ansible-playbook``
process. ``stdout``, ``stderr``, and ``stdin`` within a worker are detached
from the terminal, and non-functional. All needs to access stdio from a fork
for controller side plugins requires use of ``Display``.
- ansible-test - Packages beneath ``module_utils`` can now contain ``__init__.py``
files.
- variables - The type system underlying Ansible's variable storage has been
significantly overhauled and formalized. Attempts to store unsupported Python
object types in variables will now result in an error.
- variables - To support new Ansible features, many variable objects are now
represented by subclasses of their respective native Python types. In most
cases, they behave indistinguishably from their original types, but some Python
libraries do not handle builtin object subclasses properly. Custom plugins
that interact with such libraries may require changes to convert and pass
the native types.
minor_changes:
- Added a -vvvvv log message indicating when a host fails to produce output
within the timeout period.
- AnsibleModule.uri - Add option ``multipart_encoding`` for ``form-multipart``
files in body to change default base64 encoding for files
- INVENTORY_IGNORE_EXTS config, removed ``ini`` from the default list, inventory
scripts using a corresponding .ini configuration are rare now and inventory.ini
files are more common. Those that need to ignore the ini files for inventory
scripts can still add it to configuration.
- Jinja plugins - Plugins can declare support for undefined values.
- Jinja2 version 3.1.0 or later is now required on the controller.
- Move ``follow_redirects`` parameter to module_utils so external modules can
reuse it.
- PlayIterator - do not return tasks from already executed roles so specific
strategy plugins do not have to do the filtering of such tasks themselves
- SSH Escalation-related -vvv log messages now include the associated host information.
- Windows - Add support for Windows Server 2025 to Ansible and as an ``ansible-test``
remote target - https://github.com/ansible/ansible/issues/84229
- Windows - refactor the async implementation to better handle errors during
bootstrapping and avoid WMI when possible.
- "``ansible-galaxy collection install`` \u2014 the collection dependency resolver
now prints out conflicts it hits during dependency resolution when it's taking
too long and it ends up backtracking a lot. It also displays suggestions on
how to help it compute the result more quickly."
- 'ansible, ansible-console, ansible-pull - add --flush-cache option (https://github.com/ansible/ansible/issues/83749).
'
- ansible-galaxy - Add support for Keycloak service accounts
- ansible-galaxy - support ``resolvelib >= 0.5.3, < 2.0.0`` (https://github.com/ansible/ansible/issues/84217).
- ansible-test - Added a macOS 15.3 remote VM, replacing 14.3.
- ansible-test - Automatically retry HTTP GET/PUT/DELETE requests on exceptions.
- ansible-test - Default to Python 3.13 in the ``base`` and ``default`` containers.
- ansible-test - Disable the ``deprecated-`` prefixed ``pylint`` rules as their
results vary by Python version.
- ansible-test - Disable the ``pep8`` sanity test rules ``E701`` and ``E704``
to improve compatibility with ``black``.
- ansible-test - Improve container runtime probe error handling. When unexpected
probe output is encountered, an error with more useful debugging information
is provided.
- ansible-test - Replace container Alpine 3.20 with 3.21.
- ansible-test - Replace container Fedora 40 with 41.
- ansible-test - Replace remote Alpine 3.20 with 3.21.
- ansible-test - Replace remote Fedora 40 with 41.
- ansible-test - Replace remote FreeBSD 13.3 with 13.5.
- ansible-test - Replace remote FreeBSD 14.1 with 14.2.
- ansible-test - Replace remote RHEL 9.4 with 9.5.
- ansible-test - Show a more user-friendly error message when a ``runme.sh``
script is not executable.
- ansible-test - The ``yamllint`` sanity test now enforces string values for
the ``!vault`` tag.
- ansible-test - Update ``nios-test-container`` to version 7.0.0.
- ansible-test - Update ``pylint`` sanity test to use version 3.3.1.
- ansible-test - Update distro containers to remove unnecessary pakages (apache2,
subversion, ruby).
- ansible-test - Update sanity test requirements to latest available versions.
- ansible-test - Update the HTTP test container.
- ansible-test - Update the PyPI test container.
- ansible-test - Update the ``base`` and ``default`` containers.
- ansible-test - Update the utility container.
- ansible-test - Use Python's ``urllib`` instead of ``curl`` for HTTP requests.
- ansible-test - When detection of the current container network fails, a warning
is now issued and execution continues. This simplifies usage in cases where
the current container cannot be inspected, such as when running in GitHub
Codespaces.
- ansible-test acme test container - bump `version to 2.3.0 <https://github.com/ansible/acme-test-container/releases/tag/2.3.0>`__
to include newer versions of Pebble, dependencies, and runtimes. This adds
support for ACME profiles, ``dns-account-01`` support, and some smaller improvements
(https://github.com/ansible/ansible/pull/84547).
- apt_key module - add notes to docs and errors to point at the CLI tool deprecation
by Debian and alternatives
- apt_repository module - add notes to errors to point at the CLI tool deprecation
by Debian and alternatives
- become plugins get new property 'pipelining' to show support or lack there
of for the feature.
- callback plugins - add has_option() to CallbackBase to match other functions
overloaded from AnsiblePlugin
- callback plugins - fix get_options() for CallbackBase
- copy - fix sanity test failures (https://github.com/ansible/ansible/pull/83643).
- copy - parameter ``local_follow`` was incorrectly documented as having default
value ``True`` (https://github.com/ansible/ansible/pull/83643).
- cron - Provide additional error information while writing cron file (https://github.com/ansible/ansible/issues/83223).
- csvfile - let the config system do the typecasting (https://github.com/ansible/ansible/pull/82263).
- display - Deduplication of warning and error messages considers the full content
of the message (including source and traceback contexts, if enabled). This
may result in fewer messages being omitted.
- distribution - Added openSUSE MicroOS to Suse OS family (#84685).
- dnf5, apt - add ``auto_install_module_deps`` option (https://github.com/ansible/ansible/issues/84206)
- docs - add collection name in message from which the module is being deprecated
(https://github.com/ansible/ansible/issues/84116).
- env lookup - The error message generated for a missing environment variable
when ``default`` is an undefined value (e.g. ``undef('something')``) will
contain the hint from that undefined value, except when the undefined value
is the default of ``undef()`` with no arguments. Previously, any existing
undefined hint would be ignored.
- file - enable file module to disable diff_mode (https://github.com/ansible/ansible/issues/80817).
- file - make code more readable and simple.
- filter - add support for URL-safe encoding and decoding in b64encode and b64decode
(https://github.com/ansible/ansible/issues/84147).
- find - add a checksum_algorithm parameter to specify which type of checksum
the module will return
- from_json filter - The filter accepts a ``profile`` argument, which defaults
to ``tagless``.
- handlers - Templated handler names with syntax errors, or that resolve to
``omit`` are now skipped like handlers with undefined variables in their name.
- improved error message for yaml parsing errors in plugin documentation
- local connection plugin - A new ``become_strip_preamble`` config option (default
True) was added; disable to preserve diagnostic ``become`` output in task
results.
- local connection plugin - A new ``become_success_timeout`` operation-wide
timeout config (default 10s) was added for ``become``.
- local connection plugin - When a ``become`` plugin's ``prompt`` value is a
non-string after the ``check_password_prompt`` callback has completed, no
prompt stripping will occur on stderr.
- lookup_template - add an option to trim blocks while templating (https://github.com/ansible/ansible/issues/75962).
- module - set ipv4 and ipv6 rules simultaneously in iptables module (https://github.com/ansible/ansible/issues/84404).
- module_utils - Add ``NoReturn`` type annotations to functions which never
return.
- modules - PowerShell modules can now receive ``datetime.date``, ``datetime.time``
and ``datetime.datetime`` values as ISO 8601 strings.
- modules - PowerShell modules can now receive strings sourced from inline vault-encrypted
strings.
- modules - Unhandled exceptions during Python module execution are now returned
as structured data from the target. This allows the new traceback handling
to be applied to exceptions raised on targets.
- pipelining logic has mostly moved to connection plugins so they can decide/override
settings.
- plugin error handling - When raising exceptions in an exception handler, be
sure to use ``raise ... from`` as appropriate. This supersedes the use of
the ``AnsibleError`` arg ``orig_exc`` to represent the cause. Specifying ``orig_exc``
as the cause is still permitted. Failure to use ``raise ... from`` when ``orig_exc``
is set will result in a warning. Additionally, if the two cause exceptions
do not match, a warning will be issued.
- removed harcoding of su plugin as it now works with pipelining.
- runtime-metadata sanity test - improve validation of ``action_groups`` (https://github.com/ansible/ansible/pull/83965).
- service_facts module got freebsd support added.
- ssh connection plugin - Support ``SSH_ASKPASS`` mechanism to provide passwords,
making it the default, but still offering an explicit choice to use ``sshpass``
(https://github.com/ansible/ansible/pull/83936)
- ssh connection plugin now overrides pipelining when a tty is requested.
- ssh-agent - ``ansible``, ``ansible-playbook`` and ``ansible-console`` are
capable of spawning or reusing an ssh-agent, allowing plugins to interact
with the ssh-agent. Additionally a pure python ssh-agent client has been added,
enabling easy interaction with the agent. The ssh connection plugin contains
new functionality via ``ansible_ssh_private_key`` and ``ansible_ssh_private_key_passphrase``,
for loading an SSH private key into the agent from a variable.
- templating - Access to an undefined variable from inside a lookup, filter,
or test (which raises MarkerError) no longer ends processing of the current
template. The triggering undefined value is returned as the result of the
offending plugin invocation, and the template continues to execute.
- templating - Embedding ``range()`` values in containers such as lists will
result in an error on use. Previously the value would be converted to a string
representing the range parameters, such as ``range(0, 3)``.
- templating - Handling of omitted values is now a first-class feature of the
template engine, and is usable in all Ansible Jinja template contexts. Any
template that resolves to ``omit`` is automatically removed from its parent
container during templating.
- templating - Template evaluation is lazier than in previous versions. Template
expressions which resolve only portions of a data structure no longer result
in the entire structure being templated.
- templating - Templating errors now provide more information about both the
location and context of the error, especially for deeply-nested and/or indirected
templating scenarios.
- templating - Unified ``omit`` behavior now requires that plugins calling ``Templar.template()``
handle cases where the entire template result is omitted, by catching the
``AnsibleValueOmittedError`` that is raised. Previously, this condition caused
a randomly-generated string marker to appear in the template result.
- templating - Variables of type ``set`` and ``tuple`` are now converted to
``list`` when exiting the final pass of templating.
- to_json / to_nice_json filters - The filters accept a ``profile`` argument,
which defaults to ``tagless``.
- troubleshooting - Tracebacks can be collected and displayed for most errors,
warnings, and deprecation warnings (including those generated by modules).
Tracebacks are no longer enabled with ``-vvv``; the behavior is directly configurable
via the ``DISPLAY_TRACEBACK`` config option. Module tracebacks passed to ``fail_json``
via the ``exception`` kwarg will not be included in the task result unless
error tracebacks are configured.
- undef jinja function - The ``undef`` jinja function now raises an error if
a non-string hint is given. Attempting to use an undefined hint also results
in an error, ensuring incorrect use of the function can be distinguished from
the function's normal behavior.
- validate-modules sanity test - make sure that ``module`` and ``plugin`` ``seealso``
entries use FQCNs (https://github.com/ansible/ansible/pull/84325).
- vault - improved vault filter documentation by adding missing example content
for dump_template_data.j2, refining examples for clarity, and ensuring variable
consistency (https://github.com/ansible/ansible/issues/83583).
- warnings - All warnings (including deprecation warnings) issued during a task's
execution are now accessible via the ``warnings`` and ``deprecations`` keys
on the task result.
- when the ``dict`` lookup is given a non-dict argument, show the value of the
argument and its type in the error message.
- windows - add hard minimum limit for PowerShell to 5.1. Ansible dropped support
for older versions of PowerShell in the 2.16 release but this reqirement is
now enforced at runtime.
- windows - refactor windows exec runner to improve efficiency and add better
error reporting on failures.
- winrm - Remove need for pexpect on macOS hosts when using ``kinit`` to retrieve
the Kerberos TGT. By default the code will now only use the builtin ``subprocess``
library which should handle issues with select and a high fd count and also
simplify the code.
release_summary: '| Release Date: 2025-04-14
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
'
removed_features:
- Remove deprecated plural form of collection path (https://github.com/ansible/ansible/pull/84156).
- Removed deprecated STRING_CONVERSION_ACTION (https://github.com/ansible/ansible/issues/84220).
- encrypt - passing unsupported passlib hashtype now raises AnsibleFilterError.
- manager - remove deprecated include_delegate_to parameter from get_vars API.
- modules - Modules returning non-UTF8 strings now result in an error. The ``MODULE_STRICT_UTF8_RESPONSE``
setting can be used to disable this check.
- removed deprecated pycompat24 and compat.importlib.
- selector - remove deprecated compat.selector related files (https://github.com/ansible/ansible/pull/84155).
- windows - removed common module functions ``ConvertFrom-AnsibleJson``, ``Format-AnsibleException``
from Windows modules as they are not used and add uneeded complexity to the
code.
security_fixes:
- include_vars action - Ensure that result masking is correctly requested when
vault-encrypted files are read. (CVE-2024-8775)
- task result processing - Ensure that action-sourced result masking (``_ansible_no_log=True``)
is preserved. (CVE-2024-8775)
- templating - Ansible's template engine no longer processes Jinja templates
in strings unless they are marked as coming from a trusted source. Untrusted
strings containing Jinja template markers are ignored with a warning. Examples
of trusted sources include playbooks, vars files, and many inventory sources.
Examples of untrusted sources include module results and facts. Plugins which
have not been updated to preserve trust while manipulating strings may inadvertently
cause them to lose their trusted status.
- templating - Changes to conditional expression handling removed numerous instances
of insecure multi-pass templating (which could result in execution of untrusted
template expressions).
- user action won't allow ssh-keygen, chown and chmod to run on existing ssh
public key file, avoiding traversal on existing symlinks (CVE-2024-9902).
codename: What Is and What Should Never Be
fragments:
- 2.19.0b1_summary.yaml
- 81709-ansible-galaxy-slow-resolution-hints.yml
- 81812-ansible-galaxy-negative-spec-is-pinned.yml
- 81874-deprecate-datetime-compat.yml
- 83642-fix-sanity-ignore-for-uri.yml
- 83643-fix-sanity-ignore-for-copy.yml
- 83690-get_url-content-disposition-filename.yml
- 83700-enable-file-disable-diff.yml
- 83757-deprecate-paramiko.yml
- 83936-ssh-askpass.yml
- 83965-action-groups-schema.yml
- 84008-additional-logging.yml
- 84019-ignore_unreachable-loop.yml
- 84149-add-flush-cache-for-adhoc-commands.yml
- 84206-dnf5-apt-auto-install-module-deps.yml
- 84213-ansible-galaxy-url-building.yml
- 84229-windows-server-2025.yml
- 84238-fix-reset_connection-ssh_executable-templated.yml
- 84259-dnf5-latest-fix.yml
- 84321-added-ansible_uptime_seconds_aix.yml
- 84325-validate-modules-seealso-fqcn.yml
- 84334-dnf5-consolidate-settings.yml
- 84384-fix-undefined-key-host-group-vars.yml
- 84419-fix-wait_for_connection-warning.yml
- 84468-timeout_become_unreachable.yml
- 84473-dict-lookup-type-error-message.yml
- 84490-allow-iptables-chain-creation-with-wait.yml
- 84496-CallbackBase-get_options.yml
- 84540-uri-relative-redirect.yml
- 84547-acme-test-container.yml
- 84578-dnf5-is_installed-provides.yml
- 84660-fix-meta-end_role-linear-strategy.yml
- 84685-add-opensuse-microos.yml
- 84705-error-message-malformed-plugin-documentation.yml
- 84725-deprecate-strategy-plugins.yml
- Ansible.Basic-required_if-null.yml
- ansible-galaxy-keycloak-service-accounts.yml
- ansible-test-added-macos-15.3.yml
- ansible-test-containers.yml
- ansible-test-coverage-test-files.yml
- ansible-test-curl.yml
- ansible-test-fix-command-traceback.yml
- ansible-test-freebsd-nss.yml
- ansible-test-network-detection.yml
- ansible-test-nios-container.yml
- ansible-test-no-exec-script.yml
- ansible-test-probe-error-handling.yml
- ansible-test-pylint-fix.yml
- ansible-test-remotes.yml
- ansible-test-update.yml
- apt_key_bye.yml
- become-runas-system-deux.yml
- buildroot.yml
- compat_removal.yml
- config.yml
- config_dump.yml
- copy_validate_input.yml
- cron_err.yml
- csvfile-col.yml
- cve-2024-8775.yml
- darwin_pagesize.yml
- debconf_empty_password.yml
- deprecated.yml
- distro_LMDE_6.yml
- dnf5-exception-forwarding.yml
- dnf5-plugins-compat.yml
- dnf5-remove-usage-deprecated-option.yml
- feature-uri-add-option-multipart-encoding.yml
- file_simplify.yml
- find-checksum.yml
- find_enoent.yml
- fix-ansible-galaxy-ignore-certs.yml
- fix-cli-doc-path_undefined.yaml
- fix-display-bug-in-action-plugin.yml
- fix-include_vars-merge-hash.yml
- fix-ipv6-pattern.yml
- fix-is-filter-is-test.yml
- fix-lookup-sequence-keyword-args-only.yml
- fix-module-utils-facts-timeout.yml
- fix_errors.yml
- follow_redirects_url.yml
- gather_facts_netos_fixes.yml
- gather_facts_smart_fix.yml
- get_url_bsd_style_digest.yml
- hide-loop-vars-debug-vars.yml
- implicit_flush_handlers_parents.yml
- include_delegate_to.yml
- interpreter-discovery-auto-legacy.yml
- jinja-version.yml
- libvirt_lxc.yml
- local-become-fixes.yml
- lookup_config.yml
- macos-correct-lock.yml
- no-inherit-stdio.yml
- no-return.yml
- openrc-status.yml
- os_family.yml
- package-dnf-action-plugins-facts-fail-msg.yml
- package_facts_fix.yml
- passlib.yml
- pin-wheel.yml
- pipelining_refactor.yml
- playiterator-add_tasks-optimize.yml
- ps-import-sanity.yml
- pull_changed_fix.yml
- remove_ini_ignored_dir.yml
- reserved_module_chekc.yml
- respawn-min-python.yml
- respawn_os_env.yml
- selector_removal.yml
- service_facts_fbsd.yml
- set_ipv4_and_ipv6_simultaneously.yml
- simplify-copy-module.yml
- skip-handlers-tagged-play.yml
- skip-implicit-flush_handlers-no-notify.yml
- skip-role-task-iterator.yml
- ssh-agent.yml
- ssh-clixml.yml
- ssh_raise_exception.yml
- string_conversion.yml
- sunos_virtinfo.yml
- templates_types_datatagging.yml
- toml-library-support-dropped.yml
- trim_blocks.yml
- unarchive_timestamp_t32.yaml
- update-resolvelib-lt-2_0_0.yml
- uri_httpexception.yml
- url_safe_b64_encode_decode.yml
- user_action_fix.yml
- user_module.yml
- user_passphrase.yml
- user_ssh_fix.yml
- v2.19.0-initial-commit.yaml
- vault_cli_fix.yml
- vault_docs_fix.yaml
- win-async-refactor.yml
- win-wdac-audit.yml
- windows-exec.yml
- winrm-kinit-pexpect.yml
release_date: '2025-04-14'
2.19.0b2:
changes:
bugfixes:
- Remove use of `required` parameter in `get_bin_path` which has been deprecated.
- ansible-doc - fix indentation for first line of descriptions of suboptions
and sub-return values (https://github.com/ansible/ansible/pull/84690).
- ansible-doc - fix line wrapping for first line of description of options and
return values (https://github.com/ansible/ansible/pull/84690).
minor_changes:
- comment filter - Improve the error message shown when an invalid ``style``
argument is provided.
release_summary: '| Release Date: 2025-04-24
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
'
codename: What Is and What Should Never Be
fragments:
- 2.19.0b2_summary.yaml
- 84690-ansible-doc-indent-wrapping.yml
- comment_fail.yml
- get_bin_path-remove-use-of-deprecated-param.yml
release_date: '2025-04-23'
2.19.0b3:
changes:
bugfixes:
- Ansible will now ensure predictable permissions on remote artifacts, until
now it only ensured executable and relied on system masks for the rest.
- dnf5 - avoid generating excessive transaction entries in the dnf5 history
(https://github.com/ansible/ansible/issues/85046)
deprecated_features:
- Passing a ``warnings` or ``deprecations`` key to ``exit_json`` or ``fail_json``
is deprecated. Use ``AnsibleModule.warn`` or ``AnsibleModule.deprecate`` instead.
- plugins - Accessing plugins with ``_``-prefixed filenames without the ``_``
prefix is deprecated.
minor_changes:
- ansible-config will now show internal, but not test configuration entries.
This allows for debugging but still denoting the configurations as internal
use only (_ prefix).
- ansible-test - Improved ``pylint`` checks for Ansible-specific deprecation
functions.
- ansible-test - Use the ``-t`` option to set the stop timeout when stopping
a container. This avoids use of the ``--time`` option which was deprecated
in Docker v28.0.
- collection metadata - The collection loader now parses scalar values from
``meta/runtime.yml`` as strings. This avoids issues caused by unquoted values
such as versions or dates being parsed as types other than strings.
- deprecation warnings - Deprecation warning APIs automatically capture the
identity of the deprecating plugin. The ``collection_name`` argument is only
required to correctly attribute deprecations that occur in module_utils or
other non-plugin code.
- deprecation warnings - Improved deprecation messages to more clearly indicate
the affected content, including plugin name when available.
- deprecations - Collection name strings not of the form ``ns.coll`` passed
to deprecation API functions will result in an error.
- deprecations - Removed support for specifying deprecation dates as a ``datetime.date``,
which was included in an earlier 2.19 pre-release.
- deprecations - Some argument names to ``deprecate_value`` for consistency
with existing APIs. An earlier 2.19 pre-release included a ``removal_`` prefix
on the ``date`` and ``version`` arguments.
- modules - The ``AnsibleModule.deprecate`` function no longer sends deprecation
messages to the target host's logging system.
release_summary: '| Release Date: 2025-05-06
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
'
codename: What Is and What Should Never Be
fragments:
- 2.19.0b3_summary.yaml
- 85046-dnf5-history-entries.yml
- ansible-test-container-stop.yml
- config_priv.yml
- deprecator.yml
- ensure_remote_perms.yml
release_date: '2025-05-06'
2.19.0b4:
changes:
bugfixes:
- ansible-test - Updated the ``pylint`` sanity test to skip some deprecation
validation checks when all arguments are dynamic.
- config - Preserve or apply Origin tag to values returned by config.
- config - Prevented fatal errors when ``MODULE_IGNORE_EXTS`` configuration
was set.
- config - Templating failures on config defaults now issue a warning. Previously,
failures silently returned an unrendered and untrusted template to the caller.
- config - ``ensure_type`` correctly propagates trust and other tags on returned
values.
- config - ``ensure_type`` now converts mappings to ``dict`` when requested,
instead of returning the mapping.
- config - ``ensure_type`` now converts sequences to ``list`` when requested,
instead of returning the sequence.
- config - ``ensure_type`` now correctly errors when ``pathlist`` or ``pathspec``
types encounter non-string list items.
- config - ``ensure_type`` now reports an error when ``bytes`` are provided
for any known ``value_type``. Previously, the behavior was undefined, but
often resulted in an unhandled exception or incorrect return type.
- config - ``ensure_type`` with expected type ``int`` now properly converts
``True`` and ``False`` values to ``int``. Previously, these values were silently
returned unmodified.
- convert_bool.boolean API conversion function - Unhashable values passed to
``boolean`` behave like other non-boolean convertible values, returning False
or raising ``TypeError`` depending on the value of ``strict``. Previously,
unhashable values always raised ``ValueError`` due to an invalid set membership
check.
- dnf5 - when ``bugfix`` and/or ``security`` is specified, skip packages that
do not have any such updates, even for new versions of libdnf5 where this
functionality changed and it is considered failure
- plugin loader - Apply template trust to strings loaded from plugin configuration
definitions and doc fragments.
- template action - Template files where the entire file's output renders as
``None`` are no longer emitted as the string "None", but instead render to
an empty file as in previous releases.
minor_changes:
- facts - add "CloudStack KVM Hypervisor" for Linux VM in virtual facts (https://github.com/ansible/ansible/issues/85089).
- modules - use ``AnsibleModule.warn`` instead of passing ``warnings`` to ``exit_json``
or ``fail_json`` which is deprecated.
release_summary: '| Release Date: 2025-05-12
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
'
codename: What Is and What Should Never Be
fragments:
- 2.19.0b4_summary.yaml
- 85117-add-cloudstack-kvm-for-linux-facts.yml
- ansible-test-pylint-deprecated-fix.yml
- dnf5-advisory-type.yml
- ensure_type.yml
- plugin-loader-trust-docs.yml
- preserve_config_origin.yml
- remove-warnings-retval.yml
- template-none.yml
release_date: '2025-05-12'

@ -0,0 +1,3 @@
release_summary: |
| Release Date: 2025-04-14
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__

@ -0,0 +1,3 @@
release_summary: |
| Release Date: 2025-04-24
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__

@ -0,0 +1,3 @@
release_summary: |
| Release Date: 2025-05-06
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__

@ -0,0 +1,3 @@
release_summary: |
| Release Date: 2025-05-12
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__

@ -1,2 +1,2 @@
bugfixes: bugfixes:
- iptables - Allows the wait paramater to be used with iptables chain creation (https://github.com/ansible/ansible/issues/84490) - iptables - Allows the wait parameter to be used with iptables chain creation (https://github.com/ansible/ansible/issues/84490)

@ -0,0 +1,3 @@
bugfixes:
- "ansible-doc - fix indentation for first line of descriptions of suboptions and sub-return values (https://github.com/ansible/ansible/pull/84690)."
- "ansible-doc - fix line wrapping for first line of description of options and return values (https://github.com/ansible/ansible/pull/84690)."

@ -0,0 +1,2 @@
bugfixes:
- dnf5 - avoid generating excessive transaction entries in the dnf5 history (https://github.com/ansible/ansible/issues/85046)

@ -0,0 +1,2 @@
minor_changes:
- facts - add "CloudStack KVM Hypervisor" for Linux VM in virtual facts (https://github.com/ansible/ansible/issues/85089).

@ -0,0 +1,3 @@
minor_changes:
- ansible-test - Use the ``-t`` option to set the stop timeout when stopping a container.
This avoids use of the ``--time`` option which was deprecated in Docker v28.0.

@ -0,0 +1,2 @@
bugfixes:
- ansible-test - Updated the ``pylint`` sanity test to skip some deprecation validation checks when all arguments are dynamic.

@ -0,0 +1,3 @@
---
minor_changes:
- comment filter - Improve the error message shown when an invalid ``style`` argument is provided.

@ -0,0 +1,2 @@
minor_changes:
- ansible-config will now show internal, but not test configuration entries. This allows for debugging but still denoting the configurations as internal use only (_ prefix).

@ -0,0 +1,17 @@
minor_changes:
- modules - The ``AnsibleModule.deprecate`` function no longer sends deprecation messages to the target host's logging system.
- ansible-test - Improved ``pylint`` checks for Ansible-specific deprecation functions.
- deprecations - Removed support for specifying deprecation dates as a ``datetime.date``, which was included in an earlier 2.19 pre-release.
- deprecations - Some argument names to ``deprecate_value`` for consistency with existing APIs.
An earlier 2.19 pre-release included a ``removal_`` prefix on the ``date`` and ``version`` arguments.
- deprecations - Collection name strings not of the form ``ns.coll`` passed to deprecation API functions will result in an error.
- collection metadata - The collection loader now parses scalar values from ``meta/runtime.yml`` as strings.
This avoids issues caused by unquoted values such as versions or dates being parsed as types other than strings.
- deprecation warnings - Deprecation warning APIs automatically capture the identity of the deprecating plugin.
The ``collection_name`` argument is only required to correctly attribute deprecations that occur in module_utils or other non-plugin code.
- deprecation warnings - Improved deprecation messages to more clearly indicate the affected content, including plugin name when available.
deprecated_features:
- plugins - Accessing plugins with ``_``-prefixed filenames without the ``_`` prefix is deprecated.
- Passing a ``warnings` or ``deprecations`` key to ``exit_json`` or ``fail_json`` is deprecated.
Use ``AnsibleModule.warn`` or ``AnsibleModule.deprecate`` instead.

@ -0,0 +1,2 @@
bugfixes:
- "dnf5 - when ``bugfix`` and/or ``security`` is specified, skip packages that do not have any such updates, even for new versions of libdnf5 where this functionality changed and it is considered failure"

@ -0,0 +1,2 @@
bugfixes:
- Ansible will now ensure predictable permissions on remote artifacts, until now it only ensured executable and relied on system masks for the rest.

@ -0,0 +1,15 @@
bugfixes:
- config - ``ensure_type`` correctly propagates trust and other tags on returned values.
- config - Prevented fatal errors when ``MODULE_IGNORE_EXTS`` configuration was set.
- config - ``ensure_type`` with expected type ``int`` now properly converts ``True`` and ``False`` values to ``int``.
Previously, these values were silently returned unmodified.
- config - ``ensure_type`` now reports an error when ``bytes`` are provided for any known ``value_type``.
Previously, the behavior was undefined, but often resulted in an unhandled exception or incorrect return type.
- config - ``ensure_type`` now converts sequences to ``list`` when requested, instead of returning the sequence.
- config - ``ensure_type`` now converts mappings to ``dict`` when requested, instead of returning the mapping.
- config - ``ensure_type`` now correctly errors when ``pathlist`` or ``pathspec`` types encounter non-string list items.
- config - Templating failures on config defaults now issue a warning.
Previously, failures silently returned an unrendered and untrusted template to the caller.
- convert_bool.boolean API conversion function - Unhashable values passed to ``boolean`` behave like other non-boolean convertible values,
returning False or raising ``TypeError`` depending on the value of ``strict``.
Previously, unhashable values always raised ``ValueError`` due to an invalid set membership check.

@ -0,0 +1,2 @@
bugfixes:
- "Remove use of `required` parameter in `get_bin_path` which has been deprecated."

@ -0,0 +1,2 @@
bugfixes:
- plugin loader - Apply template trust to strings loaded from plugin configuration definitions and doc fragments.

@ -0,0 +1,2 @@
bugfixes:
- config - Preserve or apply Origin tag to values returned by config.

@ -0,0 +1,2 @@
minor_changes:
- "modules - use ``AnsibleModule.warn`` instead of passing ``warnings`` to ``exit_json`` or ``fail_json`` which is deprecated."

@ -0,0 +1,2 @@
bugfixes:
- template action - Template files where the entire file's output renders as ``None`` are no longer emitted as the string "None", but instead render to an empty file as in previous releases.

@ -47,10 +47,6 @@ minor_changes:
- to_json / to_nice_json filters - The filters accept a ``profile`` argument, which defaults to ``tagless``. - to_json / to_nice_json filters - The filters accept a ``profile`` argument, which defaults to ``tagless``.
- undef jinja function - The ``undef`` jinja function now raises an error if a non-string hint is given. - undef jinja function - The ``undef`` jinja function now raises an error if a non-string hint is given.
Attempting to use an undefined hint also results in an error, ensuring incorrect use of the function can be distinguished from the function's normal behavior. Attempting to use an undefined hint also results in an error, ensuring incorrect use of the function can be distinguished from the function's normal behavior.
- display - The ``collection_name`` arg to ``Display.deprecated`` no longer has any effect.
Information about the calling plugin is automatically captured by the display infrastructure, included in the displayed messages, and made available to callbacks.
- modules - The ``collection_name`` arg to Python module-side ``deprecate`` methods no longer has any effect.
Information about the calling module is automatically captured by the warning infrastructure and included in the module result.
breaking_changes: breaking_changes:
- loops - Omit placeholders no longer leak between loop item templating and task templating. - loops - Omit placeholders no longer leak between loop item templating and task templating.
@ -173,6 +169,9 @@ deprecated_features:
- file loading - Loading text files with ``DataLoader`` containing data that cannot be decoded under the expected encoding is deprecated. - file loading - Loading text files with ``DataLoader`` containing data that cannot be decoded under the expected encoding is deprecated.
In most cases the encoding must be UTF-8, although some plugins allow choosing a different encoding. In most cases the encoding must be UTF-8, although some plugins allow choosing a different encoding.
Previously, invalid data was silently wrapped in Unicode surrogate escape sequences, often resulting in later errors or other data corruption. Previously, invalid data was silently wrapped in Unicode surrogate escape sequences, often resulting in later errors or other data corruption.
- callback plugins - The v1 callback API (callback methods not prefixed with `v2_`) is deprecated.
Use `v2_` prefixed methods instead.
- callback plugins - The `v2_on_any` callback method is deprecated. Use specific callback methods instead.
removed_features: removed_features:
- modules - Modules returning non-UTF8 strings now result in an error. - modules - Modules returning non-UTF8 strings now result in an error.

@ -40,7 +40,6 @@ import shutil
from pathlib import Path from pathlib import Path
from ansible.module_utils.common.messages import PluginInfo
from ansible.release import __version__ from ansible.release import __version__
import ansible.utils.vars as utils_vars import ansible.utils.vars as utils_vars
from ansible.parsing.dataloader import DataLoader from ansible.parsing.dataloader import DataLoader
@ -172,15 +171,8 @@ def boilerplate_module(modfile, args, interpreters, check, destfile):
modname = os.path.basename(modfile) modname = os.path.basename(modfile)
modname = os.path.splitext(modname)[0] modname = os.path.splitext(modname)[0]
plugin = PluginInfo(
requested_name=modname,
resolved_name=modname,
type='module',
)
built_module = module_common.modify_module( built_module = module_common.modify_module(
module_name=modname, module_name=modname,
plugin=plugin,
module_path=modfile, module_path=modfile,
module_args=complex_args, module_args=complex_args,
templar=Templar(loader=loader), templar=Templar(loader=loader),
@ -225,10 +217,11 @@ def ansiballz_setup(modfile, modname, interpreters):
# All the directories in an AnsiBallZ that modules can live # All the directories in an AnsiBallZ that modules can live
core_dirs = glob.glob(os.path.join(debug_dir, 'ansible/modules')) core_dirs = glob.glob(os.path.join(debug_dir, 'ansible/modules'))
non_core_dirs = glob.glob(os.path.join(debug_dir, 'ansible/legacy'))
collection_dirs = glob.glob(os.path.join(debug_dir, 'ansible_collections/*/*/plugins/modules')) collection_dirs = glob.glob(os.path.join(debug_dir, 'ansible_collections/*/*/plugins/modules'))
# There's only one module in an AnsiBallZ payload so look for the first module and then exit # There's only one module in an AnsiBallZ payload so look for the first module and then exit
for module_dir in core_dirs + collection_dirs: for module_dir in core_dirs + collection_dirs + non_core_dirs:
for dirname, directories, filenames in os.walk(module_dir): for dirname, directories, filenames in os.walk(module_dir):
for filename in filenames: for filename in filenames:
if filename == modname + '.py': if filename == modname + '.py':

@ -18,7 +18,7 @@ def get_controller_serialize_map() -> dict[type, t.Callable]:
return { return {
_lazy_containers._AnsibleLazyTemplateDict: _profiles._JSONSerializationProfile.discard_tags, _lazy_containers._AnsibleLazyTemplateDict: _profiles._JSONSerializationProfile.discard_tags,
_lazy_containers._AnsibleLazyTemplateList: _profiles._JSONSerializationProfile.discard_tags, _lazy_containers._AnsibleLazyTemplateList: _profiles._JSONSerializationProfile.discard_tags,
EncryptedString: str, # preserves tags since this is an intance of EncryptedString; if tags should be discarded from str, another entry will handle it EncryptedString: str, # preserves tags since this is an instance of EncryptedString; if tags should be discarded from str, another entry will handle it
} }

@ -42,7 +42,6 @@ def _ansiballz_main(
module_fqn: str, module_fqn: str,
params: str, params: str,
profile: str, profile: str,
plugin_info_dict: dict[str, object],
date_time: datetime.datetime, date_time: datetime.datetime,
coverage_config: str | None, coverage_config: str | None,
coverage_output: str | None, coverage_output: str | None,
@ -142,7 +141,6 @@ def _ansiballz_main(
run_module( run_module(
json_params=json_params, json_params=json_params,
profile=profile, profile=profile,
plugin_info_dict=plugin_info_dict,
module_fqn=module_fqn, module_fqn=module_fqn,
modlib_path=modlib_path, modlib_path=modlib_path,
coverage_config=coverage_config, coverage_config=coverage_config,
@ -230,13 +228,12 @@ def _ansiballz_main(
run_module( run_module(
json_params=json_params, json_params=json_params,
profile=profile, profile=profile,
plugin_info_dict=plugin_info_dict,
module_fqn=module_fqn, module_fqn=module_fqn,
modlib_path=modlib_path, modlib_path=modlib_path,
) )
else: else:
print('WARNING: Unknown debug command. Doing nothing.') print(f'FATAL: Unknown debug command {command!r}. Doing nothing.')
# #
# See comments in the debug() method for information on debugging # See comments in the debug() method for information on debugging

@ -0,0 +1,47 @@
from __future__ import annotations as _annotations
import collections.abc as _c
import typing as _t
_T_co = _t.TypeVar('_T_co', covariant=True)
class SequenceProxy(_c.Sequence[_T_co]):
"""A read-only sequence proxy."""
# DTFIX-RELEASE: needs unit test coverage
__slots__ = ('__value',)
def __init__(self, value: _c.Sequence[_T_co]) -> None:
self.__value = value
@_t.overload
def __getitem__(self, index: int) -> _T_co: ...
@_t.overload
def __getitem__(self, index: slice) -> _c.Sequence[_T_co]: ...
def __getitem__(self, index: int | slice) -> _T_co | _c.Sequence[_T_co]:
if isinstance(index, slice):
return self.__class__(self.__value[index])
return self.__value[index]
def __len__(self) -> int:
return len(self.__value)
def __contains__(self, item: object) -> bool:
return item in self.__value
def __iter__(self) -> _t.Iterator[_T_co]:
yield from self.__value
def __reversed__(self) -> _c.Iterator[_T_co]:
return reversed(self.__value)
def index(self, *args) -> int:
return self.__value.index(*args)
def count(self, value: object) -> int:
return self.__value.count(value)

@ -16,8 +16,8 @@ class ErrorAction(enum.Enum):
"""Action to take when an error is encountered.""" """Action to take when an error is encountered."""
IGNORE = enum.auto() IGNORE = enum.auto()
WARN = enum.auto() WARNING = enum.auto()
FAIL = enum.auto() ERROR = enum.auto()
@classmethod @classmethod
def from_config(cls, setting: str, variables: dict[str, t.Any] | None = None) -> t.Self: def from_config(cls, setting: str, variables: dict[str, t.Any] | None = None) -> t.Self:
@ -75,9 +75,9 @@ class ErrorHandler:
yield yield
except args as ex: except args as ex:
match self.action: match self.action:
case ErrorAction.WARN: case ErrorAction.WARNING:
display.error_as_warning(msg=None, exception=ex) display.error_as_warning(msg=None, exception=ex)
case ErrorAction.FAIL: case ErrorAction.ERROR:
raise raise
case _: # ErrorAction.IGNORE case _: # ErrorAction.IGNORE
pass pass

@ -4,6 +4,7 @@
from __future__ import annotations from __future__ import annotations
import enum
import json import json
import typing as t import typing as t
@ -19,7 +20,9 @@ from ansible.module_utils._internal._datatag import (
from ansible.module_utils._internal._json._profiles import _tagless from ansible.module_utils._internal._json._profiles import _tagless
from ansible.parsing.vault import EncryptedString from ansible.parsing.vault import EncryptedString
from ansible._internal._datatag._tags import Origin, TrustedAsTemplate from ansible._internal._datatag._tags import Origin, TrustedAsTemplate
from ansible._internal._templating import _transform
from ansible.module_utils import _internal from ansible.module_utils import _internal
from ansible.module_utils._internal import _datatag
_T = t.TypeVar('_T') _T = t.TypeVar('_T')
_sentinel = object() _sentinel = object()
@ -52,6 +55,19 @@ class StateTrackingMixIn(HasCurrent):
return self._stack[1:] + [self._current] return self._stack[1:] + [self._current]
class EncryptedStringBehavior(enum.Enum):
"""How `AnsibleVariableVisitor` will handle instances of `EncryptedString`."""
PRESERVE = enum.auto()
"""Preserves the unmodified `EncryptedString` instance."""
DECRYPT = enum.auto()
"""Replaces the value with its decrypted plaintext."""
REDACT = enum.auto()
"""Replaces the value with a placeholder string."""
FAIL = enum.auto()
"""Raises an `AnsibleVariableTypeError` error."""
class AnsibleVariableVisitor: class AnsibleVariableVisitor:
"""Utility visitor base class to recursively apply various behaviors and checks to variable object graphs.""" """Utility visitor base class to recursively apply various behaviors and checks to variable object graphs."""
@ -63,7 +79,9 @@ class AnsibleVariableVisitor:
convert_mapping_to_dict: bool = False, convert_mapping_to_dict: bool = False,
convert_sequence_to_list: bool = False, convert_sequence_to_list: bool = False,
convert_custom_scalars: bool = False, convert_custom_scalars: bool = False,
allow_encrypted_string: bool = False, convert_to_native_values: bool = False,
apply_transforms: bool = False,
encrypted_string_behavior: EncryptedStringBehavior = EncryptedStringBehavior.DECRYPT,
): ):
super().__init__() # supports StateTrackingMixIn super().__init__() # supports StateTrackingMixIn
@ -72,7 +90,16 @@ class AnsibleVariableVisitor:
self.convert_mapping_to_dict = convert_mapping_to_dict self.convert_mapping_to_dict = convert_mapping_to_dict
self.convert_sequence_to_list = convert_sequence_to_list self.convert_sequence_to_list = convert_sequence_to_list
self.convert_custom_scalars = convert_custom_scalars self.convert_custom_scalars = convert_custom_scalars
self.allow_encrypted_string = allow_encrypted_string self.convert_to_native_values = convert_to_native_values
self.apply_transforms = apply_transforms
self.encrypted_string_behavior = encrypted_string_behavior
if apply_transforms:
from ansible._internal._templating import _engine
self._template_engine = _engine.TemplateEngine()
else:
self._template_engine = None
self._current: t.Any = None # supports StateTrackingMixIn self._current: t.Any = None # supports StateTrackingMixIn
@ -113,9 +140,19 @@ class AnsibleVariableVisitor:
value_type = type(value) value_type = type(value)
if self.apply_transforms and value_type in _transform._type_transform_mapping:
value = self._template_engine.transform(value)
value_type = type(value)
# DTFIX-RELEASE: need to handle native copy for keys too
if self.convert_to_native_values and isinstance(value, _datatag.AnsibleTaggedObject):
value = value._native_copy()
value_type = type(value)
result: _T result: _T
# DTFIX-RELEASE: the visitor is ignoring dict/mapping keys except for debugging and schema-aware checking, it should be doing type checks on keys # DTFIX-RELEASE: the visitor is ignoring dict/mapping keys except for debugging and schema-aware checking, it should be doing type checks on keys
# keep in mind the allowed types for keys is a more restrictive set than for values (str and tagged str only, not EncryptedString)
# DTFIX-RELEASE: some type lists being consulted (the ones from datatag) are probably too permissive, and perhaps should not be dynamic # DTFIX-RELEASE: some type lists being consulted (the ones from datatag) are probably too permissive, and perhaps should not be dynamic
if (result := self._early_visit(value, value_type)) is not _sentinel: if (result := self._early_visit(value, value_type)) is not _sentinel:
@ -127,8 +164,14 @@ class AnsibleVariableVisitor:
elif value_type in _ANSIBLE_ALLOWED_NON_SCALAR_COLLECTION_VAR_TYPES: elif value_type in _ANSIBLE_ALLOWED_NON_SCALAR_COLLECTION_VAR_TYPES:
with self: # supports StateTrackingMixIn with self: # supports StateTrackingMixIn
result = AnsibleTagHelper.tag_copy(value, (self._visit(k, v) for k, v in enumerate(t.cast(t.Iterable, value))), value_type=value_type) result = AnsibleTagHelper.tag_copy(value, (self._visit(k, v) for k, v in enumerate(t.cast(t.Iterable, value))), value_type=value_type)
elif self.allow_encrypted_string and isinstance(value, EncryptedString): elif self.encrypted_string_behavior != EncryptedStringBehavior.FAIL and isinstance(value, EncryptedString):
return value # type: ignore[return-value] # DTFIX-RELEASE: this should probably only be allowed for values in dict, not keys (set, dict) match self.encrypted_string_behavior:
case EncryptedStringBehavior.REDACT:
result = "<redacted>" # type: ignore[assignment]
case EncryptedStringBehavior.PRESERVE:
result = value # type: ignore[assignment]
case EncryptedStringBehavior.DECRYPT:
result = str(value) # type: ignore[assignment]
elif self.convert_mapping_to_dict and _internal.is_intermediate_mapping(value): elif self.convert_mapping_to_dict and _internal.is_intermediate_mapping(value):
with self: # supports StateTrackingMixIn with self: # supports StateTrackingMixIn
result = {k: self._visit(k, v) for k, v in value.items()} # type: ignore[assignment] result = {k: self._visit(k, v) for k, v in value.items()} # type: ignore[assignment]

@ -8,13 +8,12 @@ from __future__ import annotations as _annotations
import datetime as _datetime import datetime as _datetime
import typing as _t import typing as _t
from ansible._internal import _json
from ansible._internal._datatag import _tags from ansible._internal._datatag import _tags
from ansible.module_utils._internal import _datatag from ansible.module_utils._internal import _datatag
from ansible.module_utils._internal._json import _profiles from ansible.module_utils._internal._json import _profiles
from ansible.parsing import vault as _vault from ansible.parsing import vault as _vault
from ... import _json
class _Untrusted: class _Untrusted:
""" """
@ -48,7 +47,7 @@ class _LegacyVariableVisitor(_json.AnsibleVariableVisitor):
convert_mapping_to_dict=convert_mapping_to_dict, convert_mapping_to_dict=convert_mapping_to_dict,
convert_sequence_to_list=convert_sequence_to_list, convert_sequence_to_list=convert_sequence_to_list,
convert_custom_scalars=convert_custom_scalars, convert_custom_scalars=convert_custom_scalars,
allow_encrypted_string=True, encrypted_string_behavior=_json.EncryptedStringBehavior.PRESERVE,
) )
self.invert_trust = invert_trust self.invert_trust = invert_trust

@ -12,7 +12,6 @@ from ansible.utils.display import Display
from ._access import NotifiableAccessContextBase from ._access import NotifiableAccessContextBase
from ._utils import TemplateContext from ._utils import TemplateContext
display = Display() display = Display()
@ -57,10 +56,10 @@ class DeprecatedAccessAuditContext(NotifiableAccessContextBase):
display._deprecated_with_plugin_info( display._deprecated_with_plugin_info(
msg=msg, msg=msg,
help_text=item.deprecated.help_text, help_text=item.deprecated.help_text,
version=item.deprecated.removal_version, version=item.deprecated.version,
date=item.deprecated.removal_date, date=item.deprecated.date,
obj=item.template, obj=item.template,
plugin=item.deprecated.plugin, deprecator=item.deprecated.deprecator,
) )
return result return result

@ -566,7 +566,12 @@ class TemplateEngine:
) )
if _TemplateConfig.allow_broken_conditionals: if _TemplateConfig.allow_broken_conditionals:
_display.deprecated(msg=msg, obj=conditional, help_text=self._BROKEN_CONDITIONAL_ALLOWED_FRAGMENT, version='2.23') _display.deprecated(
msg=msg,
obj=conditional,
help_text=self._BROKEN_CONDITIONAL_ALLOWED_FRAGMENT,
version='2.23',
)
return bool_result return bool_result

@ -985,12 +985,12 @@ def _maybe_finalize_scalar(o: t.Any) -> t.Any:
match _TemplateConfig.unknown_type_conversion_handler.action: match _TemplateConfig.unknown_type_conversion_handler.action:
# we don't want to show the object value, and it can't be Origin-tagged; send the current template value for best effort # we don't want to show the object value, and it can't be Origin-tagged; send the current template value for best effort
case ErrorAction.WARN: case ErrorAction.WARNING:
display.warning( display.warning(
msg=f'Type {native_type_name(o)!r} is unsupported in variable storage, converting to {native_type_name(target_type)!r}.', msg=f'Type {native_type_name(o)!r} is unsupported in variable storage, converting to {native_type_name(target_type)!r}.',
obj=TemplateContext.current(optional=True).template_value, obj=TemplateContext.current(optional=True).template_value,
) )
case ErrorAction.FAIL: case ErrorAction.ERROR:
raise AnsibleVariableTypeError.from_value(obj=TemplateContext.current(optional=True).template_value) raise AnsibleVariableTypeError.from_value(obj=TemplateContext.current(optional=True).template_value)
return target_type(o) return target_type(o)
@ -1006,12 +1006,12 @@ def _finalize_fallback_collection(
) -> t.Collection[t.Any]: ) -> t.Collection[t.Any]:
match _TemplateConfig.unknown_type_conversion_handler.action: match _TemplateConfig.unknown_type_conversion_handler.action:
# we don't want to show the object value, and it can't be Origin-tagged; send the current template value for best effort # we don't want to show the object value, and it can't be Origin-tagged; send the current template value for best effort
case ErrorAction.WARN: case ErrorAction.WARNING:
display.warning( display.warning(
msg=f'Type {native_type_name(o)!r} is unsupported in variable storage, converting to {native_type_name(target_type)!r}.', msg=f'Type {native_type_name(o)!r} is unsupported in variable storage, converting to {native_type_name(target_type)!r}.',
obj=TemplateContext.current(optional=True).template_value, obj=TemplateContext.current(optional=True).template_value,
) )
case ErrorAction.FAIL: case ErrorAction.ERROR:
raise AnsibleVariableTypeError.from_value(obj=TemplateContext.current(optional=True).template_value) raise AnsibleVariableTypeError.from_value(obj=TemplateContext.current(optional=True).template_value)
return _finalize_collection(o, mode, finalizer, target_type) return _finalize_collection(o, mode, finalizer, target_type)

@ -8,12 +8,7 @@ import datetime
import functools import functools
import typing as t import typing as t
from ansible.errors import (
AnsibleTemplatePluginError,
)
from ansible.module_utils._internal._ambient_context import AmbientContextBase from ansible.module_utils._internal._ambient_context import AmbientContextBase
from ansible.module_utils._internal._plugin_exec_context import PluginExecContext
from ansible.module_utils.common.collections import is_sequence from ansible.module_utils.common.collections import is_sequence
from ansible.module_utils._internal._datatag import AnsibleTagHelper from ansible.module_utils._internal._datatag import AnsibleTagHelper
from ansible._internal._datatag._tags import TrustedAsTemplate from ansible._internal._datatag._tags import TrustedAsTemplate
@ -115,7 +110,7 @@ class JinjaPluginIntercept(c.MutableMapping):
return first_marker return first_marker
try: try:
with JinjaCallContext(accept_lazy_markers=instance.accept_lazy_markers), PluginExecContext(executing_plugin=instance): with JinjaCallContext(accept_lazy_markers=instance.accept_lazy_markers):
return instance.j2_function(*lazify_container_args(args), **lazify_container_kwargs(kwargs)) return instance.j2_function(*lazify_container_args(args), **lazify_container_kwargs(kwargs))
except MarkerError as ex: except MarkerError as ex:
return ex.source return ex.source
@ -216,10 +211,7 @@ def _invoke_lookup(*, plugin_name: str, lookup_terms: list, lookup_kwargs: dict[
wantlist = lookup_kwargs.pop('wantlist', False) wantlist = lookup_kwargs.pop('wantlist', False)
errors = lookup_kwargs.pop('errors', 'strict') errors = lookup_kwargs.pop('errors', 'strict')
with ( with JinjaCallContext(accept_lazy_markers=instance.accept_lazy_markers):
JinjaCallContext(accept_lazy_markers=instance.accept_lazy_markers),
PluginExecContext(executing_plugin=instance),
):
try: try:
if _TemplateConfig.allow_embedded_templates: if _TemplateConfig.allow_embedded_templates:
# for backwards compat, only trust constant templates in lookup terms # for backwards compat, only trust constant templates in lookup terms
@ -263,15 +255,13 @@ def _invoke_lookup(*, plugin_name: str, lookup_terms: list, lookup_kwargs: dict[
return ex.source return ex.source
except Exception as ex: except Exception as ex:
# DTFIX-RELEASE: convert this to the new error/warn/ignore context manager # DTFIX-RELEASE: convert this to the new error/warn/ignore context manager
if isinstance(ex, AnsibleTemplatePluginError):
msg = f'Lookup failed but the error is being ignored: {ex}'
else:
msg = f'An unhandled exception occurred while running the lookup plugin {plugin_name!r}. Error was a {type(ex)}, original message: {ex}'
if errors == 'warn': if errors == 'warn':
_display.warning(msg) _display.error_as_warning(
msg=f'An error occurred while running the lookup plugin {plugin_name!r}.',
exception=ex,
)
elif errors == 'ignore': elif errors == 'ignore':
_display.display(msg, log_only=True) _display.display(f'An error of type {type(ex)} occurred while running the lookup plugin {plugin_name!r}: {ex}', log_only=True)
else: else:
raise AnsibleTemplatePluginRuntimeError('lookup', plugin_name) from ex raise AnsibleTemplatePluginRuntimeError('lookup', plugin_name) from ex

@ -0,0 +1,26 @@
"""
Testing utilities for use in integration tests, not unit tests or non-test code.
Provides better error behavior than Python's `assert` statement.
"""
from __future__ import annotations
import contextlib
import typing as t
class _Checker:
@staticmethod
def check(value: object, msg: str | None = 'Value is not truthy.') -> None:
"""Raise an `AssertionError` if the given `value` is not truthy."""
if not value:
raise AssertionError(msg)
@contextlib.contextmanager
def hard_fail_context(msg: str) -> t.Generator[_Checker]:
"""Enter a context which converts all exceptions to `BaseException` and provides a `Checker` instance for making assertions."""
try:
yield _Checker()
except BaseException as ex:
raise BaseException(f"Hard failure: {msg}") from ex

@ -10,7 +10,6 @@ import os
import signal import signal
import sys import sys
# We overload the ``ansible`` adhoc command to provide the functionality for # We overload the ``ansible`` adhoc command to provide the functionality for
# ``SSH_ASKPASS``. This code is here, and not in ``adhoc.py`` to bypass # ``SSH_ASKPASS``. This code is here, and not in ``adhoc.py`` to bypass
# unnecessary code. The program provided to ``SSH_ASKPASS`` can only be invoked # unnecessary code. The program provided to ``SSH_ASKPASS`` can only be invoked
@ -89,18 +88,25 @@ from ansible import _internal # do not remove or defer; ensures controller-spec
_internal.setup() _internal.setup()
from ansible.errors import AnsibleError, ExitCode
try: try:
from ansible import constants as C from ansible import constants as C
from ansible.utils.display import Display from ansible.utils.display import Display
display = Display() display = Display()
except Exception as ex: except Exception as ex:
print(f'ERROR: {ex}\n\n{"".join(traceback.format_exception(ex))}', file=sys.stderr) if isinstance(ex, AnsibleError):
ex_msg = ' '.join((ex.message, ex._help_text)).strip()
else:
ex_msg = str(ex)
print(f'ERROR: {ex_msg}\n\n{"".join(traceback.format_exception(ex))}', file=sys.stderr)
sys.exit(5) sys.exit(5)
from ansible import context from ansible import context
from ansible.utils import display as _display
from ansible.cli.arguments import option_helpers as opt_help from ansible.cli.arguments import option_helpers as opt_help
from ansible.errors import AnsibleError, ExitCode
from ansible.inventory.manager import InventoryManager from ansible.inventory.manager import InventoryManager
from ansible.module_utils.six import string_types from ansible.module_utils.six import string_types
from ansible.module_utils.common.text.converters import to_bytes, to_text from ansible.module_utils.common.text.converters import to_bytes, to_text
@ -116,6 +122,7 @@ from ansible.utils.collection_loader import AnsibleCollectionConfig
from ansible.utils.collection_loader._collection_finder import _get_collection_name_from_path from ansible.utils.collection_loader._collection_finder import _get_collection_name_from_path
from ansible.utils.path import unfrackpath from ansible.utils.path import unfrackpath
from ansible.vars.manager import VariableManager from ansible.vars.manager import VariableManager
from ansible.module_utils._internal import _deprecator
try: try:
import argcomplete import argcomplete
@ -139,7 +146,7 @@ def _launch_ssh_agent() -> None:
return return
case 'auto': case 'auto':
try: try:
ssh_agent_bin = get_bin_path('ssh-agent', required=True) ssh_agent_bin = get_bin_path('ssh-agent')
except ValueError as e: except ValueError as e:
raise AnsibleError('SSH_AGENT set to auto, but cannot find ssh-agent binary') from e raise AnsibleError('SSH_AGENT set to auto, but cannot find ssh-agent binary') from e
ssh_agent_dir = os.path.join(C.DEFAULT_LOCAL_TMP, 'ssh_agent') ssh_agent_dir = os.path.join(C.DEFAULT_LOCAL_TMP, 'ssh_agent')
@ -251,7 +258,7 @@ class CLI(ABC):
else: else:
display.v(u"No config file found; using defaults") display.v(u"No config file found; using defaults")
C.handle_config_noise(display) _display._report_config_warnings(_deprecator.ANSIBLE_CORE_DEPRECATOR)
@staticmethod @staticmethod
def split_vault_id(vault_id): def split_vault_id(vault_id):

@ -56,7 +56,10 @@ class DeprecatedArgument:
from ansible.utils.display import Display from ansible.utils.display import Display
Display().deprecated(f'The {option!r} argument is deprecated.', version=self.version) Display().deprecated( # pylint: disable=ansible-invalid-deprecated-version
msg=f'The {option!r} argument is deprecated.',
version=self.version,
)
class ArgumentParser(argparse.ArgumentParser): class ArgumentParser(argparse.ArgumentParser):
@ -532,13 +535,17 @@ def _tagged_type_factory(name: str, func: t.Callable[[str], object], /) -> t.Cal
def tag_value(value: str) -> object: def tag_value(value: str) -> object:
result = func(value) result = func(value)
if result is value: if result is value or func is str:
# Values which are not mutated are automatically trusted for templating. # Values which are not mutated are automatically trusted for templating.
# The `is` reference equality is critically important, as other types may only alter the tags, so object equality is # The `is` reference equality is critically important, as other types may only alter the tags, so object equality is
# not sufficient to prevent them being tagged as trusted when they should not. # not sufficient to prevent them being tagged as trusted when they should not.
# Explicitly include all usages using the `str` type factory since it strips tags.
result = TrustedAsTemplate().tag(result) result = TrustedAsTemplate().tag(result)
return Origin(description=f'<CLI option {name!r}>').tag(result) if not (origin := Origin.get_tag(value)):
origin = Origin(description=f'<CLI option {name!r}>')
return origin.tag(result)
tag_value._name = name # simplify debugging by attaching the argument name to the function tag_value._name = name # simplify debugging by attaching the argument name to the function

@ -1172,12 +1172,16 @@ class DocCLI(CLI, RoleMixin):
return 'version %s' % (version_added, ) return 'version %s' % (version_added, )
@staticmethod @staticmethod
def warp_fill(text, limit, initial_indent='', subsequent_indent='', **kwargs): def warp_fill(text, limit, initial_indent='', subsequent_indent='', initial_extra=0, **kwargs):
result = [] result = []
for paragraph in text.split('\n\n'): for paragraph in text.split('\n\n'):
result.append(textwrap.fill(paragraph, limit, initial_indent=initial_indent, subsequent_indent=subsequent_indent, wrapped = textwrap.fill(paragraph, limit, initial_indent=initial_indent + ' ' * initial_extra, subsequent_indent=subsequent_indent,
break_on_hyphens=False, break_long_words=False, drop_whitespace=True, **kwargs)) break_on_hyphens=False, break_long_words=False, drop_whitespace=True, **kwargs)
if initial_extra and wrapped.startswith(' ' * initial_extra):
wrapped = wrapped[initial_extra:]
result.append(wrapped)
initial_indent = subsequent_indent initial_indent = subsequent_indent
initial_extra = 0
return '\n'.join(result) return '\n'.join(result)
@staticmethod @staticmethod
@ -1209,20 +1213,23 @@ class DocCLI(CLI, RoleMixin):
text.append('') text.append('')
# TODO: push this to top of for and sort by size, create indent on largest key? # TODO: push this to top of for and sort by size, create indent on largest key?
inline_indent = base_indent + ' ' * max((len(opt_indent) - len(o)) - len(base_indent), 2) inline_indent = ' ' * max((len(opt_indent) - len(o)) - len(base_indent), 2)
sub_indent = inline_indent + ' ' * (len(o) + 3) extra_indent = base_indent + ' ' * (len(o) + 3)
sub_indent = inline_indent + extra_indent
if is_sequence(opt['description']): if is_sequence(opt['description']):
for entry_idx, entry in enumerate(opt['description'], 1): for entry_idx, entry in enumerate(opt['description'], 1):
if not isinstance(entry, string_types): if not isinstance(entry, string_types):
raise AnsibleError("Expected string in description of %s at index %s, got %s" % (o, entry_idx, type(entry))) raise AnsibleError("Expected string in description of %s at index %s, got %s" % (o, entry_idx, type(entry)))
if entry_idx == 1: if entry_idx == 1:
text.append(key + DocCLI.warp_fill(DocCLI.tty_ify(entry), limit, initial_indent=inline_indent, subsequent_indent=sub_indent)) text.append(key + DocCLI.warp_fill(DocCLI.tty_ify(entry), limit,
initial_indent=inline_indent, subsequent_indent=sub_indent, initial_extra=len(extra_indent)))
else: else:
text.append(DocCLI.warp_fill(DocCLI.tty_ify(entry), limit, initial_indent=sub_indent, subsequent_indent=sub_indent)) text.append(DocCLI.warp_fill(DocCLI.tty_ify(entry), limit, initial_indent=sub_indent, subsequent_indent=sub_indent))
else: else:
if not isinstance(opt['description'], string_types): if not isinstance(opt['description'], string_types):
raise AnsibleError("Expected string in description of %s, got %s" % (o, type(opt['description']))) raise AnsibleError("Expected string in description of %s, got %s" % (o, type(opt['description'])))
text.append(key + DocCLI.warp_fill(DocCLI.tty_ify(opt['description']), limit, initial_indent=inline_indent, subsequent_indent=sub_indent)) text.append(key + DocCLI.warp_fill(DocCLI.tty_ify(opt['description']), limit,
initial_indent=inline_indent, subsequent_indent=sub_indent, initial_extra=len(extra_indent)))
del opt['description'] del opt['description']
suboptions = [] suboptions = []
@ -1328,7 +1335,6 @@ class DocCLI(CLI, RoleMixin):
'This was unintentionally allowed when plugin attributes were added, ' 'This was unintentionally allowed when plugin attributes were added, '
'but the feature does not map well to role argument specs.', 'but the feature does not map well to role argument specs.',
version='2.20', version='2.20',
collection_name='ansible.builtin',
) )
text.append("") text.append("")
text.append(_format("ATTRIBUTES:", 'bold')) text.append(_format("ATTRIBUTES:", 'bold'))

@ -9,6 +9,18 @@ _ANSIBLE_CONNECTION_PATH:
- For internal use only. - For internal use only.
type: path type: path
version_added: "2.18" version_added: "2.18"
_CALLBACK_DISPATCH_ERROR_BEHAVIOR:
name: Callback dispatch error behavior
default: warning
description:
- Action to take when a callback dispatch results in an error.
type: choices
choices: &basic_error
error: issue a 'fatal' error and stop the play
warning: issue a warning but continue
ignore: just continue silently
env: [ { name: _ANSIBLE_CALLBACK_DISPATCH_ERROR_BEHAVIOR } ]
version_added: '2.19'
ALLOW_BROKEN_CONDITIONALS: ALLOW_BROKEN_CONDITIONALS:
# This config option will be deprecated once it no longer has any effect (2.23). # This config option will be deprecated once it no longer has any effect (2.23).
name: Allow broken conditionals name: Allow broken conditionals
@ -224,18 +236,6 @@ CACHE_PLUGIN_TIMEOUT:
- {key: fact_caching_timeout, section: defaults} - {key: fact_caching_timeout, section: defaults}
type: integer type: integer
yaml: {key: facts.cache.timeout} yaml: {key: facts.cache.timeout}
_CALLBACK_DISPATCH_ERROR_BEHAVIOR:
name: Callback dispatch error behavior
default: warn
description:
- Action to take when a callback dispatch results in an error.
type: choices
choices: &choices_ignore_warn_fail
- ignore
- warn
- fail
env: [ { name: _ANSIBLE_CALLBACK_DISPATCH_ERROR_BEHAVIOR } ]
version_added: '2.19'
COLLECTIONS_SCAN_SYS_PATH: COLLECTIONS_SCAN_SYS_PATH:
name: Scan PYTHONPATH for installed collections name: Scan PYTHONPATH for installed collections
description: A boolean to enable or disable scanning the sys.path for installed collections. description: A boolean to enable or disable scanning the sys.path for installed collections.
@ -268,10 +268,7 @@ COLLECTIONS_ON_ANSIBLE_VERSION_MISMATCH:
- When a collection is loaded that does not support the running Ansible version (with the collection metadata key `requires_ansible`). - When a collection is loaded that does not support the running Ansible version (with the collection metadata key `requires_ansible`).
env: [{name: ANSIBLE_COLLECTIONS_ON_ANSIBLE_VERSION_MISMATCH}] env: [{name: ANSIBLE_COLLECTIONS_ON_ANSIBLE_VERSION_MISMATCH}]
ini: [{key: collections_on_ansible_version_mismatch, section: defaults}] ini: [{key: collections_on_ansible_version_mismatch, section: defaults}]
choices: &basic_error choices: *basic_error
error: issue a 'fatal' error and stop the play
warning: issue a warning but continue
ignore: just continue silently
default: warning default: warning
COLOR_CHANGED: COLOR_CHANGED:
name: Color for 'changed' task status name: Color for 'changed' task status
@ -760,7 +757,7 @@ DEFAULT_HASH_BEHAVIOUR:
- {key: hash_behaviour, section: defaults} - {key: hash_behaviour, section: defaults}
DEFAULT_HOST_LIST: DEFAULT_HOST_LIST:
name: Inventory Source name: Inventory Source
default: /etc/ansible/hosts default: [/etc/ansible/hosts]
description: Comma-separated list of Ansible inventory sources description: Comma-separated list of Ansible inventory sources
env: env:
- name: ANSIBLE_INVENTORY - name: ANSIBLE_INVENTORY
@ -1057,7 +1054,7 @@ DEFAULT_ROLES_PATH:
yaml: {key: defaults.roles_path} yaml: {key: defaults.roles_path}
DEFAULT_SELINUX_SPECIAL_FS: DEFAULT_SELINUX_SPECIAL_FS:
name: Problematic file systems name: Problematic file systems
default: fuse, nfs, vboxsf, ramfs, 9p, vfat default: [fuse, nfs, vboxsf, ramfs, 9p, vfat]
description: description:
- "Some filesystems do not support safe operations and/or return inconsistent errors, - "Some filesystems do not support safe operations and/or return inconsistent errors,
this setting makes Ansible 'tolerate' those in the list without causing fatal errors." this setting makes Ansible 'tolerate' those in the list without causing fatal errors."
@ -1202,15 +1199,6 @@ DEFAULT_VARS_PLUGIN_PATH:
ini: ini:
- {key: vars_plugins, section: defaults} - {key: vars_plugins, section: defaults}
type: pathspec type: pathspec
# TODO: unused?
#DEFAULT_VAR_COMPRESSION_LEVEL:
# default: 0
# description: 'TODO: write it'
# env: [{name: ANSIBLE_VAR_COMPRESSION_LEVEL}]
# ini:
# - {key: var_compression_level, section: defaults}
# type: integer
# yaml: {key: defaults.var_compression_level}
DEFAULT_VAULT_ID_MATCH: DEFAULT_VAULT_ID_MATCH:
name: Force vault id match name: Force vault id match
default: False default: False
@ -1336,7 +1324,7 @@ DISPLAY_SKIPPED_HOSTS:
type: boolean type: boolean
DISPLAY_TRACEBACK: DISPLAY_TRACEBACK:
name: Control traceback display name: Control traceback display
default: never default: [never]
description: When to include tracebacks in extended error messages description: When to include tracebacks in extended error messages
env: env:
- name: ANSIBLE_DISPLAY_TRACEBACK - name: ANSIBLE_DISPLAY_TRACEBACK
@ -1483,15 +1471,6 @@ GALAXY_COLLECTIONS_PATH_WARNING:
ini: ini:
- {key: collections_path_warning, section: galaxy} - {key: collections_path_warning, section: galaxy}
version_added: "2.16" version_added: "2.16"
# TODO: unused?
#GALAXY_SCMS:
# name: Galaxy SCMS
# default: git, hg
# description: Available galaxy source control management systems.
# env: [{name: ANSIBLE_GALAXY_SCMS}]
# ini:
# - {key: scms, section: galaxy}
# type: list
GALAXY_SERVER: GALAXY_SERVER:
default: https://galaxy.ansible.com default: https://galaxy.ansible.com
description: "URL to prepend when roles don't specify the full URI, assume they are referencing this server as the source." description: "URL to prepend when roles don't specify the full URI, assume they are referencing this server as the source."
@ -1734,7 +1713,7 @@ INVENTORY_EXPORT:
type: bool type: bool
INVENTORY_IGNORE_EXTS: INVENTORY_IGNORE_EXTS:
name: Inventory ignore extensions name: Inventory ignore extensions
default: "{{(REJECT_EXTS + ('.orig', '.cfg', '.retry'))}}" default: "{{ REJECT_EXTS + ['.orig', '.cfg', '.retry'] }}"
description: List of extensions to ignore when using a directory as an inventory source. description: List of extensions to ignore when using a directory as an inventory source.
env: [{name: ANSIBLE_INVENTORY_IGNORE}] env: [{name: ANSIBLE_INVENTORY_IGNORE}]
ini: ini:
@ -1791,7 +1770,7 @@ INJECT_FACTS_AS_VARS:
version_added: "2.5" version_added: "2.5"
MODULE_IGNORE_EXTS: MODULE_IGNORE_EXTS:
name: Module ignore extensions name: Module ignore extensions
default: "{{(REJECT_EXTS + ('.yaml', '.yml', '.ini'))}}" default: "{{ REJECT_EXTS + ['.yaml', '.yml', '.ini'] }}"
description: description:
- List of extensions to ignore when looking for modules to load. - List of extensions to ignore when looking for modules to load.
- This is for rejecting script and binary module fallback extensions. - This is for rejecting script and binary module fallback extensions.
@ -2058,13 +2037,13 @@ TASK_TIMEOUT:
version_added: '2.10' version_added: '2.10'
_TEMPLAR_UNKNOWN_TYPE_CONVERSION: _TEMPLAR_UNKNOWN_TYPE_CONVERSION:
name: Templar unknown type conversion behavior name: Templar unknown type conversion behavior
default: warn default: warning
description: description:
- Action to take when an unknown type is converted for variable storage during template finalization. - Action to take when an unknown type is converted for variable storage during template finalization.
- This setting has no effect on the inability to store unsupported variable types as the result of templating. - This setting has no effect on the inability to store unsupported variable types as the result of templating.
- Experimental diagnostic feature, subject to change. - Experimental diagnostic feature, subject to change.
type: choices type: choices
choices: *choices_ignore_warn_fail choices: *basic_error
env: [{name: _ANSIBLE_TEMPLAR_UNKNOWN_TYPE_CONVERSION}] env: [{name: _ANSIBLE_TEMPLAR_UNKNOWN_TYPE_CONVERSION}]
version_added: '2.19' version_added: '2.19'
_TEMPLAR_UNKNOWN_TYPE_ENCOUNTERED: _TEMPLAR_UNKNOWN_TYPE_ENCOUNTERED:
@ -2074,7 +2053,7 @@ _TEMPLAR_UNKNOWN_TYPE_ENCOUNTERED:
- Action to take when an unknown type is encountered inside a template pipeline. - Action to take when an unknown type is encountered inside a template pipeline.
- Experimental diagnostic feature, subject to change. - Experimental diagnostic feature, subject to change.
type: choices type: choices
choices: *choices_ignore_warn_fail choices: *basic_error
env: [{name: _ANSIBLE_TEMPLAR_UNKNOWN_TYPE_ENCOUNTERED}] env: [{name: _ANSIBLE_TEMPLAR_UNKNOWN_TYPE_ENCOUNTERED}]
version_added: '2.19' version_added: '2.19'
_TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR: _TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR:
@ -2086,7 +2065,7 @@ _TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR:
- This setting has no effect on expressions. - This setting has no effect on expressions.
- Experimental diagnostic feature, subject to change. - Experimental diagnostic feature, subject to change.
type: choices type: choices
choices: *choices_ignore_warn_fail choices: *basic_error
env: [{name: _ANSIBLE_TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR}] env: [{name: _ANSIBLE_TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR}]
version_added: '2.19' version_added: '2.19'
WORKER_SHUTDOWN_POLL_COUNT: WORKER_SHUTDOWN_POLL_COUNT:

@ -16,11 +16,12 @@ import typing as t
from collections.abc import Mapping, Sequence from collections.abc import Mapping, Sequence
from jinja2.nativetypes import NativeEnvironment from jinja2.nativetypes import NativeEnvironment
from ansible._internal._datatag import _tags
from ansible.errors import AnsibleOptionsError, AnsibleError, AnsibleUndefinedConfigEntry, AnsibleRequiredOptionError from ansible.errors import AnsibleOptionsError, AnsibleError, AnsibleUndefinedConfigEntry, AnsibleRequiredOptionError
from ansible.module_utils._internal._datatag import AnsibleTagHelper
from ansible.module_utils.common.sentinel import Sentinel from ansible.module_utils.common.sentinel import Sentinel
from ansible.module_utils.common.text.converters import to_text, to_bytes, to_native from ansible.module_utils.common.text.converters import to_text, to_bytes, to_native
from ansible.module_utils.common.yaml import yaml_load from ansible.module_utils.common.yaml import yaml_load
from ansible.module_utils.six import string_types
from ansible.module_utils.parsing.convert_bool import boolean from ansible.module_utils.parsing.convert_bool import boolean
from ansible.parsing.quoting import unquote from ansible.parsing.quoting import unquote
from ansible.utils.path import cleanup_tmp_file, makedirs_safe, unfrackpath from ansible.utils.path import cleanup_tmp_file, makedirs_safe, unfrackpath
@ -50,6 +51,14 @@ GALAXY_SERVER_ADDITIONAL = {
} }
@t.runtime_checkable
class _EncryptedStringProtocol(t.Protocol):
"""Protocol representing an `EncryptedString`, since it cannot be imported here."""
# DTFIX-FUTURE: collapse this with the one in collection loader, once we can
def _decrypt(self) -> str: ...
def _get_config_label(plugin_type: str, plugin_name: str, config: str) -> str: def _get_config_label(plugin_type: str, plugin_name: str, config: str) -> str:
"""Return a label for the given config.""" """Return a label for the given config."""
entry = f'{config!r}' entry = f'{config!r}'
@ -65,133 +74,157 @@ def _get_config_label(plugin_type: str, plugin_name: str, config: str) -> str:
return entry return entry
# FIXME: see if we can unify in module_utils with similar function used by argspec def ensure_type(value: object, value_type: str | None, origin: str | None = None, origin_ftype: str | None = None) -> t.Any:
def ensure_type(value, value_type, origin=None, origin_ftype=None): """
""" return a configuration variable with casting Converts `value` to the requested `value_type`; raises `ValueError` for failed conversions.
:arg value: The value to ensure correct typing of
:kwarg value_type: The type of the value. This can be any of the following strings: Values for `value_type` are:
:boolean: sets the value to a True or False value
:bool: Same as 'boolean' * boolean/bool: Return a `bool` by applying non-strict `bool` filter rules:
:integer: Sets the value to an integer or raises a ValueType error 'y', 'yes', 'on', '1', 'true', 't', 1, 1.0, True return True, any other value is False.
:int: Same as 'integer' * integer/int: Return an `int`. Accepts any `str` parseable by `int` or numeric value with a zero mantissa (including `bool`).
:float: Sets the value to a float or raises a ValueType error * float: Return a `float`. Accepts any `str` parseable by `float` or numeric value (including `bool`).
:list: Treats the value as a comma separated list. Split the value * list: Return a `list`. Accepts `list` or `Sequence`. Also accepts, `str`, splitting on ',' while stripping whitespace and unquoting items.
and return it as a python list. * none: Return `None`. Accepts only the string "None".
:none: Sets the value to None * path: Return a resolved path. Accepts `str`.
:path: Expands any environment variables and tilde's in the value. * temppath/tmppath/tmp: Return a unique temporary directory inside the resolved path specified by the value.
:tmppath: Create a unique temporary directory inside of the directory * pathspec: Return a `list` of resolved paths. Accepts a `list` or `Sequence`. Also accepts `str`, splitting on ':'.
specified by value and return its path. * pathlist: Return a `list` of resolved paths. Accepts a `list` or `Sequence`. Also accepts `str`, splitting on `,` while stripping whitespace from paths.
:temppath: Same as 'tmppath' * dictionary/dict: Return a `dict`. Accepts `dict` or `Mapping`.
:tmp: Same as 'tmppath' * string/str: Return a `str`. Accepts `bool`, `int`, `float`, `complex` or `str`.
:pathlist: Treat the value as a typical PATH string. (On POSIX, this
means comma separated strings.) Split the value and then expand Path resolution ensures paths are `str` with expansion of '{{CWD}}', environment variables and '~'.
each part for environment variables and tildes. Non-absolute paths are expanded relative to the basedir from `origin`, if specified.
:pathspec: Treat the value as a PATH string. Expands any environment variables
tildes's in the value. No conversion is performed if `value_type` is unknown or `value` is `None`.
:str: Sets the value to string types. When `origin_ftype` is "ini", a `str` result will be unquoted.
:string: Same as 'str'
""" """
errmsg = '' if value is None:
basedir = None return None
if origin and os.path.isabs(origin) and os.path.exists(to_bytes(origin)):
basedir = origin original_value = value
copy_tags = value_type not in ('temppath', 'tmppath', 'tmp')
value = _ensure_type(value, value_type, origin)
if copy_tags and value is not original_value:
if isinstance(value, list):
value = [AnsibleTagHelper.tag_copy(original_value, item) for item in value]
value = AnsibleTagHelper.tag_copy(original_value, value)
if isinstance(value, str) and origin_ftype and origin_ftype == 'ini':
value = unquote(value)
return value
def _ensure_type(value: object, value_type: str | None, origin: str | None = None) -> t.Any:
"""Internal implementation for `ensure_type`, call that function instead."""
original_value = value
basedir = origin if origin and os.path.isabs(origin) and os.path.exists(to_bytes(origin)) else None
if value_type: if value_type:
value_type = value_type.lower() value_type = value_type.lower()
if value is not None: match value_type:
if value_type in ('boolean', 'bool'): case 'boolean' | 'bool':
value = boolean(value, strict=False) return boolean(value, strict=False)
case 'integer' | 'int':
if isinstance(value, int): # handle both int and bool (which is an int)
return int(value)
elif value_type in ('integer', 'int'): if isinstance(value, (float, str)):
if not isinstance(value, int):
try: try:
# use Decimal for all other source type conversions; non-zero mantissa is a failure
if (decimal_value := decimal.Decimal(value)) == (int_part := int(decimal_value)): if (decimal_value := decimal.Decimal(value)) == (int_part := int(decimal_value)):
value = int_part return int_part
else: except (decimal.DecimalException, ValueError):
errmsg = 'int' pass
except decimal.DecimalException:
errmsg = 'int' case 'float':
if isinstance(value, float):
return value
elif value_type == 'float': if isinstance(value, (int, str)):
if not isinstance(value, float): try:
value = float(value) return float(value)
except ValueError:
pass
case 'list':
if isinstance(value, list):
return value
elif value_type == 'list': if isinstance(value, str):
if isinstance(value, string_types): return [unquote(x.strip()) for x in value.split(',')]
value = [unquote(x.strip()) for x in value.split(',')]
elif not isinstance(value, Sequence):
errmsg = 'list'
elif value_type == 'none': if isinstance(value, Sequence) and not isinstance(value, bytes):
return list(value)
case 'none':
if value == "None": if value == "None":
value = None return None
if value is not None: case 'path':
errmsg = 'None' if isinstance(value, str):
return resolve_path(value, basedir=basedir)
elif value_type == 'path': case 'temppath' | 'tmppath' | 'tmp':
if isinstance(value, string_types): if isinstance(value, str):
value = resolve_path(value, basedir=basedir) value = resolve_path(value, basedir=basedir)
else:
errmsg = 'path'
elif value_type in ('tmp', 'temppath', 'tmppath'):
if isinstance(value, string_types):
value = resolve_path(value, basedir=basedir)
if not os.path.exists(value): if not os.path.exists(value):
makedirs_safe(value, 0o700) makedirs_safe(value, 0o700)
prefix = 'ansible-local-%s' % os.getpid() prefix = 'ansible-local-%s' % os.getpid()
value = tempfile.mkdtemp(prefix=prefix, dir=value) value = tempfile.mkdtemp(prefix=prefix, dir=value)
atexit.register(cleanup_tmp_file, value, warn=True) atexit.register(cleanup_tmp_file, value, warn=True)
else:
errmsg = 'temppath'
elif value_type == 'pathspec': return value
if isinstance(value, string_types):
case 'pathspec':
if isinstance(value, str):
value = value.split(os.pathsep) value = value.split(os.pathsep)
if isinstance(value, Sequence): if isinstance(value, Sequence) and not isinstance(value, bytes) and all(isinstance(x, str) for x in value):
value = [resolve_path(x, basedir=basedir) for x in value] return [resolve_path(x, basedir=basedir) for x in value]
else:
errmsg = 'pathspec'
elif value_type == 'pathlist': case 'pathlist':
if isinstance(value, string_types): if isinstance(value, str):
value = [x.strip() for x in value.split(',')] value = [x.strip() for x in value.split(',')]
if isinstance(value, Sequence): if isinstance(value, Sequence) and not isinstance(value, bytes) and all(isinstance(x, str) for x in value):
value = [resolve_path(x, basedir=basedir) for x in value] return [resolve_path(x, basedir=basedir) for x in value]
else:
errmsg = 'pathlist'
elif value_type in ('dict', 'dictionary'): case 'dictionary' | 'dict':
if not isinstance(value, Mapping): if isinstance(value, dict):
errmsg = 'dictionary' return value
elif value_type in ('str', 'string'): if isinstance(value, Mapping):
if isinstance(value, (string_types, bool, int, float, complex)): return dict(value)
value = to_text(value, errors='surrogate_or_strict')
if origin_ftype and origin_ftype == 'ini': case 'string' | 'str':
value = unquote(value) if isinstance(value, str):
else: return value
errmsg = 'string'
# defaults to string type if isinstance(value, (bool, int, float, complex)):
elif isinstance(value, (string_types)): return str(value)
value = to_text(value, errors='surrogate_or_strict')
if origin_ftype and origin_ftype == 'ini':
value = unquote(value)
if errmsg: if isinstance(value, _EncryptedStringProtocol):
raise ValueError(f'Invalid type provided for {errmsg!r}: {value!r}') return value._decrypt()
return to_text(value, errors='surrogate_or_strict', nonstring='passthru') case _:
# FIXME: define and document a pass-through value_type (None, 'raw', 'object', '', ...) and then deprecate acceptance of unknown types
return value # return non-str values of unknown value_type as-is
raise ValueError(f'Invalid value provided for {value_type!r}: {original_value!r}')
# FIXME: see if this can live in utils/path # FIXME: see if this can live in utils/path
def resolve_path(path, basedir=None): def resolve_path(path: str, basedir: str | None = None) -> str:
""" resolve relative or 'variable' paths """ """ resolve relative or 'variable' paths """
if '{{CWD}}' in path: # allow users to force CWD using 'magic' {{CWD}} if '{{CWD}}' in path: # allow users to force CWD using 'magic' {{CWD}}
path = path.replace('{{CWD}}', os.getcwd()) path = path.replace('{{CWD}}', os.getcwd())
@ -304,11 +337,13 @@ def _add_base_defs_deprecations(base_defs):
process(entry) process(entry)
class ConfigManager(object): class ConfigManager:
DEPRECATED = [] # type: list[tuple[str, dict[str, str]]] DEPRECATED = [] # type: list[tuple[str, dict[str, str]]]
WARNINGS = set() # type: set[str] WARNINGS = set() # type: set[str]
_errors: list[tuple[str, Exception]]
def __init__(self, conf_file=None, defs_file=None): def __init__(self, conf_file=None, defs_file=None):
self._base_defs = {} self._base_defs = {}
@ -329,6 +364,9 @@ class ConfigManager(object):
# initialize parser and read config # initialize parser and read config
self._parse_config_file() self._parse_config_file()
self._errors = []
"""Deferred errors that will be turned into warnings."""
# ensure we always have config def entry # ensure we always have config def entry
self._base_defs['CONFIG_FILE'] = {'default': None, 'type': 'path'} self._base_defs['CONFIG_FILE'] = {'default': None, 'type': 'path'}
@ -368,16 +406,16 @@ class ConfigManager(object):
defs = dict((k, server_config_def(server_key, k, req, value_type)) for k, req, value_type in GALAXY_SERVER_DEF) defs = dict((k, server_config_def(server_key, k, req, value_type)) for k, req, value_type in GALAXY_SERVER_DEF)
self.initialize_plugin_configuration_definitions('galaxy_server', server_key, defs) self.initialize_plugin_configuration_definitions('galaxy_server', server_key, defs)
def template_default(self, value, variables): def template_default(self, value, variables, key_name: str = '<unknown>'):
if isinstance(value, string_types) and (value.startswith('{{') and value.endswith('}}')) and variables is not None: if isinstance(value, str) and (value.startswith('{{') and value.endswith('}}')) and variables is not None:
# template default values if possible # template default values if possible
# NOTE: cannot use is_template due to circular dep # NOTE: cannot use is_template due to circular dep
try: try:
# FIXME: This really should be using an immutable sandboxed native environment, not just native environment # FIXME: This really should be using an immutable sandboxed native environment, not just native environment
t = NativeEnvironment().from_string(value) template = NativeEnvironment().from_string(value)
value = t.render(variables) value = template.render(variables)
except Exception: except Exception as ex:
pass # not templatable self._errors.append((f'Failed to template default for config {key_name}.', ex))
return value return value
def _read_config_yaml_file(self, yml_file): def _read_config_yaml_file(self, yml_file):
@ -480,9 +518,9 @@ class ConfigManager(object):
else: else:
ret = self._plugins.get(plugin_type, {}).get(name, {}) ret = self._plugins.get(plugin_type, {}).get(name, {})
if ignore_private: if ignore_private: # ignore 'test' config entries, they should not change runtime behaviors
for cdef in list(ret.keys()): for cdef in list(ret.keys()):
if cdef.startswith('_'): if cdef.startswith('_Z_'):
del ret[cdef] del ret[cdef]
return ret return ret
@ -631,7 +669,7 @@ class ConfigManager(object):
raise AnsibleRequiredOptionError(f"Required config {_get_config_label(plugin_type, plugin_name, config)} not provided.") raise AnsibleRequiredOptionError(f"Required config {_get_config_label(plugin_type, plugin_name, config)} not provided.")
else: else:
origin = 'default' origin = 'default'
value = self.template_default(defs[config].get('default'), variables) value = self.template_default(defs[config].get('default'), variables, key_name=_get_config_label(plugin_type, plugin_name, config))
try: try:
# ensure correct type, can raise exceptions on mismatched types # ensure correct type, can raise exceptions on mismatched types
@ -658,7 +696,7 @@ class ConfigManager(object):
if isinstance(defs[config]['choices'], Mapping): if isinstance(defs[config]['choices'], Mapping):
valid = ', '.join([to_text(k) for k in defs[config]['choices'].keys()]) valid = ', '.join([to_text(k) for k in defs[config]['choices'].keys()])
elif isinstance(defs[config]['choices'], string_types): elif isinstance(defs[config]['choices'], str):
valid = defs[config]['choices'] valid = defs[config]['choices']
elif isinstance(defs[config]['choices'], Sequence): elif isinstance(defs[config]['choices'], Sequence):
valid = ', '.join([to_text(c) for c in defs[config]['choices']]) valid = ', '.join([to_text(c) for c in defs[config]['choices']])
@ -674,6 +712,9 @@ class ConfigManager(object):
else: else:
raise AnsibleUndefinedConfigEntry(f'No config definition exists for {_get_config_label(plugin_type, plugin_name, config)}.') raise AnsibleUndefinedConfigEntry(f'No config definition exists for {_get_config_label(plugin_type, plugin_name, config)}.')
if not _tags.Origin.is_tagged_on(value):
value = _tags.Origin(description=f'<Config {origin}>').tag(value)
return value, origin return value, origin
def initialize_plugin_configuration_definitions(self, plugin_type, name, defs): def initialize_plugin_configuration_definitions(self, plugin_type, name, defs):

@ -10,9 +10,7 @@ from string import ascii_letters, digits
from ansible.config.manager import ConfigManager from ansible.config.manager import ConfigManager
from ansible.module_utils.common.text.converters import to_text from ansible.module_utils.common.text.converters import to_text
from ansible.module_utils.common.collections import Sequence
from ansible.module_utils.parsing.convert_bool import BOOLEANS_TRUE from ansible.module_utils.parsing.convert_bool import BOOLEANS_TRUE
from ansible.release import __version__
from ansible.utils.fqcn import add_internal_fqcns from ansible.utils.fqcn import add_internal_fqcns
# initialize config manager/config data to read/store global settings # initialize config manager/config data to read/store global settings
@ -20,68 +18,11 @@ from ansible.utils.fqcn import add_internal_fqcns
config = ConfigManager() config = ConfigManager()
def _warning(msg):
""" display is not guaranteed here, nor it being the full class, but try anyways, fallback to sys.stderr.write """
try:
from ansible.utils.display import Display
Display().warning(msg)
except Exception:
import sys
sys.stderr.write(' [WARNING] %s\n' % (msg))
def _deprecated(msg, version):
""" display is not guaranteed here, nor it being the full class, but try anyways, fallback to sys.stderr.write """
try:
from ansible.utils.display import Display
Display().deprecated(msg, version=version)
except Exception:
import sys
sys.stderr.write(' [DEPRECATED] %s, to be removed in %s\n' % (msg, version))
def handle_config_noise(display=None):
if display is not None:
w = display.warning
d = display.deprecated
else:
w = _warning
d = _deprecated
while config.WARNINGS:
warn = config.WARNINGS.pop()
w(warn)
while config.DEPRECATED:
# tuple with name and options
dep = config.DEPRECATED.pop(0)
msg = config.get_deprecated_msg_from_config(dep[1])
# use tabs only for ansible-doc?
msg = msg.replace("\t", "")
d(f"{dep[0]} option. {msg}", version=dep[1]['version'])
def set_constant(name, value, export=vars()): def set_constant(name, value, export=vars()):
""" sets constants and returns resolved options dict """ """ sets constants and returns resolved options dict """
export[name] = value export[name] = value
class _DeprecatedSequenceConstant(Sequence):
def __init__(self, value, msg, version):
self._value = value
self._msg = msg
self._version = version
def __len__(self):
_deprecated(self._msg, self._version)
return len(self._value)
def __getitem__(self, y):
_deprecated(self._msg, self._version)
return self._value[y]
# CONSTANTS ### yes, actual ones # CONSTANTS ### yes, actual ones
# The following are hard-coded action names # The following are hard-coded action names
@ -119,7 +60,7 @@ COLOR_CODES = {
'magenta': u'0;35', 'bright magenta': u'1;35', 'magenta': u'0;35', 'bright magenta': u'1;35',
'normal': u'0', 'normal': u'0',
} }
REJECT_EXTS = ('.pyc', '.pyo', '.swp', '.bak', '~', '.rpm', '.md', '.txt', '.rst') REJECT_EXTS = ['.pyc', '.pyo', '.swp', '.bak', '~', '.rpm', '.md', '.txt', '.rst'] # this is concatenated with other config settings as lists; cannot be tuple
BOOL_TRUE = BOOLEANS_TRUE BOOL_TRUE = BOOLEANS_TRUE
COLLECTION_PTYPE_COMPAT = {'module': 'modules'} COLLECTION_PTYPE_COMPAT = {'module': 'modules'}
@ -245,6 +186,3 @@ MAGIC_VARIABLE_MAPPING = dict(
# POPULATE SETTINGS FROM CONFIG ### # POPULATE SETTINGS FROM CONFIG ###
for setting in config.get_configuration_definitions(): for setting in config.get_configuration_definitions():
set_constant(setting, config.get_config_value(setting, variables=vars())) set_constant(setting, config.get_config_value(setting, variables=vars()))
# emit any warnings or deprecations
handle_config_noise()

@ -18,6 +18,9 @@ from ..module_utils.datatag import native_type_name
from ansible._internal._datatag import _tags from ansible._internal._datatag import _tags
from .._internal._errors import _utils from .._internal._errors import _utils
if t.TYPE_CHECKING:
from ansible.plugins import loader as _t_loader
class ExitCode(enum.IntEnum): class ExitCode(enum.IntEnum):
SUCCESS = 0 # used by TQM, must be bit-flag safe SUCCESS = 0 # used by TQM, must be bit-flag safe
@ -374,8 +377,9 @@ class _AnsibleActionDone(AnsibleAction):
class AnsiblePluginError(AnsibleError): class AnsiblePluginError(AnsibleError):
"""Base class for Ansible plugin-related errors that do not need AnsibleError contextual data.""" """Base class for Ansible plugin-related errors that do not need AnsibleError contextual data."""
def __init__(self, message=None, plugin_load_context=None): def __init__(self, message: str | None = None, plugin_load_context: _t_loader.PluginLoadContext | None = None, help_text: str | None = None) -> None:
super(AnsiblePluginError, self).__init__(message) super(AnsiblePluginError, self).__init__(message, help_text=help_text)
self.plugin_load_context = plugin_load_context self.plugin_load_context = plugin_load_context

@ -39,7 +39,6 @@ from io import BytesIO
from ansible._internal import _locking from ansible._internal import _locking
from ansible._internal._datatag import _utils from ansible._internal._datatag import _utils
from ansible.module_utils._internal import _dataclass_validation from ansible.module_utils._internal import _dataclass_validation
from ansible.module_utils.common.messages import PluginInfo
from ansible.module_utils.common.yaml import yaml_load from ansible.module_utils.common.yaml import yaml_load
from ansible._internal._datatag._tags import Origin from ansible._internal._datatag._tags import Origin
from ansible.module_utils.common.json import Direction, get_module_encoder from ansible.module_utils.common.json import Direction, get_module_encoder
@ -56,6 +55,7 @@ from ansible.template import Templar
from ansible.utils.collection_loader._collection_finder import _get_collection_metadata, _nested_dict_get from ansible.utils.collection_loader._collection_finder import _get_collection_metadata, _nested_dict_get
from ansible.module_utils._internal import _json, _ansiballz from ansible.module_utils._internal import _json, _ansiballz
from ansible.module_utils import basic as _basic from ansible.module_utils import basic as _basic
from ansible.module_utils.common import messages as _messages
if t.TYPE_CHECKING: if t.TYPE_CHECKING:
from ansible import template as _template from ansible import template as _template
@ -434,7 +434,13 @@ class ModuleUtilLocatorBase:
else: else:
msg += '.' msg += '.'
display.deprecated(msg, removal_version, removed, removal_date, self._collection_name) display.deprecated( # pylint: disable=ansible-deprecated-date-not-permitted,ansible-deprecated-unnecessary-collection-name
msg=msg,
version=removal_version,
removed=removed,
date=removal_date,
deprecator=_messages.PluginInfo._from_collection_name(self._collection_name),
)
if 'redirect' in routing_entry: if 'redirect' in routing_entry:
self.redirected = True self.redirected = True
source_pkg = '.'.join(name_parts) source_pkg = '.'.join(name_parts)
@ -944,7 +950,6 @@ class _CachedModule:
def _find_module_utils( def _find_module_utils(
*, *,
module_name: str, module_name: str,
plugin: PluginInfo,
b_module_data: bytes, b_module_data: bytes,
module_path: str, module_path: str,
module_args: dict[object, object], module_args: dict[object, object],
@ -1020,7 +1025,9 @@ def _find_module_utils(
# People should start writing collections instead of modules in roles so we # People should start writing collections instead of modules in roles so we
# may never fix this # may never fix this
display.debug('ANSIBALLZ: Could not determine module FQN') display.debug('ANSIBALLZ: Could not determine module FQN')
remote_module_fqn = 'ansible.modules.%s' % module_name # FIXME: add integration test to validate that builtins and legacy modules with the same name are tracked separately by the caching mechanism
# FIXME: surrogate FQN should be unique per source path- role-packaged modules with name collisions can still be aliased
remote_module_fqn = 'ansible.legacy.%s' % module_name
if module_substyle == 'python': if module_substyle == 'python':
date_time = datetime.datetime.now(datetime.timezone.utc) date_time = datetime.datetime.now(datetime.timezone.utc)
@ -1126,7 +1133,6 @@ def _find_module_utils(
module_fqn=remote_module_fqn, module_fqn=remote_module_fqn,
params=encoded_params, params=encoded_params,
profile=module_metadata.serialization_profile, profile=module_metadata.serialization_profile,
plugin_info_dict=dataclasses.asdict(plugin),
date_time=date_time, date_time=date_time,
coverage_config=coverage_config, coverage_config=coverage_config,
coverage_output=coverage_output, coverage_output=coverage_output,
@ -1236,7 +1242,6 @@ def _extract_interpreter(b_module_data):
def modify_module( def modify_module(
*, *,
module_name: str, module_name: str,
plugin: PluginInfo,
module_path, module_path,
module_args, module_args,
templar, templar,
@ -1277,7 +1282,6 @@ def modify_module(
module_bits = _find_module_utils( module_bits = _find_module_utils(
module_name=module_name, module_name=module_name,
plugin=plugin,
b_module_data=b_module_data, b_module_data=b_module_data,
module_path=module_path, module_path=module_path,
module_args=module_args, module_args=module_args,

@ -32,6 +32,7 @@ from ansible._internal import _task
from ansible.errors import AnsibleConnectionFailure, AnsibleError from ansible.errors import AnsibleConnectionFailure, AnsibleError
from ansible.executor.task_executor import TaskExecutor from ansible.executor.task_executor import TaskExecutor
from ansible.executor.task_queue_manager import FinalQueue, STDIN_FILENO, STDOUT_FILENO, STDERR_FILENO from ansible.executor.task_queue_manager import FinalQueue, STDIN_FILENO, STDOUT_FILENO, STDERR_FILENO
from ansible.executor.task_result import _RawTaskResult
from ansible.inventory.host import Host from ansible.inventory.host import Host
from ansible.module_utils.common.collections import is_sequence from ansible.module_utils.common.collections import is_sequence
from ansible.module_utils.common.text.converters import to_text from ansible.module_utils.common.text.converters import to_text
@ -226,7 +227,7 @@ class WorkerProcess(multiprocessing_context.Process): # type: ignore[name-defin
init_plugin_loader(cli_collections_path) init_plugin_loader(cli_collections_path)
try: try:
# execute the task and build a TaskResult from the result # execute the task and build a _RawTaskResult from the result
display.debug("running TaskExecutor() for %s/%s" % (self._host, self._task)) display.debug("running TaskExecutor() for %s/%s" % (self._host, self._task))
executor_result = TaskExecutor( executor_result = TaskExecutor(
self._host, self._host,
@ -256,48 +257,52 @@ class WorkerProcess(multiprocessing_context.Process): # type: ignore[name-defin
# put the result on the result queue # put the result on the result queue
display.debug("sending task result for task %s" % self._task._uuid) display.debug("sending task result for task %s" % self._task._uuid)
try: try:
self._final_q.send_task_result( self._final_q.send_task_result(_RawTaskResult(
self._host.name, host=self._host,
self._task._uuid, task=self._task,
executor_result, return_data=executor_result,
task_fields=self._task.dump_attrs(), task_fields=self._task.dump_attrs(),
) ))
except Exception as ex: except Exception as ex:
try: try:
raise AnsibleError("Task result omitted due to queue send failure.") from ex raise AnsibleError("Task result omitted due to queue send failure.") from ex
except Exception as ex_wrapper: except Exception as ex_wrapper:
self._final_q.send_task_result( self._final_q.send_task_result(_RawTaskResult(
self._host.name, host=self._host,
self._task._uuid, task=self._task,
ActionBase.result_dict_from_exception(ex_wrapper), # Overriding the task result, to represent the failure return_data=ActionBase.result_dict_from_exception(ex_wrapper), # Overriding the task result, to represent the failure
{}, # The failure pickling may have been caused by the task attrs, omit for safety task_fields={}, # The failure pickling may have been caused by the task attrs, omit for safety
) ))
display.debug("done sending task result for task %s" % self._task._uuid) display.debug("done sending task result for task %s" % self._task._uuid)
except AnsibleConnectionFailure: except AnsibleConnectionFailure as ex:
return_data = ActionBase.result_dict_from_exception(ex)
return_data.pop('failed')
return_data.update(unreachable=True)
self._host.vars = dict() self._host.vars = dict()
self._host.groups = [] self._host.groups = []
self._final_q.send_task_result( self._final_q.send_task_result(_RawTaskResult(
self._host.name, host=self._host,
self._task._uuid, task=self._task,
dict(unreachable=True), return_data=return_data,
task_fields=self._task.dump_attrs(), task_fields=self._task.dump_attrs(),
) ))
except Exception as e: except Exception as ex:
if not isinstance(e, (IOError, EOFError, KeyboardInterrupt, SystemExit)) or isinstance(e, TemplateNotFound): if not isinstance(ex, (IOError, EOFError, KeyboardInterrupt, SystemExit)) or isinstance(ex, TemplateNotFound):
try: try:
self._host.vars = dict() self._host.vars = dict()
self._host.groups = [] self._host.groups = []
self._final_q.send_task_result( self._final_q.send_task_result(_RawTaskResult(
self._host.name, host=self._host,
self._task._uuid, task=self._task,
dict(failed=True, exception=to_text(traceback.format_exc()), stdout=''), return_data=ActionBase.result_dict_from_exception(ex),
task_fields=self._task.dump_attrs(), task_fields=self._task.dump_attrs(),
) ))
except Exception: except Exception:
display.debug(u"WORKER EXCEPTION: %s" % to_text(e)) display.debug(u"WORKER EXCEPTION: %s" % to_text(ex))
display.debug(u"WORKER TRACEBACK: %s" % to_text(traceback.format_exc())) display.debug(u"WORKER TRACEBACK: %s" % to_text(traceback.format_exc()))
finally: finally:
self._clean_up() self._clean_up()

@ -20,10 +20,9 @@ from ansible.errors import (
AnsibleError, AnsibleParserError, AnsibleUndefinedVariable, AnsibleConnectionFailure, AnsibleActionFail, AnsibleActionSkip, AnsibleTaskError, AnsibleError, AnsibleParserError, AnsibleUndefinedVariable, AnsibleConnectionFailure, AnsibleActionFail, AnsibleActionSkip, AnsibleTaskError,
AnsibleValueOmittedError, AnsibleValueOmittedError,
) )
from ansible.executor.task_result import TaskResult from ansible.executor.task_result import _RawTaskResult
from ansible._internal._datatag import _utils from ansible._internal._datatag import _utils
from ansible.module_utils._internal._plugin_exec_context import PluginExecContext from ansible.module_utils.common.messages import Detail, WarningSummary, DeprecationSummary, PluginInfo
from ansible.module_utils.common.messages import Detail, WarningSummary, DeprecationSummary
from ansible.module_utils.datatag import native_type_name from ansible.module_utils.datatag import native_type_name
from ansible._internal._datatag._tags import TrustedAsTemplate from ansible._internal._datatag._tags import TrustedAsTemplate
from ansible.module_utils.parsing.convert_bool import boolean from ansible.module_utils.parsing.convert_bool import boolean
@ -44,6 +43,9 @@ from ansible.vars.clean import namespace_facts, clean_facts
from ansible.vars.manager import _deprecate_top_level_fact from ansible.vars.manager import _deprecate_top_level_fact
from ansible._internal._errors import _captured from ansible._internal._errors import _captured
if t.TYPE_CHECKING:
from ansible.executor.task_queue_manager import FinalQueue
display = Display() display = Display()
@ -79,7 +81,7 @@ class TaskExecutor:
class. class.
""" """
def __init__(self, host, task: Task, job_vars, play_context, loader, shared_loader_obj, final_q, variable_manager): def __init__(self, host, task: Task, job_vars, play_context, loader, shared_loader_obj, final_q: FinalQueue, variable_manager):
self._host = host self._host = host
self._task = task self._task = task
self._job_vars = job_vars self._job_vars = job_vars
@ -361,10 +363,10 @@ class TaskExecutor:
if self._connection and not isinstance(self._connection, string_types): if self._connection and not isinstance(self._connection, string_types):
task_fields['connection'] = getattr(self._connection, 'ansible_name') task_fields['connection'] = getattr(self._connection, 'ansible_name')
tr = TaskResult( tr = _RawTaskResult(
self._host.name, host=self._host,
self._task._uuid, task=self._task,
res, return_data=res,
task_fields=task_fields, task_fields=task_fields,
) )
@ -637,8 +639,8 @@ class TaskExecutor:
if self._task.timeout: if self._task.timeout:
old_sig = signal.signal(signal.SIGALRM, task_timeout) old_sig = signal.signal(signal.SIGALRM, task_timeout)
signal.alarm(self._task.timeout) signal.alarm(self._task.timeout)
with PluginExecContext(self._handler):
result = self._handler.run(task_vars=vars_copy) result = self._handler.run(task_vars=vars_copy)
# DTFIX-RELEASE: nuke this, it hides a lot of error detail- remove the active exception propagation hack from AnsibleActionFail at the same time # DTFIX-RELEASE: nuke this, it hides a lot of error detail- remove the active exception propagation hack from AnsibleActionFail at the same time
except (AnsibleActionFail, AnsibleActionSkip) as e: except (AnsibleActionFail, AnsibleActionSkip) as e:
@ -666,17 +668,23 @@ class TaskExecutor:
if result.get('failed'): if result.get('failed'):
self._final_q.send_callback( self._final_q.send_callback(
'v2_runner_on_async_failed', 'v2_runner_on_async_failed',
TaskResult(self._host.name, _RawTaskResult(
self._task._uuid, host=self._host,
result, task=self._task,
task_fields=self._task.dump_attrs())) return_data=result,
task_fields=self._task.dump_attrs(),
),
)
else: else:
self._final_q.send_callback( self._final_q.send_callback(
'v2_runner_on_async_ok', 'v2_runner_on_async_ok',
TaskResult(self._host.name, _RawTaskResult(
self._task._uuid, host=self._host,
result, task=self._task,
task_fields=self._task.dump_attrs())) return_data=result,
task_fields=self._task.dump_attrs(),
),
)
if 'ansible_facts' in result and self._task.action not in C._ACTION_DEBUG: if 'ansible_facts' in result and self._task.action not in C._ACTION_DEBUG:
if self._task.action in C._ACTION_WITH_CLEAN_FACTS: if self._task.action in C._ACTION_WITH_CLEAN_FACTS:
@ -756,12 +764,12 @@ class TaskExecutor:
display.debug('Retrying task, attempt %d of %d' % (attempt, retries)) display.debug('Retrying task, attempt %d of %d' % (attempt, retries))
self._final_q.send_callback( self._final_q.send_callback(
'v2_runner_retry', 'v2_runner_retry',
TaskResult( _RawTaskResult(
self._host.name, host=self._host,
self._task._uuid, task=self._task,
result, return_data=result,
task_fields=self._task.dump_attrs() task_fields=self._task.dump_attrs()
) ),
) )
time.sleep(delay) time.sleep(delay)
self._handler = self._get_action_handler(templar=templar) self._handler = self._get_action_handler(templar=templar)
@ -835,13 +843,12 @@ class TaskExecutor:
if not isinstance(deprecation, DeprecationSummary): if not isinstance(deprecation, DeprecationSummary):
# translate non-DeprecationMessageDetail message dicts # translate non-DeprecationMessageDetail message dicts
try: try:
if deprecation.pop('collection_name', ...) is not ...: if (collection_name := deprecation.pop('collection_name', ...)) is not ...:
# deprecated: description='enable the deprecation message for collection_name' core_version='2.23' # deprecated: description='enable the deprecation message for collection_name' core_version='2.23'
# CAUTION: This deprecation cannot be enabled until the replacement (deprecator) has been documented, and the schema finalized.
# self.deprecated('The `collection_name` key in the `deprecations` dictionary is deprecated.', version='2.27') # self.deprecated('The `collection_name` key in the `deprecations` dictionary is deprecated.', version='2.27')
pass deprecation.update(deprecator=PluginInfo._from_collection_name(collection_name))
# DTFIX-RELEASE: when plugin isn't set, do it at the boundary where we receive the module/action results
# that may even allow us to never set it in modules/actions directly and to populate it at the boundary
deprecation = DeprecationSummary( deprecation = DeprecationSummary(
details=( details=(
Detail(msg=deprecation.pop('msg')), Detail(msg=deprecation.pop('msg')),
@ -926,10 +933,10 @@ class TaskExecutor:
time_left -= self._task.poll time_left -= self._task.poll
self._final_q.send_callback( self._final_q.send_callback(
'v2_runner_on_async_poll', 'v2_runner_on_async_poll',
TaskResult( _RawTaskResult(
self._host.name, host=self._host,
async_task._uuid, task=async_task,
async_result, return_data=async_result,
task_fields=async_task.dump_attrs(), task_fields=async_task.dump_attrs(),
), ),
) )

@ -17,6 +17,7 @@
from __future__ import annotations from __future__ import annotations
import dataclasses
import os import os
import sys import sys
import tempfile import tempfile
@ -31,7 +32,7 @@ from ansible.errors import AnsibleError, ExitCode, AnsibleCallbackError
from ansible._internal._errors._handler import ErrorHandler from ansible._internal._errors._handler import ErrorHandler
from ansible.executor.play_iterator import PlayIterator from ansible.executor.play_iterator import PlayIterator
from ansible.executor.stats import AggregateStats from ansible.executor.stats import AggregateStats
from ansible.executor.task_result import TaskResult from ansible.executor.task_result import _RawTaskResult, _WireTaskResult
from ansible.inventory.data import InventoryData from ansible.inventory.data import InventoryData
from ansible.module_utils.six import string_types from ansible.module_utils.six import string_types
from ansible.module_utils.common.text.converters import to_native from ansible.module_utils.common.text.converters import to_native
@ -47,7 +48,8 @@ from ansible.utils.display import Display
from ansible.utils.lock import lock_decorator from ansible.utils.lock import lock_decorator
from ansible.utils.multiprocessing import context as multiprocessing_context from ansible.utils.multiprocessing import context as multiprocessing_context
from dataclasses import dataclass if t.TYPE_CHECKING:
from ansible.executor.process.worker import WorkerProcess
__all__ = ['TaskQueueManager'] __all__ = ['TaskQueueManager']
@ -57,12 +59,13 @@ STDERR_FILENO = 2
display = Display() display = Display()
_T = t.TypeVar('_T')
@dataclasses.dataclass(frozen=True, kw_only=True, slots=True)
class CallbackSend: class CallbackSend:
def __init__(self, method_name, *args, **kwargs): method_name: str
self.method_name = method_name wire_task_result: _WireTaskResult
self.args = args
self.kwargs = kwargs
class DisplaySend: class DisplaySend:
@ -72,7 +75,7 @@ class DisplaySend:
self.kwargs = kwargs self.kwargs = kwargs
@dataclass @dataclasses.dataclass
class PromptSend: class PromptSend:
worker_id: int worker_id: int
prompt: str prompt: str
@ -87,19 +90,11 @@ class FinalQueue(multiprocessing.queues.SimpleQueue):
kwargs['ctx'] = multiprocessing_context kwargs['ctx'] = multiprocessing_context
super().__init__(*args, **kwargs) super().__init__(*args, **kwargs)
def send_callback(self, method_name, *args, **kwargs): def send_callback(self, method_name: str, task_result: _RawTaskResult) -> None:
self.put( self.put(CallbackSend(method_name=method_name, wire_task_result=task_result.as_wire_task_result()))
CallbackSend(method_name, *args, **kwargs),
)
def send_task_result(self, *args, **kwargs): def send_task_result(self, task_result: _RawTaskResult) -> None:
if isinstance(args[0], TaskResult): self.put(task_result.as_wire_task_result())
tr = args[0]
else:
tr = TaskResult(*args, **kwargs)
self.put(
tr,
)
def send_display(self, method, *args, **kwargs): def send_display(self, method, *args, **kwargs):
self.put( self.put(
@ -194,11 +189,8 @@ class TaskQueueManager:
# plugins for inter-process locking. # plugins for inter-process locking.
self._connection_lockfile = tempfile.TemporaryFile() self._connection_lockfile = tempfile.TemporaryFile()
def _initialize_processes(self, num): def _initialize_processes(self, num: int) -> None:
self._workers = [] self._workers: list[WorkerProcess | None] = [None] * num
for i in range(num):
self._workers.append(None)
def load_callbacks(self): def load_callbacks(self):
""" """
@ -438,54 +430,72 @@ class TaskQueueManager:
defunct = True defunct = True
return defunct return defunct
@staticmethod
def _first_arg_of_type(value_type: t.Type[_T], args: t.Sequence) -> _T | None:
return next((arg for arg in args if isinstance(arg, value_type)), None)
@lock_decorator(attr='_callback_lock') @lock_decorator(attr='_callback_lock')
def send_callback(self, method_name, *args, **kwargs): def send_callback(self, method_name, *args, **kwargs):
# We always send events to stdout callback first, rest should follow config order # We always send events to stdout callback first, rest should follow config order
for callback_plugin in [self._stdout_callback] + self._callback_plugins: for callback_plugin in [self._stdout_callback] + self._callback_plugins:
# a plugin that set self.disabled to True will not be called # a plugin that set self.disabled to True will not be called
# see osx_say.py example for such a plugin # see osx_say.py example for such a plugin
if getattr(callback_plugin, 'disabled', False): if callback_plugin.disabled:
continue continue
# a plugin can opt in to implicit tasks (such as meta). It does this # a plugin can opt in to implicit tasks (such as meta). It does this
# by declaring self.wants_implicit_tasks = True. # by declaring self.wants_implicit_tasks = True.
wants_implicit_tasks = getattr(callback_plugin, 'wants_implicit_tasks', False) if not callback_plugin.wants_implicit_tasks and (task_arg := self._first_arg_of_type(Task, args)) and task_arg.implicit:
continue
# try to find v2 method, fallback to v1 method, ignore callback if no method found # try to find v2 method, fallback to v1 method, ignore callback if no method found
methods = [] methods = []
for possible in [method_name, 'v2_on_any']: for possible in [method_name, 'v2_on_any']:
gotit = getattr(callback_plugin, possible, None) method = getattr(callback_plugin, possible, None)
if gotit is None:
gotit = getattr(callback_plugin, possible.removeprefix('v2_'), None)
if gotit is not None:
methods.append(gotit)
# send clean copies
new_args = []
# If we end up being given an implicit task, we'll set this flag in
# the loop below. If the plugin doesn't care about those, then we
# check and continue to the next iteration of the outer loop.
is_implicit_task = False
for arg in args:
# FIXME: add play/task cleaners
if isinstance(arg, TaskResult):
new_args.append(arg.clean_copy())
# elif isinstance(arg, Play):
# elif isinstance(arg, Task):
else:
new_args.append(arg)
if isinstance(arg, Task) and arg.implicit: if method is None:
is_implicit_task = True method = getattr(callback_plugin, possible.removeprefix('v2_'), None)
if is_implicit_task and not wants_implicit_tasks: if method is not None:
continue display.deprecated(
msg='The v1 callback API is deprecated.',
version='2.23',
help_text='Use `v2_` prefixed callback methods instead.',
)
if method is not None and not getattr(method, '_base_impl', False): # don't bother dispatching to the base impls
if possible == 'v2_on_any':
display.deprecated(
msg='The `v2_on_any` callback method is deprecated.',
version='2.23',
help_text='Use event-specific callback methods instead.',
)
methods.append(method)
for method in methods: for method in methods:
# send clean copies
new_args = []
for arg in args:
# FIXME: add play/task cleaners
if isinstance(arg, _RawTaskResult):
copied_tr = arg.as_callback_task_result()
new_args.append(copied_tr)
# this state hack requires that no callback ever accepts > 1 TaskResult object
callback_plugin._current_task_result = copied_tr
else:
new_args.append(arg)
with self._callback_dispatch_error_handler.handle(AnsibleCallbackError): with self._callback_dispatch_error_handler.handle(AnsibleCallbackError):
try: try:
method(*new_args, **kwargs) method(*new_args, **kwargs)
except AssertionError:
# Using an `assert` in integration tests is useful.
# Production code should never use `assert` or raise `AssertionError`.
raise
except Exception as ex: except Exception as ex:
raise AnsibleCallbackError(f"Callback dispatch {method_name!r} failed for plugin {callback_plugin._load_name!r}.") from ex raise AnsibleCallbackError(f"Callback dispatch {method_name!r} failed for plugin {callback_plugin._load_name!r}.") from ex
callback_plugin._current_task_result = None

@ -4,15 +4,24 @@
from __future__ import annotations from __future__ import annotations
import collections.abc as _c
import dataclasses
import functools
import typing as t import typing as t
from ansible import constants as C from ansible import constants
from ansible.parsing.dataloader import DataLoader from ansible.utils import vars as _vars
from ansible.vars.clean import module_response_deepcopy, strip_internal_keys from ansible.vars.clean import module_response_deepcopy, strip_internal_keys
from ansible.module_utils.common import messages as _messages
from ansible._internal import _collection_proxy
if t.TYPE_CHECKING:
from ansible.inventory.host import Host
from ansible.playbook.task import Task
_IGNORE = ('failed', 'skipped') _IGNORE = ('failed', 'skipped')
_PRESERVE = ('attempts', 'changed', 'retries', '_ansible_no_log') _PRESERVE = {'attempts', 'changed', 'retries', '_ansible_no_log'}
_SUB_PRESERVE = {'_ansible_delegated_vars': ('ansible_host', 'ansible_port', 'ansible_user', 'ansible_connection')} _SUB_PRESERVE = {'_ansible_delegated_vars': {'ansible_host', 'ansible_port', 'ansible_user', 'ansible_connection'}}
# stuff callbacks need # stuff callbacks need
CLEAN_EXCEPTIONS = ( CLEAN_EXCEPTIONS = (
@ -23,61 +32,120 @@ CLEAN_EXCEPTIONS = (
) )
class TaskResult: @t.final
@dataclasses.dataclass(frozen=True, kw_only=True, slots=True)
class _WireTaskResult:
"""A thin version of `_RawTaskResult` which can be sent over the worker queue."""
host_name: str
task_uuid: str
return_data: _c.MutableMapping[str, object]
task_fields: _c.Mapping[str, object]
class _BaseTaskResult:
""" """
This class is responsible for interpreting the resulting data This class is responsible for interpreting the resulting data
from an executed task, and provides helper methods for determining from an executed task, and provides helper methods for determining
the result of a given task. the result of a given task.
""" """
def __init__(self, host, task, return_data, task_fields=None): def __init__(self, host: Host, task: Task, return_data: _c.MutableMapping[str, t.Any], task_fields: _c.Mapping[str, t.Any]) -> None:
self._host = host self.__host = host
self._task = task self.__task = task
self._return_data = return_data # FIXME: this should be immutable, but strategy result processing mutates it in some corner cases
self.__task_fields = task_fields
if isinstance(return_data, dict): @property
self._result = return_data.copy() def host(self) -> Host:
else: """The host associated with this result."""
self._result = DataLoader().load(return_data) return self.__host
if task_fields is None: @property
self._task_fields = dict() def _host(self) -> Host:
else: """Use the `host` property when supporting only ansible-core 2.19 or later."""
self._task_fields = task_fields # deprecated: description='Deprecate `_host` in favor of `host`' core_version='2.23'
return self.__host
@property
def task(self) -> Task:
"""The task associated with this result."""
return self.__task
@property
def _task(self) -> Task:
"""Use the `task` property when supporting only ansible-core 2.19 or later."""
# deprecated: description='Deprecate `_task` in favor of `task`' core_version='2.23'
return self.__task
@property
def task_fields(self) -> _c.Mapping[str, t.Any]:
"""The task fields associated with this result."""
return self.__task_fields
@property
def _task_fields(self) -> _c.Mapping[str, t.Any]:
"""Use the `task_fields` property when supporting only ansible-core 2.19 or later."""
# deprecated: description='Deprecate `_task_fields` in favor of `task`' core_version='2.23'
return self.__task_fields
@property
def exception(self) -> _messages.ErrorSummary | None:
"""The error from this task result, if any."""
return self._return_data.get('exception')
@property
def warnings(self) -> _c.Sequence[_messages.WarningSummary]:
"""The warnings for this task, if any."""
return _collection_proxy.SequenceProxy(self._return_data.get('warnings') or [])
@property
def deprecations(self) -> _c.Sequence[_messages.DeprecationSummary]:
"""The deprecation warnings for this task, if any."""
return _collection_proxy.SequenceProxy(self._return_data.get('deprecations') or [])
@property
def _loop_results(self) -> list[_c.MutableMapping[str, t.Any]]:
"""Return a list of loop results. If no loop results are present, an empty list is returned."""
results = self._return_data.get('results')
if not isinstance(results, list):
return []
return results
@property @property
def task_name(self): def task_name(self) -> str:
return self._task_fields.get('name', None) or self._task.get_name() return str(self.task_fields.get('name', '')) or self.task.get_name()
def is_changed(self): def is_changed(self) -> bool:
return self._check_key('changed') return self._check_key('changed')
def is_skipped(self): def is_skipped(self) -> bool:
# loop results if self._loop_results:
if 'results' in self._result:
results = self._result['results']
# Loop tasks are only considered skipped if all items were skipped. # Loop tasks are only considered skipped if all items were skipped.
# some squashed results (eg, dnf) are not dicts and can't be skipped individually # some squashed results (eg, dnf) are not dicts and can't be skipped individually
if results and all(isinstance(res, dict) and res.get('skipped', False) for res in results): if all(isinstance(loop_res, dict) and loop_res.get('skipped', False) for loop_res in self._loop_results):
return True return True
# regular tasks and squashed non-dict results # regular tasks and squashed non-dict results
return self._result.get('skipped', False) return bool(self._return_data.get('skipped', False))
def is_failed(self): def is_failed(self) -> bool:
if 'failed_when_result' in self._result or \ if 'failed_when_result' in self._return_data or any(isinstance(loop_res, dict) and 'failed_when_result' in loop_res for loop_res in self._loop_results):
'results' in self._result and True in [True for x in self._result['results'] if 'failed_when_result' in x]:
return self._check_key('failed_when_result') return self._check_key('failed_when_result')
else:
return self._check_key('failed')
def is_unreachable(self): return self._check_key('failed')
def is_unreachable(self) -> bool:
return self._check_key('unreachable') return self._check_key('unreachable')
def needs_debugger(self, globally_enabled=False): def needs_debugger(self, globally_enabled: bool = False) -> bool:
_debugger = self._task_fields.get('debugger') _debugger = self.task_fields.get('debugger')
_ignore_errors = C.TASK_DEBUGGER_IGNORE_ERRORS and self._task_fields.get('ignore_errors') _ignore_errors = constants.TASK_DEBUGGER_IGNORE_ERRORS and self.task_fields.get('ignore_errors')
ret = False ret = False
if globally_enabled and ((self.is_failed() and not _ignore_errors) or self.is_unreachable()): if globally_enabled and ((self.is_failed() and not _ignore_errors) or self.is_unreachable()):
ret = True ret = True
@ -94,68 +162,96 @@ class TaskResult:
return ret return ret
def _check_key(self, key): def _check_key(self, key: str) -> bool:
"""get a specific key from the result or its items""" """Fetch a specific named boolean value from the result; if missing, a logical OR of the value from nested loop results; False for non-loop results."""
if (value := self._return_data.get(key, ...)) is not ...:
return bool(value)
if isinstance(self._result, dict) and key in self._result: return any(isinstance(result, dict) and result.get(key) for result in self._loop_results)
return self._result.get(key, False)
else:
flag = False
for res in self._result.get('results', []):
if isinstance(res, dict):
flag |= res.get(key, False)
return flag
def clean_copy(self):
""" returns 'clean' taskresult object """ @t.final
class _RawTaskResult(_BaseTaskResult):
def as_wire_task_result(self) -> _WireTaskResult:
"""Return a `_WireTaskResult` from this instance."""
return _WireTaskResult(
host_name=self.host.name,
task_uuid=self.task._uuid,
return_data=self._return_data,
task_fields=self.task_fields,
)
# FIXME: clean task_fields, _task and _host copies def as_callback_task_result(self) -> CallbackTaskResult:
result = TaskResult(self._host, self._task, {}, self._task_fields) """Return a `CallbackTaskResult` from this instance."""
ignore: tuple[str, ...]
# statuses are already reflected on the event type # statuses are already reflected on the event type
if result._task and result._task.action in C._ACTION_DEBUG: if self.task and self.task.action in constants._ACTION_DEBUG:
# debug is verbose by default to display vars, no need to add invocation # debug is verbose by default to display vars, no need to add invocation
ignore = _IGNORE + ('invocation',) ignore = _IGNORE + ('invocation',)
else: else:
ignore = _IGNORE ignore = _IGNORE
subset = {} subset: dict[str, dict[str, object]] = {}
# preserve subset for later # preserve subset for later
for sub in _SUB_PRESERVE: for sub, sub_keys in _SUB_PRESERVE.items():
if sub in self._result: sub_data = self._return_data.get(sub)
subset[sub] = {}
for key in _SUB_PRESERVE[sub]: if isinstance(sub_data, dict):
if key in self._result[sub]: subset[sub] = {key: value for key, value in sub_data.items() if key in sub_keys}
subset[sub][key] = self._result[sub][key]
# DTFIX-FUTURE: is checking no_log here redundant now that we use _ansible_no_log everywhere? # DTFIX-FUTURE: is checking no_log here redundant now that we use _ansible_no_log everywhere?
if isinstance(self._task.no_log, bool) and self._task.no_log or self._result.get('_ansible_no_log'): if isinstance(self.task.no_log, bool) and self.task.no_log or self._return_data.get('_ansible_no_log'):
censored_result = censor_result(self._result) censored_result = censor_result(self._return_data)
if results := self._result.get('results'): if self._loop_results:
# maintain shape for loop results so callback behavior recognizes a loop was performed # maintain shape for loop results so callback behavior recognizes a loop was performed
censored_result.update(results=[censor_result(item) if item.get('_ansible_no_log') else item for item in results]) censored_result.update(results=[
censor_result(loop_res) if isinstance(loop_res, dict) and loop_res.get('_ansible_no_log') else loop_res for loop_res in self._loop_results
])
result._result = censored_result return_data = censored_result
elif self._result: elif self._return_data:
result._result = module_response_deepcopy(self._result) return_data = {k: v for k, v in module_response_deepcopy(self._return_data).items() if k not in ignore}
# actually remove
for remove_key in ignore:
if remove_key in result._result:
del result._result[remove_key]
# remove almost ALL internal keys, keep ones relevant to callback # remove almost ALL internal keys, keep ones relevant to callback
strip_internal_keys(result._result, exceptions=CLEAN_EXCEPTIONS) strip_internal_keys(return_data, exceptions=CLEAN_EXCEPTIONS)
else:
return_data = {}
# keep subset # keep subset
result._result.update(subset) return_data.update(subset)
return CallbackTaskResult(self.host, self.task, return_data, self.task_fields)
@t.final
class CallbackTaskResult(_BaseTaskResult):
"""Public contract of TaskResult """
# DTFIX-RELEASE: find a better home for this since it's public API
@property
def _result(self) -> _c.MutableMapping[str, t.Any]:
"""Use the `result` property when supporting only ansible-core 2.19 or later."""
# deprecated: description='Deprecate `_result` in favor of `result`' core_version='2.23'
return self.result
@functools.cached_property
def result(self) -> _c.MutableMapping[str, t.Any]:
"""
Returns a cached copy of the task result dictionary for consumption by callbacks.
Internal custom types are transformed to native Python types to facilitate access and serialization.
"""
return t.cast(_c.MutableMapping[str, t.Any], _vars.transform_to_native_types(self._return_data))
return result TaskResult = CallbackTaskResult
"""Compatibility name for the pre-2.19 callback-shaped TaskResult passed to callbacks."""
def censor_result(result: dict[str, t.Any]) -> dict[str, t.Any]: def censor_result(result: _c.Mapping[str, t.Any]) -> dict[str, t.Any]:
censored_result = {key: value for key in _PRESERVE if (value := result.get(key, ...)) is not ...} censored_result = {key: value for key in _PRESERVE if (value := result.get(key, ...)) is not ...}
censored_result.update(censored="the output has been hidden due to the fact that 'no_log: true' was specified for this result") censored_result.update(censored="the output has been hidden due to the fact that 'no_log: true' was specified for this result")

@ -138,7 +138,7 @@ def g_connect(versions):
'The v2 Ansible Galaxy API is deprecated and no longer supported. ' 'The v2 Ansible Galaxy API is deprecated and no longer supported. '
'Ensure that you have configured the ansible-galaxy CLI to utilize an ' 'Ensure that you have configured the ansible-galaxy CLI to utilize an '
'updated and supported version of Ansible Galaxy.', 'updated and supported version of Ansible Galaxy.',
version='2.20' version='2.20',
) )
return method(self, *args, **kwargs) return method(self, *args, **kwargs)

@ -201,9 +201,9 @@ class CollectionSignatureError(Exception):
# FUTURE: expose actual verify result details for a collection on this object, maybe reimplement as dataclass on py3.8+ # FUTURE: expose actual verify result details for a collection on this object, maybe reimplement as dataclass on py3.8+
class CollectionVerifyResult: class CollectionVerifyResult:
def __init__(self, collection_name): # type: (str) -> None def __init__(self, collection_name: str) -> None:
self.collection_name = collection_name # type: str self.collection_name = collection_name
self.success = True # type: bool self.success = True
def verify_local_collection(local_collection, remote_collection, artifacts_manager): def verify_local_collection(local_collection, remote_collection, artifacts_manager):

@ -30,6 +30,7 @@ from random import shuffle
from ansible import constants as C from ansible import constants as C
from ansible._internal import _json, _wrapt from ansible._internal import _json, _wrapt
from ansible._internal._json import EncryptedStringBehavior
from ansible.errors import AnsibleError, AnsibleOptionsError from ansible.errors import AnsibleError, AnsibleOptionsError
from ansible.inventory.data import InventoryData from ansible.inventory.data import InventoryData
from ansible.module_utils.six import string_types from ansible.module_utils.six import string_types
@ -312,6 +313,7 @@ class InventoryManager(object):
ex.obj = origin ex.obj = origin
failures.append({'src': source, 'plugin': plugin_name, 'exc': ex}) failures.append({'src': source, 'plugin': plugin_name, 'exc': ex})
except Exception as ex: except Exception as ex:
# DTFIX-RELEASE: fix this error handling to correctly deal with messaging
try: try:
# omit line number to prevent contextual display of script or possibly sensitive info # omit line number to prevent contextual display of script or possibly sensitive info
raise AnsibleError(str(ex), obj=origin) from ex raise AnsibleError(str(ex), obj=origin) from ex
@ -787,7 +789,7 @@ class _InventoryDataWrapper(_wrapt.ObjectProxy):
return _json.AnsibleVariableVisitor( return _json.AnsibleVariableVisitor(
trusted_as_template=self._target_plugin.trusted_by_default, trusted_as_template=self._target_plugin.trusted_by_default,
origin=self._default_origin, origin=self._default_origin,
allow_encrypted_string=True, encrypted_string_behavior=EncryptedStringBehavior.PRESERVE,
) )
def set_variable(self, entity: str, varname: str, value: t.Any) -> None: def set_variable(self, entity: str, varname: str, value: t.Any) -> None:

@ -6,7 +6,6 @@
from __future__ import annotations from __future__ import annotations
import atexit import atexit
import dataclasses
import importlib.util import importlib.util
import json import json
import os import os
@ -15,17 +14,14 @@ import sys
import typing as t import typing as t
from . import _errors from . import _errors
from ._plugin_exec_context import PluginExecContext, HasPluginInfo
from .. import basic from .. import basic
from ..common.json import get_module_encoder, Direction from ..common.json import get_module_encoder, Direction
from ..common.messages import PluginInfo
def run_module( def run_module(
*, *,
json_params: bytes, json_params: bytes,
profile: str, profile: str,
plugin_info_dict: dict[str, object],
module_fqn: str, module_fqn: str,
modlib_path: str, modlib_path: str,
init_globals: dict[str, t.Any] | None = None, init_globals: dict[str, t.Any] | None = None,
@ -38,7 +34,6 @@ def run_module(
_run_module( _run_module(
json_params=json_params, json_params=json_params,
profile=profile, profile=profile,
plugin_info_dict=plugin_info_dict,
module_fqn=module_fqn, module_fqn=module_fqn,
modlib_path=modlib_path, modlib_path=modlib_path,
init_globals=init_globals, init_globals=init_globals,
@ -80,7 +75,6 @@ def _run_module(
*, *,
json_params: bytes, json_params: bytes,
profile: str, profile: str,
plugin_info_dict: dict[str, object],
module_fqn: str, module_fqn: str,
modlib_path: str, modlib_path: str,
init_globals: dict[str, t.Any] | None = None, init_globals: dict[str, t.Any] | None = None,
@ -92,12 +86,11 @@ def _run_module(
init_globals = init_globals or {} init_globals = init_globals or {}
init_globals.update(_module_fqn=module_fqn, _modlib_path=modlib_path) init_globals.update(_module_fqn=module_fqn, _modlib_path=modlib_path)
with PluginExecContext(_ModulePluginWrapper(PluginInfo._from_dict(plugin_info_dict))): # Run the module. By importing it as '__main__', it executes as a script.
# Run the module. By importing it as '__main__', it executes as a script. runpy.run_module(mod_name=module_fqn, init_globals=init_globals, run_name='__main__', alter_sys=True)
runpy.run_module(mod_name=module_fqn, init_globals=init_globals, run_name='__main__', alter_sys=True)
# An Ansible module must print its own results and exit. If execution reaches this point, that did not happen. # An Ansible module must print its own results and exit. If execution reaches this point, that did not happen.
raise RuntimeError('New-style module did not handle its own exit.') raise RuntimeError('New-style module did not handle its own exit.')
def _handle_exception(exception: BaseException, profile: str) -> t.NoReturn: def _handle_exception(exception: BaseException, profile: str) -> t.NoReturn:
@ -112,22 +105,3 @@ def _handle_exception(exception: BaseException, profile: str) -> t.NoReturn:
print(json.dumps(result, cls=encoder)) # pylint: disable=ansible-bad-function print(json.dumps(result, cls=encoder)) # pylint: disable=ansible-bad-function
sys.exit(1) # pylint: disable=ansible-bad-function sys.exit(1) # pylint: disable=ansible-bad-function
@dataclasses.dataclass(frozen=True)
class _ModulePluginWrapper(HasPluginInfo):
"""Modules aren't plugin instances; this adapter implements the `HasPluginInfo` protocol to allow `PluginExecContext` infra to work with modules."""
plugin: PluginInfo
@property
def _load_name(self) -> str:
return self.plugin.requested_name
@property
def ansible_name(self) -> str:
return self.plugin.resolved_name
@property
def plugin_type(self) -> str:
return self.plugin.type

@ -1,64 +0,0 @@
"""Patch broken ClassVar support in dataclasses when ClassVar is accessed via a module other than `typing`."""
# deprecated: description='verify ClassVar support in dataclasses has been fixed in Python before removing this patching code', python_version='3.12'
from __future__ import annotations
import dataclasses
import sys
import typing as t
# trigger the bug by exposing typing.ClassVar via a module reference that is not `typing`
_ts = sys.modules[__name__]
ClassVar = t.ClassVar
def patch_dataclasses_is_type() -> None:
if not _is_patch_needed():
return # pragma: nocover
try:
real_is_type = dataclasses._is_type # type: ignore[attr-defined]
except AttributeError: # pragma: nocover
raise RuntimeError("unable to patch broken dataclasses ClassVar support") from None
# patch dataclasses._is_type - impl from https://github.com/python/cpython/blob/4c6d4f5cb33e48519922d635894eef356faddba2/Lib/dataclasses.py#L709-L765
def _is_type(annotation, cls, a_module, a_type, is_type_predicate):
match = dataclasses._MODULE_IDENTIFIER_RE.match(annotation) # type: ignore[attr-defined]
if match:
ns = None
module_name = match.group(1)
if not module_name:
# No module name, assume the class's module did
# "from dataclasses import InitVar".
ns = sys.modules.get(cls.__module__).__dict__
else:
# Look up module_name in the class's module.
module = sys.modules.get(cls.__module__)
if module and module.__dict__.get(module_name): # this is the patched line; removed `is a_module`
ns = sys.modules.get(a_type.__module__).__dict__
if ns and is_type_predicate(ns.get(match.group(2)), a_module):
return True
return False
_is_type._orig_impl = real_is_type # type: ignore[attr-defined] # stash this away to allow unit tests to undo the patch
dataclasses._is_type = _is_type # type: ignore[attr-defined]
try:
if _is_patch_needed():
raise RuntimeError("patching had no effect") # pragma: nocover
except Exception as ex: # pragma: nocover
dataclasses._is_type = real_is_type # type: ignore[attr-defined]
raise RuntimeError("dataclasses ClassVar support is still broken after patching") from ex
def _is_patch_needed() -> bool:
@dataclasses.dataclass
class CheckClassVar:
# this is the broken case requiring patching: ClassVar dot-referenced from a module that is not `typing` is treated as an instance field
# DTFIX-RELEASE: add link to CPython bug report to-be-filed (or update associated deprecation comments if we don't)
a_classvar: _ts.ClassVar[int] # type: ignore[name-defined]
a_field: int
return len(dataclasses.fields(CheckClassVar)) != 1

@ -1,7 +1,6 @@
from __future__ import annotations from __future__ import annotations
import dataclasses import dataclasses
import datetime
import typing as t import typing as t
from ansible.module_utils.common import messages as _messages from ansible.module_utils.common import messages as _messages
@ -12,27 +11,6 @@ from ansible.module_utils._internal import _datatag
class Deprecated(_datatag.AnsibleDatatagBase): class Deprecated(_datatag.AnsibleDatatagBase):
msg: str msg: str
help_text: t.Optional[str] = None help_text: t.Optional[str] = None
removal_date: t.Optional[datetime.date] = None date: t.Optional[str] = None
removal_version: t.Optional[str] = None version: t.Optional[str] = None
plugin: t.Optional[_messages.PluginInfo] = None deprecator: t.Optional[_messages.PluginInfo] = None
@classmethod
def _from_dict(cls, d: t.Dict[str, t.Any]) -> Deprecated:
source = d
removal_date = source.get('removal_date')
if removal_date is not None:
source = source.copy()
source['removal_date'] = datetime.date.fromisoformat(removal_date)
return cls(**source)
def _as_dict(self) -> t.Dict[str, t.Any]:
# deprecated: description='no-args super() with slotted dataclass requires 3.14+' python_version='3.13'
# see: https://github.com/python/cpython/pull/124455
value = super(Deprecated, self)._as_dict()
if self.removal_date is not None:
value['removal_date'] = self.removal_date.isoformat()
return value

@ -0,0 +1,134 @@
from __future__ import annotations
import inspect
import re
import pathlib
import sys
import typing as t
from ansible.module_utils.common.messages import PluginInfo
_ansible_module_base_path: t.Final = pathlib.Path(sys.modules['ansible'].__file__).parent
"""Runtime-detected base path of the `ansible` Python package to distinguish between Ansible-owned and external code."""
ANSIBLE_CORE_DEPRECATOR: t.Final = PluginInfo._from_collection_name('ansible.builtin')
"""Singleton `PluginInfo` instance for ansible-core callers where the plugin can/should not be identified in messages."""
INDETERMINATE_DEPRECATOR: t.Final = PluginInfo(resolved_name='indeterminate', type='indeterminate')
"""Singleton `PluginInfo` instance for indeterminate deprecator."""
_DEPRECATOR_PLUGIN_TYPES = frozenset(
{
'action',
'become',
'cache',
'callback',
'cliconf',
'connection',
# doc_fragments - no code execution
# filter - basename inadequate to identify plugin
'httpapi',
'inventory',
'lookup',
'module', # only for collections
'netconf',
'shell',
'strategy',
'terminal',
# test - basename inadequate to identify plugin
'vars',
}
)
"""Plugin types which are valid for identifying a deprecator for deprecation purposes."""
_AMBIGUOUS_DEPRECATOR_PLUGIN_TYPES = frozenset(
{
'filter',
'test',
}
)
"""Plugin types for which basename cannot be used to identify the plugin name."""
def get_best_deprecator(*, deprecator: PluginInfo | None = None, collection_name: str | None = None) -> PluginInfo:
"""Return the best-available `PluginInfo` for the caller of this method."""
_skip_stackwalk = True
if deprecator and collection_name:
raise ValueError('Specify only one of `deprecator` or `collection_name`.')
return deprecator or PluginInfo._from_collection_name(collection_name) or get_caller_plugin_info() or INDETERMINATE_DEPRECATOR
def get_caller_plugin_info() -> PluginInfo | None:
"""Try to get `PluginInfo` for the caller of this method, ignoring marked infrastructure stack frames."""
_skip_stackwalk = True
if frame_info := next((frame_info for frame_info in inspect.stack() if '_skip_stackwalk' not in frame_info.frame.f_locals), None):
return _path_as_core_plugininfo(frame_info.filename) or _path_as_collection_plugininfo(frame_info.filename)
return None # pragma: nocover
def _path_as_core_plugininfo(path: str) -> PluginInfo | None:
"""Return a `PluginInfo` instance if the provided `path` refers to a core plugin."""
try:
relpath = str(pathlib.Path(path).relative_to(_ansible_module_base_path))
except ValueError:
return None # not ansible-core
namespace = 'ansible.builtin'
if match := re.match(r'plugins/(?P<plugin_type>\w+)/(?P<plugin_name>\w+)', relpath):
plugin_name = match.group("plugin_name")
plugin_type = match.group("plugin_type")
if plugin_type not in _DEPRECATOR_PLUGIN_TYPES:
# The plugin type isn't a known deprecator type, so we have to assume the caller is intermediate code.
# We have no way of knowing if the intermediate code is deprecating its own feature, or acting on behalf of another plugin.
# Callers in this case need to identify the deprecating plugin name, otherwise only ansible-core will be reported.
# Reporting ansible-core is never wrong, it just may be missing an additional detail (plugin name) in the "on behalf of" case.
return ANSIBLE_CORE_DEPRECATOR
elif match := re.match(r'modules/(?P<module_name>\w+)', relpath):
# AnsiballZ Python package for core modules
plugin_name = match.group("module_name")
plugin_type = "module"
elif match := re.match(r'legacy/(?P<module_name>\w+)', relpath):
# AnsiballZ Python package for non-core library/role modules
namespace = 'ansible.legacy'
plugin_name = match.group("module_name")
plugin_type = "module"
else:
return ANSIBLE_CORE_DEPRECATOR # non-plugin core path, safe to use ansible-core for the same reason as the non-deprecator plugin type case above
name = f'{namespace}.{plugin_name}'
return PluginInfo(resolved_name=name, type=plugin_type)
def _path_as_collection_plugininfo(path: str) -> PluginInfo | None:
"""Return a `PluginInfo` instance if the provided `path` refers to a collection plugin."""
if not (match := re.search(r'/ansible_collections/(?P<ns>\w+)/(?P<coll>\w+)/plugins/(?P<plugin_type>\w+)/(?P<plugin_name>\w+)', path)):
return None
plugin_type = match.group('plugin_type')
if plugin_type in _AMBIGUOUS_DEPRECATOR_PLUGIN_TYPES:
# We're able to detect the namespace, collection and plugin type -- but we have no way to identify the plugin name currently.
# To keep things simple we'll fall back to just identifying the namespace and collection.
# In the future we could improve the detection and/or make it easier for a caller to identify the plugin name.
return PluginInfo._from_collection_name('.'.join((match.group('ns'), match.group('coll'))))
if plugin_type == 'modules':
plugin_type = 'module'
if plugin_type not in _DEPRECATOR_PLUGIN_TYPES:
# The plugin type isn't a known deprecator type, so we have to assume the caller is intermediate code.
# We have no way of knowing if the intermediate code is deprecating its own feature, or acting on behalf of another plugin.
# Callers in this case need to identify the deprecator to avoid ambiguity, since it could be the same collection or another collection.
return INDETERMINATE_DEPRECATOR
name = '.'.join((match.group('ns'), match.group('coll'), match.group('plugin_name')))
return PluginInfo(resolved_name=name, type=plugin_type)

@ -1,49 +0,0 @@
from __future__ import annotations
import typing as t
from ._ambient_context import AmbientContextBase
from ..common.messages import PluginInfo
class HasPluginInfo(t.Protocol):
"""Protocol to type-annotate and expose PluginLoader-set values."""
@property
def _load_name(self) -> str:
"""The requested name used to load the plugin."""
@property
def ansible_name(self) -> str:
"""Fully resolved plugin name."""
@property
def plugin_type(self) -> str:
"""Plugin type name."""
class PluginExecContext(AmbientContextBase):
"""Execution context that wraps all plugin invocations to allow infrastructure introspection of the currently-executing plugin instance."""
def __init__(self, executing_plugin: HasPluginInfo) -> None:
self._executing_plugin = executing_plugin
@property
def executing_plugin(self) -> HasPluginInfo:
return self._executing_plugin
@property
def plugin_info(self) -> PluginInfo:
return PluginInfo(
requested_name=self._executing_plugin._load_name,
resolved_name=self._executing_plugin.ansible_name,
type=self._executing_plugin.plugin_type,
)
@classmethod
def get_current_plugin_info(cls) -> PluginInfo | None:
"""Utility method to extract a PluginInfo for the currently executing plugin (or None if no plugin is executing)."""
if ctx := cls.current(optional=True):
return ctx.plugin_info
return None

@ -0,0 +1,25 @@
from __future__ import annotations
import typing as t
from ..common import messages as _messages
class HasPluginInfo(t.Protocol):
"""Protocol to type-annotate and expose PluginLoader-set values."""
@property
def ansible_name(self) -> str | None:
"""Fully resolved plugin name."""
@property
def plugin_type(self) -> str:
"""Plugin type name."""
def get_plugin_info(value: HasPluginInfo) -> _messages.PluginInfo:
"""Utility method that returns a `PluginInfo` from an object implementing the `HasPluginInfo` protocol."""
return _messages.PluginInfo(
resolved_name=value.ansible_name,
type=value.plugin_type,
)

@ -0,0 +1,14 @@
from __future__ import annotations
import keyword
def validate_collection_name(collection_name: object, name: str = 'collection_name') -> None:
"""Validate a collection name."""
if not isinstance(collection_name, str):
raise TypeError(f"{name} must be {str} instead of {type(collection_name)}")
parts = collection_name.split('.')
if len(parts) != 2 or not all(part.isidentifier() and not keyword.iskeyword(part) for part in parts):
raise ValueError(f"{name} must consist of two non-keyword identifiers separated by '.'")

@ -53,9 +53,7 @@ try:
except ImportError: except ImportError:
HAS_SYSLOG = False HAS_SYSLOG = False
# deprecated: description='types.EllipsisType is available in Python 3.10+' python_version='3.9' _UNSET = t.cast(t.Any, object())
if t.TYPE_CHECKING:
from builtins import ellipsis
try: try:
from systemd import journal, daemon as systemd_daemon from systemd import journal, daemon as systemd_daemon
@ -77,7 +75,7 @@ except ImportError:
# Python2 & 3 way to get NoneType # Python2 & 3 way to get NoneType
NoneType = type(None) NoneType = type(None)
from ._internal import _traceback, _errors, _debugging from ._internal import _traceback, _errors, _debugging, _deprecator
from .common.text.converters import ( from .common.text.converters import (
to_native, to_native,
@ -341,7 +339,7 @@ def _load_params():
except Exception as ex: except Exception as ex:
raise Exception("Failed to decode JSON module parameters.") from ex raise Exception("Failed to decode JSON module parameters.") from ex
if (ansible_module_args := params.get('ANSIBLE_MODULE_ARGS', ...)) is ...: if (ansible_module_args := params.get('ANSIBLE_MODULE_ARGS', _UNSET)) is _UNSET:
raise Exception("ANSIBLE_MODULE_ARGS not provided.") raise Exception("ANSIBLE_MODULE_ARGS not provided.")
global _PARSED_MODULE_ARGS global _PARSED_MODULE_ARGS
@ -511,16 +509,31 @@ class AnsibleModule(object):
warn(warning) warn(warning)
self.log('[WARNING] %s' % warning) self.log('[WARNING] %s' % warning)
def deprecate(self, msg, version=None, date=None, collection_name=None): def deprecate(
if version is not None and date is not None: self,
raise AssertionError("implementation error -- version and date must not both be set") msg: str,
deprecate(msg, version=version, date=date) version: str | None = None,
# For compatibility, we accept that neither version nor date is set, date: str | None = None,
# and treat that the same as if version would not have been set collection_name: str | None = None,
if date is not None: *,
self.log('[DEPRECATION WARNING] %s %s' % (msg, date)) deprecator: _messages.PluginInfo | None = None,
else: help_text: str | None = None,
self.log('[DEPRECATION WARNING] %s %s' % (msg, version)) ) -> None:
"""
Record a deprecation warning to be returned with the module result.
Most callers do not need to provide `collection_name` or `deprecator` -- but provide only one if needed.
Specify `version` or `date`, but not both.
If `date` is a string, it must be in the form `YYYY-MM-DD`.
"""
_skip_stackwalk = True
deprecate( # pylint: disable=ansible-deprecated-date-not-permitted,ansible-deprecated-unnecessary-collection-name
msg=msg,
version=version,
date=date,
deprecator=_deprecator.get_best_deprecator(deprecator=deprecator, collection_name=collection_name),
help_text=help_text,
)
def load_file_common_arguments(self, params, path=None): def load_file_common_arguments(self, params, path=None):
""" """
@ -1406,6 +1419,7 @@ class AnsibleModule(object):
self.cleanup(path) self.cleanup(path)
def _return_formatted(self, kwargs): def _return_formatted(self, kwargs):
_skip_stackwalk = True
self.add_path_info(kwargs) self.add_path_info(kwargs)
@ -1413,6 +1427,13 @@ class AnsibleModule(object):
kwargs['invocation'] = {'module_args': self.params} kwargs['invocation'] = {'module_args': self.params}
if 'warnings' in kwargs: if 'warnings' in kwargs:
self.deprecate( # pylint: disable=ansible-deprecated-unnecessary-collection-name
msg='Passing `warnings` to `exit_json` or `fail_json` is deprecated.',
version='2.23',
help_text='Use `AnsibleModule.warn` instead.',
deprecator=_deprecator.ANSIBLE_CORE_DEPRECATOR,
)
if isinstance(kwargs['warnings'], list): if isinstance(kwargs['warnings'], list):
for w in kwargs['warnings']: for w in kwargs['warnings']:
self.warn(w) self.warn(w)
@ -1424,17 +1445,38 @@ class AnsibleModule(object):
kwargs['warnings'] = warnings kwargs['warnings'] = warnings
if 'deprecations' in kwargs: if 'deprecations' in kwargs:
self.deprecate( # pylint: disable=ansible-deprecated-unnecessary-collection-name
msg='Passing `deprecations` to `exit_json` or `fail_json` is deprecated.',
version='2.23',
help_text='Use `AnsibleModule.deprecate` instead.',
deprecator=_deprecator.ANSIBLE_CORE_DEPRECATOR,
)
if isinstance(kwargs['deprecations'], list): if isinstance(kwargs['deprecations'], list):
for d in kwargs['deprecations']: for d in kwargs['deprecations']:
if isinstance(d, SEQUENCETYPE) and len(d) == 2: if isinstance(d, (KeysView, Sequence)) and len(d) == 2:
self.deprecate(d[0], version=d[1]) self.deprecate( # pylint: disable=ansible-deprecated-unnecessary-collection-name,ansible-invalid-deprecated-version
msg=d[0],
version=d[1],
deprecator=_deprecator.get_best_deprecator(),
)
elif isinstance(d, Mapping): elif isinstance(d, Mapping):
self.deprecate(d['msg'], version=d.get('version'), date=d.get('date'), self.deprecate( # pylint: disable=ansible-deprecated-date-not-permitted,ansible-deprecated-unnecessary-collection-name
collection_name=d.get('collection_name')) msg=d['msg'],
version=d.get('version'),
date=d.get('date'),
deprecator=_deprecator.get_best_deprecator(collection_name=d.get('collection_name')),
)
else: else:
self.deprecate(d) # pylint: disable=ansible-deprecated-no-version self.deprecate( # pylint: disable=ansible-deprecated-unnecessary-collection-name,ansible-deprecated-no-version
msg=d,
deprecator=_deprecator.get_best_deprecator(),
)
else: else:
self.deprecate(kwargs['deprecations']) # pylint: disable=ansible-deprecated-no-version self.deprecate( # pylint: disable=ansible-deprecated-unnecessary-collection-name,ansible-deprecated-no-version
msg=kwargs['deprecations'],
deprecator=_deprecator.get_best_deprecator(),
)
deprecations = get_deprecations() deprecations = get_deprecations()
if deprecations: if deprecations:
@ -1454,12 +1496,13 @@ class AnsibleModule(object):
def exit_json(self, **kwargs) -> t.NoReturn: def exit_json(self, **kwargs) -> t.NoReturn:
""" return from the module, without error """ """ return from the module, without error """
_skip_stackwalk = True
self.do_cleanup_files() self.do_cleanup_files()
self._return_formatted(kwargs) self._return_formatted(kwargs)
sys.exit(0) sys.exit(0)
def fail_json(self, msg: str, *, exception: BaseException | str | ellipsis | None = ..., **kwargs) -> t.NoReturn: def fail_json(self, msg: str, *, exception: BaseException | str | None = _UNSET, **kwargs) -> t.NoReturn:
""" """
Return from the module with an error message and optional exception/traceback detail. Return from the module with an error message and optional exception/traceback detail.
A traceback will only be included in the result if error traceback capturing has been enabled. A traceback will only be included in the result if error traceback capturing has been enabled.
@ -1475,6 +1518,8 @@ class AnsibleModule(object):
When `exception` is not specified, a formatted traceback will be retrieved from the current exception. When `exception` is not specified, a formatted traceback will be retrieved from the current exception.
If no exception is pending, the current call stack will be used instead. If no exception is pending, the current call stack will be used instead.
""" """
_skip_stackwalk = True
msg = str(msg) # coerce to str instead of raising an error due to an invalid type msg = str(msg) # coerce to str instead of raising an error due to an invalid type
kwargs.update( kwargs.update(
@ -1498,7 +1543,7 @@ class AnsibleModule(object):
if isinstance(exception, str): if isinstance(exception, str):
formatted_traceback = exception formatted_traceback = exception
elif exception is ... and (current_exception := t.cast(t.Optional[BaseException], sys.exc_info()[1])): elif exception is _UNSET and (current_exception := t.cast(t.Optional[BaseException], sys.exc_info()[1])):
formatted_traceback = _traceback.maybe_extract_traceback(current_exception, _traceback.TracebackEvent.ERROR) formatted_traceback = _traceback.maybe_extract_traceback(current_exception, _traceback.TracebackEvent.ERROR)
else: else:
formatted_traceback = _traceback.maybe_capture_traceback(_traceback.TracebackEvent.ERROR) formatted_traceback = _traceback.maybe_capture_traceback(_traceback.TracebackEvent.ERROR)

@ -22,6 +22,7 @@ from ansible.module_utils.common.parameters import (
from ansible.module_utils.common.text.converters import to_native from ansible.module_utils.common.text.converters import to_native
from ansible.module_utils.common.warnings import deprecate, warn from ansible.module_utils.common.warnings import deprecate, warn
from ansible.module_utils.common import messages as _messages
from ansible.module_utils.common.validation import ( from ansible.module_utils.common.validation import (
check_mutually_exclusive, check_mutually_exclusive,
@ -300,9 +301,13 @@ class ModuleArgumentSpecValidator(ArgumentSpecValidator):
result = super(ModuleArgumentSpecValidator, self).validate(parameters) result = super(ModuleArgumentSpecValidator, self).validate(parameters)
for d in result._deprecations: for d in result._deprecations:
deprecate(d['msg'], # DTFIX-FUTURE: pass an actual deprecator instead of one derived from collection_name
version=d.get('version'), date=d.get('date'), deprecate( # pylint: disable=ansible-deprecated-date-not-permitted,ansible-deprecated-unnecessary-collection-name
collection_name=d.get('collection_name')) msg=d['msg'],
version=d.get('version'),
date=d.get('date'),
deprecator=_messages.PluginInfo._from_collection_name(d.get('collection_name')),
)
for w in result._warnings: for w in result._warnings:
warn('Both option {option} and its alias {alias} are set.'.format(option=w['option'], alias=w['alias'])) warn('Both option {option} and its alias {alias} are set.'.format(option=w['option'], alias=w['alias']))

@ -13,7 +13,7 @@ import dataclasses as _dataclasses
# deprecated: description='typing.Self exists in Python 3.11+' python_version='3.10' # deprecated: description='typing.Self exists in Python 3.11+' python_version='3.10'
from ..compat import typing as _t from ..compat import typing as _t
from ansible.module_utils._internal import _datatag from ansible.module_utils._internal import _datatag, _validation
if _sys.version_info >= (3, 10): if _sys.version_info >= (3, 10):
# Using slots for reduced memory usage and improved performance. # Using slots for reduced memory usage and improved performance.
@ -27,13 +27,27 @@ else:
class PluginInfo(_datatag.AnsibleSerializableDataclass): class PluginInfo(_datatag.AnsibleSerializableDataclass):
"""Information about a loaded plugin.""" """Information about a loaded plugin."""
requested_name: str
"""The plugin name as requested, before resolving, which may be partially or fully qualified."""
resolved_name: str resolved_name: str
"""The resolved canonical plugin name; always fully-qualified for collection plugins.""" """The resolved canonical plugin name; always fully-qualified for collection plugins."""
type: str type: str
"""The plugin type.""" """The plugin type."""
_COLLECTION_ONLY_TYPE: _t.ClassVar[str] = 'collection'
"""This is not a real plugin type. It's a placeholder for use by a `PluginInfo` instance which references a collection without a plugin."""
@classmethod
def _from_collection_name(cls, collection_name: str | None) -> _t.Self | None:
"""Returns an instance with the special `collection` type to refer to a non-plugin or ambiguous caller within a collection."""
if not collection_name:
return None
_validation.validate_collection_name(collection_name)
return cls(
resolved_name=collection_name,
type=cls._COLLECTION_ONLY_TYPE,
)
@_dataclasses.dataclass(**_dataclass_kwargs) @_dataclasses.dataclass(**_dataclass_kwargs)
class Detail(_datatag.AnsibleSerializableDataclass): class Detail(_datatag.AnsibleSerializableDataclass):
@ -75,34 +89,37 @@ class WarningSummary(SummaryBase):
class DeprecationSummary(WarningSummary): class DeprecationSummary(WarningSummary):
"""Deprecation summary with details (possibly derived from an exception __cause__ chain) and an optional traceback.""" """Deprecation summary with details (possibly derived from an exception __cause__ chain) and an optional traceback."""
version: _t.Optional[str] = None deprecator: _t.Optional[PluginInfo] = None
date: _t.Optional[str] = None """
plugin: _t.Optional[PluginInfo] = None The identifier for the content which is being deprecated.
"""
@property
def collection_name(self) -> _t.Optional[str]:
if not self.plugin:
return None
parts = self.plugin.resolved_name.split('.')
if len(parts) < 2:
return None
collection_name = '.'.join(parts[:2])
# deprecated: description='enable the deprecation message for collection_name' core_version='2.23' date: _t.Optional[str] = None
# from ansible.module_utils.datatag import deprecate_value """
# collection_name = deprecate_value(collection_name, 'The `collection_name` property is deprecated.', removal_version='2.27') The date after which a new release of `deprecator` will remove the feature described by `msg`.
Ignored if `deprecator` is not provided.
"""
return collection_name version: _t.Optional[str] = None
"""
The version of `deprecator` which will remove the feature described by `msg`.
Ignored if `deprecator` is not provided.
Ignored if `date` is provided.
"""
def _as_simple_dict(self) -> _t.Dict[str, _t.Any]: def _as_simple_dict(self) -> _t.Dict[str, _t.Any]:
"""Returns a dictionary representation of the deprecation object in the format exposed to playbooks.""" """Returns a dictionary representation of the deprecation object in the format exposed to playbooks."""
from ansible.module_utils._internal._deprecator import INDETERMINATE_DEPRECATOR # circular import from messages
if self.deprecator and self.deprecator != INDETERMINATE_DEPRECATOR:
collection_name = '.'.join(self.deprecator.resolved_name.split('.')[:2])
else:
collection_name = None
result = self._as_dict() result = self._as_dict()
result.update( result.update(
msg=self._format(), msg=self._format(),
collection_name=self.collection_name, collection_name=collection_name,
) )
return result return result

@ -29,7 +29,6 @@ def get_bin_path(arg, opt_dirs=None, required=None):
deprecate( deprecate(
msg="The `required` parameter in `get_bin_path` API is deprecated.", msg="The `required` parameter in `get_bin_path` API is deprecated.",
version="2.21", version="2.21",
collection_name="ansible.builtin",
) )
paths = [] paths = []

@ -3,14 +3,12 @@
from __future__ import annotations from __future__ import annotations
import dataclasses
import os import os
import pathlib import pathlib
import subprocess import subprocess
import sys import sys
import typing as t import typing as t
from ansible.module_utils._internal import _plugin_exec_context
from ansible.module_utils.common.text.converters import to_bytes from ansible.module_utils.common.text.converters import to_bytes
_ANSIBLE_PARENT_PATH = pathlib.Path(__file__).parents[3] _ANSIBLE_PARENT_PATH = pathlib.Path(__file__).parents[3]
@ -99,7 +97,6 @@ if __name__ == '__main__':
json_params = {json_params!r} json_params = {json_params!r}
profile = {profile!r} profile = {profile!r}
plugin_info_dict = {plugin_info_dict!r}
module_fqn = {module_fqn!r} module_fqn = {module_fqn!r}
modlib_path = {modlib_path!r} modlib_path = {modlib_path!r}
@ -110,19 +107,15 @@ if __name__ == '__main__':
_ansiballz.run_module( _ansiballz.run_module(
json_params=json_params, json_params=json_params,
profile=profile, profile=profile,
plugin_info_dict=plugin_info_dict,
module_fqn=module_fqn, module_fqn=module_fqn,
modlib_path=modlib_path, modlib_path=modlib_path,
init_globals=dict(_respawned=True), init_globals=dict(_respawned=True),
) )
""" """
plugin_info = _plugin_exec_context.PluginExecContext.get_current_plugin_info()
respawn_code = respawn_code_template.format( respawn_code = respawn_code_template.format(
json_params=basic._ANSIBLE_ARGS, json_params=basic._ANSIBLE_ARGS,
profile=basic._ANSIBLE_PROFILE, profile=basic._ANSIBLE_PROFILE,
plugin_info_dict=dataclasses.asdict(plugin_info),
module_fqn=module_fqn, module_fqn=module_fqn,
modlib_path=modlib_path, modlib_path=modlib_path,
) )

@ -4,15 +4,12 @@
from __future__ import annotations as _annotations from __future__ import annotations as _annotations
import datetime as _datetime
import typing as _t import typing as _t
from ansible.module_utils._internal import _traceback, _plugin_exec_context from ansible.module_utils._internal import _traceback, _deprecator
from ansible.module_utils.common import messages as _messages from ansible.module_utils.common import messages as _messages
from ansible.module_utils import _internal from ansible.module_utils import _internal
_UNSET = _t.cast(_t.Any, ...)
def warn(warning: str) -> None: def warn(warning: str) -> None:
"""Record a warning to be returned with the module result.""" """Record a warning to be returned with the module result."""
@ -28,22 +25,23 @@ def warn(warning: str) -> None:
def deprecate( def deprecate(
msg: str, msg: str,
version: str | None = None, version: str | None = None,
date: str | _datetime.date | None = None, date: str | None = None,
collection_name: str | None = _UNSET, collection_name: str | None = None,
*, *,
deprecator: _messages.PluginInfo | None = None,
help_text: str | None = None, help_text: str | None = None,
obj: object | None = None, obj: object | None = None,
) -> None: ) -> None:
""" """
Record a deprecation warning to be returned with the module result. Record a deprecation warning.
The `obj` argument is only useful in a controller context; it is ignored for target-side callers. The `obj` argument is only useful in a controller context; it is ignored for target-side callers.
Most callers do not need to provide `collection_name` or `deprecator` -- but provide only one if needed.
Specify `version` or `date`, but not both.
If `date` is a string, it must be in the form `YYYY-MM-DD`.
""" """
if isinstance(date, _datetime.date): _skip_stackwalk = True
date = str(date)
# deprecated: description='enable the deprecation message for collection_name' core_version='2.23' deprecator = _deprecator.get_best_deprecator(deprecator=deprecator, collection_name=collection_name)
# if collection_name is not _UNSET:
# deprecate('The `collection_name` argument to `deprecate` is deprecated.', version='2.27')
if _internal.is_controller: if _internal.is_controller:
_display = _internal.import_controller_module('ansible.utils.display').Display() _display = _internal.import_controller_module('ansible.utils.display').Display()
@ -53,6 +51,8 @@ def deprecate(
date=date, date=date,
help_text=help_text, help_text=help_text,
obj=obj, obj=obj,
# skip passing collection_name; get_best_deprecator already accounted for it when present
deprecator=deprecator,
) )
return return
@ -64,7 +64,7 @@ def deprecate(
formatted_traceback=_traceback.maybe_capture_traceback(_traceback.TracebackEvent.DEPRECATED), formatted_traceback=_traceback.maybe_capture_traceback(_traceback.TracebackEvent.DEPRECATED),
version=version, version=version,
date=date, date=date,
plugin=_plugin_exec_context.PluginExecContext.get_current_plugin_info(), deprecator=deprecator,
)] = None )] = None

@ -1,11 +1,11 @@
"""Public API for data tagging.""" """Public API for data tagging."""
from __future__ import annotations as _annotations from __future__ import annotations as _annotations
import datetime as _datetime
import typing as _t import typing as _t
from ._internal import _plugin_exec_context, _datatag from ._internal import _datatag, _deprecator
from ._internal._datatag import _tags from ._internal._datatag import _tags
from .common import messages as _messages
_T = _t.TypeVar('_T') _T = _t.TypeVar('_T')
@ -14,28 +14,28 @@ def deprecate_value(
value: _T, value: _T,
msg: str, msg: str,
*, *,
version: str | None = None,
date: str | None = None,
collection_name: str | None = None,
deprecator: _messages.PluginInfo | None = None,
help_text: str | None = None, help_text: str | None = None,
removal_date: str | _datetime.date | None = None,
removal_version: str | None = None,
) -> _T: ) -> _T:
""" """
Return `value` tagged with the given deprecation details. Return `value` tagged with the given deprecation details.
The types `None` and `bool` cannot be deprecated and are returned unmodified. The types `None` and `bool` cannot be deprecated and are returned unmodified.
Raises a `TypeError` if `value` is not a supported type. Raises a `TypeError` if `value` is not a supported type.
If `removal_date` is a string, it must be in the form `YYYY-MM-DD`. Most callers do not need to provide `collection_name` or `deprecator` -- but provide only one if needed.
This function is only supported in contexts where an Ansible plugin/module is executing. Specify `version` or `date`, but not both.
If `date` is provided, it should be in the form `YYYY-MM-DD`.
""" """
if isinstance(removal_date, str): _skip_stackwalk = True
# The `fromisoformat` method accepts other ISO 8601 formats than `YYYY-MM-DD` starting with Python 3.11.
# That should be considered undocumented behavior of `deprecate_value` rather than an intentional feature.
removal_date = _datetime.date.fromisoformat(removal_date)
deprecated = _tags.Deprecated( deprecated = _tags.Deprecated(
msg=msg, msg=msg,
help_text=help_text, help_text=help_text,
removal_date=removal_date, date=date,
removal_version=removal_version, version=version,
plugin=_plugin_exec_context.PluginExecContext.get_current_plugin_info(), deprecator=_deprecator.get_best_deprecator(deprecator=deprecator, collection_name=collection_name),
) )
return deprecated.tag(value) return deprecated.tag(value)

@ -151,7 +151,7 @@ class LinuxVirtual(Virtual):
sys_vendor = get_file_content('/sys/devices/virtual/dmi/id/sys_vendor') sys_vendor = get_file_content('/sys/devices/virtual/dmi/id/sys_vendor')
product_family = get_file_content('/sys/devices/virtual/dmi/id/product_family') product_family = get_file_content('/sys/devices/virtual/dmi/id/product_family')
if product_name in ('KVM', 'KVM Server', 'Bochs', 'AHV'): if product_name in ('KVM', 'KVM Server', 'Bochs', 'AHV', 'CloudStack KVM Hypervisor'):
guest_tech.add('kvm') guest_tech.add('kvm')
if not found_virt: if not found_virt:
virtual_facts['virtualization_type'] = 'kvm' virtual_facts['virtualization_type'] = 'kvm'

@ -3,6 +3,8 @@
from __future__ import annotations from __future__ import annotations
import collections.abc as c
from ansible.module_utils.six import binary_type, text_type from ansible.module_utils.six import binary_type, text_type
from ansible.module_utils.common.text.converters import to_text from ansible.module_utils.common.text.converters import to_text
@ -17,9 +19,13 @@ def boolean(value, strict=True):
return value return value
normalized_value = value normalized_value = value
if isinstance(value, (text_type, binary_type)): if isinstance(value, (text_type, binary_type)):
normalized_value = to_text(value, errors='surrogate_or_strict').lower().strip() normalized_value = to_text(value, errors='surrogate_or_strict').lower().strip()
if not isinstance(value, c.Hashable):
normalized_value = None # prevent unhashable types from bombing, but keep the rest of the existing fallback/error behavior
if normalized_value in BOOLEANS_TRUE: if normalized_value in BOOLEANS_TRUE:
return True return True
elif normalized_value in BOOLEANS_FALSE or not strict: elif normalized_value in BOOLEANS_FALSE or not strict:

@ -181,7 +181,7 @@ def assemble_from_fragments(src_path, delimiter=None, compiled_regexp=None, igno
return temp_path return temp_path
def cleanup(path, result=None): def cleanup(module, path, result=None):
# cleanup just in case # cleanup just in case
if os.path.exists(path): if os.path.exists(path):
try: try:
@ -189,7 +189,7 @@ def cleanup(path, result=None):
except (IOError, OSError) as e: except (IOError, OSError) as e:
# don't error on possible race conditions, but keep warning # don't error on possible race conditions, but keep warning
if result is not None: if result is not None:
result['warnings'] = ['Unable to remove temp file (%s): %s' % (path, to_native(e))] module.warn('Unable to remove temp file (%s): %s' % (path, to_native(e)))
def main(): def main():
@ -261,7 +261,7 @@ def main():
(rc, out, err) = module.run_command(validate % path) (rc, out, err) = module.run_command(validate % path)
result['validation'] = dict(rc=rc, stdout=out, stderr=err) result['validation'] = dict(rc=rc, stdout=out, stderr=err)
if rc != 0: if rc != 0:
cleanup(path) cleanup(module, path)
module.fail_json(msg="failed to validate: rc:%s error:%s" % (rc, err)) module.fail_json(msg="failed to validate: rc:%s error:%s" % (rc, err))
if backup and dest_hash is not None: if backup and dest_hash is not None:
result['backup_file'] = module.backup_local(dest) result['backup_file'] = module.backup_local(dest)
@ -269,7 +269,7 @@ def main():
module.atomic_move(path, dest, unsafe_writes=module.params['unsafe_writes']) module.atomic_move(path, dest, unsafe_writes=module.params['unsafe_writes'])
changed = True changed = True
cleanup(path, result) cleanup(module, path, result)
# handle file permissions # handle file permissions
file_args = module.load_file_common_arguments(module.params) file_args = module.load_file_common_arguments(module.params)

@ -68,7 +68,7 @@ EXAMPLES = r"""
ansible.builtin.async_status: ansible.builtin.async_status:
jid: '{{ dnf_sleeper.ansible_job_id }}' jid: '{{ dnf_sleeper.ansible_job_id }}'
register: job_result register: job_result
until: job_result.finished until: job_result is finished
retries: 100 retries: 100
delay: 10 delay: 10

@ -618,7 +618,6 @@ def main():
changed = False changed = False
res_args = dict() res_args = dict()
warnings = list()
if cron_file: if cron_file:
@ -627,8 +626,8 @@ def main():
cron_file_basename = os.path.basename(cron_file) cron_file_basename = os.path.basename(cron_file)
if not re.search(r'^[A-Z0-9_-]+$', cron_file_basename, re.I): if not re.search(r'^[A-Z0-9_-]+$', cron_file_basename, re.I):
warnings.append('Filename portion of cron_file ("%s") should consist' % cron_file_basename + module.warn('Filename portion of cron_file ("%s") should consist' % cron_file_basename +
' solely of upper- and lower-case letters, digits, underscores, and hyphens') ' solely of upper- and lower-case letters, digits, underscores, and hyphens')
# Ensure all files generated are only writable by the owning user. Primarily relevant for the cron_file option. # Ensure all files generated are only writable by the owning user. Primarily relevant for the cron_file option.
os.umask(int('022', 8)) os.umask(int('022', 8))
@ -693,7 +692,7 @@ def main():
if do_install: if do_install:
for char in ['\r', '\n']: for char in ['\r', '\n']:
if char in job.strip('\r\n'): if char in job.strip('\r\n'):
warnings.append('Job should not contain line breaks') module.warn('Job should not contain line breaks')
break break
job = crontab.get_cron_job(minute, hour, day, month, weekday, job, special_time, disabled) job = crontab.get_cron_job(minute, hour, day, month, weekday, job, special_time, disabled)
@ -734,7 +733,6 @@ def main():
res_args = dict( res_args = dict(
jobs=crontab.get_jobnames(), jobs=crontab.get_jobnames(),
envs=crontab.get_envnames(), envs=crontab.get_envnames(),
warnings=warnings,
changed=changed changed=changed
) )

@ -722,6 +722,7 @@ class Dnf5Module(YumDnf):
if self.security: if self.security:
types.append("security") types.append("security")
advisory_query.filter_type(types) advisory_query.filter_type(types)
conf.skip_unavailable = True # ignore packages that are of a different type, for backwards compat
settings.set_advisory_filter(advisory_query) settings.set_advisory_filter(advisory_query)
goal = libdnf5.base.Goal(base) goal = libdnf5.base.Goal(base)
@ -797,7 +798,7 @@ class Dnf5Module(YumDnf):
if self.module.check_mode: if self.module.check_mode:
if results: if results:
msg = "Check mode: No changes made, but would have if not in check mode" msg = "Check mode: No changes made, but would have if not in check mode"
else: elif changed:
transaction.download() transaction.download()
if not self.download_only: if not self.download_only:
transaction.set_description("ansible dnf5 module") transaction.set_description("ansible dnf5 module")

@ -87,7 +87,7 @@ options:
- 'If a checksum is passed to this parameter, the digest of the - 'If a checksum is passed to this parameter, the digest of the
destination file will be calculated after it is downloaded to ensure destination file will be calculated after it is downloaded to ensure
its integrity and verify that the transfer completed successfully. its integrity and verify that the transfer completed successfully.
Format: <algorithm>:<checksum|url>, for example C(checksum="sha256:D98291AC[...]B6DC7B97", Format: <algorithm>:<checksum|url>, for example C(checksum="sha256:D98291AC[...]B6DC7B97"),
C(checksum="sha256:http://example.com/path/sha256sum.txt").' C(checksum="sha256:http://example.com/path/sha256sum.txt").'
- If you worry about portability, only the sha1 algorithm is available - If you worry about portability, only the sha1 algorithm is available
on all platforms and python versions. on all platforms and python versions.

@ -317,11 +317,6 @@ remote_url_changed:
returned: success returned: success
type: bool type: bool
sample: True sample: True
warnings:
description: List of warnings if requested features were not available due to a too old git version.
returned: error
type: str
sample: git version is too old to fully support the depth argument. Falling back to full checkouts.
git_dir_now: git_dir_now:
description: Contains the new path of .git directory if it is changed. description: Contains the new path of .git directory if it is changed.
returned: success returned: success
@ -1240,7 +1235,7 @@ def main():
archive_prefix = module.params['archive_prefix'] archive_prefix = module.params['archive_prefix']
separate_git_dir = module.params['separate_git_dir'] separate_git_dir = module.params['separate_git_dir']
result = dict(changed=False, warnings=list()) result = dict(changed=False)
if module.params['accept_hostkey']: if module.params['accept_hostkey']:
if ssh_opts is not None: if ssh_opts is not None:

@ -814,10 +814,8 @@ def main():
elif requirements: elif requirements:
cmd.extend(['-r', requirements]) cmd.extend(['-r', requirements])
else: else:
module.exit_json( module.warn("No valid name or requirements file found.")
changed=False, module.exit_json(changed=False)
warnings=["No valid name or requirements file found."],
)
if module.check_mode: if module.check_mode:
if extra_args or requirements or state == 'latest' or not name: if extra_args or requirements or state == 'latest' or not name:

@ -88,9 +88,9 @@ EXAMPLES = """
- name: Sleep for 5 seconds between stop and start command of badly behaving service - name: Sleep for 5 seconds between stop and start command of badly behaving service
ansible.builtin.sysvinit: ansible.builtin.sysvinit:
name: apache2 name: apache2
state: restarted state: restarted
sleep: 5 sleep: 5
- name: Make sure apache2 is started on runlevels 3 and 5 - name: Make sure apache2 is started on runlevels 3 and 5
ansible.builtin.sysvinit: ansible.builtin.sysvinit:

@ -0,0 +1,40 @@
# (c) 2012-2014, Michael DeHaan <michael.dehaan@gmail.com>
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
from __future__ import annotations
import json
from ansible.utils.display import Display
Display().deprecated(f'{__name__!r} is deprecated.', version='2.23', help_text='Call `json.dumps` directly instead.')
def jsonify(result, format=False):
"""Format JSON output."""
if result is None:
return "{}"
indent = None
if format:
indent = 4
try:
return json.dumps(result, sort_keys=True, indent=indent, ensure_ascii=False)
except UnicodeDecodeError:
return json.dumps(result, sort_keys=True, indent=indent)

@ -8,25 +8,36 @@ from ansible.module_utils._internal import _datatag
from ansible.module_utils.common.text import converters as _converters from ansible.module_utils.common.text import converters as _converters
from ansible.parsing import vault as _vault from ansible.parsing import vault as _vault
_UNSET = _t.cast(_t.Any, object())
class _AnsibleMapping(dict): class _AnsibleMapping(dict):
"""Backwards compatibility type.""" """Backwards compatibility type."""
def __new__(cls, value): def __new__(cls, value=_UNSET, /, **kwargs):
return _datatag.AnsibleTagHelper.tag_copy(value, dict(value)) if value is _UNSET:
return dict(**kwargs)
return _datatag.AnsibleTagHelper.tag_copy(value, dict(value, **kwargs))
class _AnsibleUnicode(str): class _AnsibleUnicode(str):
"""Backwards compatibility type.""" """Backwards compatibility type."""
def __new__(cls, value): def __new__(cls, object=_UNSET, **kwargs):
return _datatag.AnsibleTagHelper.tag_copy(value, str(value)) if object is _UNSET:
return str(**kwargs)
return _datatag.AnsibleTagHelper.tag_copy(object, str(object, **kwargs))
class _AnsibleSequence(list): class _AnsibleSequence(list):
"""Backwards compatibility type.""" """Backwards compatibility type."""
def __new__(cls, value): def __new__(cls, value=_UNSET, /):
if value is _UNSET:
return list()
return _datatag.AnsibleTagHelper.tag_copy(value, list(value)) return _datatag.AnsibleTagHelper.tag_copy(value, list(value))

@ -21,34 +21,42 @@ import os
from ansible import constants as C from ansible import constants as C
from ansible.errors import AnsibleError from ansible.errors import AnsibleError
from ansible.executor.task_result import _RawTaskResult
from ansible.inventory.host import Host
from ansible.module_utils.common.text.converters import to_text from ansible.module_utils.common.text.converters import to_text
from ansible.parsing.dataloader import DataLoader
from ansible.playbook.handler import Handler from ansible.playbook.handler import Handler
from ansible.playbook.task_include import TaskInclude from ansible.playbook.task_include import TaskInclude
from ansible.playbook.role_include import IncludeRole from ansible.playbook.role_include import IncludeRole
from ansible._internal._templating._engine import TemplateEngine from ansible._internal._templating._engine import TemplateEngine
from ansible.utils.display import Display from ansible.utils.display import Display
from ansible.vars.manager import VariableManager
display = Display() display = Display()
class IncludedFile: class IncludedFile:
def __init__(self, filename, args, vars, task, is_role=False): def __init__(self, filename, args, vars, task, is_role: bool = False) -> None:
self._filename = filename self._filename = filename
self._args = args self._args = args
self._vars = vars self._vars = vars
self._task = task self._task = task
self._hosts = [] self._hosts: list[Host] = []
self._is_role = is_role self._is_role = is_role
self._results = [] self._results: list[_RawTaskResult] = []
def add_host(self, host): def add_host(self, host: Host) -> None:
if host not in self._hosts: if host not in self._hosts:
self._hosts.append(host) self._hosts.append(host)
return return
raise ValueError() raise ValueError()
def __eq__(self, other): def __eq__(self, other):
if not isinstance(other, IncludedFile):
return False
return (other._filename == self._filename and return (other._filename == self._filename and
other._args == self._args and other._args == self._args and
other._vars == self._vars and other._vars == self._vars and
@ -59,23 +67,28 @@ class IncludedFile:
return "%s (args=%s vars=%s): %s" % (self._filename, self._args, self._vars, self._hosts) return "%s (args=%s vars=%s): %s" % (self._filename, self._args, self._vars, self._hosts)
@staticmethod @staticmethod
def process_include_results(results, iterator, loader, variable_manager): def process_include_results(
included_files = [] results: list[_RawTaskResult],
task_vars_cache = {} iterator,
loader: DataLoader,
variable_manager: VariableManager,
) -> list[IncludedFile]:
included_files: list[IncludedFile] = []
task_vars_cache: dict[tuple, dict] = {}
for res in results: for res in results:
original_host = res._host original_host = res.host
original_task = res._task original_task = res.task
if original_task.action in C._ACTION_ALL_INCLUDES: if original_task.action in C._ACTION_ALL_INCLUDES:
if original_task.loop: if original_task.loop:
if 'results' not in res._result: if 'results' not in res._return_data:
continue continue
include_results = res._result['results'] include_results = res._loop_results
else: else:
include_results = [res._result] include_results = [res._return_data]
for include_result in include_results: for include_result in include_results:
# if the task result was skipped or failed, continue # if the task result was skipped or failed, continue

@ -227,8 +227,6 @@ class Task(Base, Conditional, Taggable, CollectionSearch, Notifiable, Delegatabl
raise AnsibleError("you must specify a value when using %s" % k, obj=ds) raise AnsibleError("you must specify a value when using %s" % k, obj=ds)
new_ds['loop_with'] = loop_name new_ds['loop_with'] = loop_name
new_ds['loop'] = v new_ds['loop'] = v
# display.deprecated("with_ type loops are being phased out, use the 'loop' keyword instead",
# version="2.10", collection_name='ansible.builtin')
def preprocess_data(self, ds): def preprocess_data(self, ds):
""" """

@ -20,24 +20,26 @@
from __future__ import annotations from __future__ import annotations
import abc import abc
import functools
import types import types
import typing as t import typing as t
from ansible import constants as C from ansible import constants as C
from ansible.errors import AnsibleError from ansible.errors import AnsibleError
from ansible.utils.display import Display from ansible.utils.display import Display
from ansible.utils import display as _display
from ansible.module_utils._internal import _plugin_exec_context from ansible.module_utils._internal import _plugin_info
display = Display() display = Display()
if t.TYPE_CHECKING: if t.TYPE_CHECKING:
from .loader import PluginPathContext from . import loader as _t_loader
# Global so that all instances of a PluginLoader will share the caches # Global so that all instances of a PluginLoader will share the caches
MODULE_CACHE = {} # type: dict[str, dict[str, types.ModuleType]] MODULE_CACHE = {} # type: dict[str, dict[str, types.ModuleType]]
PATH_CACHE = {} # type: dict[str, list[PluginPathContext] | None] PATH_CACHE = {} # type: dict[str, list[_t_loader.PluginPathContext] | None]
PLUGIN_PATH_CACHE = {} # type: dict[str, dict[str, dict[str, PluginPathContext]]] PLUGIN_PATH_CACHE = {} # type: dict[str, dict[str, dict[str, _t_loader.PluginPathContext]]]
def get_plugin_class(obj): def get_plugin_class(obj):
@ -50,10 +52,10 @@ def get_plugin_class(obj):
class _ConfigurablePlugin(t.Protocol): class _ConfigurablePlugin(t.Protocol):
"""Protocol to provide type-safe access to config for plugin-related mixins.""" """Protocol to provide type-safe access to config for plugin-related mixins."""
def get_option(self, option: str, hostvars: dict[str, object] | None = None) -> object: ... def get_option(self, option: str, hostvars: dict[str, object] | None = None) -> t.Any: ...
class _AnsiblePluginInfoMixin(_plugin_exec_context.HasPluginInfo): class _AnsiblePluginInfoMixin(_plugin_info.HasPluginInfo):
"""Mixin to provide type annotations and default values for existing PluginLoader-set load-time attrs.""" """Mixin to provide type annotations and default values for existing PluginLoader-set load-time attrs."""
_original_path: str | None = None _original_path: str | None = None
_load_name: str | None = None _load_name: str | None = None
@ -102,6 +104,14 @@ class AnsiblePlugin(_AnsiblePluginInfoMixin, _ConfigurablePlugin, metaclass=abc.
raise KeyError(str(e)) raise KeyError(str(e))
return option_value, origin return option_value, origin
@functools.cached_property
def __plugin_info(self):
"""
Internal cached property to retrieve `PluginInfo` for this plugin instance.
Only for use by the `AnsiblePlugin` base class.
"""
return _plugin_info.get_plugin_info(self)
def get_option(self, option, hostvars=None): def get_option(self, option, hostvars=None):
if option not in self._options: if option not in self._options:
@ -117,7 +127,7 @@ class AnsiblePlugin(_AnsiblePluginInfoMixin, _ConfigurablePlugin, metaclass=abc.
def set_option(self, option, value): def set_option(self, option, value):
self._options[option] = C.config.get_config_value(option, plugin_type=self.plugin_type, plugin_name=self._load_name, direct={option: value}) self._options[option] = C.config.get_config_value(option, plugin_type=self.plugin_type, plugin_name=self._load_name, direct={option: value})
C.handle_config_noise(display) _display._report_config_warnings(self.__plugin_info)
def set_options(self, task_keys=None, var_options=None, direct=None): def set_options(self, task_keys=None, var_options=None, direct=None):
""" """
@ -134,7 +144,7 @@ class AnsiblePlugin(_AnsiblePluginInfoMixin, _ConfigurablePlugin, metaclass=abc.
if self.allow_extras and var_options and '_extras' in var_options: if self.allow_extras and var_options and '_extras' in var_options:
# these are largely unvalidated passthroughs, either plugin or underlying API will validate # these are largely unvalidated passthroughs, either plugin or underlying API will validate
self._options['_extras'] = var_options['_extras'] self._options['_extras'] = var_options['_extras']
C.handle_config_noise(display) _display._report_config_warnings(self.__plugin_info)
def has_option(self, option): def has_option(self, option):
if not self._options: if not self._options:

@ -318,13 +318,6 @@ class ActionBase(ABC, _AnsiblePluginInfoMixin):
final_environment: dict[str, t.Any] = {} final_environment: dict[str, t.Any] = {}
self._compute_environment_string(final_environment) self._compute_environment_string(final_environment)
# `modify_module` adapts PluginInfo to allow target-side use of `PluginExecContext` since modules aren't plugins
plugin = PluginInfo(
requested_name=module_name,
resolved_name=result.resolved_fqcn,
type='module',
)
# modify_module will exit early if interpreter discovery is required; re-run after if necessary # modify_module will exit early if interpreter discovery is required; re-run after if necessary
for _dummy in (1, 2): for _dummy in (1, 2):
try: try:
@ -338,7 +331,6 @@ class ActionBase(ABC, _AnsiblePluginInfoMixin):
async_timeout=self._task.async_val, async_timeout=self._task.async_val,
environment=final_environment, environment=final_environment,
remote_is_local=bool(getattr(self._connection, '_remote_is_local', False)), remote_is_local=bool(getattr(self._connection, '_remote_is_local', False)),
plugin=plugin,
become_plugin=self._connection.become, become_plugin=self._connection.become,
) )
@ -649,12 +641,12 @@ class ActionBase(ABC, _AnsiblePluginInfoMixin):
# done. Make the files +x if we're asked to, and return. # done. Make the files +x if we're asked to, and return.
if not self._is_become_unprivileged(): if not self._is_become_unprivileged():
if execute: if execute:
# Can't depend on the file being transferred with execute permissions. # Can't depend on the file being transferred with required permissions.
# Only need user perms because no become was used here # Only need user perms because no become was used here
res = self._remote_chmod(remote_paths, 'u+x') res = self._remote_chmod(remote_paths, 'u+rwx')
if res['rc'] != 0: if res['rc'] != 0:
raise AnsibleError( raise AnsibleError(
'Failed to set execute bit on remote files ' 'Failed to set permissions on remote files '
'(rc: {0}, err: {1})'.format( '(rc: {0}, err: {1})'.format(
res['rc'], res['rc'],
to_native(res['stderr']))) to_native(res['stderr'])))
@ -695,10 +687,10 @@ class ActionBase(ABC, _AnsiblePluginInfoMixin):
return remote_paths return remote_paths
# Step 3b: Set execute if we need to. We do this before anything else # Step 3b: Set execute if we need to. We do this before anything else
# because some of the methods below might work but not let us set +x # because some of the methods below might work but not let us set
# as part of them. # permissions as part of them.
if execute: if execute:
res = self._remote_chmod(remote_paths, 'u+x') res = self._remote_chmod(remote_paths, 'u+rwx')
if res['rc'] != 0: if res['rc'] != 0:
raise AnsibleError( raise AnsibleError(
'Failed to set file mode or acl on remote temporary files ' 'Failed to set file mode or acl on remote temporary files '
@ -1010,7 +1002,7 @@ class ActionBase(ABC, _AnsiblePluginInfoMixin):
# tells the module to ignore options that are not in its argspec. # tells the module to ignore options that are not in its argspec.
module_args['_ansible_ignore_unknown_opts'] = ignore_unknown_opts module_args['_ansible_ignore_unknown_opts'] = ignore_unknown_opts
# allow user to insert string to add context to remote loggging # allow user to insert string to add context to remote logging
module_args['_ansible_target_log_info'] = C.config.get_config_value('TARGET_LOG_INFO', variables=task_vars) module_args['_ansible_target_log_info'] = C.config.get_config_value('TARGET_LOG_INFO', variables=task_vars)
module_args['_ansible_tracebacks_for'] = _traceback.traceback_for() module_args['_ansible_tracebacks_for'] = _traceback.traceback_for()

@ -28,10 +28,8 @@ class ActionModule(ActionBase):
# TODO: remove in favor of controller side argspec detecting valid arguments # TODO: remove in favor of controller side argspec detecting valid arguments
# network facts modules must support gather_subset # network facts modules must support gather_subset
try: name = self._connection.ansible_name.removeprefix('ansible.netcommon.')
name = self._connection.ansible_name.removeprefix('ansible.netcommon.')
except AttributeError:
name = self._connection._load_name.split('.')[-1]
if name not in ('network_cli', 'httpapi', 'netconf'): if name not in ('network_cli', 'httpapi', 'netconf'):
subset = mod_args.pop('gather_subset', None) subset = mod_args.pop('gather_subset', None)
if subset not in ('all', ['all'], None): if subset not in ('all', ['all'], None):

@ -132,6 +132,9 @@ class ActionModule(ActionBase):
data_templar = self._templar.copy_with_new_env(searchpath=searchpath, available_variables=temp_vars) data_templar = self._templar.copy_with_new_env(searchpath=searchpath, available_variables=temp_vars)
resultant = data_templar.template(template_data, escape_backslashes=False, overrides=overrides) resultant = data_templar.template(template_data, escape_backslashes=False, overrides=overrides)
if resultant is None:
resultant = ''
new_task = self._task.copy() new_task = self._task.copy()
# mode is either the mode from task.args or the mode of the source file if the task.args # mode is either the mode from task.args or the mode of the source file if the task.args
# mode == 'preserve' # mode == 'preserve'

@ -24,15 +24,14 @@ import re
import sys import sys
import textwrap import textwrap
import typing as t import typing as t
import collections.abc as _c
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
from collections.abc import MutableMapping
from copy import deepcopy from copy import deepcopy
from ansible import constants as C from ansible import constants as C
from ansible.module_utils._internal import _datatag from ansible.module_utils._internal import _datatag
from ansible.module_utils.common.messages import ErrorSummary
from ansible._internal._yaml import _dumper from ansible._internal._yaml import _dumper
from ansible.plugins import AnsiblePlugin from ansible.plugins import AnsiblePlugin
from ansible.utils.color import stringc from ansible.utils.color import stringc
@ -44,7 +43,7 @@ from ansible._internal._templating import _engine
import yaml import yaml
if TYPE_CHECKING: if TYPE_CHECKING:
from ansible.executor.task_result import TaskResult from ansible.executor.task_result import CallbackTaskResult
global_display = Display() global_display = Display()
@ -59,6 +58,19 @@ _YAML_BREAK_CHARS = '\n\x85\u2028\u2029' # NL, NEL, LS, PS
_SPACE_BREAK_RE = re.compile(fr' +([{_YAML_BREAK_CHARS}])') _SPACE_BREAK_RE = re.compile(fr' +([{_YAML_BREAK_CHARS}])')
_T_callable = t.TypeVar("_T_callable", bound=t.Callable)
def _callback_base_impl(wrapped: _T_callable) -> _T_callable:
"""
Decorator for the no-op methods on the `CallbackBase` base class.
Used to avoid unnecessary dispatch overhead to no-op base callback methods.
"""
wrapped._base_impl = True
return wrapped
class _AnsibleCallbackDumper(_dumper.AnsibleDumper): class _AnsibleCallbackDumper(_dumper.AnsibleDumper):
def __init__(self, *args, lossy: bool = False, **kwargs): def __init__(self, *args, lossy: bool = False, **kwargs):
super().__init__(*args, **kwargs) super().__init__(*args, **kwargs)
@ -87,6 +99,8 @@ class _AnsibleCallbackDumper(_dumper.AnsibleDumper):
def _register_representers(cls) -> None: def _register_representers(cls) -> None:
super()._register_representers() super()._register_representers()
# exact type checks occur first against representers, then subclasses against multi-representers
cls.add_representer(str, cls._pretty_represent_str)
cls.add_multi_representer(str, cls._pretty_represent_str) cls.add_multi_representer(str, cls._pretty_represent_str)
@ -140,12 +154,17 @@ class CallbackBase(AnsiblePlugin):
custom actions. custom actions.
""" """
def __init__(self, display=None, options=None): def __init__(self, display: Display | None = None, options: dict[str, t.Any] | None = None) -> None:
super().__init__()
if display: if display:
self._display = display self._display = display
else: else:
self._display = global_display self._display = global_display
# FUTURE: fix double-loading of non-collection stdout callback plugins that don't set CALLBACK_NEEDS_ENABLED
# FUTURE: this code is jacked for 2.x- it should just use the type names and always assume 2.0+ for normal cases
if self._display.verbosity >= 4: if self._display.verbosity >= 4:
name = getattr(self, 'CALLBACK_NAME', 'unnamed') name = getattr(self, 'CALLBACK_NAME', 'unnamed')
ctype = getattr(self, 'CALLBACK_TYPE', 'old') ctype = getattr(self, 'CALLBACK_TYPE', 'old')
@ -155,7 +174,8 @@ class CallbackBase(AnsiblePlugin):
self.disabled = False self.disabled = False
self.wants_implicit_tasks = False self.wants_implicit_tasks = False
self._plugin_options = {} self._plugin_options: dict[str, t.Any] = {}
if options is not None: if options is not None:
self.set_options(options) self.set_options(options)
@ -164,6 +184,8 @@ class CallbackBase(AnsiblePlugin):
'ansible_loop_var', 'ansible_index_var', 'ansible_loop', 'ansible_loop_var', 'ansible_index_var', 'ansible_loop',
) )
self._current_task_result: CallbackTaskResult | None = None
# helper for callbacks, so they don't all have to include deepcopy # helper for callbacks, so they don't all have to include deepcopy
_copy_result = deepcopy _copy_result = deepcopy
@ -185,25 +207,30 @@ class CallbackBase(AnsiblePlugin):
self._plugin_options = C.config.get_plugin_options(self.plugin_type, self._load_name, keys=task_keys, variables=var_options, direct=direct) self._plugin_options = C.config.get_plugin_options(self.plugin_type, self._load_name, keys=task_keys, variables=var_options, direct=direct)
@staticmethod @staticmethod
def host_label(result): def host_label(result: CallbackTaskResult) -> str:
"""Return label for the hostname (& delegated hostname) of a task """Return label for the hostname (& delegated hostname) of a task result."""
result. label = result.host.get_name()
""" if result.task.delegate_to and result.task.delegate_to != result.host.get_name():
label = "%s" % result._host.get_name()
if result._task.delegate_to and result._task.delegate_to != result._host.get_name():
# show delegated host # show delegated host
label += " -> %s" % result._task.delegate_to label += " -> %s" % result.task.delegate_to
# in case we have 'extra resolution' # in case we have 'extra resolution'
ahost = result._result.get('_ansible_delegated_vars', {}).get('ansible_host', result._task.delegate_to) ahost = result.result.get('_ansible_delegated_vars', {}).get('ansible_host', result.task.delegate_to)
if result._task.delegate_to != ahost: if result.task.delegate_to != ahost:
label += "(%s)" % ahost label += "(%s)" % ahost
return label return label
def _run_is_verbose(self, result, verbosity=0): def _run_is_verbose(self, result: CallbackTaskResult, verbosity: int = 0) -> bool:
return ((self._display.verbosity > verbosity or result._result.get('_ansible_verbose_always', False) is True) return ((self._display.verbosity > verbosity or result.result.get('_ansible_verbose_always', False) is True)
and result._result.get('_ansible_verbose_override', False) is False) and result.result.get('_ansible_verbose_override', False) is False)
def _dump_results(self, result, indent=None, sort_keys=True, keep_invocation=False, serialize=True): def _dump_results(
self,
result: _c.Mapping[str, t.Any],
indent: int | None = None,
sort_keys: bool = True,
keep_invocation: bool = False,
serialize: bool = True,
) -> str:
try: try:
result_format = self.get_option('result_format') result_format = self.get_option('result_format')
except KeyError: except KeyError:
@ -253,10 +280,12 @@ class CallbackBase(AnsiblePlugin):
# that want to further modify the result, or use custom serialization # that want to further modify the result, or use custom serialization
return abridged_result return abridged_result
# DTFIX-RELEASE: Switch to stock json/yaml serializers here? We should always have a transformed plain-types result.
if result_format == 'json': if result_format == 'json':
return json.dumps(abridged_result, cls=_fallback_to_str.Encoder, indent=indent, ensure_ascii=False, sort_keys=sort_keys) return json.dumps(abridged_result, cls=_fallback_to_str.Encoder, indent=indent, ensure_ascii=False, sort_keys=sort_keys)
elif result_format == 'yaml': if result_format == 'yaml':
# None is a sentinel in this case that indicates default behavior # None is a sentinel in this case that indicates default behavior
# default behavior for yaml is to prettify results # default behavior for yaml is to prettify results
lossy = pretty_results in (None, True) lossy = pretty_results in (None, True)
@ -281,22 +310,28 @@ class CallbackBase(AnsiblePlugin):
' ' * (indent or 4) ' ' * (indent or 4)
) )
def _handle_warnings(self, res: dict[str, t.Any]) -> None: # DTFIX-RELEASE: add test to exercise this case
"""Display warnings and deprecation warnings sourced by task execution.""" raise ValueError(f'Unsupported result_format {result_format!r}.')
for warning in res.pop('warnings', []):
# DTFIX-RELEASE: what to do about propagating wrap_text from the original display.warning call?
self._display._warning(warning, wrap_text=False)
for warning in res.pop('deprecations', []):
self._display._deprecated(warning)
def _handle_exception(self, result: dict[str, t.Any], use_stderr: bool = False) -> None: def _handle_warnings(self, res: _c.MutableMapping[str, t.Any]) -> None:
error_summary: ErrorSummary | None """Display warnings and deprecation warnings sourced by task execution."""
if res.pop('warnings', None) and self._current_task_result and (warnings := self._current_task_result.warnings):
if error_summary := result.pop('exception', None): # display warnings from the current task result if `warnings` was not removed from `result` (or made falsey)
self._display._error(error_summary, wrap_text=False, stderr=use_stderr) for warning in warnings:
# DTFIX-RELEASE: what to do about propagating wrap_text from the original display.warning call?
def _handle_warnings_and_exception(self, result: TaskResult) -> None: self._display._warning(warning, wrap_text=False)
if res.pop('deprecations', None) and self._current_task_result and (deprecations := self._current_task_result.deprecations):
# display deprecations from the current task result if `deprecations` was not removed from `result` (or made falsey)
for deprecation in deprecations:
self._display._deprecated(deprecation)
def _handle_exception(self, result: _c.MutableMapping[str, t.Any], use_stderr: bool = False) -> None:
if result.pop('exception', None) and self._current_task_result and (exception := self._current_task_result.exception):
# display exception from the current task result if `exception` was not removed from `result` (or made falsey)
self._display._error(exception, wrap_text=False, stderr=use_stderr)
def _handle_warnings_and_exception(self, result: CallbackTaskResult) -> None:
"""Standardized handling of warnings/deprecations and exceptions from a task/item result.""" """Standardized handling of warnings/deprecations and exceptions from a task/item result."""
# DTFIX-RELEASE: make/doc/porting-guide a public version of this method? # DTFIX-RELEASE: make/doc/porting-guide a public version of this method?
try: try:
@ -304,8 +339,8 @@ class CallbackBase(AnsiblePlugin):
except KeyError: except KeyError:
use_stderr = False use_stderr = False
self._handle_warnings(result._result) self._handle_warnings(result.result)
self._handle_exception(result._result, use_stderr=use_stderr) self._handle_exception(result.result, use_stderr=use_stderr)
def _serialize_diff(self, diff): def _serialize_diff(self, diff):
try: try:
@ -322,7 +357,8 @@ class CallbackBase(AnsiblePlugin):
if result_format == 'json': if result_format == 'json':
return json.dumps(diff, sort_keys=True, indent=4, separators=(u',', u': ')) + u'\n' return json.dumps(diff, sort_keys=True, indent=4, separators=(u',', u': ')) + u'\n'
elif result_format == 'yaml':
if result_format == 'yaml':
# None is a sentinel in this case that indicates default behavior # None is a sentinel in this case that indicates default behavior
# default behavior for yaml is to prettify results # default behavior for yaml is to prettify results
lossy = pretty_results in (None, True) lossy = pretty_results in (None, True)
@ -338,6 +374,9 @@ class CallbackBase(AnsiblePlugin):
' ' ' '
) )
# DTFIX-RELEASE: add test to exercise this case
raise ValueError(f'Unsupported result_format {result_format!r}.')
def _get_diff(self, difflist): def _get_diff(self, difflist):
if not isinstance(difflist, list): if not isinstance(difflist, list):
@ -356,7 +395,7 @@ class CallbackBase(AnsiblePlugin):
if 'before' in diff and 'after' in diff: if 'before' in diff and 'after' in diff:
# format complex structures into 'files' # format complex structures into 'files'
for x in ['before', 'after']: for x in ['before', 'after']:
if isinstance(diff[x], MutableMapping): if isinstance(diff[x], _c.Mapping):
diff[x] = self._serialize_diff(diff[x]) diff[x] = self._serialize_diff(diff[x])
elif diff[x] is None: elif diff[x] is None:
diff[x] = '' diff[x] = ''
@ -398,7 +437,7 @@ class CallbackBase(AnsiblePlugin):
ret.append(diff['prepared']) ret.append(diff['prepared'])
return u''.join(ret) return u''.join(ret)
def _get_item_label(self, result): def _get_item_label(self, result: _c.Mapping[str, t.Any]) -> t.Any:
""" retrieves the value to be displayed as a label for an item entry from a result object""" """ retrieves the value to be displayed as a label for an item entry from a result object"""
if result.get('_ansible_no_log', False): if result.get('_ansible_no_log', False):
item = "(censored due to no_log)" item = "(censored due to no_log)"
@ -406,9 +445,9 @@ class CallbackBase(AnsiblePlugin):
item = result.get('_ansible_item_label', result.get('item')) item = result.get('_ansible_item_label', result.get('item'))
return item return item
def _process_items(self, result): def _process_items(self, result: CallbackTaskResult) -> None:
# just remove them as now they get handled by individual callbacks # just remove them as now they get handled by individual callbacks
del result._result['results'] del result.result['results']
def _clean_results(self, result, task_name): def _clean_results(self, result, task_name):
""" removes data from results for display """ """ removes data from results for display """
@ -434,74 +473,97 @@ class CallbackBase(AnsiblePlugin):
def set_play_context(self, play_context): def set_play_context(self, play_context):
pass pass
@_callback_base_impl
def on_any(self, *args, **kwargs): def on_any(self, *args, **kwargs):
pass pass
@_callback_base_impl
def runner_on_failed(self, host, res, ignore_errors=False): def runner_on_failed(self, host, res, ignore_errors=False):
pass pass
@_callback_base_impl
def runner_on_ok(self, host, res): def runner_on_ok(self, host, res):
pass pass
@_callback_base_impl
def runner_on_skipped(self, host, item=None): def runner_on_skipped(self, host, item=None):
pass pass
@_callback_base_impl
def runner_on_unreachable(self, host, res): def runner_on_unreachable(self, host, res):
pass pass
@_callback_base_impl
def runner_on_no_hosts(self): def runner_on_no_hosts(self):
pass pass
@_callback_base_impl
def runner_on_async_poll(self, host, res, jid, clock): def runner_on_async_poll(self, host, res, jid, clock):
pass pass
@_callback_base_impl
def runner_on_async_ok(self, host, res, jid): def runner_on_async_ok(self, host, res, jid):
pass pass
@_callback_base_impl
def runner_on_async_failed(self, host, res, jid): def runner_on_async_failed(self, host, res, jid):
pass pass
@_callback_base_impl
def playbook_on_start(self): def playbook_on_start(self):
pass pass
@_callback_base_impl
def playbook_on_notify(self, host, handler): def playbook_on_notify(self, host, handler):
pass pass
@_callback_base_impl
def playbook_on_no_hosts_matched(self): def playbook_on_no_hosts_matched(self):
pass pass
@_callback_base_impl
def playbook_on_no_hosts_remaining(self): def playbook_on_no_hosts_remaining(self):
pass pass
@_callback_base_impl
def playbook_on_task_start(self, name, is_conditional): def playbook_on_task_start(self, name, is_conditional):
pass pass
@_callback_base_impl
def playbook_on_vars_prompt(self, varname, private=True, prompt=None, encrypt=None, confirm=False, salt_size=None, salt=None, default=None, unsafe=None): def playbook_on_vars_prompt(self, varname, private=True, prompt=None, encrypt=None, confirm=False, salt_size=None, salt=None, default=None, unsafe=None):
pass pass
@_callback_base_impl
def playbook_on_setup(self): def playbook_on_setup(self):
pass pass
@_callback_base_impl
def playbook_on_import_for_host(self, host, imported_file): def playbook_on_import_for_host(self, host, imported_file):
pass pass
@_callback_base_impl
def playbook_on_not_import_for_host(self, host, missing_file): def playbook_on_not_import_for_host(self, host, missing_file):
pass pass
@_callback_base_impl
def playbook_on_play_start(self, name): def playbook_on_play_start(self, name):
pass pass
@_callback_base_impl
def playbook_on_stats(self, stats): def playbook_on_stats(self, stats):
pass pass
@_callback_base_impl
def on_file_diff(self, host, diff): def on_file_diff(self, host, diff):
pass pass
# V2 METHODS, by default they call v1 counterparts if possible # V2 METHODS, by default they call v1 counterparts if possible
@_callback_base_impl
def v2_on_any(self, *args, **kwargs): def v2_on_any(self, *args, **kwargs):
self.on_any(args, kwargs) self.on_any(args, kwargs)
def v2_runner_on_failed(self, result: TaskResult, ignore_errors: bool = False) -> None: @_callback_base_impl
def v2_runner_on_failed(self, result: CallbackTaskResult, ignore_errors: bool = False) -> None:
"""Process results of a failed task. """Process results of a failed task.
Note: The value of 'ignore_errors' tells Ansible whether to Note: The value of 'ignore_errors' tells Ansible whether to
@ -512,7 +574,7 @@ class CallbackBase(AnsiblePlugin):
issues (for example, missing packages), or syntax errors. issues (for example, missing packages), or syntax errors.
:param result: The parameters of the task and its results. :param result: The parameters of the task and its results.
:type result: TaskResult :type result: CallbackTaskResult
:param ignore_errors: Whether Ansible should continue \ :param ignore_errors: Whether Ansible should continue \
running tasks on the host where the task failed. running tasks on the host where the task failed.
:type ignore_errors: bool :type ignore_errors: bool
@ -520,147 +582,172 @@ class CallbackBase(AnsiblePlugin):
:return: None :return: None
:rtype: None :rtype: None
""" """
host = result._host.get_name() host = result.host.get_name()
self.runner_on_failed(host, result._result, ignore_errors) self.runner_on_failed(host, result.result, ignore_errors)
def v2_runner_on_ok(self, result: TaskResult) -> None: @_callback_base_impl
def v2_runner_on_ok(self, result: CallbackTaskResult) -> None:
"""Process results of a successful task. """Process results of a successful task.
:param result: The parameters of the task and its results. :param result: The parameters of the task and its results.
:type result: TaskResult :type result: CallbackTaskResult
:return: None :return: None
:rtype: None :rtype: None
""" """
host = result._host.get_name() host = result.host.get_name()
self.runner_on_ok(host, result._result) self.runner_on_ok(host, result.result)
def v2_runner_on_skipped(self, result: TaskResult) -> None: @_callback_base_impl
def v2_runner_on_skipped(self, result: CallbackTaskResult) -> None:
"""Process results of a skipped task. """Process results of a skipped task.
:param result: The parameters of the task and its results. :param result: The parameters of the task and its results.
:type result: TaskResult :type result: CallbackTaskResult
:return: None :return: None
:rtype: None :rtype: None
""" """
if C.DISPLAY_SKIPPED_HOSTS: if C.DISPLAY_SKIPPED_HOSTS:
host = result._host.get_name() host = result.host.get_name()
self.runner_on_skipped(host, self._get_item_label(getattr(result._result, 'results', {}))) self.runner_on_skipped(host, self._get_item_label(getattr(result.result, 'results', {})))
def v2_runner_on_unreachable(self, result: TaskResult) -> None: @_callback_base_impl
def v2_runner_on_unreachable(self, result: CallbackTaskResult) -> None:
"""Process results of a task if a target node is unreachable. """Process results of a task if a target node is unreachable.
:param result: The parameters of the task and its results. :param result: The parameters of the task and its results.
:type result: TaskResult :type result: CallbackTaskResult
:return: None :return: None
:rtype: None :rtype: None
""" """
host = result._host.get_name() host = result.host.get_name()
self.runner_on_unreachable(host, result._result) self.runner_on_unreachable(host, result.result)
def v2_runner_on_async_poll(self, result: TaskResult) -> None: @_callback_base_impl
def v2_runner_on_async_poll(self, result: CallbackTaskResult) -> None:
"""Get details about an unfinished task running in async mode. """Get details about an unfinished task running in async mode.
Note: The value of the `poll` keyword in the task determines Note: The value of the `poll` keyword in the task determines
the interval at which polling occurs and this method is run. the interval at which polling occurs and this method is run.
:param result: The parameters of the task and its status. :param result: The parameters of the task and its status.
:type result: TaskResult :type result: CallbackTaskResult
:rtype: None :rtype: None
:rtype: None :rtype: None
""" """
host = result._host.get_name() host = result.host.get_name()
jid = result._result.get('ansible_job_id') jid = result.result.get('ansible_job_id')
# FIXME, get real clock # FIXME, get real clock
clock = 0 clock = 0
self.runner_on_async_poll(host, result._result, jid, clock) self.runner_on_async_poll(host, result.result, jid, clock)
def v2_runner_on_async_ok(self, result: TaskResult) -> None: @_callback_base_impl
def v2_runner_on_async_ok(self, result: CallbackTaskResult) -> None:
"""Process results of a successful task that ran in async mode. """Process results of a successful task that ran in async mode.
:param result: The parameters of the task and its results. :param result: The parameters of the task and its results.
:type result: TaskResult :type result: CallbackTaskResult
:return: None :return: None
:rtype: None :rtype: None
""" """
host = result._host.get_name() host = result.host.get_name()
jid = result._result.get('ansible_job_id') jid = result.result.get('ansible_job_id')
self.runner_on_async_ok(host, result._result, jid) self.runner_on_async_ok(host, result.result, jid)
def v2_runner_on_async_failed(self, result): @_callback_base_impl
host = result._host.get_name() def v2_runner_on_async_failed(self, result: CallbackTaskResult) -> None:
host = result.host.get_name()
# Attempt to get the async job ID. If the job does not finish before the # Attempt to get the async job ID. If the job does not finish before the
# async timeout value, the ID may be within the unparsed 'async_result' dict. # async timeout value, the ID may be within the unparsed 'async_result' dict.
jid = result._result.get('ansible_job_id') jid = result.result.get('ansible_job_id')
if not jid and 'async_result' in result._result: if not jid and 'async_result' in result.result:
jid = result._result['async_result'].get('ansible_job_id') jid = result.result['async_result'].get('ansible_job_id')
self.runner_on_async_failed(host, result._result, jid) self.runner_on_async_failed(host, result.result, jid)
@_callback_base_impl
def v2_playbook_on_start(self, playbook): def v2_playbook_on_start(self, playbook):
self.playbook_on_start() self.playbook_on_start()
@_callback_base_impl
def v2_playbook_on_notify(self, handler, host): def v2_playbook_on_notify(self, handler, host):
self.playbook_on_notify(host, handler) self.playbook_on_notify(host, handler)
@_callback_base_impl
def v2_playbook_on_no_hosts_matched(self): def v2_playbook_on_no_hosts_matched(self):
self.playbook_on_no_hosts_matched() self.playbook_on_no_hosts_matched()
@_callback_base_impl
def v2_playbook_on_no_hosts_remaining(self): def v2_playbook_on_no_hosts_remaining(self):
self.playbook_on_no_hosts_remaining() self.playbook_on_no_hosts_remaining()
@_callback_base_impl
def v2_playbook_on_task_start(self, task, is_conditional): def v2_playbook_on_task_start(self, task, is_conditional):
self.playbook_on_task_start(task.name, is_conditional) self.playbook_on_task_start(task.name, is_conditional)
# FIXME: not called # FIXME: not called
@_callback_base_impl
def v2_playbook_on_cleanup_task_start(self, task): def v2_playbook_on_cleanup_task_start(self, task):
pass # no v1 correspondence pass # no v1 correspondence
@_callback_base_impl
def v2_playbook_on_handler_task_start(self, task): def v2_playbook_on_handler_task_start(self, task):
pass # no v1 correspondence pass # no v1 correspondence
@_callback_base_impl
def v2_playbook_on_vars_prompt(self, varname, private=True, prompt=None, encrypt=None, confirm=False, salt_size=None, salt=None, default=None, unsafe=None): def v2_playbook_on_vars_prompt(self, varname, private=True, prompt=None, encrypt=None, confirm=False, salt_size=None, salt=None, default=None, unsafe=None):
self.playbook_on_vars_prompt(varname, private, prompt, encrypt, confirm, salt_size, salt, default, unsafe) self.playbook_on_vars_prompt(varname, private, prompt, encrypt, confirm, salt_size, salt, default, unsafe)
# FIXME: not called # FIXME: not called
def v2_playbook_on_import_for_host(self, result, imported_file): @_callback_base_impl
host = result._host.get_name() def v2_playbook_on_import_for_host(self, result: CallbackTaskResult, imported_file) -> None:
host = result.host.get_name()
self.playbook_on_import_for_host(host, imported_file) self.playbook_on_import_for_host(host, imported_file)
# FIXME: not called # FIXME: not called
def v2_playbook_on_not_import_for_host(self, result, missing_file): @_callback_base_impl
host = result._host.get_name() def v2_playbook_on_not_import_for_host(self, result: CallbackTaskResult, missing_file) -> None:
host = result.host.get_name()
self.playbook_on_not_import_for_host(host, missing_file) self.playbook_on_not_import_for_host(host, missing_file)
@_callback_base_impl
def v2_playbook_on_play_start(self, play): def v2_playbook_on_play_start(self, play):
self.playbook_on_play_start(play.name) self.playbook_on_play_start(play.name)
@_callback_base_impl
def v2_playbook_on_stats(self, stats): def v2_playbook_on_stats(self, stats):
self.playbook_on_stats(stats) self.playbook_on_stats(stats)
def v2_on_file_diff(self, result): @_callback_base_impl
if 'diff' in result._result: def v2_on_file_diff(self, result: CallbackTaskResult) -> None:
host = result._host.get_name() if 'diff' in result.result:
self.on_file_diff(host, result._result['diff']) host = result.host.get_name()
self.on_file_diff(host, result.result['diff'])
@_callback_base_impl
def v2_playbook_on_include(self, included_file): def v2_playbook_on_include(self, included_file):
pass # no v1 correspondence pass # no v1 correspondence
def v2_runner_item_on_ok(self, result: TaskResult) -> None: @_callback_base_impl
def v2_runner_item_on_ok(self, result: CallbackTaskResult) -> None:
pass pass
def v2_runner_item_on_failed(self, result: TaskResult) -> None: @_callback_base_impl
def v2_runner_item_on_failed(self, result: CallbackTaskResult) -> None:
pass pass
def v2_runner_item_on_skipped(self, result: TaskResult) -> None: @_callback_base_impl
def v2_runner_item_on_skipped(self, result: CallbackTaskResult) -> None:
pass pass
def v2_runner_retry(self, result): @_callback_base_impl
def v2_runner_retry(self, result: CallbackTaskResult) -> None:
pass pass
@_callback_base_impl
def v2_runner_on_start(self, host, task): def v2_runner_on_start(self, host, task):
"""Event used when host begins execution of a task """Event used when host begins execution of a task

@ -21,7 +21,7 @@ DOCUMENTATION = """
from ansible import constants as C from ansible import constants as C
from ansible import context from ansible import context
from ansible.executor.task_result import TaskResult from ansible.executor.task_result import CallbackTaskResult
from ansible.playbook.task_include import TaskInclude from ansible.playbook.task_include import TaskInclude
from ansible.plugins.callback import CallbackBase from ansible.plugins.callback import CallbackBase
from ansible.utils.color import colorize, hostcolor from ansible.utils.color import colorize, hostcolor
@ -47,39 +47,39 @@ class CallbackModule(CallbackBase):
self._task_type_cache = {} self._task_type_cache = {}
super(CallbackModule, self).__init__() super(CallbackModule, self).__init__()
def v2_runner_on_failed(self, result: TaskResult, ignore_errors: bool = False) -> None: def v2_runner_on_failed(self, result: CallbackTaskResult, ignore_errors: bool = False) -> None:
host_label = self.host_label(result) host_label = self.host_label(result)
if self._last_task_banner != result._task._uuid: if self._last_task_banner != result.task._uuid:
self._print_task_banner(result._task) self._print_task_banner(result.task)
self._handle_warnings_and_exception(result) self._handle_warnings_and_exception(result)
# FIXME: this method should not exist, delegate "suggested keys to display" to the plugin or something... As-is, the placement of this # FIXME: this method should not exist, delegate "suggested keys to display" to the plugin or something... As-is, the placement of this
# call obliterates `results`, which causes a task summary to be printed on loop failures, which we don't do anywhere else. # call obliterates `results`, which causes a task summary to be printed on loop failures, which we don't do anywhere else.
self._clean_results(result._result, result._task.action) self._clean_results(result.result, result.task.action)
if result._task.loop and 'results' in result._result: if result.task.loop and 'results' in result.result:
self._process_items(result) self._process_items(result)
else: else:
if self._display.verbosity < 2 and self.get_option('show_task_path_on_failure'): if self._display.verbosity < 2 and self.get_option('show_task_path_on_failure'):
self._print_task_path(result._task) self._print_task_path(result.task)
msg = "fatal: [%s]: FAILED! => %s" % (host_label, self._dump_results(result._result)) msg = "fatal: [%s]: FAILED! => %s" % (host_label, self._dump_results(result.result))
self._display.display(msg, color=C.COLOR_ERROR, stderr=self.get_option('display_failed_stderr')) self._display.display(msg, color=C.COLOR_ERROR, stderr=self.get_option('display_failed_stderr'))
if ignore_errors: if ignore_errors:
self._display.display("...ignoring", color=C.COLOR_SKIP) self._display.display("...ignoring", color=C.COLOR_SKIP)
def v2_runner_on_ok(self, result: TaskResult) -> None: def v2_runner_on_ok(self, result: CallbackTaskResult) -> None:
host_label = self.host_label(result) host_label = self.host_label(result)
if isinstance(result._task, TaskInclude): if isinstance(result.task, TaskInclude):
if self._last_task_banner != result._task._uuid: if self._last_task_banner != result.task._uuid:
self._print_task_banner(result._task) self._print_task_banner(result.task)
return return
elif result._result.get('changed', False): elif result.result.get('changed', False):
if self._last_task_banner != result._task._uuid: if self._last_task_banner != result.task._uuid:
self._print_task_banner(result._task) self._print_task_banner(result.task)
msg = "changed: [%s]" % (host_label,) msg = "changed: [%s]" % (host_label,)
color = C.COLOR_CHANGED color = C.COLOR_CHANGED
@ -87,52 +87,52 @@ class CallbackModule(CallbackBase):
if not self.get_option('display_ok_hosts'): if not self.get_option('display_ok_hosts'):
return return
if self._last_task_banner != result._task._uuid: if self._last_task_banner != result.task._uuid:
self._print_task_banner(result._task) self._print_task_banner(result.task)
msg = "ok: [%s]" % (host_label,) msg = "ok: [%s]" % (host_label,)
color = C.COLOR_OK color = C.COLOR_OK
self._handle_warnings_and_exception(result) self._handle_warnings_and_exception(result)
if result._task.loop and 'results' in result._result: if result.task.loop and 'results' in result.result:
self._process_items(result) self._process_items(result)
else: else:
self._clean_results(result._result, result._task.action) self._clean_results(result.result, result.task.action)
if self._run_is_verbose(result): if self._run_is_verbose(result):
msg += " => %s" % (self._dump_results(result._result),) msg += " => %s" % (self._dump_results(result.result),)
self._display.display(msg, color=color) self._display.display(msg, color=color)
def v2_runner_on_skipped(self, result: TaskResult) -> None: def v2_runner_on_skipped(self, result: CallbackTaskResult) -> None:
if self.get_option('display_skipped_hosts'): if self.get_option('display_skipped_hosts'):
self._clean_results(result._result, result._task.action) self._clean_results(result.result, result.task.action)
if self._last_task_banner != result._task._uuid: if self._last_task_banner != result.task._uuid:
self._print_task_banner(result._task) self._print_task_banner(result.task)
self._handle_warnings_and_exception(result) self._handle_warnings_and_exception(result)
if result._task.loop is not None and 'results' in result._result: if result.task.loop is not None and 'results' in result.result:
self._process_items(result) self._process_items(result)
msg = "skipping: [%s]" % result._host.get_name() msg = "skipping: [%s]" % result.host.get_name()
if self._run_is_verbose(result): if self._run_is_verbose(result):
msg += " => %s" % self._dump_results(result._result) msg += " => %s" % self._dump_results(result.result)
self._display.display(msg, color=C.COLOR_SKIP) self._display.display(msg, color=C.COLOR_SKIP)
def v2_runner_on_unreachable(self, result: TaskResult) -> None: def v2_runner_on_unreachable(self, result: CallbackTaskResult) -> None:
if self._last_task_banner != result._task._uuid: if self._last_task_banner != result.task._uuid:
self._print_task_banner(result._task) self._print_task_banner(result.task)
self._handle_warnings_and_exception(result) self._handle_warnings_and_exception(result)
host_label = self.host_label(result) host_label = self.host_label(result)
msg = "fatal: [%s]: UNREACHABLE! => %s" % (host_label, self._dump_results(result._result)) msg = "fatal: [%s]: UNREACHABLE! => %s" % (host_label, self._dump_results(result.result))
self._display.display(msg, color=C.COLOR_UNREACHABLE, stderr=self.get_option('display_failed_stderr')) self._display.display(msg, color=C.COLOR_UNREACHABLE, stderr=self.get_option('display_failed_stderr'))
if result._task.ignore_unreachable: if result.task.ignore_unreachable:
self._display.display("...ignoring", color=C.COLOR_SKIP) self._display.display("...ignoring", color=C.COLOR_SKIP)
def v2_playbook_on_no_hosts_matched(self): def v2_playbook_on_no_hosts_matched(self):
@ -222,29 +222,29 @@ class CallbackModule(CallbackBase):
self._display.banner(msg) self._display.banner(msg)
def v2_on_file_diff(self, result): def v2_on_file_diff(self, result: CallbackTaskResult) -> None:
if result._task.loop and 'results' in result._result: if result.task.loop and 'results' in result.result:
for res in result._result['results']: for res in result.result['results']:
if 'diff' in res and res['diff'] and res.get('changed', False): if 'diff' in res and res['diff'] and res.get('changed', False):
diff = self._get_diff(res['diff']) diff = self._get_diff(res['diff'])
if diff: if diff:
if self._last_task_banner != result._task._uuid: if self._last_task_banner != result.task._uuid:
self._print_task_banner(result._task) self._print_task_banner(result.task)
self._display.display(diff) self._display.display(diff)
elif 'diff' in result._result and result._result['diff'] and result._result.get('changed', False): elif 'diff' in result.result and result.result['diff'] and result.result.get('changed', False):
diff = self._get_diff(result._result['diff']) diff = self._get_diff(result.result['diff'])
if diff: if diff:
if self._last_task_banner != result._task._uuid: if self._last_task_banner != result.task._uuid:
self._print_task_banner(result._task) self._print_task_banner(result.task)
self._display.display(diff) self._display.display(diff)
def v2_runner_item_on_ok(self, result: TaskResult) -> None: def v2_runner_item_on_ok(self, result: CallbackTaskResult) -> None:
host_label = self.host_label(result) host_label = self.host_label(result)
if isinstance(result._task, TaskInclude): if isinstance(result.task, TaskInclude):
return return
elif result._result.get('changed', False): elif result.result.get('changed', False):
if self._last_task_banner != result._task._uuid: if self._last_task_banner != result.task._uuid:
self._print_task_banner(result._task) self._print_task_banner(result.task)
msg = 'changed' msg = 'changed'
color = C.COLOR_CHANGED color = C.COLOR_CHANGED
@ -252,47 +252,47 @@ class CallbackModule(CallbackBase):
if not self.get_option('display_ok_hosts'): if not self.get_option('display_ok_hosts'):
return return
if self._last_task_banner != result._task._uuid: if self._last_task_banner != result.task._uuid:
self._print_task_banner(result._task) self._print_task_banner(result.task)
msg = 'ok' msg = 'ok'
color = C.COLOR_OK color = C.COLOR_OK
self._handle_warnings_and_exception(result) self._handle_warnings_and_exception(result)
msg = "%s: [%s] => (item=%s)" % (msg, host_label, self._get_item_label(result._result)) msg = "%s: [%s] => (item=%s)" % (msg, host_label, self._get_item_label(result.result))
self._clean_results(result._result, result._task.action) self._clean_results(result.result, result.task.action)
if self._run_is_verbose(result): if self._run_is_verbose(result):
msg += " => %s" % self._dump_results(result._result) msg += " => %s" % self._dump_results(result.result)
self._display.display(msg, color=color) self._display.display(msg, color=color)
def v2_runner_item_on_failed(self, result: TaskResult) -> None: def v2_runner_item_on_failed(self, result: CallbackTaskResult) -> None:
if self._last_task_banner != result._task._uuid: if self._last_task_banner != result.task._uuid:
self._print_task_banner(result._task) self._print_task_banner(result.task)
self._handle_warnings_and_exception(result) self._handle_warnings_and_exception(result)
host_label = self.host_label(result) host_label = self.host_label(result)
msg = "failed: [%s]" % (host_label,) msg = "failed: [%s]" % (host_label,)
self._clean_results(result._result, result._task.action) self._clean_results(result.result, result.task.action)
self._display.display( self._display.display(
msg + " (item=%s) => %s" % (self._get_item_label(result._result), self._dump_results(result._result)), msg + " (item=%s) => %s" % (self._get_item_label(result.result), self._dump_results(result.result)),
color=C.COLOR_ERROR, color=C.COLOR_ERROR,
stderr=self.get_option('display_failed_stderr') stderr=self.get_option('display_failed_stderr')
) )
def v2_runner_item_on_skipped(self, result: TaskResult) -> None: def v2_runner_item_on_skipped(self, result: CallbackTaskResult) -> None:
if self.get_option('display_skipped_hosts'): if self.get_option('display_skipped_hosts'):
if self._last_task_banner != result._task._uuid: if self._last_task_banner != result.task._uuid:
self._print_task_banner(result._task) self._print_task_banner(result.task)
self._handle_warnings_and_exception(result) self._handle_warnings_and_exception(result)
self._clean_results(result._result, result._task.action) self._clean_results(result.result, result.task.action)
msg = "skipping: [%s] => (item=%s) " % (result._host.get_name(), self._get_item_label(result._result)) msg = "skipping: [%s] => (item=%s) " % (result.host.get_name(), self._get_item_label(result.result))
if self._run_is_verbose(result): if self._run_is_verbose(result):
msg += " => %s" % self._dump_results(result._result) msg += " => %s" % self._dump_results(result.result)
self._display.display(msg, color=C.COLOR_SKIP) self._display.display(msg, color=C.COLOR_SKIP)
def v2_playbook_on_include(self, included_file): def v2_playbook_on_include(self, included_file):
@ -377,37 +377,37 @@ class CallbackModule(CallbackBase):
if context.CLIARGS['check'] and self.get_option('check_mode_markers'): if context.CLIARGS['check'] and self.get_option('check_mode_markers'):
self._display.banner("DRY RUN") self._display.banner("DRY RUN")
def v2_runner_retry(self, result): def v2_runner_retry(self, result: CallbackTaskResult) -> None:
task_name = result.task_name or result._task task_name = result.task_name or result.task
host_label = self.host_label(result) host_label = self.host_label(result)
msg = "FAILED - RETRYING: [%s]: %s (%d retries left)." % (host_label, task_name, result._result['retries'] - result._result['attempts']) msg = "FAILED - RETRYING: [%s]: %s (%d retries left)." % (host_label, task_name, result.result['retries'] - result.result['attempts'])
if self._run_is_verbose(result, verbosity=2): if self._run_is_verbose(result, verbosity=2):
msg += "Result was: %s" % self._dump_results(result._result) msg += "Result was: %s" % self._dump_results(result.result)
self._display.display(msg, color=C.COLOR_DEBUG) self._display.display(msg, color=C.COLOR_DEBUG)
def v2_runner_on_async_poll(self, result): def v2_runner_on_async_poll(self, result: CallbackTaskResult) -> None:
host = result._host.get_name() host = result.host.get_name()
jid = result._result.get('ansible_job_id') jid = result.result.get('ansible_job_id')
started = result._result.get('started') started = result.result.get('started')
finished = result._result.get('finished') finished = result.result.get('finished')
self._display.display( self._display.display(
'ASYNC POLL on %s: jid=%s started=%s finished=%s' % (host, jid, started, finished), 'ASYNC POLL on %s: jid=%s started=%s finished=%s' % (host, jid, started, finished),
color=C.COLOR_DEBUG color=C.COLOR_DEBUG
) )
def v2_runner_on_async_ok(self, result): def v2_runner_on_async_ok(self, result: CallbackTaskResult) -> None:
host = result._host.get_name() host = result.host.get_name()
jid = result._result.get('ansible_job_id') jid = result.result.get('ansible_job_id')
self._display.display("ASYNC OK on %s: jid=%s" % (host, jid), color=C.COLOR_DEBUG) self._display.display("ASYNC OK on %s: jid=%s" % (host, jid), color=C.COLOR_DEBUG)
def v2_runner_on_async_failed(self, result): def v2_runner_on_async_failed(self, result: CallbackTaskResult) -> None:
host = result._host.get_name() host = result.host.get_name()
# Attempt to get the async job ID. If the job does not finish before the # Attempt to get the async job ID. If the job does not finish before the
# async timeout value, the ID may be within the unparsed 'async_result' dict. # async timeout value, the ID may be within the unparsed 'async_result' dict.
jid = result._result.get('ansible_job_id') jid = result.result.get('ansible_job_id')
if not jid and 'async_result' in result._result: if not jid and 'async_result' in result.result:
jid = result._result['async_result'].get('ansible_job_id') jid = result.result['async_result'].get('ansible_job_id')
self._display.display("ASYNC FAILED on %s: jid=%s" % (host, jid), color=C.COLOR_DEBUG) self._display.display("ASYNC FAILED on %s: jid=%s" % (host, jid), color=C.COLOR_DEBUG)
def v2_playbook_on_notify(self, handler, host): def v2_playbook_on_notify(self, handler, host):

@ -86,12 +86,14 @@ import decimal
import os import os
import time import time
import re import re
import typing as t
from ansible import constants from ansible import constants
from ansible.module_utils.common.messages import ErrorSummary
from ansible.module_utils.common.text.converters import to_bytes, to_text from ansible.module_utils.common.text.converters import to_bytes, to_text
from ansible.playbook.task import Task from ansible.playbook.task import Task
from ansible.plugins.callback import CallbackBase from ansible.plugins.callback import CallbackBase
from ansible.executor.task_result import CallbackTaskResult
from ansible.playbook.included_file import IncludedFile
from ansible.utils._junit_xml import ( from ansible.utils._junit_xml import (
TestCase, TestCase,
TestError, TestError,
@ -184,23 +186,23 @@ class CallbackModule(CallbackBase):
self._task_data[uuid] = TaskData(uuid, name, path, play, action) self._task_data[uuid] = TaskData(uuid, name, path, play, action)
def _finish_task(self, status, result): def _finish_task(self, status: str, result: IncludedFile | CallbackTaskResult) -> None:
""" record the results of a task for a single host """ """ record the results of a task for a single host """
task_uuid = result._task._uuid if isinstance(result, CallbackTaskResult):
task_uuid = result.task._uuid
host_uuid = result.host._uuid
host_name = result.host.name
if hasattr(result, '_host'): if self._fail_on_change == 'true' and status == 'ok' and result.result.get('changed', False):
host_uuid = result._host._uuid status = 'failed'
host_name = result._host.name
else: else:
task_uuid = result._task._uuid
host_uuid = 'include' host_uuid = 'include'
host_name = 'include' host_name = 'include'
task_data = self._task_data[task_uuid] task_data = self._task_data[task_uuid]
if self._fail_on_change == 'true' and status == 'ok' and result._result.get('changed', False):
status = 'failed'
# ignore failure if expected and toggle result if asked for # ignore failure if expected and toggle result if asked for
if status == 'failed' and 'EXPECTED FAILURE' in task_data.name: if status == 'failed' and 'EXPECTED FAILURE' in task_data.name:
status = 'ok' status = 'ok'
@ -233,7 +235,8 @@ class CallbackModule(CallbackBase):
if host_data.status == 'included': if host_data.status == 'included':
return TestCase(name=name, classname=junit_classname, time=duration, system_out=str(host_data.result)) return TestCase(name=name, classname=junit_classname, time=duration, system_out=str(host_data.result))
res = host_data.result._result task_result = t.cast(CallbackTaskResult, host_data.result)
res = task_result.result
rc = res.get('rc', 0) rc = res.get('rc', 0)
dump = self._dump_results(res, indent=0) dump = self._dump_results(res, indent=0)
dump = self._cleanse_string(dump) dump = self._cleanse_string(dump)
@ -243,10 +246,8 @@ class CallbackModule(CallbackBase):
test_case = TestCase(name=name, classname=junit_classname, time=duration) test_case = TestCase(name=name, classname=junit_classname, time=duration)
error_summary: ErrorSummary
if host_data.status == 'failed': if host_data.status == 'failed':
if error_summary := res.get('exception'): if error_summary := task_result.exception:
message = error_summary._format() message = error_summary._format()
output = error_summary.formatted_traceback output = error_summary.formatted_traceback
test_case.errors.append(TestError(message=message, output=output)) test_case.errors.append(TestError(message=message, output=output))
@ -309,19 +310,19 @@ class CallbackModule(CallbackBase):
def v2_playbook_on_handler_task_start(self, task: Task) -> None: def v2_playbook_on_handler_task_start(self, task: Task) -> None:
self._start_task(task) self._start_task(task)
def v2_runner_on_failed(self, result, ignore_errors=False): def v2_runner_on_failed(self, result: CallbackTaskResult, ignore_errors=False) -> None:
if ignore_errors and self._fail_on_ignore != 'true': if ignore_errors and self._fail_on_ignore != 'true':
self._finish_task('ok', result) self._finish_task('ok', result)
else: else:
self._finish_task('failed', result) self._finish_task('failed', result)
def v2_runner_on_ok(self, result): def v2_runner_on_ok(self, result: CallbackTaskResult) -> None:
self._finish_task('ok', result) self._finish_task('ok', result)
def v2_runner_on_skipped(self, result): def v2_runner_on_skipped(self, result: CallbackTaskResult) -> None:
self._finish_task('skipped', result) self._finish_task('skipped', result)
def v2_playbook_on_include(self, included_file): def v2_playbook_on_include(self, included_file: IncludedFile) -> None:
self._finish_task('included', included_file) self._finish_task('included', included_file)
def v2_playbook_on_stats(self, stats): def v2_playbook_on_stats(self, stats):
@ -347,7 +348,7 @@ class TaskData:
if host.uuid in self.host_data: if host.uuid in self.host_data:
if host.status == 'included': if host.status == 'included':
# concatenate task include output from multiple items # concatenate task include output from multiple items
host.result = '%s\n%s' % (self.host_data[host.uuid].result, host.result) host.result = f'{self.host_data[host.uuid].result}\n{host.result}'
else: else:
raise Exception('%s: %s: %s: duplicate host callback: %s' % (self.path, self.play, self.name, host.name)) raise Exception('%s: %s: %s: duplicate host callback: %s' % (self.path, self.play, self.name, host.name))
@ -359,7 +360,7 @@ class HostData:
Data about an individual host. Data about an individual host.
""" """
def __init__(self, uuid, name, status, result): def __init__(self, uuid: str, name: str, status: str, result: IncludedFile | CallbackTaskResult | str) -> None:
self.uuid = uuid self.uuid = uuid
self.name = name self.name = name
self.status = status self.status = status

@ -15,7 +15,7 @@ DOCUMENTATION = """
- result_format_callback - result_format_callback
""" """
from ansible.executor.task_result import TaskResult from ansible.executor.task_result import CallbackTaskResult
from ansible.plugins.callback import CallbackBase from ansible.plugins.callback import CallbackBase
from ansible import constants as C from ansible import constants as C
@ -41,41 +41,41 @@ class CallbackModule(CallbackBase):
return buf + "\n" return buf + "\n"
def v2_runner_on_failed(self, result: TaskResult, ignore_errors: bool = False) -> None: def v2_runner_on_failed(self, result: CallbackTaskResult, ignore_errors: bool = False) -> None:
self._handle_warnings_and_exception(result) self._handle_warnings_and_exception(result)
if result._task.action in C.MODULE_NO_JSON and 'module_stderr' not in result._result: if result.task.action in C.MODULE_NO_JSON and 'module_stderr' not in result.result:
self._display.display(self._command_generic_msg(result._host.get_name(), result._result, "FAILED"), color=C.COLOR_ERROR) self._display.display(self._command_generic_msg(result.host.get_name(), result.result, "FAILED"), color=C.COLOR_ERROR)
else: else:
self._display.display("%s | FAILED! => %s" % (result._host.get_name(), self._dump_results(result._result, indent=4)), color=C.COLOR_ERROR) self._display.display("%s | FAILED! => %s" % (result.host.get_name(), self._dump_results(result.result, indent=4)), color=C.COLOR_ERROR)
def v2_runner_on_ok(self, result: TaskResult) -> None: def v2_runner_on_ok(self, result: CallbackTaskResult) -> None:
self._handle_warnings_and_exception(result) self._handle_warnings_and_exception(result)
self._clean_results(result._result, result._task.action) self._clean_results(result.result, result.task.action)
if result._result.get('changed', False): if result.result.get('changed', False):
color = C.COLOR_CHANGED color = C.COLOR_CHANGED
state = 'CHANGED' state = 'CHANGED'
else: else:
color = C.COLOR_OK color = C.COLOR_OK
state = 'SUCCESS' state = 'SUCCESS'
if result._task.action in C.MODULE_NO_JSON and 'ansible_job_id' not in result._result: if result.task.action in C.MODULE_NO_JSON and 'ansible_job_id' not in result.result:
self._display.display(self._command_generic_msg(result._host.get_name(), result._result, state), color=color) self._display.display(self._command_generic_msg(result.host.get_name(), result.result, state), color=color)
else: else:
self._display.display("%s | %s => %s" % (result._host.get_name(), state, self._dump_results(result._result, indent=4)), color=color) self._display.display("%s | %s => %s" % (result.host.get_name(), state, self._dump_results(result.result, indent=4)), color=color)
def v2_runner_on_skipped(self, result: TaskResult) -> None: def v2_runner_on_skipped(self, result: CallbackTaskResult) -> None:
self._handle_warnings_and_exception(result) self._handle_warnings_and_exception(result)
self._display.display("%s | SKIPPED" % (result._host.get_name()), color=C.COLOR_SKIP) self._display.display("%s | SKIPPED" % (result.host.get_name()), color=C.COLOR_SKIP)
def v2_runner_on_unreachable(self, result: TaskResult) -> None: def v2_runner_on_unreachable(self, result: CallbackTaskResult) -> None:
self._handle_warnings_and_exception(result) self._handle_warnings_and_exception(result)
self._display.display("%s | UNREACHABLE! => %s" % (result._host.get_name(), self._dump_results(result._result, indent=4)), color=C.COLOR_UNREACHABLE) self._display.display("%s | UNREACHABLE! => %s" % (result.host.get_name(), self._dump_results(result.result, indent=4)), color=C.COLOR_UNREACHABLE)
def v2_on_file_diff(self, result): def v2_on_file_diff(self, result):
if 'diff' in result._result and result._result['diff']: if 'diff' in result.result and result.result['diff']:
self._display.display(self._get_diff(result._result['diff'])) self._display.display(self._get_diff(result.result['diff']))

@ -16,6 +16,8 @@ DOCUMENTATION = """
from ansible import constants as C from ansible import constants as C
from ansible.plugins.callback import CallbackBase from ansible.plugins.callback import CallbackBase
from ansible.template import Templar from ansible.template import Templar
from ansible.executor.task_result import CallbackTaskResult
from ansible.module_utils._internal import _deprecator
class CallbackModule(CallbackBase): class CallbackModule(CallbackBase):
@ -31,7 +33,12 @@ class CallbackModule(CallbackBase):
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs) super().__init__(*args, **kwargs)
self._display.deprecated('The oneline callback plugin is deprecated.', version='2.23')
self._display.deprecated( # pylint: disable=ansible-deprecated-unnecessary-collection-name
msg='The oneline callback plugin is deprecated.',
version='2.23',
deprecator=_deprecator.ANSIBLE_CORE_DEPRECATOR, # entire plugin being removed; this improves the messaging
)
def _command_generic_msg(self, hostname, result, caption): def _command_generic_msg(self, hostname, result, caption):
stdout = result.get('stdout', '').replace('\n', '\\n').replace('\r', '\\r') stdout = result.get('stdout', '').replace('\n', '\\n').replace('\r', '\\r')
@ -41,9 +48,9 @@ class CallbackModule(CallbackBase):
else: else:
return "%s | %s | rc=%s | (stdout) %s" % (hostname, caption, result.get('rc', -1), stdout) return "%s | %s | rc=%s | (stdout) %s" % (hostname, caption, result.get('rc', -1), stdout)
def v2_runner_on_failed(self, result, ignore_errors=False): def v2_runner_on_failed(self, result: CallbackTaskResult, ignore_errors: bool = False) -> None:
if 'exception' in result._result: if 'exception' in result.result:
error_text = Templar().template(result._result['exception']) # transform to a string error_text = Templar().template(result.result['exception']) # transform to a string
if self._display.verbosity < 3: if self._display.verbosity < 3:
# extract just the actual error message from the exception text # extract just the actual error message from the exception text
error = error_text.strip().split('\n')[-1] error = error_text.strip().split('\n')[-1]
@ -51,31 +58,31 @@ class CallbackModule(CallbackBase):
else: else:
msg = "An exception occurred during task execution. The full traceback is:\n" + error_text.replace('\n', '') msg = "An exception occurred during task execution. The full traceback is:\n" + error_text.replace('\n', '')
if result._task.action in C.MODULE_NO_JSON and 'module_stderr' not in result._result: if result.task.action in C.MODULE_NO_JSON and 'module_stderr' not in result.result:
self._display.display(self._command_generic_msg(result._host.get_name(), result._result, 'FAILED'), color=C.COLOR_ERROR) self._display.display(self._command_generic_msg(result.host.get_name(), result.result, 'FAILED'), color=C.COLOR_ERROR)
else: else:
self._display.display(msg, color=C.COLOR_ERROR) self._display.display(msg, color=C.COLOR_ERROR)
self._display.display("%s | FAILED! => %s" % (result._host.get_name(), self._dump_results(result._result, indent=0).replace('\n', '')), self._display.display("%s | FAILED! => %s" % (result.host.get_name(), self._dump_results(result.result, indent=0).replace('\n', '')),
color=C.COLOR_ERROR) color=C.COLOR_ERROR)
def v2_runner_on_ok(self, result): def v2_runner_on_ok(self, result: CallbackTaskResult) -> None:
if result._result.get('changed', False): if result.result.get('changed', False):
color = C.COLOR_CHANGED color = C.COLOR_CHANGED
state = 'CHANGED' state = 'CHANGED'
else: else:
color = C.COLOR_OK color = C.COLOR_OK
state = 'SUCCESS' state = 'SUCCESS'
if result._task.action in C.MODULE_NO_JSON and 'ansible_job_id' not in result._result: if result.task.action in C.MODULE_NO_JSON and 'ansible_job_id' not in result.result:
self._display.display(self._command_generic_msg(result._host.get_name(), result._result, state), color=color) self._display.display(self._command_generic_msg(result.host.get_name(), result.result, state), color=color)
else: else:
self._display.display("%s | %s => %s" % (result._host.get_name(), state, self._dump_results(result._result, indent=0).replace('\n', '')), self._display.display("%s | %s => %s" % (result.host.get_name(), state, self._dump_results(result.result, indent=0).replace('\n', '')),
color=color) color=color)
def v2_runner_on_unreachable(self, result): def v2_runner_on_unreachable(self, result: CallbackTaskResult) -> None:
self._display.display("%s | UNREACHABLE!: %s" % (result._host.get_name(), result._result.get('msg', '')), color=C.COLOR_UNREACHABLE) self._display.display("%s | UNREACHABLE!: %s" % (result.host.get_name(), result.result.get('msg', '')), color=C.COLOR_UNREACHABLE)
def v2_runner_on_skipped(self, result): def v2_runner_on_skipped(self, result: CallbackTaskResult) -> None:
self._display.display("%s | SKIPPED" % (result._host.get_name()), color=C.COLOR_SKIP) self._display.display("%s | SKIPPED" % (result.host.get_name()), color=C.COLOR_SKIP)

@ -30,9 +30,11 @@ DOCUMENTATION = """
import os import os
from ansible.constants import TREE_DIR from ansible.constants import TREE_DIR
from ansible.executor.task_result import CallbackTaskResult
from ansible.module_utils.common.text.converters import to_bytes, to_text from ansible.module_utils.common.text.converters import to_bytes, to_text
from ansible.plugins.callback import CallbackBase from ansible.plugins.callback import CallbackBase
from ansible.utils.path import makedirs_safe, unfrackpath from ansible.utils.path import makedirs_safe, unfrackpath
from ansible.module_utils._internal import _deprecator
class CallbackModule(CallbackBase): class CallbackModule(CallbackBase):
@ -47,7 +49,12 @@ class CallbackModule(CallbackBase):
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs) super().__init__(*args, **kwargs)
self._display.deprecated('The tree callback plugin is deprecated.', version='2.23')
self._display.deprecated( # pylint: disable=ansible-deprecated-unnecessary-collection-name
msg='The tree callback plugin is deprecated.',
version='2.23',
deprecator=_deprecator.ANSIBLE_CORE_DEPRECATOR, # entire plugin being removed; this improves the messaging
)
def set_options(self, task_keys=None, var_options=None, direct=None): def set_options(self, task_keys=None, var_options=None, direct=None):
""" override to set self.tree """ """ override to set self.tree """
@ -76,14 +83,14 @@ class CallbackModule(CallbackBase):
except (OSError, IOError) as e: except (OSError, IOError) as e:
self._display.warning(u"Unable to write to %s's file: %s" % (hostname, to_text(e))) self._display.warning(u"Unable to write to %s's file: %s" % (hostname, to_text(e)))
def result_to_tree(self, result): def result_to_tree(self, result: CallbackTaskResult) -> None:
self.write_tree_file(result._host.get_name(), self._dump_results(result._result)) self.write_tree_file(result.host.get_name(), self._dump_results(result.result))
def v2_runner_on_ok(self, result): def v2_runner_on_ok(self, result: CallbackTaskResult) -> None:
self.result_to_tree(result) self.result_to_tree(result)
def v2_runner_on_failed(self, result, ignore_errors=False): def v2_runner_on_failed(self, result: CallbackTaskResult, ignore_errors: bool = False) -> None:
self.result_to_tree(result) self.result_to_tree(result)
def v2_runner_on_unreachable(self, result): def v2_runner_on_unreachable(self, result: CallbackTaskResult) -> None:
self.result_to_tree(result) self.result_to_tree(result)

@ -252,7 +252,7 @@ class Connection(ConnectionBase):
def _become_success_timeout(self) -> int: def _become_success_timeout(self) -> int:
"""Timeout value for become success in seconds.""" """Timeout value for become success in seconds."""
if (timeout := self.get_option('become_success_timeout')) < 1: if (timeout := self.get_option('become_success_timeout')) < 1:
timeout = C.config.get_configuration_definitions('connection', 'local')['become_success_timeout']['default'] timeout = C.config.get_config_default('become_success_timeout', plugin_type='connection', plugin_name='local')
return timeout return timeout

@ -248,11 +248,13 @@ from ansible.errors import (
AnsibleError, AnsibleError,
AnsibleFileNotFound, AnsibleFileNotFound,
) )
from ansible.module_utils.common.text.converters import to_bytes, to_native, to_text
from ansible.module_utils.compat.paramiko import _PARAMIKO_IMPORT_ERR as PARAMIKO_IMPORT_ERR, _paramiko as paramiko from ansible.module_utils.compat.paramiko import _PARAMIKO_IMPORT_ERR as PARAMIKO_IMPORT_ERR, _paramiko as paramiko
from ansible.plugins.connection import ConnectionBase from ansible.plugins.connection import ConnectionBase
from ansible.utils.display import Display from ansible.utils.display import Display
from ansible.utils.path import makedirs_safe from ansible.utils.path import makedirs_safe
from ansible.module_utils.common.text.converters import to_bytes, to_native, to_text from ansible.module_utils._internal import _deprecator
display = Display() display = Display()
@ -327,7 +329,12 @@ class Connection(ConnectionBase):
_log_channel: str | None = None _log_channel: str | None = None
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
display.deprecated('The paramiko connection plugin is deprecated.', version='2.21') display.deprecated( # pylint: disable=ansible-deprecated-unnecessary-collection-name
msg='The paramiko connection plugin is deprecated.',
version='2.21',
deprecator=_deprecator.ANSIBLE_CORE_DEPRECATOR, # entire plugin being removed; this improves the messaging
)
super().__init__(*args, **kwargs) super().__init__(*args, **kwargs)
def _cache_key(self) -> str: def _cache_key(self) -> str:

@ -29,7 +29,7 @@ attributes:
platforms: all platforms: all
until: until:
description: Denotes if this action obeys until/retry/poll keywords description: Denotes if this action obeys until/retry/poll keywords
support: full support: none
tags: tags:
description: Allows for the 'tags' keyword to control the selection of this action for execution description: Allows for the 'tags' keyword to control the selection of this action for execution
support: full support: full

@ -26,7 +26,7 @@ from jinja2.filters import do_map, do_select, do_selectattr, do_reject, do_rejec
from jinja2.environment import Environment from jinja2.environment import Environment
from ansible._internal._templating import _lazy_containers from ansible._internal._templating import _lazy_containers
from ansible.errors import AnsibleFilterError, AnsibleTypeError from ansible.errors import AnsibleFilterError, AnsibleTypeError, AnsibleTemplatePluginError
from ansible.module_utils.datatag import native_type_name from ansible.module_utils.datatag import native_type_name
from ansible.module_utils.common.json import get_encoder, get_decoder from ansible.module_utils.common.json import get_encoder, get_decoder
from ansible.module_utils.six import string_types, integer_types, text_type from ansible.module_utils.six import string_types, integer_types, text_type
@ -115,7 +115,10 @@ def to_bool(value: object) -> bool:
result = value_to_check == 1 # backwards compatibility with the old code which checked: value in ('yes', 'on', '1', 'true', 1) result = value_to_check == 1 # backwards compatibility with the old code which checked: value in ('yes', 'on', '1', 'true', 1)
# NB: update the doc string to reflect reality once this fallback is removed # NB: update the doc string to reflect reality once this fallback is removed
display.deprecated(f'The `bool` filter coerced invalid value {value!r} ({native_type_name(value)}) to {result!r}.', version='2.23') display.deprecated(
msg=f'The `bool` filter coerced invalid value {value!r} ({native_type_name(value)}) to {result!r}.',
version='2.23',
)
return result return result
@ -405,6 +408,13 @@ def comment(text, style='plain', **kw):
} }
} }
if style not in comment_styles:
raise AnsibleTemplatePluginError(
message=f"Invalid style {style!r}.",
help_text=f"Available styles: {', '.join(comment_styles)}",
obj=style,
)
# Pointer to the right comment type # Pointer to the right comment type
style_params = comment_styles[style] style_params = comment_styles[style]

@ -28,7 +28,7 @@ from collections.abc import Mapping
from ansible import template as _template from ansible import template as _template
from ansible.errors import AnsibleError, AnsibleParserError, AnsibleValueOmittedError from ansible.errors import AnsibleError, AnsibleParserError, AnsibleValueOmittedError
from ansible.inventory.group import to_safe_group_name as original_safe from ansible.inventory.group import to_safe_group_name as original_safe
from ansible.module_utils._internal import _plugin_exec_context from ansible.module_utils._internal import _plugin_info
from ansible.parsing.utils.addresses import parse_address from ansible.parsing.utils.addresses import parse_address
from ansible.parsing.dataloader import DataLoader from ansible.parsing.dataloader import DataLoader
from ansible.plugins import AnsiblePlugin, _ConfigurablePlugin from ansible.plugins import AnsiblePlugin, _ConfigurablePlugin
@ -314,7 +314,7 @@ class BaseFileInventoryPlugin(_BaseInventoryPlugin):
super(BaseFileInventoryPlugin, self).__init__() super(BaseFileInventoryPlugin, self).__init__()
class Cacheable(_plugin_exec_context.HasPluginInfo, _ConfigurablePlugin): class Cacheable(_plugin_info.HasPluginInfo, _ConfigurablePlugin):
"""Mixin for inventory plugins which support caching.""" """Mixin for inventory plugins which support caching."""
_cache: CachePluginAdjudicator _cache: CachePluginAdjudicator

@ -29,7 +29,7 @@ from ansible.module_utils.common.text.converters import to_bytes, to_text, to_na
from ansible.module_utils.six import string_types from ansible.module_utils.six import string_types
from ansible.parsing.yaml.loader import AnsibleLoader from ansible.parsing.yaml.loader import AnsibleLoader
from ansible._internal._yaml._loader import AnsibleInstrumentedLoader from ansible._internal._yaml._loader import AnsibleInstrumentedLoader
from ansible.plugins import get_plugin_class, MODULE_CACHE, PATH_CACHE, PLUGIN_PATH_CACHE from ansible.plugins import get_plugin_class, MODULE_CACHE, PATH_CACHE, PLUGIN_PATH_CACHE, AnsibleJinja2Plugin
from ansible.utils.collection_loader import AnsibleCollectionConfig, AnsibleCollectionRef from ansible.utils.collection_loader import AnsibleCollectionConfig, AnsibleCollectionRef
from ansible.utils.collection_loader._collection_finder import _AnsibleCollectionFinder, _get_collection_metadata from ansible.utils.collection_loader._collection_finder import _AnsibleCollectionFinder, _get_collection_metadata
from ansible.utils.display import Display from ansible.utils.display import Display
@ -135,29 +135,44 @@ class PluginPathContext(object):
class PluginLoadContext(object): class PluginLoadContext(object):
def __init__(self): def __init__(self, plugin_type: str, legacy_package_name: str) -> None:
self.original_name = None self.original_name: str | None = None
self.redirect_list = [] self.redirect_list: list[str] = []
self.error_list = [] self.raw_error_list: list[Exception] = []
self.import_error_list = [] """All exception instances encountered during the plugin load."""
self.load_attempts = [] self.error_list: list[str] = []
self.pending_redirect = None """Stringified exceptions, excluding import errors."""
self.exit_reason = None self.import_error_list: list[Exception] = []
self.plugin_resolved_path = None """All ImportError exception instances encountered during the plugin load."""
self.plugin_resolved_name = None self.load_attempts: list[str] = []
self.plugin_resolved_collection = None # empty string for resolved plugins from user-supplied paths self.pending_redirect: str | None = None
self.deprecated = False self.exit_reason: str | None = None
self.removal_date = None self.plugin_resolved_path: str | None = None
self.removal_version = None self.plugin_resolved_name: str | None = None
self.deprecation_warnings = [] """For collection plugins, the resolved Python module FQ __name__; for non-collections, the short name."""
self.resolved = False self.plugin_resolved_collection: str | None = None # empty string for resolved plugins from user-supplied paths
self._resolved_fqcn = None """For collection plugins, the resolved collection {ns}.{col}; empty string for non-collection plugins."""
self.action_plugin = None self.deprecated: bool = False
self.removal_date: str | None = None
self.removal_version: str | None = None
self.deprecation_warnings: list[str] = []
self.resolved: bool = False
self._resolved_fqcn: str | None = None
self.action_plugin: str | None = None
self._plugin_type: str = plugin_type
"""The type of the plugin."""
self._legacy_package_name = legacy_package_name
"""The legacy sys.modules package name from the plugin loader instance; stored to prevent potentially incorrect manual computation."""
self._python_module_name: str | None = None
"""
The fully qualified Python module name for the plugin (accessible via `sys.modules`).
For non-collection non-core plugins, this may include a non-existent synthetic package element with a hash of the file path to avoid collisions.
"""
@property @property
def resolved_fqcn(self): def resolved_fqcn(self) -> str | None:
if not self.resolved: if not self.resolved:
return return None
if not self._resolved_fqcn: if not self._resolved_fqcn:
final_plugin = self.redirect_list[-1] final_plugin = self.redirect_list[-1]
@ -169,7 +184,7 @@ class PluginLoadContext(object):
return self._resolved_fqcn return self._resolved_fqcn
def record_deprecation(self, name, deprecation, collection_name): def record_deprecation(self, name: str, deprecation: dict[str, t.Any] | None, collection_name: str) -> t.Self:
if not deprecation: if not deprecation:
return self return self
@ -183,7 +198,12 @@ class PluginLoadContext(object):
removal_version = None removal_version = None
warning_text = '{0} has been deprecated.{1}{2}'.format(name, ' ' if warning_text else '', warning_text) warning_text = '{0} has been deprecated.{1}{2}'.format(name, ' ' if warning_text else '', warning_text)
display.deprecated(warning_text, date=removal_date, version=removal_version, collection_name=collection_name) display.deprecated( # pylint: disable=ansible-deprecated-date-not-permitted,ansible-deprecated-unnecessary-collection-name
msg=warning_text,
date=removal_date,
version=removal_version,
deprecator=PluginInfo._from_collection_name(collection_name),
)
self.deprecated = True self.deprecated = True
if removal_date: if removal_date:
@ -193,28 +213,79 @@ class PluginLoadContext(object):
self.deprecation_warnings.append(warning_text) self.deprecation_warnings.append(warning_text)
return self return self
def resolve(self, resolved_name, resolved_path, resolved_collection, exit_reason, action_plugin): def resolve(self, resolved_name: str, resolved_path: str, resolved_collection: str, exit_reason: str, action_plugin: str) -> t.Self:
"""Record a resolved collection plugin."""
self.pending_redirect = None self.pending_redirect = None
self.plugin_resolved_name = resolved_name self.plugin_resolved_name = resolved_name
self.plugin_resolved_path = resolved_path self.plugin_resolved_path = resolved_path
self.plugin_resolved_collection = resolved_collection self.plugin_resolved_collection = resolved_collection
self.exit_reason = exit_reason self.exit_reason = exit_reason
self._python_module_name = resolved_name
self.resolved = True self.resolved = True
self.action_plugin = action_plugin self.action_plugin = action_plugin
return self
def resolve_legacy(self, name: str, pull_cache: dict[str, PluginPathContext]) -> t.Self:
"""Record a resolved legacy plugin."""
plugin_path_context = pull_cache[name]
self.plugin_resolved_name = name
self.plugin_resolved_path = plugin_path_context.path
self.plugin_resolved_collection = 'ansible.builtin' if plugin_path_context.internal else ''
self._resolved_fqcn = 'ansible.builtin.' + name if plugin_path_context.internal else name
self._python_module_name = self._make_legacy_python_module_name()
self.resolved = True
return self
def resolve_legacy_jinja_plugin(self, name: str, known_plugin: AnsibleJinja2Plugin) -> t.Self:
"""Record a resolved legacy Jinja plugin."""
internal = known_plugin.ansible_name.startswith('ansible.builtin.')
self.plugin_resolved_name = name
self.plugin_resolved_path = known_plugin._original_path
self.plugin_resolved_collection = 'ansible.builtin' if internal else ''
self._resolved_fqcn = known_plugin.ansible_name
self._python_module_name = self._make_legacy_python_module_name()
self.resolved = True
return self return self
def redirect(self, redirect_name): def redirect(self, redirect_name: str) -> t.Self:
self.pending_redirect = redirect_name self.pending_redirect = redirect_name
self.exit_reason = 'pending redirect resolution from {0} to {1}'.format(self.original_name, redirect_name) self.exit_reason = 'pending redirect resolution from {0} to {1}'.format(self.original_name, redirect_name)
self.resolved = False self.resolved = False
return self return self
def nope(self, exit_reason): def nope(self, exit_reason: str) -> t.Self:
self.pending_redirect = None self.pending_redirect = None
self.exit_reason = exit_reason self.exit_reason = exit_reason
self.resolved = False self.resolved = False
return self return self
def _make_legacy_python_module_name(self) -> str:
"""
Generate a fully-qualified Python module name for a legacy/builtin plugin.
The same package namespace is shared for builtin and legacy plugins.
Explicit requests for builtins via `ansible.builtin` are handled elsewhere with an aliased collection package resolved by the collection loader.
Only unqualified and `ansible.legacy`-qualified requests land here; whichever plugin is visible at the time will end up in sys.modules.
Filter and test plugin host modules receive special name suffixes to avoid collisions unrelated to the actual plugin name.
"""
name = os.path.splitext(self.plugin_resolved_path)[0]
basename = os.path.basename(name)
if self._plugin_type in ('filter', 'test'):
# Unlike other plugin types, filter and test plugin names are independent of the file where they are defined.
# As a result, the Python module name must be derived from the full path of the plugin.
# This prevents accidental shadowing of unrelated plugins of the same type.
basename += f'_{abs(hash(self.plugin_resolved_path))}'
return f'{self._legacy_package_name}.{basename}'
class PluginLoader: class PluginLoader:
""" """
@ -224,7 +295,15 @@ class PluginLoader:
paths, and the python path. The first match is used. paths, and the python path. The first match is used.
""" """
def __init__(self, class_name, package, config, subdir, aliases=None, required_base_class=None): def __init__(
self,
class_name: str,
package: str,
config: str | list[str],
subdir: str,
aliases: dict[str, str] | None = None,
required_base_class: str | None = None,
) -> None:
aliases = {} if aliases is None else aliases aliases = {} if aliases is None else aliases
self.class_name = class_name self.class_name = class_name
@ -250,15 +329,15 @@ class PluginLoader:
PLUGIN_PATH_CACHE[class_name] = defaultdict(dict) PLUGIN_PATH_CACHE[class_name] = defaultdict(dict)
# hold dirs added at runtime outside of config # hold dirs added at runtime outside of config
self._extra_dirs = [] self._extra_dirs: list[str] = []
# caches # caches
self._module_cache = MODULE_CACHE[class_name] self._module_cache = MODULE_CACHE[class_name]
self._paths = PATH_CACHE[class_name] self._paths = PATH_CACHE[class_name]
self._plugin_path_cache = PLUGIN_PATH_CACHE[class_name] self._plugin_path_cache = PLUGIN_PATH_CACHE[class_name]
self._plugin_instance_cache = {} if self.subdir == 'vars_plugins' else None self._plugin_instance_cache: dict[str, tuple[object, PluginLoadContext]] | None = {} if self.subdir == 'vars_plugins' else None
self._searched_paths = set() self._searched_paths: set[str] = set()
@property @property
def type(self): def type(self):
@ -426,7 +505,8 @@ class PluginLoader:
# if type name != 'module_doc_fragment': # if type name != 'module_doc_fragment':
if type_name in C.CONFIGURABLE_PLUGINS and not C.config.has_configuration_definition(type_name, name): if type_name in C.CONFIGURABLE_PLUGINS and not C.config.has_configuration_definition(type_name, name):
documentation_source = getattr(module, 'DOCUMENTATION', '') # trust-tagged source propagates to loaded values; expressions and templates in config require trust
documentation_source = _tags.TrustedAsTemplate().tag(getattr(module, 'DOCUMENTATION', ''))
try: try:
dstring = yaml.load(_tags.Origin(path=path).tag(documentation_source), Loader=AnsibleLoader) dstring = yaml.load(_tags.Origin(path=path).tag(documentation_source), Loader=AnsibleLoader)
except ParserError as e: except ParserError as e:
@ -488,7 +568,13 @@ class PluginLoader:
entry = collection_meta.get('plugin_routing', {}).get(plugin_type, {}).get(subdir_qualified_resource, None) entry = collection_meta.get('plugin_routing', {}).get(plugin_type, {}).get(subdir_qualified_resource, None)
return entry return entry
def _find_fq_plugin(self, fq_name, extension, plugin_load_context, ignore_deprecated=False): def _find_fq_plugin(
self,
fq_name: str,
extension: str | None,
plugin_load_context: PluginLoadContext,
ignore_deprecated: bool = False,
) -> PluginLoadContext:
"""Search builtin paths to find a plugin. No external paths are searched, """Search builtin paths to find a plugin. No external paths are searched,
meaning plugins inside roles inside collections will be ignored. meaning plugins inside roles inside collections will be ignored.
""" """
@ -525,17 +611,13 @@ class PluginLoader:
version=removal_version, version=removal_version,
date=removal_date, date=removal_date,
removed=True, removed=True,
plugin=PluginInfo( deprecator=PluginInfo._from_collection_name(acr.collection),
requested_name=acr.collection,
resolved_name=acr.collection,
type='collection',
),
) )
plugin_load_context.removal_date = removal_date plugin_load_context.date = removal_date
plugin_load_context.removal_version = removal_version plugin_load_context.version = removal_version
plugin_load_context.resolved = True plugin_load_context.resolved = True
plugin_load_context.exit_reason = removed_msg plugin_load_context.exit_reason = removed_msg
raise AnsiblePluginRemovedError(removed_msg, plugin_load_context=plugin_load_context) raise AnsiblePluginRemovedError(message=removed_msg, plugin_load_context=plugin_load_context)
redirect = routing_metadata.get('redirect', None) redirect = routing_metadata.get('redirect', None)
@ -592,7 +674,7 @@ class PluginLoader:
# look for any matching extension in the package location (sans filter) # look for any matching extension in the package location (sans filter)
found_files = [f found_files = [f
for f in glob.iglob(os.path.join(pkg_path, n_resource) + '.*') for f in glob.iglob(os.path.join(pkg_path, n_resource) + '.*')
if os.path.isfile(f) and not f.endswith(C.MODULE_IGNORE_EXTS)] if os.path.isfile(f) and not any(f.endswith(ext) for ext in C.MODULE_IGNORE_EXTS)]
if not found_files: if not found_files:
return plugin_load_context.nope('failed fuzzy extension match for {0} in {1}'.format(full_name, acr.collection)) return plugin_load_context.nope('failed fuzzy extension match for {0} in {1}'.format(full_name, acr.collection))
@ -623,7 +705,7 @@ class PluginLoader:
collection_list: list[str] | None = None, collection_list: list[str] | None = None,
) -> PluginLoadContext: ) -> PluginLoadContext:
""" Find a plugin named name, returning contextual info about the load, recursively resolving redirection """ """ Find a plugin named name, returning contextual info about the load, recursively resolving redirection """
plugin_load_context = PluginLoadContext() plugin_load_context = PluginLoadContext(self.type, self.package)
plugin_load_context.original_name = name plugin_load_context.original_name = name
while True: while True:
result = self._resolve_plugin_step(name, mod_type, ignore_deprecated, check_aliases, collection_list, plugin_load_context=plugin_load_context) result = self._resolve_plugin_step(name, mod_type, ignore_deprecated, check_aliases, collection_list, plugin_load_context=plugin_load_context)
@ -636,11 +718,8 @@ class PluginLoader:
else: else:
break break
# TODO: smuggle these to the controller when we're in a worker, reduce noise from normal things like missing plugin packages during collection search for ex in plugin_load_context.raw_error_list:
if plugin_load_context.error_list: display.error_as_warning(f"Error loading plugin {name!r}.", ex)
display.warning("errors were encountered during the plugin load for {0}:\n{1}".format(name, plugin_load_context.error_list))
# TODO: display/return import_error_list? Only useful for forensics...
# FIXME: store structured deprecation data in PluginLoadContext and use display.deprecate # FIXME: store structured deprecation data in PluginLoadContext and use display.deprecate
# if plugin_load_context.deprecated and C.config.get_config_value('DEPRECATION_WARNINGS'): # if plugin_load_context.deprecated and C.config.get_config_value('DEPRECATION_WARNINGS'):
@ -650,9 +729,15 @@ class PluginLoader:
return plugin_load_context return plugin_load_context
# FIXME: name bikeshed def _resolve_plugin_step(
def _resolve_plugin_step(self, name, mod_type='', ignore_deprecated=False, self,
check_aliases=False, collection_list=None, plugin_load_context=PluginLoadContext()): name: str,
mod_type: str = '',
ignore_deprecated: bool = False,
check_aliases: bool = False,
collection_list: list[str] | None = None,
plugin_load_context: PluginLoadContext | None = None,
) -> PluginLoadContext:
if not plugin_load_context: if not plugin_load_context:
raise ValueError('A PluginLoadContext is required') raise ValueError('A PluginLoadContext is required')
@ -707,11 +792,14 @@ class PluginLoader:
except (AnsiblePluginRemovedError, AnsiblePluginCircularRedirect, AnsibleCollectionUnsupportedVersionError): except (AnsiblePluginRemovedError, AnsiblePluginCircularRedirect, AnsibleCollectionUnsupportedVersionError):
# these are generally fatal, let them fly # these are generally fatal, let them fly
raise raise
except ImportError as ie:
plugin_load_context.import_error_list.append(ie)
except Exception as ex: except Exception as ex:
# FIXME: keep actual errors, not just assembled messages plugin_load_context.raw_error_list.append(ex)
plugin_load_context.error_list.append(to_native(ex))
# DTFIX-RELEASE: can we deprecate/remove these stringified versions?
if isinstance(ex, ImportError):
plugin_load_context.import_error_list.append(ex)
else:
plugin_load_context.error_list.append(str(ex))
if plugin_load_context.error_list: if plugin_load_context.error_list:
display.debug(msg='plugin lookup for {0} failed; errors: {1}'.format(name, '; '.join(plugin_load_context.error_list))) display.debug(msg='plugin lookup for {0} failed; errors: {1}'.format(name, '; '.join(plugin_load_context.error_list)))
@ -737,13 +825,7 @@ class PluginLoader:
# requested mod_type # requested mod_type
pull_cache = self._plugin_path_cache[suffix] pull_cache = self._plugin_path_cache[suffix]
try: try:
path_with_context = pull_cache[name] return plugin_load_context.resolve_legacy(name=name, pull_cache=pull_cache)
plugin_load_context.plugin_resolved_path = path_with_context.path
plugin_load_context.plugin_resolved_name = name
plugin_load_context.plugin_resolved_collection = 'ansible.builtin' if path_with_context.internal else ''
plugin_load_context._resolved_fqcn = ('ansible.builtin.' + name if path_with_context.internal else name)
plugin_load_context.resolved = True
return plugin_load_context
except KeyError: except KeyError:
# Cache miss. Now let's find the plugin # Cache miss. Now let's find the plugin
pass pass
@ -796,13 +878,7 @@ class PluginLoader:
self._searched_paths.add(path) self._searched_paths.add(path)
try: try:
path_with_context = pull_cache[name] return plugin_load_context.resolve_legacy(name=name, pull_cache=pull_cache)
plugin_load_context.plugin_resolved_path = path_with_context.path
plugin_load_context.plugin_resolved_name = name
plugin_load_context.plugin_resolved_collection = 'ansible.builtin' if path_with_context.internal else ''
plugin_load_context._resolved_fqcn = 'ansible.builtin.' + name if path_with_context.internal else name
plugin_load_context.resolved = True
return plugin_load_context
except KeyError: except KeyError:
# Didn't find the plugin in this directory. Load modules from the next one # Didn't find the plugin in this directory. Load modules from the next one
pass pass
@ -810,18 +886,18 @@ class PluginLoader:
# if nothing is found, try finding alias/deprecated # if nothing is found, try finding alias/deprecated
if not name.startswith('_'): if not name.startswith('_'):
alias_name = '_' + name alias_name = '_' + name
# We've already cached all the paths at this point
if alias_name in pull_cache: try:
path_with_context = pull_cache[alias_name] plugin_load_context.resolve_legacy(name=alias_name, pull_cache=pull_cache)
if not ignore_deprecated and not os.path.islink(path_with_context.path): except KeyError:
# FIXME: this is not always the case, some are just aliases pass
display.deprecated('%s is kept for backwards compatibility but usage is discouraged. ' # pylint: disable=ansible-deprecated-no-version else:
'The module documentation details page may explain more about this rationale.' % name.lstrip('_')) display.deprecated(
plugin_load_context.plugin_resolved_path = path_with_context.path msg=f'Plugin {name!r} automatically redirected to {alias_name!r}.',
plugin_load_context.plugin_resolved_name = alias_name help_text=f'Use {alias_name!r} instead of {name!r} to refer to the plugin.',
plugin_load_context.plugin_resolved_collection = 'ansible.builtin' if path_with_context.internal else '' version='2.23',
plugin_load_context._resolved_fqcn = 'ansible.builtin.' + alias_name if path_with_context.internal else alias_name )
plugin_load_context.resolved = True
return plugin_load_context return plugin_load_context
# last ditch, if it's something that can be redirected, look for a builtin redirect before giving up # last ditch, if it's something that can be redirected, look for a builtin redirect before giving up
@ -831,7 +907,7 @@ class PluginLoader:
return plugin_load_context.nope('{0} is not eligible for last-chance resolution'.format(name)) return plugin_load_context.nope('{0} is not eligible for last-chance resolution'.format(name))
def has_plugin(self, name, collection_list=None): def has_plugin(self, name: str, collection_list: list[str] | None = None) -> bool:
""" Checks if a plugin named name exists """ """ Checks if a plugin named name exists """
try: try:
@ -842,41 +918,37 @@ class PluginLoader:
# log and continue, likely an innocuous type/package loading failure in collections import # log and continue, likely an innocuous type/package loading failure in collections import
display.debug('has_plugin error: {0}'.format(to_text(ex))) display.debug('has_plugin error: {0}'.format(to_text(ex)))
__contains__ = has_plugin return False
def _load_module_source(self, name, path):
# avoid collisions across plugins __contains__ = has_plugin
if name.startswith('ansible_collections.'):
full_name = name
else:
full_name = '.'.join([self.package, name])
if full_name in sys.modules: def _load_module_source(self, *, python_module_name: str, path: str) -> types.ModuleType:
if python_module_name in sys.modules:
# Avoids double loading, See https://github.com/ansible/ansible/issues/13110 # Avoids double loading, See https://github.com/ansible/ansible/issues/13110
return sys.modules[full_name] return sys.modules[python_module_name]
with warnings.catch_warnings(): with warnings.catch_warnings():
# FIXME: this still has issues if the module was previously imported but not "cached", # FIXME: this still has issues if the module was previously imported but not "cached",
# we should bypass this entire codepath for things that are directly importable # we should bypass this entire codepath for things that are directly importable
warnings.simplefilter("ignore", RuntimeWarning) warnings.simplefilter("ignore", RuntimeWarning)
spec = importlib.util.spec_from_file_location(to_native(full_name), to_native(path)) spec = importlib.util.spec_from_file_location(to_native(python_module_name), to_native(path))
module = importlib.util.module_from_spec(spec) module = importlib.util.module_from_spec(spec)
# mimic import machinery; make the module-being-loaded available in sys.modules during import # mimic import machinery; make the module-being-loaded available in sys.modules during import
# and remove if there's a failure... # and remove if there's a failure...
sys.modules[full_name] = module sys.modules[python_module_name] = module
try: try:
spec.loader.exec_module(module) spec.loader.exec_module(module)
except Exception: except Exception:
del sys.modules[full_name] del sys.modules[python_module_name]
raise raise
return module return module
def _update_object( def _update_object(
self, self,
*,
obj: _AnsiblePluginInfoMixin, obj: _AnsiblePluginInfoMixin,
name: str, name: str,
path: str, path: str,
@ -907,9 +979,9 @@ class PluginLoader:
is_core_plugin = ctx.plugin_load_context.plugin_resolved_collection == 'ansible.builtin' is_core_plugin = ctx.plugin_load_context.plugin_resolved_collection == 'ansible.builtin'
if self.class_name == 'StrategyModule' and not is_core_plugin: if self.class_name == 'StrategyModule' and not is_core_plugin:
display.deprecated( # pylint: disable=ansible-deprecated-no-version display.deprecated( # pylint: disable=ansible-deprecated-no-version
'Use of strategy plugins not included in ansible.builtin are deprecated and do not carry ' msg='Use of strategy plugins not included in ansible.builtin are deprecated and do not carry '
'any backwards compatibility guarantees. No alternative for third party strategy plugins ' 'any backwards compatibility guarantees. No alternative for third party strategy plugins '
'is currently planned.' 'is currently planned.',
) )
return ctx.object return ctx.object
@ -936,8 +1008,6 @@ class PluginLoader:
return get_with_context_result(None, plugin_load_context) return get_with_context_result(None, plugin_load_context)
fq_name = plugin_load_context.resolved_fqcn fq_name = plugin_load_context.resolved_fqcn
if '.' not in fq_name and plugin_load_context.plugin_resolved_collection:
fq_name = '.'.join((plugin_load_context.plugin_resolved_collection, fq_name))
resolved_type_name = plugin_load_context.plugin_resolved_name resolved_type_name = plugin_load_context.plugin_resolved_name
path = plugin_load_context.plugin_resolved_path path = plugin_load_context.plugin_resolved_path
if (cached_result := (self._plugin_instance_cache or {}).get(fq_name)) and cached_result[1].resolved: if (cached_result := (self._plugin_instance_cache or {}).get(fq_name)) and cached_result[1].resolved:
@ -947,7 +1017,7 @@ class PluginLoader:
redirected_names = plugin_load_context.redirect_list or [] redirected_names = plugin_load_context.redirect_list or []
if path not in self._module_cache: if path not in self._module_cache:
self._module_cache[path] = self._load_module_source(resolved_type_name, path) self._module_cache[path] = self._load_module_source(python_module_name=plugin_load_context._python_module_name, path=path)
found_in_cache = False found_in_cache = False
self._load_config_defs(resolved_type_name, self._module_cache[path], path) self._load_config_defs(resolved_type_name, self._module_cache[path], path)
@ -974,7 +1044,7 @@ class PluginLoader:
# A plugin may need to use its _load_name in __init__ (for example, to set # A plugin may need to use its _load_name in __init__ (for example, to set
# or get options from config), so update the object before using the constructor # or get options from config), so update the object before using the constructor
instance = object.__new__(obj) instance = object.__new__(obj)
self._update_object(instance, resolved_type_name, path, redirected_names, fq_name) self._update_object(obj=instance, name=resolved_type_name, path=path, redirected_names=redirected_names, resolved=fq_name)
obj.__init__(instance, *args, **kwargs) # pylint: disable=unnecessary-dunder-call obj.__init__(instance, *args, **kwargs) # pylint: disable=unnecessary-dunder-call
obj = instance obj = instance
except TypeError as e: except TypeError as e:
@ -984,12 +1054,12 @@ class PluginLoader:
return get_with_context_result(None, plugin_load_context) return get_with_context_result(None, plugin_load_context)
raise raise
self._update_object(obj, resolved_type_name, path, redirected_names, fq_name) self._update_object(obj=obj, name=resolved_type_name, path=path, redirected_names=redirected_names, resolved=fq_name)
if self._plugin_instance_cache is not None and getattr(obj, 'is_stateless', False): if self._plugin_instance_cache is not None and getattr(obj, 'is_stateless', False):
self._plugin_instance_cache[fq_name] = (obj, plugin_load_context) self._plugin_instance_cache[fq_name] = (obj, plugin_load_context)
elif self._plugin_instance_cache is not None: elif self._plugin_instance_cache is not None:
# The cache doubles as the load order, so record the FQCN even if the plugin hasn't set is_stateless = True # The cache doubles as the load order, so record the FQCN even if the plugin hasn't set is_stateless = True
self._plugin_instance_cache[fq_name] = (None, PluginLoadContext()) self._plugin_instance_cache[fq_name] = (None, PluginLoadContext(self.type, self.package))
return get_with_context_result(obj, plugin_load_context) return get_with_context_result(obj, plugin_load_context)
def _display_plugin_load(self, class_name, name, searched_paths, path, found_in_cache=None, class_only=None): def _display_plugin_load(self, class_name, name, searched_paths, path, found_in_cache=None, class_only=None):
@ -1064,10 +1134,15 @@ class PluginLoader:
basename = os.path.basename(name) basename = os.path.basename(name)
is_j2 = isinstance(self, Jinja2Loader) is_j2 = isinstance(self, Jinja2Loader)
if path in legacy_excluding_builtin:
fqcn = basename
else:
fqcn = f"ansible.builtin.{basename}"
if is_j2: if is_j2:
ref_name = path ref_name = path
else: else:
ref_name = basename ref_name = fqcn
if not is_j2 and basename in _PLUGIN_FILTERS[self.package]: if not is_j2 and basename in _PLUGIN_FILTERS[self.package]:
# j2 plugins get processed in own class, here they would just be container files # j2 plugins get processed in own class, here they would just be container files
@ -1090,26 +1165,18 @@ class PluginLoader:
yield path yield path
continue continue
if path in legacy_excluding_builtin:
fqcn = basename
else:
fqcn = f"ansible.builtin.{basename}"
if (cached_result := (self._plugin_instance_cache or {}).get(fqcn)) and cached_result[1].resolved: if (cached_result := (self._plugin_instance_cache or {}).get(fqcn)) and cached_result[1].resolved:
# Here just in case, but we don't call all() multiple times for vars plugins, so this should not be used. # Here just in case, but we don't call all() multiple times for vars plugins, so this should not be used.
yield cached_result[0] yield cached_result[0]
continue continue
if path not in self._module_cache: if path not in self._module_cache:
if self.type in ('filter', 'test'): path_context = PluginPathContext(path, path not in legacy_excluding_builtin)
# filter and test plugin files can contain multiple plugins load_context = PluginLoadContext(self.type, self.package)
# they must have a unique python module name to prevent them from shadowing each other load_context.resolve_legacy(basename, {basename: path_context})
full_name = '{0}_{1}'.format(abs(hash(path)), basename)
else:
full_name = basename
try: try:
module = self._load_module_source(full_name, path) module = self._load_module_source(python_module_name=load_context._python_module_name, path=path)
except Exception as e: except Exception as e:
display.warning("Skipping plugin (%s), cannot load: %s" % (path, to_text(e))) display.warning("Skipping plugin (%s), cannot load: %s" % (path, to_text(e)))
continue continue
@ -1147,7 +1214,7 @@ class PluginLoader:
except TypeError as e: except TypeError as e:
display.warning("Skipping plugin (%s) as it seems to be incomplete: %s" % (path, to_text(e))) display.warning("Skipping plugin (%s) as it seems to be incomplete: %s" % (path, to_text(e)))
self._update_object(obj, basename, path, resolved=fqcn) self._update_object(obj=obj, name=basename, path=path, resolved=fqcn)
if self._plugin_instance_cache is not None: if self._plugin_instance_cache is not None:
needs_enabled = False needs_enabled = False
@ -1239,7 +1306,7 @@ class Jinja2Loader(PluginLoader):
try: try:
# use 'parent' loader class to find files, but cannot return this as it can contain multiple plugins per file # use 'parent' loader class to find files, but cannot return this as it can contain multiple plugins per file
if plugin_path not in self._module_cache: if plugin_path not in self._module_cache:
self._module_cache[plugin_path] = self._load_module_source(full_name, plugin_path) self._module_cache[plugin_path] = self._load_module_source(python_module_name=full_name, path=plugin_path)
module = self._module_cache[plugin_path] module = self._module_cache[plugin_path]
obj = getattr(module, self.class_name) obj = getattr(module, self.class_name)
except Exception as e: except Exception as e:
@ -1262,7 +1329,7 @@ class Jinja2Loader(PluginLoader):
plugin = self._plugin_wrapper_type(func) plugin = self._plugin_wrapper_type(func)
if plugin in plugins: if plugin in plugins:
continue continue
self._update_object(plugin, full, plugin_path, resolved=fq_name) self._update_object(obj=plugin, name=full, path=plugin_path, resolved=fq_name)
plugins.append(plugin) plugins.append(plugin)
return plugins return plugins
@ -1276,7 +1343,7 @@ class Jinja2Loader(PluginLoader):
requested_name = name requested_name = name
context = PluginLoadContext() context = PluginLoadContext(self.type, self.package)
# avoid collection path for legacy # avoid collection path for legacy
name = name.removeprefix('ansible.legacy.') name = name.removeprefix('ansible.legacy.')
@ -1288,11 +1355,8 @@ class Jinja2Loader(PluginLoader):
if isinstance(known_plugin, _DeferredPluginLoadFailure): if isinstance(known_plugin, _DeferredPluginLoadFailure):
raise known_plugin.ex raise known_plugin.ex
context.resolved = True context.resolve_legacy_jinja_plugin(name, known_plugin)
context.plugin_resolved_name = name
context.plugin_resolved_path = known_plugin._original_path
context.plugin_resolved_collection = 'ansible.builtin' if known_plugin.ansible_name.startswith('ansible.builtin.') else ''
context._resolved_fqcn = known_plugin.ansible_name
return get_with_context_result(known_plugin, context) return get_with_context_result(known_plugin, context)
plugin = None plugin = None
@ -1328,7 +1392,12 @@ class Jinja2Loader(PluginLoader):
warning_text = f'{self.type.title()} "{key}" has been deprecated.{" " if warning_text else ""}{warning_text}' warning_text = f'{self.type.title()} "{key}" has been deprecated.{" " if warning_text else ""}{warning_text}'
display.deprecated(warning_text, version=removal_version, date=removal_date, collection_name=acr.collection) display.deprecated( # pylint: disable=ansible-deprecated-date-not-permitted,ansible-deprecated-unnecessary-collection-name
msg=warning_text,
version=removal_version,
date=removal_date,
deprecator=PluginInfo._from_collection_name(acr.collection),
)
# check removal # check removal
tombstone_entry = routing_entry.get('tombstone') tombstone_entry = routing_entry.get('tombstone')
@ -1343,11 +1412,7 @@ class Jinja2Loader(PluginLoader):
version=removal_version, version=removal_version,
date=removal_date, date=removal_date,
removed=True, removed=True,
plugin=PluginInfo( deprecator=PluginInfo._from_collection_name(acr.collection),
requested_name=acr.collection,
resolved_name=acr.collection,
type='collection',
),
) )
raise AnsiblePluginRemovedError(exc_msg) raise AnsiblePluginRemovedError(exc_msg)
@ -1400,7 +1465,7 @@ class Jinja2Loader(PluginLoader):
plugin = self._plugin_wrapper_type(func) plugin = self._plugin_wrapper_type(func)
if plugin: if plugin:
context = plugin_impl.plugin_load_context context = plugin_impl.plugin_load_context
self._update_object(plugin, requested_name, plugin_impl.object._original_path, resolved=fq_name) self._update_object(obj=plugin, name=requested_name, path=plugin_impl.object._original_path, resolved=fq_name)
# context will have filename, which for tests/filters might not be correct # context will have filename, which for tests/filters might not be correct
context._resolved_fqcn = plugin.ansible_name context._resolved_fqcn = plugin.ansible_name
# FIXME: once we start caching these results, we'll be missing functions that would have loaded later # FIXME: once we start caching these results, we'll be missing functions that would have loaded later

@ -230,8 +230,8 @@ class LookupModule(LookupBase):
display.vvvv("url lookup connecting to %s" % term) display.vvvv("url lookup connecting to %s" % term)
if self.get_option('follow_redirects') in ('yes', 'no'): if self.get_option('follow_redirects') in ('yes', 'no'):
display.deprecated( display.deprecated(
"Using 'yes' or 'no' for 'follow_redirects' parameter is deprecated.", msg="Using 'yes' or 'no' for 'follow_redirects' parameter is deprecated.",
version='2.22' version='2.22',
) )
try: try:
response = open_url( response = open_url(

@ -26,6 +26,7 @@ import sys
import threading import threading
import time import time
import typing as t import typing as t
import collections.abc as _c
from collections import deque from collections import deque
@ -35,13 +36,13 @@ from ansible import context
from ansible.errors import AnsibleError, AnsibleFileNotFound, AnsibleParserError, AnsibleTemplateError from ansible.errors import AnsibleError, AnsibleFileNotFound, AnsibleParserError, AnsibleTemplateError
from ansible.executor.play_iterator import IteratingStates, PlayIterator from ansible.executor.play_iterator import IteratingStates, PlayIterator
from ansible.executor.process.worker import WorkerProcess from ansible.executor.process.worker import WorkerProcess
from ansible.executor.task_result import TaskResult from ansible.executor.task_result import _RawTaskResult, _WireTaskResult
from ansible.executor.task_queue_manager import CallbackSend, DisplaySend, PromptSend, TaskQueueManager from ansible.executor.task_queue_manager import CallbackSend, DisplaySend, PromptSend, TaskQueueManager
from ansible.module_utils.six import string_types
from ansible.module_utils.common.text.converters import to_text from ansible.module_utils.common.text.converters import to_text
from ansible.module_utils.connection import Connection, ConnectionError from ansible.module_utils.connection import Connection, ConnectionError
from ansible.playbook.handler import Handler from ansible.playbook.handler import Handler
from ansible.playbook.helpers import load_list_of_blocks from ansible.playbook.helpers import load_list_of_blocks
from ansible.playbook.included_file import IncludedFile
from ansible.playbook.task import Task from ansible.playbook.task import Task
from ansible.playbook.task_include import TaskInclude from ansible.playbook.task_include import TaskInclude
from ansible.plugins import loader as plugin_loader from ansible.plugins import loader as plugin_loader
@ -89,7 +90,9 @@ def _get_item_vars(result, task):
return item_vars return item_vars
def results_thread_main(strategy): def results_thread_main(strategy: StrategyBase) -> None:
value: object
while True: while True:
try: try:
result = strategy._final_q.get() result = strategy._final_q.get()
@ -99,13 +102,10 @@ def results_thread_main(strategy):
dmethod = getattr(display, result.method) dmethod = getattr(display, result.method)
dmethod(*result.args, **result.kwargs) dmethod(*result.args, **result.kwargs)
elif isinstance(result, CallbackSend): elif isinstance(result, CallbackSend):
for arg in result.args: task_result = strategy._convert_wire_task_result_to_raw(result.wire_task_result)
if isinstance(arg, TaskResult): strategy._tqm.send_callback(result.method_name, task_result)
strategy.normalize_task_result(arg) elif isinstance(result, _WireTaskResult):
break result = strategy._convert_wire_task_result_to_raw(result)
strategy._tqm.send_callback(result.method_name, *result.args, **result.kwargs)
elif isinstance(result, TaskResult):
strategy.normalize_task_result(result)
with strategy._results_lock: with strategy._results_lock:
strategy._results.append(result) strategy._results.append(result)
elif isinstance(result, PromptSend): elif isinstance(result, PromptSend):
@ -137,7 +137,7 @@ def results_thread_main(strategy):
def debug_closure(func): def debug_closure(func):
"""Closure to wrap ``StrategyBase._process_pending_results`` and invoke the task debugger""" """Closure to wrap ``StrategyBase._process_pending_results`` and invoke the task debugger"""
@functools.wraps(func) @functools.wraps(func)
def inner(self, iterator, one_pass=False, max_passes=None): def inner(self, iterator: PlayIterator, one_pass: bool = False, max_passes: int | None = None) -> list[_RawTaskResult]:
status_to_stats_map = ( status_to_stats_map = (
('is_failed', 'failures'), ('is_failed', 'failures'),
('is_unreachable', 'dark'), ('is_unreachable', 'dark'),
@ -148,12 +148,12 @@ def debug_closure(func):
# We don't know the host yet, copy the previous states, for lookup after we process new results # We don't know the host yet, copy the previous states, for lookup after we process new results
prev_host_states = iterator.host_states.copy() prev_host_states = iterator.host_states.copy()
results = func(self, iterator, one_pass=one_pass, max_passes=max_passes) results: list[_RawTaskResult] = func(self, iterator, one_pass=one_pass, max_passes=max_passes)
_processed_results = [] _processed_results: list[_RawTaskResult] = []
for result in results: for result in results:
task = result._task task = result.task
host = result._host host = result.host
_queued_task_args = self._queued_task_cache.pop((host.name, task._uuid), None) _queued_task_args = self._queued_task_cache.pop((host.name, task._uuid), None)
task_vars = _queued_task_args['task_vars'] task_vars = _queued_task_args['task_vars']
play_context = _queued_task_args['play_context'] play_context = _queued_task_args['play_context']
@ -239,7 +239,7 @@ class StrategyBase:
# outstanding tasks still in queue # outstanding tasks still in queue
self._blocked_hosts: dict[str, bool] = dict() self._blocked_hosts: dict[str, bool] = dict()
self._results: deque[TaskResult] = deque() self._results: deque[_RawTaskResult] = deque()
self._results_lock = threading.Condition(threading.Lock()) self._results_lock = threading.Condition(threading.Lock())
# create the result processing thread for reading results in the background # create the result processing thread for reading results in the background
@ -249,7 +249,7 @@ class StrategyBase:
# holds the list of active (persistent) connections to be shutdown at # holds the list of active (persistent) connections to be shutdown at
# play completion # play completion
self._active_connections: dict[str, str] = dict() self._active_connections: dict[Host, str] = dict()
# Caches for get_host calls, to avoid calling excessively # Caches for get_host calls, to avoid calling excessively
# These values should be set at the top of the ``run`` method of each # These values should be set at the top of the ``run`` method of each
@ -447,39 +447,33 @@ class StrategyBase:
for target_host in host_list: for target_host in host_list:
_set_host_facts(target_host, always_facts) _set_host_facts(target_host, always_facts)
def normalize_task_result(self, task_result): def _convert_wire_task_result_to_raw(self, wire_task_result: _WireTaskResult) -> _RawTaskResult:
"""Normalize a TaskResult to reference actual Host and Task objects """Return a `_RawTaskResult` created from a `_WireTaskResult`."""
when only given the ``Host.name``, or the ``Task._uuid`` host = self._inventory.get_host(wire_task_result.host_name)
queue_cache_entry = (host.name, wire_task_result.task_uuid)
Only the ``Host.name`` and ``Task._uuid`` are commonly sent back from
the ``TaskExecutor`` or ``WorkerProcess`` due to performance concerns
Mutates the original object try:
""" found_task = self._queued_task_cache[queue_cache_entry]['task']
except KeyError:
if isinstance(task_result._host, string_types): # This should only happen due to an implicit task created by the
# If the value is a string, it is ``Host.name`` # TaskExecutor, restrict this behavior to the explicit use case
task_result._host = self._inventory.get_host(to_text(task_result._host)) # of an implicit async_status task
if wire_task_result.task_fields.get('action') != 'async_status':
raise
task = Task()
else:
task = found_task.copy(exclude_parent=True, exclude_tasks=True)
task._parent = found_task._parent
if isinstance(task_result._task, string_types): task.from_attrs(wire_task_result.task_fields)
# If the value is a string, it is ``Task._uuid``
queue_cache_entry = (task_result._host.name, task_result._task)
try:
found_task = self._queued_task_cache[queue_cache_entry]['task']
except KeyError:
# This should only happen due to an implicit task created by the
# TaskExecutor, restrict this behavior to the explicit use case
# of an implicit async_status task
if task_result._task_fields.get('action') != 'async_status':
raise
original_task = Task()
else:
original_task = found_task.copy(exclude_parent=True, exclude_tasks=True)
original_task._parent = found_task._parent
original_task.from_attrs(task_result._task_fields)
task_result._task = original_task
return task_result return _RawTaskResult(
host=host,
task=task,
return_data=wire_task_result.return_data,
task_fields=wire_task_result.task_fields,
)
def search_handlers_by_notification(self, notification: str, iterator: PlayIterator) -> t.Generator[Handler, None, None]: def search_handlers_by_notification(self, notification: str, iterator: PlayIterator) -> t.Generator[Handler, None, None]:
handlers = [h for b in reversed(iterator._play.handlers) for h in b.block] handlers = [h for b in reversed(iterator._play.handlers) for h in b.block]
@ -537,7 +531,7 @@ class StrategyBase:
yield handler yield handler
@debug_closure @debug_closure
def _process_pending_results(self, iterator: PlayIterator, one_pass: bool = False, max_passes: int | None = None) -> list[TaskResult]: def _process_pending_results(self, iterator: PlayIterator, one_pass: bool = False, max_passes: int | None = None) -> list[_RawTaskResult]:
""" """
Reads results off the final queue and takes appropriate action Reads results off the final queue and takes appropriate action
based on the result (executing callbacks, updating state, etc.). based on the result (executing callbacks, updating state, etc.).
@ -553,8 +547,8 @@ class StrategyBase:
finally: finally:
self._results_lock.release() self._results_lock.release()
original_host = task_result._host original_host = task_result.host
original_task: Task = task_result._task original_task: Task = task_result.task
# all host status messages contain 2 entries: (msg, task_result) # all host status messages contain 2 entries: (msg, task_result)
role_ran = False role_ran = False
@ -588,7 +582,7 @@ class StrategyBase:
original_host.name, original_host.name,
dict( dict(
ansible_failed_task=original_task.serialize(), ansible_failed_task=original_task.serialize(),
ansible_failed_result=task_result._result, ansible_failed_result=task_result._return_data,
), ),
) )
else: else:
@ -596,7 +590,7 @@ class StrategyBase:
else: else:
self._tqm._stats.increment('ok', original_host.name) self._tqm._stats.increment('ok', original_host.name)
self._tqm._stats.increment('ignored', original_host.name) self._tqm._stats.increment('ignored', original_host.name)
if 'changed' in task_result._result and task_result._result['changed']: if task_result.is_changed():
self._tqm._stats.increment('changed', original_host.name) self._tqm._stats.increment('changed', original_host.name)
self._tqm.send_callback('v2_runner_on_failed', task_result, ignore_errors=ignore_errors) self._tqm.send_callback('v2_runner_on_failed', task_result, ignore_errors=ignore_errors)
elif task_result.is_unreachable(): elif task_result.is_unreachable():
@ -618,9 +612,9 @@ class StrategyBase:
if original_task.loop: if original_task.loop:
# this task had a loop, and has more than one result, so # this task had a loop, and has more than one result, so
# loop over all of them instead of a single result # loop over all of them instead of a single result
result_items = task_result._result.get('results', []) result_items = task_result._loop_results
else: else:
result_items = [task_result._result] result_items = [task_result._return_data]
for result_item in result_items: for result_item in result_items:
if '_ansible_notify' in result_item and task_result.is_changed(): if '_ansible_notify' in result_item and task_result.is_changed():
@ -665,7 +659,7 @@ class StrategyBase:
if 'add_host' in result_item or 'add_group' in result_item: if 'add_host' in result_item or 'add_group' in result_item:
item_vars = _get_item_vars(result_item, original_task) item_vars = _get_item_vars(result_item, original_task)
found_task_vars = self._queued_task_cache.get((original_host.name, task_result._task._uuid))['task_vars'] found_task_vars = self._queued_task_cache.get((original_host.name, task_result.task._uuid))['task_vars']
if item_vars: if item_vars:
all_task_vars = combine_vars(found_task_vars, item_vars) all_task_vars = combine_vars(found_task_vars, item_vars)
else: else:
@ -680,17 +674,17 @@ class StrategyBase:
original_task._resolve_conditional(original_task.failed_when, all_task_vars)) original_task._resolve_conditional(original_task.failed_when, all_task_vars))
if original_task.loop or original_task.loop_with: if original_task.loop or original_task.loop_with:
new_item_result = TaskResult( new_item_result = _RawTaskResult(
task_result._host, task_result.host,
task_result._task, task_result.task,
result_item, result_item,
task_result._task_fields, task_result.task_fields,
) )
self._tqm.send_callback('v2_runner_item_on_ok', new_item_result) self._tqm.send_callback('v2_runner_item_on_ok', new_item_result)
if result_item.get('changed', False): if result_item.get('changed', False):
task_result._result['changed'] = True task_result._return_data['changed'] = True
if result_item.get('failed', False): if result_item.get('failed', False):
task_result._result['failed'] = True task_result._return_data['failed'] = True
if 'ansible_facts' in result_item and original_task.action not in C._ACTION_DEBUG: if 'ansible_facts' in result_item and original_task.action not in C._ACTION_DEBUG:
# if delegated fact and we are delegating facts, we need to change target host for them # if delegated fact and we are delegating facts, we need to change target host for them
@ -738,13 +732,13 @@ class StrategyBase:
else: else:
self._tqm._stats.set_custom_stats(k, data[k], myhost) self._tqm._stats.set_custom_stats(k, data[k], myhost)
if 'diff' in task_result._result: if 'diff' in task_result._return_data:
if self._diff or getattr(original_task, 'diff', False): if self._diff or getattr(original_task, 'diff', False):
self._tqm.send_callback('v2_on_file_diff', task_result) self._tqm.send_callback('v2_on_file_diff', task_result)
if not isinstance(original_task, TaskInclude): if not isinstance(original_task, TaskInclude):
self._tqm._stats.increment('ok', original_host.name) self._tqm._stats.increment('ok', original_host.name)
if 'changed' in task_result._result and task_result._result['changed']: if task_result.is_changed():
self._tqm._stats.increment('changed', original_host.name) self._tqm._stats.increment('changed', original_host.name)
# finally, send the ok for this task # finally, send the ok for this task
@ -754,7 +748,7 @@ class StrategyBase:
if original_task.register: if original_task.register:
host_list = self.get_task_hosts(iterator, original_host, original_task) host_list = self.get_task_hosts(iterator, original_host, original_task)
clean_copy = strip_internal_keys(module_response_deepcopy(task_result._result)) clean_copy = strip_internal_keys(module_response_deepcopy(task_result._return_data))
if 'invocation' in clean_copy: if 'invocation' in clean_copy:
del clean_copy['invocation'] del clean_copy['invocation']
@ -805,7 +799,7 @@ class StrategyBase:
return ret_results return ret_results
def _copy_included_file(self, included_file): def _copy_included_file(self, included_file: IncludedFile) -> IncludedFile:
""" """
A proven safe and performant way to create a copy of an included file A proven safe and performant way to create a copy of an included file
""" """
@ -818,7 +812,7 @@ class StrategyBase:
return ti_copy return ti_copy
def _load_included_file(self, included_file, iterator, is_handler=False, handle_stats_and_callbacks=True): def _load_included_file(self, included_file: IncludedFile, iterator, is_handler=False, handle_stats_and_callbacks=True):
""" """
Loads an included YAML file of tasks, applying the optional set of variables. Loads an included YAML file of tasks, applying the optional set of variables.
@ -828,12 +822,12 @@ class StrategyBase:
""" """
if handle_stats_and_callbacks: if handle_stats_and_callbacks:
display.deprecated( display.deprecated(
"Reporting play recap stats and running callbacks functionality for " msg="Reporting play recap stats and running callbacks functionality for "
"``include_tasks`` in ``StrategyBase._load_included_file`` is deprecated. " "``include_tasks`` in ``StrategyBase._load_included_file`` is deprecated. "
"See ``https://github.com/ansible/ansible/pull/79260`` for guidance on how to " "See ``https://github.com/ansible/ansible/pull/79260`` for guidance on how to "
"move the reporting into specific strategy plugins to account for " "move the reporting into specific strategy plugins to account for "
"``include_role`` tasks as well.", "``include_role`` tasks as well.",
version="2.21" version="2.21",
) )
display.debug("loading included file: %s" % included_file._filename) display.debug("loading included file: %s" % included_file._filename)
try: try:
@ -865,11 +859,11 @@ class StrategyBase:
else: else:
reason = to_text(e) reason = to_text(e)
if handle_stats_and_callbacks: if handle_stats_and_callbacks:
for r in included_file._results: for tr in included_file._results:
r._result['failed'] = True tr._return_data['failed'] = True
for host in included_file._hosts: for host in included_file._hosts:
tr = TaskResult(host=host, task=included_file._task, return_data=dict(failed=True, reason=reason)) tr = _RawTaskResult(host=host, task=included_file._task, return_data=dict(failed=True, reason=reason), task_fields={})
self._tqm._stats.increment('failures', host.name) self._tqm._stats.increment('failures', host.name)
self._tqm.send_callback('v2_runner_on_failed', tr) self._tqm.send_callback('v2_runner_on_failed', tr)
raise AnsibleError(reason) from e raise AnsibleError(reason) from e
@ -905,7 +899,7 @@ class StrategyBase:
def _cond_not_supported_warn(self, task_name): def _cond_not_supported_warn(self, task_name):
display.warning("%s task does not support when conditional" % task_name) display.warning("%s task does not support when conditional" % task_name)
def _execute_meta(self, task: Task, play_context, iterator, target_host): def _execute_meta(self, task: Task, play_context, iterator, target_host: Host):
task.resolved_action = 'ansible.builtin.meta' # _post_validate_args is never called for meta actions, so resolved_action hasn't been set task.resolved_action = 'ansible.builtin.meta' # _post_validate_args is never called for meta actions, so resolved_action hasn't been set
# meta tasks store their args in the _raw_params field of args, # meta tasks store their args in the _raw_params field of args,
@ -1083,7 +1077,7 @@ class StrategyBase:
else: else:
display.vv(f"META: {header}") display.vv(f"META: {header}")
res = TaskResult(target_host, task, result) res = _RawTaskResult(target_host, task, result, {})
if skipped: if skipped:
self._tqm.send_callback('v2_runner_on_skipped', res) self._tqm.send_callback('v2_runner_on_skipped', res)
return [res] return [res]
@ -1103,14 +1097,14 @@ class StrategyBase:
hosts_left.append(self._inventory.get_host(host)) hosts_left.append(self._inventory.get_host(host))
return hosts_left return hosts_left
def update_active_connections(self, results): def update_active_connections(self, results: _c.Iterable[_RawTaskResult]) -> None:
""" updates the current active persistent connections """ """ updates the current active persistent connections """
for r in results: for r in results:
if 'args' in r._task_fields: if 'args' in r.task_fields:
socket_path = r._task_fields['args'].get('_ansible_socket') socket_path = r.task_fields['args'].get('_ansible_socket')
if socket_path: if socket_path:
if r._host not in self._active_connections: if r.host not in self._active_connections:
self._active_connections[r._host] = socket_path self._active_connections[r.host] = socket_path
class NextAction(object): class NextAction(object):

@ -252,11 +252,11 @@ class StrategyModule(StrategyBase):
# FIXME: send the error to the callback; don't directly write to display here # FIXME: send the error to the callback; don't directly write to display here
display.error(ex) display.error(ex)
for r in included_file._results: for r in included_file._results:
r._result['failed'] = True r._return_data['failed'] = True
r._result['reason'] = str(ex) r._return_data['reason'] = str(ex)
self._tqm._stats.increment('failures', r._host.name) self._tqm._stats.increment('failures', r.host.name)
self._tqm.send_callback('v2_runner_on_failed', r) self._tqm.send_callback('v2_runner_on_failed', r)
failed_includes_hosts.add(r._host) failed_includes_hosts.add(r.host)
continue continue
else: else:
# since we skip incrementing the stats when the task result is # since we skip incrementing the stats when the task result is

@ -40,6 +40,8 @@ from ansible.utils.display import Display
from ansible.inventory.host import Host from ansible.inventory.host import Host
from ansible.playbook.task import Task from ansible.playbook.task import Task
from ansible.executor.play_iterator import PlayIterator from ansible.executor.play_iterator import PlayIterator
from ansible.playbook.play_context import PlayContext
from ansible.executor import task_result as _task_result
display = Display() display = Display()
@ -92,7 +94,7 @@ class StrategyModule(StrategyBase):
return host_tasks return host_tasks
def run(self, iterator, play_context): def run(self, iterator, play_context: PlayContext): # type: ignore[override]
""" """
The linear strategy is simple - get the next task and queue The linear strategy is simple - get the next task and queue
it for all hosts, then wait for the queue to drain before it for all hosts, then wait for the queue to drain before
@ -100,7 +102,7 @@ class StrategyModule(StrategyBase):
""" """
# iterate over each task, while there is one left to run # iterate over each task, while there is one left to run
result = self._tqm.RUN_OK result = int(self._tqm.RUN_OK)
work_to_do = True work_to_do = True
self._set_hosts_cache(iterator._play) self._set_hosts_cache(iterator._play)
@ -125,7 +127,7 @@ class StrategyModule(StrategyBase):
# flag set if task is set to any_errors_fatal # flag set if task is set to any_errors_fatal
any_errors_fatal = False any_errors_fatal = False
results = [] results: list[_task_result._RawTaskResult] = []
for (host, task) in host_tasks: for (host, task) in host_tasks:
if self._tqm._terminated: if self._tqm._terminated:
break break
@ -285,11 +287,11 @@ class StrategyModule(StrategyBase):
# FIXME: send the error to the callback; don't directly write to display here # FIXME: send the error to the callback; don't directly write to display here
display.error(ex) display.error(ex)
for r in included_file._results: for r in included_file._results:
r._result['failed'] = True r._return_data['failed'] = True
r._result['reason'] = str(ex) r._return_data['reason'] = str(ex)
self._tqm._stats.increment('failures', r._host.name) self._tqm._stats.increment('failures', r.host.name)
self._tqm.send_callback('v2_runner_on_failed', r) self._tqm.send_callback('v2_runner_on_failed', r)
failed_includes_hosts.add(r._host) failed_includes_hosts.add(r.host)
else: else:
# since we skip incrementing the stats when the task result is # since we skip incrementing the stats when the task result is
# first processed, we do so now for each host in the list # first processed, we do so now for each host in the list
@ -320,9 +322,9 @@ class StrategyModule(StrategyBase):
unreachable_hosts = [] unreachable_hosts = []
for res in results: for res in results:
if res.is_failed(): if res.is_failed():
failed_hosts.append(res._host.name) failed_hosts.append(res.host.name)
elif res.is_unreachable(): elif res.is_unreachable():
unreachable_hosts.append(res._host.name) unreachable_hosts.append(res.host.name)
if any_errors_fatal and (failed_hosts or unreachable_hosts): if any_errors_fatal and (failed_hosts or unreachable_hosts):
for host in hosts_left: for host in hosts_left:

Some files were not shown because too many files have changed in this diff Show More

Loading…
Cancel
Save