Compare commits

...

29 Commits

Author SHA1 Message Date
Matt Davis c742fdc66c
New release v2.19.0b3 (#85101) 7 months ago
pollenJP(@'ω'@) 7a932a93b0 get_url: missing closing brace in docs (#85096)
(cherry picked from commit 1c29910087)
7 months ago
Matt Davis 8c8717a8e4 Switch to stackwalk caller ID (#85095)
* See changelog fragment for most changes.
* Defer early config warnings until display is functioning, eliminating related fallback display logic.
* Added more type annotations and docstrings.
* ansible-test - pylint sanity for deprecations improved.
* Refactored inline legacy resolutions in PluginLoader.

Co-authored-by: Matt Clay <matt@mystile.com>
(cherry picked from commit ff6998f2b9)
7 months ago
Jordan Borean 6054b29cb7
Add win_script become tests (#85079)
(cherry picked from commit e4cac2ac33)
7 months ago
Brian Coca 131175a5a6
ensure predictable permissions on module artifacts (#84948)
and test it!

(cherry picked from commit 9f894b81c2)
7 months ago
Martin Krizek 0aab250fbc
dnf5: avoid generating excessive history entries (#85065)
Fixes #85046

(cherry picked from commit cff49a62ec)
7 months ago
Martin Krizek dcec78b0f9
async_status: fix example to use finished test (#85066)
Fixes #85048

(cherry picked from commit dbf131c07d)
7 months ago
Brian Coca ea22e5d0dd
show internal but not hidden config options, while still hiding test options (#84997)
(cherry picked from commit aab732cb82)
7 months ago
Brian Coca 867d9d3096
These actions do not support until (#84847)
(cherry picked from commit 8ab342f8cc)
7 months ago
Matt Clay e0e286c009
[stable-2.19] ansible-test - Use `-t` for container stop timeout (#85019) (#85055)
(cherry picked from commit 0aa8afbaf4)
7 months ago
Matt Clay 1c1a271b88
Update Ansible release version to v2.19.0b2.post0. (#85041) 7 months ago
Matt Clay 4e861fa9c8
New release v2.19.0b2 (#85040)
* New release v2.19.0b2

* Revert setuptools version bump
7 months ago
Matt Davis f898f9fec6 Implement TaskResult backward compatibility for callbacks (#85039)
* Implement TaskResult backward compatibility for callbacks
* general API cleanup
* misc deprecations

Co-authored-by: Matt Clay <matt@mystile.com>

* fix v2_on_any deprecation exclusion for base

---------

Co-authored-by: Matt Clay <matt@mystile.com>
(cherry picked from commit 03181ac87b)
7 months ago
Matt Davis 4714194672 restore parsing/utils/jsonify.py (#85032)
(cherry picked from commit 2033993d89)
7 months ago
Abhijeet Kasurde ffbf121182
comment: raise an exception when an invalid option is provided (#84984)
Co-authored-by: Matt Clay <matt@mystile.com>
Signed-off-by: Abhijeet Kasurde <Akasurde@redhat.com>
(cherry picked from commit 1daa8412d5)
8 months ago
Brian Coca 89a4900b61
normalize error handler choices (#84998)
use existing to avoid deprecation cycle
normalize test too

(cherry picked from commit 2cbb721f6f)
8 months ago
Matt Clay 17d4fdd883
Increase galaxy test publish timeout (#85016)
(cherry picked from commit e6dc17cda4)
8 months ago
Lee Garrett 7fc916361e
Fix test_range_templating on 32-bit architectures (#85007)
* Fix test_range_templating on 32-bit architectures

32-bit archtectures like i386, armel, armhf will fail with the error

ansible._internal._templating._errors.AnsibleTemplatePluginRuntimeError: The
filter plugin 'ansible.builtin.random' failed: Python int too large to convert
to C ssize_t

So just pick sys.maxsize (2**31 - 1) so it works on 32 bit machines.

---------

Co-authored-by: Lee Garrett <lgarrett@rocketjump.eu>
Co-authored-by: Matt Clay <matt@mystile.com>
(cherry picked from commit 5f6aef95ac)
8 months ago
Matt Davis 82ea3addce
Miscellaneous fixes (#85012)
* Add help_text to play_hosts deprecation

* clean up TaskResult type handling

(cherry picked from commit 1b6b910439)
8 months ago
Matt Clay 98009c811b
Disable retries on ansible-galaxy-collection (#85013)
(cherry picked from commit f7d03474a5)
8 months ago
Sloane Hertel de7c454684
Remove unused local function _get_plugin_vars from vars manager (#85008)
(cherry picked from commit 93e6f012cb)
8 months ago
Matt Clay 80d5f05642
Miscellaneous DT fixes (#84991)
* Use `_UNSET` instead of allowing `ellipsis`

* Fix deprecation warning pre-check

* Deprecation warnings from modules can now be disabled.
* Deprecation warnings from modules get the "can be disabled" notice.

* Include help text in pre-display fatal errors

* Simplify lookup warning/debug messaging

* Fix return type of `timedout` test plugin

* Use `object` for `_UNSET`

* Remove obsolete `convert_data` tests

* Remove unnecessary template from test

* Improve legacy YAML objects backward compat

* Fix templar backward compat for None overrides

(cherry picked from commit 6cc97447aa)
8 months ago
Matt Clay ec0d8f3278
Disable parallel publish in galaxy test (#85000)
(cherry picked from commit e094d48b1b)
8 months ago
Abhijeet Kasurde c21a817c47
filter_core integration test - remove Python 2.6 related dead code (#84985)
Signed-off-by: Abhijeet Kasurde <Akasurde@redhat.com>
(cherry picked from commit 500a4aba08)
8 months ago
Martin Krizek 85cb2baf1f
get_bin_path('ssh-agent'): required is deprecated (#84995)
(cherry picked from commit 4868effc71)
8 months ago
Felix Fontein 2fcfad54b0
ansible-doc: fix indent and line wrapping for first line of (sub-)option and (sub-)return value descriptions (#84690)
* Fix initial indent for descriptions of suboptions.
* Fix line width for initial line of option descriptions.

(cherry picked from commit 352d8ec33a)
8 months ago
Matt Clay 6f95a618af
Convert DT issue template to pre-release template (#84982)
(cherry picked from commit 9ddfe9db39)
8 months ago
Matt Martz 19d9253ec9
Update Ansible release version to v2.19.0b1.post0. (#84988) 8 months ago
Matt Martz 8d775ddced
New release v2.19.0b1 (#84979) 8 months ago

@ -1,9 +1,8 @@
name: Fallible 2.19 Data Tagging Preview Bug Report
description: File a bug report against the Fallible 2.19 Data Tagging Preview
name: Pre-Release Bug Report
description: File a bug report against a pre-release version
labels:
- fallible_dt
- bug
- data_tagging
- pre_release
assignees:
- nitzmahone
- mattclay
@ -12,15 +11,14 @@ body:
attributes:
value: |
## Bug Report
- type: dropdown
- type: textarea
attributes:
label: Fallible Version
description: The fallible release that reproduces the issue described.
options:
- 2025.4.1
- 2025.3.11
- 2025.3.3
- 2025.1.30
label: Ansible Version
description: Paste the full output from `ansible --version` below.
render: console
placeholder: $ ansible --version
validations:
required: true
- type: textarea
attributes:
label: Summary
@ -37,8 +35,6 @@ body:
bin/ansible
### Issue Type
Bug Report
### Ansible Version
2.19.0.dev0
### Configuration
### OS / Environment
-->

@ -0,0 +1,383 @@
==================================================================
ansible-core 2.19 "What Is and What Should Never Be" Release Notes
==================================================================
.. contents:: Topics
v2.19.0b3
=========
Release Summary
---------------
| Release Date: 2025-05-06
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
Minor Changes
-------------
- ansible-config will now show internal, but not test configuration entries. This allows for debugging but still denoting the configurations as internal use only (_ prefix).
- ansible-test - Improved ``pylint`` checks for Ansible-specific deprecation functions.
- ansible-test - Use the ``-t`` option to set the stop timeout when stopping a container. This avoids use of the ``--time`` option which was deprecated in Docker v28.0.
- collection metadata - The collection loader now parses scalar values from ``meta/runtime.yml`` as strings. This avoids issues caused by unquoted values such as versions or dates being parsed as types other than strings.
- deprecation warnings - Deprecation warning APIs automatically capture the identity of the deprecating plugin. The ``collection_name`` argument is only required to correctly attribute deprecations that occur in module_utils or other non-plugin code.
- deprecation warnings - Improved deprecation messages to more clearly indicate the affected content, including plugin name when available.
- deprecations - Collection name strings not of the form ``ns.coll`` passed to deprecation API functions will result in an error.
- deprecations - Removed support for specifying deprecation dates as a ``datetime.date``, which was included in an earlier 2.19 pre-release.
- deprecations - Some argument names to ``deprecate_value`` for consistency with existing APIs. An earlier 2.19 pre-release included a ``removal_`` prefix on the ``date`` and ``version`` arguments.
- modules - The ``AnsibleModule.deprecate`` function no longer sends deprecation messages to the target host's logging system.
Deprecated Features
-------------------
- Passing a ``warnings` or ``deprecations`` key to ``exit_json`` or ``fail_json`` is deprecated. Use ``AnsibleModule.warn`` or ``AnsibleModule.deprecate`` instead.
- plugins - Accessing plugins with ``_``-prefixed filenames without the ``_`` prefix is deprecated.
Bugfixes
--------
- Ansible will now ensure predictable permissions on remote artifacts, until now it only ensured executable and relied on system masks for the rest.
- dnf5 - avoid generating excessive transaction entries in the dnf5 history (https://github.com/ansible/ansible/issues/85046)
v2.19.0b2
=========
Release Summary
---------------
| Release Date: 2025-04-24
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
Minor Changes
-------------
- comment filter - Improve the error message shown when an invalid ``style`` argument is provided.
Bugfixes
--------
- Remove use of `required` parameter in `get_bin_path` which has been deprecated.
- ansible-doc - fix indentation for first line of descriptions of suboptions and sub-return values (https://github.com/ansible/ansible/pull/84690).
- ansible-doc - fix line wrapping for first line of description of options and return values (https://github.com/ansible/ansible/pull/84690).
v2.19.0b1
=========
Release Summary
---------------
| Release Date: 2025-04-14
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
Major Changes
-------------
- Jinja plugins - Jinja builtin filter and test plugins are now accessible via their fully-qualified names ``ansible.builtin.{name}``.
- Task Execution / Forks - Forks no longer inherit stdio from the parent ``ansible-playbook`` process. ``stdout``, ``stderr``, and ``stdin`` within a worker are detached from the terminal, and non-functional. All needs to access stdio from a fork for controller side plugins requires use of ``Display``.
- ansible-test - Packages beneath ``module_utils`` can now contain ``__init__.py`` files.
- variables - The type system underlying Ansible's variable storage has been significantly overhauled and formalized. Attempts to store unsupported Python object types in variables will now result in an error.
- variables - To support new Ansible features, many variable objects are now represented by subclasses of their respective native Python types. In most cases, they behave indistinguishably from their original types, but some Python libraries do not handle builtin object subclasses properly. Custom plugins that interact with such libraries may require changes to convert and pass the native types.
Minor Changes
-------------
- Added a -vvvvv log message indicating when a host fails to produce output within the timeout period.
- AnsibleModule.uri - Add option ``multipart_encoding`` for ``form-multipart`` files in body to change default base64 encoding for files
- INVENTORY_IGNORE_EXTS config, removed ``ini`` from the default list, inventory scripts using a corresponding .ini configuration are rare now and inventory.ini files are more common. Those that need to ignore the ini files for inventory scripts can still add it to configuration.
- Jinja plugins - Plugins can declare support for undefined values.
- Jinja2 version 3.1.0 or later is now required on the controller.
- Move ``follow_redirects`` parameter to module_utils so external modules can reuse it.
- PlayIterator - do not return tasks from already executed roles so specific strategy plugins do not have to do the filtering of such tasks themselves
- SSH Escalation-related -vvv log messages now include the associated host information.
- Windows - Add support for Windows Server 2025 to Ansible and as an ``ansible-test`` remote target - https://github.com/ansible/ansible/issues/84229
- Windows - refactor the async implementation to better handle errors during bootstrapping and avoid WMI when possible.
- ``ansible-galaxy collection install`` — the collection dependency resolver now prints out conflicts it hits during dependency resolution when it's taking too long and it ends up backtracking a lot. It also displays suggestions on how to help it compute the result more quickly.
- ansible, ansible-console, ansible-pull - add --flush-cache option (https://github.com/ansible/ansible/issues/83749).
- ansible-galaxy - Add support for Keycloak service accounts
- ansible-galaxy - support ``resolvelib >= 0.5.3, < 2.0.0`` (https://github.com/ansible/ansible/issues/84217).
- ansible-test - Added a macOS 15.3 remote VM, replacing 14.3.
- ansible-test - Automatically retry HTTP GET/PUT/DELETE requests on exceptions.
- ansible-test - Default to Python 3.13 in the ``base`` and ``default`` containers.
- ansible-test - Disable the ``deprecated-`` prefixed ``pylint`` rules as their results vary by Python version.
- ansible-test - Disable the ``pep8`` sanity test rules ``E701`` and ``E704`` to improve compatibility with ``black``.
- ansible-test - Improve container runtime probe error handling. When unexpected probe output is encountered, an error with more useful debugging information is provided.
- ansible-test - Replace container Alpine 3.20 with 3.21.
- ansible-test - Replace container Fedora 40 with 41.
- ansible-test - Replace remote Alpine 3.20 with 3.21.
- ansible-test - Replace remote Fedora 40 with 41.
- ansible-test - Replace remote FreeBSD 13.3 with 13.5.
- ansible-test - Replace remote FreeBSD 14.1 with 14.2.
- ansible-test - Replace remote RHEL 9.4 with 9.5.
- ansible-test - Show a more user-friendly error message when a ``runme.sh`` script is not executable.
- ansible-test - The ``yamllint`` sanity test now enforces string values for the ``!vault`` tag.
- ansible-test - Update ``nios-test-container`` to version 7.0.0.
- ansible-test - Update ``pylint`` sanity test to use version 3.3.1.
- ansible-test - Update distro containers to remove unnecessary pakages (apache2, subversion, ruby).
- ansible-test - Update sanity test requirements to latest available versions.
- ansible-test - Update the HTTP test container.
- ansible-test - Update the PyPI test container.
- ansible-test - Update the ``base`` and ``default`` containers.
- ansible-test - Update the utility container.
- ansible-test - Use Python's ``urllib`` instead of ``curl`` for HTTP requests.
- ansible-test - When detection of the current container network fails, a warning is now issued and execution continues. This simplifies usage in cases where the current container cannot be inspected, such as when running in GitHub Codespaces.
- ansible-test acme test container - bump `version to 2.3.0 <https://github.com/ansible/acme-test-container/releases/tag/2.3.0>`__ to include newer versions of Pebble, dependencies, and runtimes. This adds support for ACME profiles, ``dns-account-01`` support, and some smaller improvements (https://github.com/ansible/ansible/pull/84547).
- apt_key module - add notes to docs and errors to point at the CLI tool deprecation by Debian and alternatives
- apt_repository module - add notes to errors to point at the CLI tool deprecation by Debian and alternatives
- become plugins get new property 'pipelining' to show support or lack there of for the feature.
- callback plugins - add has_option() to CallbackBase to match other functions overloaded from AnsiblePlugin
- callback plugins - fix get_options() for CallbackBase
- copy - fix sanity test failures (https://github.com/ansible/ansible/pull/83643).
- copy - parameter ``local_follow`` was incorrectly documented as having default value ``True`` (https://github.com/ansible/ansible/pull/83643).
- cron - Provide additional error information while writing cron file (https://github.com/ansible/ansible/issues/83223).
- csvfile - let the config system do the typecasting (https://github.com/ansible/ansible/pull/82263).
- display - Deduplication of warning and error messages considers the full content of the message (including source and traceback contexts, if enabled). This may result in fewer messages being omitted.
- distribution - Added openSUSE MicroOS to Suse OS family (#84685).
- dnf5, apt - add ``auto_install_module_deps`` option (https://github.com/ansible/ansible/issues/84206)
- docs - add collection name in message from which the module is being deprecated (https://github.com/ansible/ansible/issues/84116).
- env lookup - The error message generated for a missing environment variable when ``default`` is an undefined value (e.g. ``undef('something')``) will contain the hint from that undefined value, except when the undefined value is the default of ``undef()`` with no arguments. Previously, any existing undefined hint would be ignored.
- file - enable file module to disable diff_mode (https://github.com/ansible/ansible/issues/80817).
- file - make code more readable and simple.
- filter - add support for URL-safe encoding and decoding in b64encode and b64decode (https://github.com/ansible/ansible/issues/84147).
- find - add a checksum_algorithm parameter to specify which type of checksum the module will return
- from_json filter - The filter accepts a ``profile`` argument, which defaults to ``tagless``.
- handlers - Templated handler names with syntax errors, or that resolve to ``omit`` are now skipped like handlers with undefined variables in their name.
- improved error message for yaml parsing errors in plugin documentation
- local connection plugin - A new ``become_strip_preamble`` config option (default True) was added; disable to preserve diagnostic ``become`` output in task results.
- local connection plugin - A new ``become_success_timeout`` operation-wide timeout config (default 10s) was added for ``become``.
- local connection plugin - When a ``become`` plugin's ``prompt`` value is a non-string after the ``check_password_prompt`` callback has completed, no prompt stripping will occur on stderr.
- lookup_template - add an option to trim blocks while templating (https://github.com/ansible/ansible/issues/75962).
- module - set ipv4 and ipv6 rules simultaneously in iptables module (https://github.com/ansible/ansible/issues/84404).
- module_utils - Add ``NoReturn`` type annotations to functions which never return.
- modules - PowerShell modules can now receive ``datetime.date``, ``datetime.time`` and ``datetime.datetime`` values as ISO 8601 strings.
- modules - PowerShell modules can now receive strings sourced from inline vault-encrypted strings.
- modules - Unhandled exceptions during Python module execution are now returned as structured data from the target. This allows the new traceback handling to be applied to exceptions raised on targets.
- pipelining logic has mostly moved to connection plugins so they can decide/override settings.
- plugin error handling - When raising exceptions in an exception handler, be sure to use ``raise ... from`` as appropriate. This supersedes the use of the ``AnsibleError`` arg ``orig_exc`` to represent the cause. Specifying ``orig_exc`` as the cause is still permitted. Failure to use ``raise ... from`` when ``orig_exc`` is set will result in a warning. Additionally, if the two cause exceptions do not match, a warning will be issued.
- removed harcoding of su plugin as it now works with pipelining.
- runtime-metadata sanity test - improve validation of ``action_groups`` (https://github.com/ansible/ansible/pull/83965).
- service_facts module got freebsd support added.
- ssh connection plugin - Support ``SSH_ASKPASS`` mechanism to provide passwords, making it the default, but still offering an explicit choice to use ``sshpass`` (https://github.com/ansible/ansible/pull/83936)
- ssh connection plugin now overrides pipelining when a tty is requested.
- ssh-agent - ``ansible``, ``ansible-playbook`` and ``ansible-console`` are capable of spawning or reusing an ssh-agent, allowing plugins to interact with the ssh-agent. Additionally a pure python ssh-agent client has been added, enabling easy interaction with the agent. The ssh connection plugin contains new functionality via ``ansible_ssh_private_key`` and ``ansible_ssh_private_key_passphrase``, for loading an SSH private key into the agent from a variable.
- templating - Access to an undefined variable from inside a lookup, filter, or test (which raises MarkerError) no longer ends processing of the current template. The triggering undefined value is returned as the result of the offending plugin invocation, and the template continues to execute.
- templating - Embedding ``range()`` values in containers such as lists will result in an error on use. Previously the value would be converted to a string representing the range parameters, such as ``range(0, 3)``.
- templating - Handling of omitted values is now a first-class feature of the template engine, and is usable in all Ansible Jinja template contexts. Any template that resolves to ``omit`` is automatically removed from its parent container during templating.
- templating - Template evaluation is lazier than in previous versions. Template expressions which resolve only portions of a data structure no longer result in the entire structure being templated.
- templating - Templating errors now provide more information about both the location and context of the error, especially for deeply-nested and/or indirected templating scenarios.
- templating - Unified ``omit`` behavior now requires that plugins calling ``Templar.template()`` handle cases where the entire template result is omitted, by catching the ``AnsibleValueOmittedError`` that is raised. Previously, this condition caused a randomly-generated string marker to appear in the template result.
- templating - Variables of type ``set`` and ``tuple`` are now converted to ``list`` when exiting the final pass of templating.
- to_json / to_nice_json filters - The filters accept a ``profile`` argument, which defaults to ``tagless``.
- troubleshooting - Tracebacks can be collected and displayed for most errors, warnings, and deprecation warnings (including those generated by modules). Tracebacks are no longer enabled with ``-vvv``; the behavior is directly configurable via the ``DISPLAY_TRACEBACK`` config option. Module tracebacks passed to ``fail_json`` via the ``exception`` kwarg will not be included in the task result unless error tracebacks are configured.
- undef jinja function - The ``undef`` jinja function now raises an error if a non-string hint is given. Attempting to use an undefined hint also results in an error, ensuring incorrect use of the function can be distinguished from the function's normal behavior.
- validate-modules sanity test - make sure that ``module`` and ``plugin`` ``seealso`` entries use FQCNs (https://github.com/ansible/ansible/pull/84325).
- vault - improved vault filter documentation by adding missing example content for dump_template_data.j2, refining examples for clarity, and ensuring variable consistency (https://github.com/ansible/ansible/issues/83583).
- warnings - All warnings (including deprecation warnings) issued during a task's execution are now accessible via the ``warnings`` and ``deprecations`` keys on the task result.
- when the ``dict`` lookup is given a non-dict argument, show the value of the argument and its type in the error message.
- windows - add hard minimum limit for PowerShell to 5.1. Ansible dropped support for older versions of PowerShell in the 2.16 release but this reqirement is now enforced at runtime.
- windows - refactor windows exec runner to improve efficiency and add better error reporting on failures.
- winrm - Remove need for pexpect on macOS hosts when using ``kinit`` to retrieve the Kerberos TGT. By default the code will now only use the builtin ``subprocess`` library which should handle issues with select and a high fd count and also simplify the code.
Breaking Changes / Porting Guide
--------------------------------
- Support for the ``toml`` library has been removed from TOML inventory parsing and dumping. Use ``tomli`` for parsing on Python 3.10. Python 3.11 and later have built-in support for parsing. Use ``tomli-w`` to support outputting inventory in TOML format.
- assert - The ``quiet`` argument must be a commonly-accepted boolean value. Previously, unrecognized values were silently treated as False.
- callback plugins - The structure of the ``exception``, ``warnings`` and ``deprecations`` values visible to callbacks has changed. Callbacks that inspect or serialize these values may require special handling.
- conditionals - Conditional expressions that result in non-boolean values are now an error by default. Such results often indicate unintentional use of templates where they are not supported, resulting in a conditional that is always true. When this option is enabled, conditional expressions which are a literal ``None`` or empty string will evaluate as true, for backwards compatibility. The error can be temporarily changed to a deprecation warning by enabling the ``ALLOW_BROKEN_CONDITIONALS`` config option.
- first_found lookup - When specifying ``files`` or ``paths`` as a templated list containing undefined values, the undefined list elements will be discarded with a warning. Previously, the entire list would be discarded without any warning.
- internals - The ``AnsibleLoader`` and ``AnsibleDumper`` classes for working with YAML are now factory functions and cannot be extended.
- internals - The ``ansible.utils.native_jinja`` Python module has been removed.
- inventory - Invalid variable names provided by inventories result in an inventory parse failure. This behavior is now consistent with other variable name usages throughout Ansible.
- lookup plugins - Lookup plugins called as `with_(lookup)` will no longer have the `_subdir` attribute set.
- lookup plugins - ``terms`` will always be passed to ``run`` as the first positional arg, where previously it was sometimes passed as a keyword arg when using ``with_`` syntax.
- loops - Omit placeholders no longer leak between loop item templating and task templating. Previously, ``omit`` placeholders could remain embedded in loop items after templating and be used as an ``omit`` for task templating. Now, values resolving to ``omit`` are dropped immediately when loop items are templated. To turn missing values into an ``omit`` for task templating, use ``| default(omit)``. This solution is backwards compatible with previous versions of ansible-core.
- modules - Ansible modules using ``sys.excepthook`` must use a standard ``try/except`` instead.
- plugins - Any plugin that sources or creates templates must properly tag them as trusted.
- plugins - Custom Jinja plugins that accept undefined top-level arguments must opt in to receiving them.
- plugins - Custom Jinja plugins that use ``environment.getitem`` to retrieve undefined values will now trigger a ``MarkerError`` exception. This exception must be handled to allow the plugin to return a ``Marker``, or the plugin must opt-in to accepting ``Marker`` values.
- public API - The ``ansible.vars.fact_cache.FactCache`` wrapper has been removed.
- serialization of ``omit`` sentinel - Serialization of variables containing ``omit`` sentinels (e.g., by the ``to_json`` and ``to_yaml`` filters or ``ansible-inventory``) will fail if the variable has not completed templating. Previously, serialization succeeded with placeholder strings emitted in the serialized output.
- set_fact - The string values "yes", "no", "true" and "false" were previously converted (ignoring case) to boolean values when not using Jinja2 native mode. Since Jinja2 native mode is always used, this conversion no longer occurs. When boolean values are required, native boolean syntax should be used where variables are defined, such as in YAML. When native boolean syntax is not an option, the ``bool`` filter can be used to parse string values into booleans.
- template lookup - The ``convert_data`` option is deprecated and no longer has any effect. Use the ``from_json`` filter on the lookup result instead.
- templating - Access to ``_`` prefixed attributes and methods, and methods with known side effects, is no longer permitted. In cases where a matching mapping key is present, the associated value will be returned instead of an error. This increases template environment isolation and ensures more consistent behavior between the ``.`` and ``[]`` operators.
- templating - Conditionals and lookups which use embedded inline templates in Jinja string constants now display a warning. These templates should be converted to their expression equivalent.
- templating - Many Jinja plugins (filters, lookups, tests) and methods previously silently ignored undefined inputs, which often masked subtle errors. Passing an undefined argument to a Jinja plugin or method that does not declare undefined support now results in an undefined value.
- templating - Templates are always rendered in Jinja2 native mode. As a result, non-string values are no longer automatically converted to strings.
- templating - Templates resulting in ``None`` are no longer automatically converted to an empty string.
- templating - Templates with embedded inline templates that were not contained within a Jinja string constant now result in an error, as support for multi-pass templating was removed for security reasons. In most cases, such templates can be easily rewritten to avoid the use of embedded inline templates.
- templating - The ``allow_unsafe_lookups`` option no longer has any effect. Lookup plugins are responsible for tagging strings containing templates to allow evaluation as a template.
- templating - The result of the ``range()`` global function cannot be returned from a template- it should always be passed to a filter (e.g., ``random``). Previously, range objects returned from an intermediate template were always converted to a list, which is inconsistent with inline consumption of range objects.
- templating - ``#jinja2:`` overrides in templates with invalid override names or types are now templating errors.
Deprecated Features
-------------------
- CLI - The ``--inventory-file`` option alias is deprecated. Use the ``-i`` or ``--inventory`` option instead.
- Stategy Plugins - Use of strategy plugins not provided in ``ansible.builtin`` are deprecated and do not carry any backwards compatibility guarantees going forward. A future release will remove the ability to use external strategy plugins. No alternative for third party strategy plugins is currently planned.
- ``ansible.module_utils.compat.datetime`` - The datetime compatibility shims are now deprecated. They are scheduled to be removed in ``ansible-core`` v2.21. This includes ``UTC``, ``utcfromtimestamp()`` and ``utcnow`` importable from said module (https://github.com/ansible/ansible/pull/81874).
- bool filter - Support for coercing unrecognized input values (including None) has been deprecated. Consult the filter documentation for acceptable values, or consider use of the ``truthy`` and ``falsy`` tests.
- cache plugins - The `ansible.plugins.cache.base` Python module is deprecated. Use `ansible.plugins.cache` instead.
- callback plugins - The `v2_on_any` callback method is deprecated. Use specific callback methods instead.
- callback plugins - The v1 callback API (callback methods not prefixed with `v2_`) is deprecated. Use `v2_` prefixed methods instead.
- conditionals - Conditionals using Jinja templating delimiters (e.g., ``{{``, ``{%``) should be rewritten as expressions without delimiters, unless the entire conditional value is a single template that resolves to a trusted string expression. This is useful for dynamic indirection of conditional expressions, but is limited to trusted literal string expressions.
- config - The ``ACTION_WARNINGS`` config has no effect. It previously disabled command warnings, which have since been removed.
- config - The ``DEFAULT_JINJA2_NATIVE`` option has no effect. Jinja2 native mode is now the default and only option.
- config - The ``DEFAULT_NULL_REPRESENTATION`` option has no effect. Null values are no longer automatically converted to another value during templating of single variable references.
- display - The ``Display.get_deprecation_message`` method has been deprecated. Call ``Display.deprecated`` to display a deprecation message, or call it with ``removed=True`` to raise an ``AnsibleError``.
- file loading - Loading text files with ``DataLoader`` containing data that cannot be decoded under the expected encoding is deprecated. In most cases the encoding must be UTF-8, although some plugins allow choosing a different encoding. Previously, invalid data was silently wrapped in Unicode surrogate escape sequences, often resulting in later errors or other data corruption.
- first_found lookup - Splitting of file paths on ``,;:`` is deprecated. Pass a list of paths instead. The ``split`` method on strings can be used to split variables into a list as needed.
- interpreter discovery - The ``auto_legacy`` and ``auto_legacy_silent`` options for ``INTERPRETER_PYTHON`` are deprecated. Use ``auto`` or ``auto_silent`` options instead, as they have the same effect.
- oneline callback - The ``oneline`` callback and its associated ad-hoc CLI args (``-o``, ``--one-line``) are deprecated.
- paramiko - The paramiko connection plugin has been deprecated with planned removal in 2.21.
- playbook variables - The ``play_hosts`` variable has been deprecated, use ``ansible_play_batch`` instead.
- plugin error handling - The ``AnsibleError`` constructor arg ``suppress_extended_error`` is deprecated. Using ``suppress_extended_error=True`` has the same effect as ``show_content=False``.
- plugins - The ``listify_lookup_plugin_terms`` function is obsolete and in most cases no longer needed.
- template lookup - The jinja2_native option is no longer used in the Ansible Core code base. Jinja2 native mode is now the default and only option.
- templating - Support for enabling Jinja2 extensions (not plugins) has been deprecated.
- templating - The ``ansible_managed`` variable available for certain templating scenarios, such as the ``template`` action and ``template`` lookup has been deprecated. Define and use a custom variable instead of relying on ``ansible_managed``.
- templating - The ``disable_lookups`` option has no effect, since plugins must be updated to apply trust before any templating can be performed.
- to_yaml/to_nice_yaml filters - Implicit YAML dumping of vaulted value ciphertext is deprecated. Set `dump_vault_tags` to explicitly specify the desired behavior.
- tree callback - The ``tree`` callback and its associated ad-hoc CLI args (``-t``, ``--tree``) are deprecated.
Removed Features (previously deprecated)
----------------------------------------
- Remove deprecated plural form of collection path (https://github.com/ansible/ansible/pull/84156).
- Removed deprecated STRING_CONVERSION_ACTION (https://github.com/ansible/ansible/issues/84220).
- encrypt - passing unsupported passlib hashtype now raises AnsibleFilterError.
- manager - remove deprecated include_delegate_to parameter from get_vars API.
- modules - Modules returning non-UTF8 strings now result in an error. The ``MODULE_STRICT_UTF8_RESPONSE`` setting can be used to disable this check.
- removed deprecated pycompat24 and compat.importlib.
- selector - remove deprecated compat.selector related files (https://github.com/ansible/ansible/pull/84155).
- windows - removed common module functions ``ConvertFrom-AnsibleJson``, ``Format-AnsibleException`` from Windows modules as they are not used and add uneeded complexity to the code.
Security Fixes
--------------
- include_vars action - Ensure that result masking is correctly requested when vault-encrypted files are read. (CVE-2024-8775)
- task result processing - Ensure that action-sourced result masking (``_ansible_no_log=True``) is preserved. (CVE-2024-8775)
- templating - Ansible's template engine no longer processes Jinja templates in strings unless they are marked as coming from a trusted source. Untrusted strings containing Jinja template markers are ignored with a warning. Examples of trusted sources include playbooks, vars files, and many inventory sources. Examples of untrusted sources include module results and facts. Plugins which have not been updated to preserve trust while manipulating strings may inadvertently cause them to lose their trusted status.
- templating - Changes to conditional expression handling removed numerous instances of insecure multi-pass templating (which could result in execution of untrusted template expressions).
- user action won't allow ssh-keygen, chown and chmod to run on existing ssh public key file, avoiding traversal on existing symlinks (CVE-2024-9902).
Bugfixes
--------
- Ansible will now also warn when reserved keywords are set via a module (set_fact, include_vars, etc).
- Ansible.Basic - Fix ``required_if`` check when the option value to check is unset or set to null.
- Correctly return ``False`` when using the ``filter`` and ``test`` Jinja tests on plugin names which are not filters or tests, respectively. (resolves issue https://github.com/ansible/ansible/issues/82084)
- Do not run implicit ``flush_handlers`` meta tasks when the whole play is excluded from the run due to tags specified.
- Errors now preserve stacked error messages even when YAML is involved.
- Fix a display.debug statement with the wrong param in _get_diff_data() method
- Fix disabling SSL verification when installing collections and roles from git repositories. If ``--ignore-certs`` isn't provided, the value for the ``GALAXY_IGNORE_CERTS`` configuration option will be used (https://github.com/ansible/ansible/issues/83326).
- Fix ipv6 pattern bug in lib/ansible/parsing/utils/addresses.py (https://github.com/ansible/ansible/issues/84237)
- Fix returning 'unreachable' for the overall task result. This prevents false positives when a looped task has unignored unreachable items (https://github.com/ansible/ansible/issues/84019).
- Implicit ``meta: flush_handlers`` tasks now have a parent block to prevent potential tracebacks when calling methods like ``get_play()`` on them internally.
- Improve performance on large inventories by reducing the number of implicit meta tasks.
- Jinja plugins - Errors raised will always be derived from ``AnsibleTemplatePluginError``.
- Optimize the way tasks from within ``include_tasks``/``include_role`` are inserted into the play.
- Time out waiting on become is an unreachable error (https://github.com/ansible/ansible/issues/84468)
- Use consistent multiprocessing context for action write locks
- Use the requested error message in the ansible.module_utils.facts.timeout timeout function instead of hardcoding one.
- Windows - add support for running on system where WDAC is in audit mode with ``Dynamic Code Security`` enabled.
- YAML parsing - The `!unsafe` tag no longer coerces non-string scalars to strings.
- ``ansible-galaxy`` — the collection dependency resolver now treats version specifiers starting with ``!=`` as unpinned.
- ``package``/``dnf`` action plugins - provide the reason behind the failure to gather the ``ansible_pkg_mgr`` fact to identify the package backend
- action plugins - Action plugins that raise unhandled exceptions no longer terminate playbook loops. Previously, exceptions raised by an action plugin caused abnormal loop termination and loss of loop iteration results.
- ansible-config - format galaxy server configs while dumping in JSON format (https://github.com/ansible/ansible/issues/84840).
- ansible-doc - If none of the files in files exists, path will be undefined and a direct reference will throw an UnboundLocalError (https://github.com/ansible/ansible/pull/84464).
- ansible-galaxy - Small adjustments to URL building for ``download_url`` and relative redirects.
- ansible-pull change detection will now work independently of callback or result format settings.
- ansible-test - Enable the ``sys.unraisablehook`` work-around for the ``pylint`` sanity test on Python 3.11. Previously the work-around was only enabled for Python 3.12 and later. However, the same issue has been discovered on Python 3.11.
- ansible-test - Ensure CA certificates are installed on managed FreeBSD instances.
- ansible-test - Fix support for PowerShell module_util imports with the ``-Optional`` flag.
- ansible-test - Fix support for detecting PowerShell modules importing module utils with the newer ``#AnsibleRequires`` format.
- ansible-test - Fix traceback that occurs after an interactive command fails.
- ansible-test - Fix up coverage reporting to properly translate the temporary path of integration test modules to the expected static test module path.
- ansible-test - Fixed traceback when handling certain YAML errors in the ``yamllint`` sanity test.
- ansible-test - Managed macOS instances now use the ``sudo_chdir`` option for the ``sudo`` become plugin to avoid permission errors when dropping privileges.
- ansible-vault will now correctly handle `--prompt`, previously it would issue an error about stdin if no 2nd argument was passed
- ansible_uptime_second - added ansible_uptime_seconds fact support for AIX (https://github.com/ansible/ansible/pull/84321).
- apt_key module - prevent tests from running when apt-key was removed
- base.yml - deprecated libvirt_lxc_noseclabel config.
- build - Pin ``wheel`` in ``pyproject.toml`` to ensure compatibility with supported ``setuptools`` versions.
- config - various fixes to config lookup plugin (https://github.com/ansible/ansible/pull/84398).
- copy - refactor copy module for simplicity.
- copy action now prevents user from setting internal options.
- debconf - set empty password values (https://github.com/ansible/ansible/issues/83214).
- debug - hide loop vars in debug var display (https://github.com/ansible/ansible/issues/65856).
- default callback - Error context is now shown for failing tasks that use the ``debug`` action.
- display - The ``Display.deprecated`` method once again properly handles the ``removed=True`` argument (https://github.com/ansible/ansible/issues/82358).
- distro - add support for Linux Mint Debian Edition (LMDE) (https://github.com/ansible/ansible/issues/84934).
- distro - detect Debian as os_family for LMDE 6 (https://github.com/ansible/ansible/issues/84934).
- dnf5 - Handle forwarded exceptions from dnf5-5.2.13 where a generic ``RuntimeError`` was previously raised
- dnf5 - fix ``is_installed`` check for packages that are not installed but listed as provided by an installed package (https://github.com/ansible/ansible/issues/84578)
- dnf5 - fix installing a package using ``state=latest`` when a binary of the same name as the package is already installed (https://github.com/ansible/ansible/issues/84259)
- dnf5 - fix traceback when ``enable_plugins``/``disable_plugins`` is used on ``python3-libdnf5`` versions that do not support this functionality
- dnf5 - libdnf5 - use ``conf.pkg_gpgcheck`` instead of deprecated ``conf.gpgcheck`` which is used only as a fallback
- dnf5 - matching on a binary can be achieved only by specifying a full path (https://github.com/ansible/ansible/issues/84334)
- facts - gather pagesize and calculate respective values depending upon architecture (https://github.com/ansible/ansible/issues/84773).
- facts - skip if distribution file path is directory, instead of raising error (https://github.com/ansible/ansible/issues/84006).
- find - skip ENOENT error code while recursively enumerating files. find module will now be tolerant to race conditions that remove files or directories from the target it is currently inspecting. (https://github.com/ansible/ansible/issues/84873).
- first_found lookup - Corrected return value documentation to reflect None (not empty string) for no files found.
- gather_facts action now defaults to `ansible.legacy.setup` if `smart` was set, no network OS was found and no other alias for `setup` was present.
- gather_facts action will now issues errors and warnings as appropriate if a network OS is detected but no facts modules are defined for it.
- gather_facts action, will now add setup when 'smart' appears with other modules in the FACTS_MODULES setting (#84750).
- get_url - add support for BSD-style checksum digest file (https://github.com/ansible/ansible/issues/84476).
- get_url - fix honoring ``filename`` from the ``content-disposition`` header even when the type is ``inline`` (https://github.com/ansible/ansible/issues/83690)
- host_group_vars - fixed defining the 'key' variable if the get_vars method is called with cache=False (https://github.com/ansible/ansible/issues/84384)
- include_vars - fix including previously undefined hash variables with hash_behaviour merge (https://github.com/ansible/ansible/issues/84295).
- iptables - Allows the wait paramater to be used with iptables chain creation (https://github.com/ansible/ansible/issues/84490)
- linear strategy - fix executing ``end_role`` meta tasks for each host, instead of handling these as implicit run_once tasks (https://github.com/ansible/ansible/issues/84660).
- local connection plugin - Become timeout errors now include all received data. Previously, the most recently-received data was discarded.
- local connection plugin - Ensure ``become`` success validation always occurs, even when an active plugin does not set ``prompt``.
- local connection plugin - Fixed cases where the internal ``BECOME-SUCCESS`` message appeared in task output.
- local connection plugin - Fixed hang or spurious failure when data arrived concurrently on stdout and stderr during a successful ``become`` operation validation.
- local connection plugin - Fixed hang when a become plugin expects a prompt but a password was not provided.
- local connection plugin - Fixed hang when an active become plugin incorrectly signals lack of prompt.
- local connection plugin - Fixed hang when an internal become read timeout expired before the password prompt was written.
- local connection plugin - Fixed hang when only one of stdout or stderr was closed by the ``become_exe`` subprocess.
- local connection plugin - Fixed long timeout/hang for ``become`` plugins that repeat their prompt on failure (e.g., ``sudo``, some ``su`` implementations).
- local connection plugin - Fixed silent ignore of ``become`` failures and loss of task output when data arrived concurrently on stdout and stderr during ``become`` operation validation.
- local connection plugin - Fixed task output header truncation when post-become data arrived before ``become`` operation validation had completed.
- lookup plugins - The ``terms`` arg to the ``run`` method is now always a list. Previously, there were cases where a non-list could be received.
- module arg templating - When using a templated raw task arg and a templated ``args`` keyword, args are now merged. Previously use of templated raw task args silently ignored all values from the templated ``args`` keyword.
- module defaults - Module defaults are no longer templated unless they are used by a task that does not override them. Previously, all module defaults for all modules were templated for every task.
- module respawn - limit to supported Python versions
- omitting task args - Use of omit for task args now properly falls back to args of lower precedence, such as module defaults. Previously an omitted value would obliterate values of lower precedence.
- package_facts module when using 'auto' will return the first package manager found that provides an output, instead of just the first one, as this can be foreign and not have any packages.
- psrp - Improve stderr parsing when running raw commands that emit error records or stderr lines.
- regex_search filter - Corrected return value documentation to reflect None (not empty string) for no match.
- respawn - use copy of env variables to update existing PYTHONPATH value (https://github.com/ansible/ansible/issues/84954).
- runas become - Fix up become logic to still get the SYSTEM token with the most privileges when running as SYSTEM.
- sequence lookup - sequence query/lookups without positional arguments now return a valid list if their kwargs comprise a valid sequence expression (https://github.com/ansible/ansible/issues/82921).
- service_facts - skip lines which does not contain service names in openrc output (https://github.com/ansible/ansible/issues/84512).
- ssh - Improve the logic for parsing CLIXML data in stderr when working with Windows host. This fixes issues when the raw stderr contains invalid UTF-8 byte sequences and improves embedded CLIXML sequences.
- ssh - Raise exception when sshpass returns error code (https://github.com/ansible/ansible/issues/58133).
- ssh - connection options were incorrectly templated during ``reset_connection`` tasks (https://github.com/ansible/ansible/pull/84238).
- stability - Fixed silent process failure on unhandled IOError/OSError under ``linear`` strategy.
- su become plugin - Ensure generated regex from ``prompt_l10n`` config values is properly escaped.
- su become plugin - Ensure that password prompts are correctly detected in the presence of leading output. Previously, this case resulted in a timeout or hang.
- su become plugin - Ensure that trailing colon is expected on all ``prompt_l10n`` config values.
- sudo become plugin - The `sudo_chdir` config option allows the current directory to be set to the specified value before executing sudo to avoid permission errors when dropping privileges.
- sunos - remove hard coding of virtinfo command in facts gathering code (https://github.com/ansible/ansible/pull/84357).
- to_yaml/to_nice_yaml filters - Eliminated possibility of keyword arg collisions with internally-set defaults.
- unarchive - Clamp timestamps from beyond y2038 to representible values when unpacking zip files on platforms that use 32-bit time_t (e.g. Debian i386).
- uri - Form location correctly when the server returns a relative redirect (https://github.com/ansible/ansible/issues/84540)
- uri - Handle HTTP exceptions raised while reading the content (https://github.com/ansible/ansible/issues/83794).
- uri - mark ``url`` as required (https://github.com/ansible/ansible/pull/83642).
- user - Create Buildroot subclass as alias to Busybox (https://github.com/ansible/ansible/issues/83665).
- user - Set timeout for passphrase interaction.
- user - Update prompt for SSH key passphrase (https://github.com/ansible/ansible/issues/84484).
- user - Use higher precedence HOME_MODE as UMASK for path provided (https://github.com/ansible/ansible/pull/84482).
- user action will now require O(force) to overwrite the public part of an ssh key when generating ssh keys, as was already the case for the private part.
- user module now avoids changing ownership of files symlinked in provided home dir skeleton
- vars lookup - The ``default`` substitution only applies when trying to look up a variable which is not defined. If the variable is defined, but templates to an undefined value, the ``default`` substitution will not apply. Use the ``default`` filter to coerce those values instead.
- wait_for_connection - a warning was displayed if any hosts used a local connection (https://github.com/ansible/ansible/issues/84419)
Known Issues
------------
- templating - Any string value starting with ``#jinja2:`` which is templated will always be interpreted as Jinja2 configuration overrides. To include this literal value at the start of a string, a space or other character must precede it.
- variables - Tagged values cannot be used for dictionary keys in many circumstances.
- variables - The values ``None``, ``True`` and ``False`` cannot be tagged because they are singletons. Attempts to apply tags to these values will be silently ignored.

@ -1,2 +1,855 @@
ancestor: 2.18.0
releases: {}
releases:
2.19.0b1:
changes:
breaking_changes:
- Support for the ``toml`` library has been removed from TOML inventory parsing
and dumping. Use ``tomli`` for parsing on Python 3.10. Python 3.11 and later
have built-in support for parsing. Use ``tomli-w`` to support outputting inventory
in TOML format.
- assert - The ``quiet`` argument must be a commonly-accepted boolean value.
Previously, unrecognized values were silently treated as False.
- callback plugins - The structure of the ``exception``, ``warnings`` and ``deprecations``
values visible to callbacks has changed. Callbacks that inspect or serialize
these values may require special handling.
- conditionals - Conditional expressions that result in non-boolean values are
now an error by default. Such results often indicate unintentional use of
templates where they are not supported, resulting in a conditional that is
always true. When this option is enabled, conditional expressions which are
a literal ``None`` or empty string will evaluate as true, for backwards compatibility.
The error can be temporarily changed to a deprecation warning by enabling
the ``ALLOW_BROKEN_CONDITIONALS`` config option.
- first_found lookup - When specifying ``files`` or ``paths`` as a templated
list containing undefined values, the undefined list elements will be discarded
with a warning. Previously, the entire list would be discarded without any
warning.
- internals - The ``AnsibleLoader`` and ``AnsibleDumper`` classes for working
with YAML are now factory functions and cannot be extended.
- internals - The ``ansible.utils.native_jinja`` Python module has been removed.
- inventory - Invalid variable names provided by inventories result in an inventory
parse failure. This behavior is now consistent with other variable name usages
throughout Ansible.
- lookup plugins - Lookup plugins called as `with_(lookup)` will no longer have
the `_subdir` attribute set.
- lookup plugins - ``terms`` will always be passed to ``run`` as the first positional
arg, where previously it was sometimes passed as a keyword arg when using
``with_`` syntax.
- loops - Omit placeholders no longer leak between loop item templating and
task templating. Previously, ``omit`` placeholders could remain embedded in
loop items after templating and be used as an ``omit`` for task templating.
Now, values resolving to ``omit`` are dropped immediately when loop items
are templated. To turn missing values into an ``omit`` for task templating,
use ``| default(omit)``. This solution is backwards compatible with previous
versions of ansible-core.
- modules - Ansible modules using ``sys.excepthook`` must use a standard ``try/except``
instead.
- plugins - Any plugin that sources or creates templates must properly tag them
as trusted.
- plugins - Custom Jinja plugins that accept undefined top-level arguments must
opt in to receiving them.
- plugins - Custom Jinja plugins that use ``environment.getitem`` to retrieve
undefined values will now trigger a ``MarkerError`` exception. This exception
must be handled to allow the plugin to return a ``Marker``, or the plugin
must opt-in to accepting ``Marker`` values.
- public API - The ``ansible.vars.fact_cache.FactCache`` wrapper has been removed.
- serialization of ``omit`` sentinel - Serialization of variables containing
``omit`` sentinels (e.g., by the ``to_json`` and ``to_yaml`` filters or ``ansible-inventory``)
will fail if the variable has not completed templating. Previously, serialization
succeeded with placeholder strings emitted in the serialized output.
- set_fact - The string values "yes", "no", "true" and "false" were previously
converted (ignoring case) to boolean values when not using Jinja2 native mode.
Since Jinja2 native mode is always used, this conversion no longer occurs.
When boolean values are required, native boolean syntax should be used where
variables are defined, such as in YAML. When native boolean syntax is not
an option, the ``bool`` filter can be used to parse string values into booleans.
- template lookup - The ``convert_data`` option is deprecated and no longer
has any effect. Use the ``from_json`` filter on the lookup result instead.
- templating - Access to ``_`` prefixed attributes and methods, and methods
with known side effects, is no longer permitted. In cases where a matching
mapping key is present, the associated value will be returned instead of an
error. This increases template environment isolation and ensures more consistent
behavior between the ``.`` and ``[]`` operators.
- templating - Conditionals and lookups which use embedded inline templates
in Jinja string constants now display a warning. These templates should be
converted to their expression equivalent.
- templating - Many Jinja plugins (filters, lookups, tests) and methods previously
silently ignored undefined inputs, which often masked subtle errors. Passing
an undefined argument to a Jinja plugin or method that does not declare undefined
support now results in an undefined value.
- templating - Templates are always rendered in Jinja2 native mode. As a result,
non-string values are no longer automatically converted to strings.
- templating - Templates resulting in ``None`` are no longer automatically converted
to an empty string.
- templating - Templates with embedded inline templates that were not contained
within a Jinja string constant now result in an error, as support for multi-pass
templating was removed for security reasons. In most cases, such templates
can be easily rewritten to avoid the use of embedded inline templates.
- templating - The ``allow_unsafe_lookups`` option no longer has any effect.
Lookup plugins are responsible for tagging strings containing templates to
allow evaluation as a template.
- templating - The result of the ``range()`` global function cannot be returned
from a template- it should always be passed to a filter (e.g., ``random``).
Previously, range objects returned from an intermediate template were always
converted to a list, which is inconsistent with inline consumption of range
objects.
- templating - ``#jinja2:`` overrides in templates with invalid override names
or types are now templating errors.
bugfixes:
- Ansible will now also warn when reserved keywords are set via a module (set_fact,
include_vars, etc).
- Ansible.Basic - Fix ``required_if`` check when the option value to check is
unset or set to null.
- Correctly return ``False`` when using the ``filter`` and ``test`` Jinja tests
on plugin names which are not filters or tests, respectively. (resolves issue
https://github.com/ansible/ansible/issues/82084)
- Do not run implicit ``flush_handlers`` meta tasks when the whole play is excluded
from the run due to tags specified.
- Errors now preserve stacked error messages even when YAML is involved.
- Fix a display.debug statement with the wrong param in _get_diff_data() method
- Fix disabling SSL verification when installing collections and roles from
git repositories. If ``--ignore-certs`` isn't provided, the value for the
``GALAXY_IGNORE_CERTS`` configuration option will be used (https://github.com/ansible/ansible/issues/83326).
- Fix ipv6 pattern bug in lib/ansible/parsing/utils/addresses.py (https://github.com/ansible/ansible/issues/84237)
- Fix returning 'unreachable' for the overall task result. This prevents false
positives when a looped task has unignored unreachable items (https://github.com/ansible/ansible/issues/84019).
- 'Implicit ``meta: flush_handlers`` tasks now have a parent block to prevent
potential tracebacks when calling methods like ``get_play()`` on them internally.'
- Improve performance on large inventories by reducing the number of implicit
meta tasks.
- Jinja plugins - Errors raised will always be derived from ``AnsibleTemplatePluginError``.
- Optimize the way tasks from within ``include_tasks``/``include_role`` are
inserted into the play.
- Time out waiting on become is an unreachable error (https://github.com/ansible/ansible/issues/84468)
- Use consistent multiprocessing context for action write locks
- Use the requested error message in the ansible.module_utils.facts.timeout
timeout function instead of hardcoding one.
- Windows - add support for running on system where WDAC is in audit mode with
``Dynamic Code Security`` enabled.
- YAML parsing - The `!unsafe` tag no longer coerces non-string scalars to strings.
- "``ansible-galaxy`` \u2014 the collection dependency resolver now treats version
specifiers starting with ``!=`` as unpinned."
- '``package``/``dnf`` action plugins - provide the reason behind the failure
to gather the ``ansible_pkg_mgr`` fact to identify the package backend'
- action plugins - Action plugins that raise unhandled exceptions no longer
terminate playbook loops. Previously, exceptions raised by an action plugin
caused abnormal loop termination and loss of loop iteration results.
- ansible-config - format galaxy server configs while dumping in JSON format
(https://github.com/ansible/ansible/issues/84840).
- ansible-doc - If none of the files in files exists, path will be undefined
and a direct reference will throw an UnboundLocalError (https://github.com/ansible/ansible/pull/84464).
- ansible-galaxy - Small adjustments to URL building for ``download_url`` and
relative redirects.
- ansible-pull change detection will now work independently of callback or result
format settings.
- ansible-test - Enable the ``sys.unraisablehook`` work-around for the ``pylint``
sanity test on Python 3.11. Previously the work-around was only enabled for
Python 3.12 and later. However, the same issue has been discovered on Python
3.11.
- ansible-test - Ensure CA certificates are installed on managed FreeBSD instances.
- ansible-test - Fix support for PowerShell module_util imports with the ``-Optional``
flag.
- ansible-test - Fix support for detecting PowerShell modules importing module
utils with the newer ``#AnsibleRequires`` format.
- ansible-test - Fix traceback that occurs after an interactive command fails.
- ansible-test - Fix up coverage reporting to properly translate the temporary
path of integration test modules to the expected static test module path.
- ansible-test - Fixed traceback when handling certain YAML errors in the ``yamllint``
sanity test.
- ansible-test - Managed macOS instances now use the ``sudo_chdir`` option for
the ``sudo`` become plugin to avoid permission errors when dropping privileges.
- ansible-vault will now correctly handle `--prompt`, previously it would issue
an error about stdin if no 2nd argument was passed
- ansible_uptime_second - added ansible_uptime_seconds fact support for AIX
(https://github.com/ansible/ansible/pull/84321).
- apt_key module - prevent tests from running when apt-key was removed
- base.yml - deprecated libvirt_lxc_noseclabel config.
- build - Pin ``wheel`` in ``pyproject.toml`` to ensure compatibility with supported
``setuptools`` versions.
- config - various fixes to config lookup plugin (https://github.com/ansible/ansible/pull/84398).
- copy - refactor copy module for simplicity.
- copy action now prevents user from setting internal options.
- debconf - set empty password values (https://github.com/ansible/ansible/issues/83214).
- debug - hide loop vars in debug var display (https://github.com/ansible/ansible/issues/65856).
- default callback - Error context is now shown for failing tasks that use the
``debug`` action.
- display - The ``Display.deprecated`` method once again properly handles the
``removed=True`` argument (https://github.com/ansible/ansible/issues/82358).
- distro - add support for Linux Mint Debian Edition (LMDE) (https://github.com/ansible/ansible/issues/84934).
- distro - detect Debian as os_family for LMDE 6 (https://github.com/ansible/ansible/issues/84934).
- dnf5 - Handle forwarded exceptions from dnf5-5.2.13 where a generic ``RuntimeError``
was previously raised
- dnf5 - fix ``is_installed`` check for packages that are not installed but
listed as provided by an installed package (https://github.com/ansible/ansible/issues/84578)
- dnf5 - fix installing a package using ``state=latest`` when a binary of the
same name as the package is already installed (https://github.com/ansible/ansible/issues/84259)
- dnf5 - fix traceback when ``enable_plugins``/``disable_plugins`` is used on
``python3-libdnf5`` versions that do not support this functionality
- dnf5 - libdnf5 - use ``conf.pkg_gpgcheck`` instead of deprecated ``conf.gpgcheck``
which is used only as a fallback
- dnf5 - matching on a binary can be achieved only by specifying a full path
(https://github.com/ansible/ansible/issues/84334)
- facts - gather pagesize and calculate respective values depending upon architecture
(https://github.com/ansible/ansible/issues/84773).
- facts - skip if distribution file path is directory, instead of raising error
(https://github.com/ansible/ansible/issues/84006).
- find - skip ENOENT error code while recursively enumerating files. find module
will now be tolerant to race conditions that remove files or directories from
the target it is currently inspecting. (https://github.com/ansible/ansible/issues/84873).
- first_found lookup - Corrected return value documentation to reflect None
(not empty string) for no files found.
- gather_facts action now defaults to `ansible.legacy.setup` if `smart` was
set, no network OS was found and no other alias for `setup` was present.
- gather_facts action will now issues errors and warnings as appropriate if
a network OS is detected but no facts modules are defined for it.
- gather_facts action, will now add setup when 'smart' appears with other modules
in the FACTS_MODULES setting (#84750).
- get_url - add support for BSD-style checksum digest file (https://github.com/ansible/ansible/issues/84476).
- get_url - fix honoring ``filename`` from the ``content-disposition`` header
even when the type is ``inline`` (https://github.com/ansible/ansible/issues/83690)
- host_group_vars - fixed defining the 'key' variable if the get_vars method
is called with cache=False (https://github.com/ansible/ansible/issues/84384)
- include_vars - fix including previously undefined hash variables with hash_behaviour
merge (https://github.com/ansible/ansible/issues/84295).
- iptables - Allows the wait paramater to be used with iptables chain creation
(https://github.com/ansible/ansible/issues/84490)
- linear strategy - fix executing ``end_role`` meta tasks for each host, instead
of handling these as implicit run_once tasks (https://github.com/ansible/ansible/issues/84660).
- local connection plugin - Become timeout errors now include all received data.
Previously, the most recently-received data was discarded.
- local connection plugin - Ensure ``become`` success validation always occurs,
even when an active plugin does not set ``prompt``.
- local connection plugin - Fixed cases where the internal ``BECOME-SUCCESS``
message appeared in task output.
- local connection plugin - Fixed hang or spurious failure when data arrived
concurrently on stdout and stderr during a successful ``become`` operation
validation.
- local connection plugin - Fixed hang when a become plugin expects a prompt
but a password was not provided.
- local connection plugin - Fixed hang when an active become plugin incorrectly
signals lack of prompt.
- local connection plugin - Fixed hang when an internal become read timeout
expired before the password prompt was written.
- local connection plugin - Fixed hang when only one of stdout or stderr was
closed by the ``become_exe`` subprocess.
- local connection plugin - Fixed long timeout/hang for ``become`` plugins that
repeat their prompt on failure (e.g., ``sudo``, some ``su`` implementations).
- local connection plugin - Fixed silent ignore of ``become`` failures and loss
of task output when data arrived concurrently on stdout and stderr during
``become`` operation validation.
- local connection plugin - Fixed task output header truncation when post-become
data arrived before ``become`` operation validation had completed.
- lookup plugins - The ``terms`` arg to the ``run`` method is now always a list.
Previously, there were cases where a non-list could be received.
- module arg templating - When using a templated raw task arg and a templated
``args`` keyword, args are now merged. Previously use of templated raw task
args silently ignored all values from the templated ``args`` keyword.
- module defaults - Module defaults are no longer templated unless they are
used by a task that does not override them. Previously, all module defaults
for all modules were templated for every task.
- module respawn - limit to supported Python versions
- omitting task args - Use of omit for task args now properly falls back to
args of lower precedence, such as module defaults. Previously an omitted value
would obliterate values of lower precedence.
- package_facts module when using 'auto' will return the first package manager
found that provides an output, instead of just the first one, as this can
be foreign and not have any packages.
- psrp - Improve stderr parsing when running raw commands that emit error records
or stderr lines.
- regex_search filter - Corrected return value documentation to reflect None
(not empty string) for no match.
- respawn - use copy of env variables to update existing PYTHONPATH value (https://github.com/ansible/ansible/issues/84954).
- runas become - Fix up become logic to still get the SYSTEM token with the
most privileges when running as SYSTEM.
- sequence lookup - sequence query/lookups without positional arguments now
return a valid list if their kwargs comprise a valid sequence expression (https://github.com/ansible/ansible/issues/82921).
- service_facts - skip lines which does not contain service names in openrc
output (https://github.com/ansible/ansible/issues/84512).
- ssh - Improve the logic for parsing CLIXML data in stderr when working with
Windows host. This fixes issues when the raw stderr contains invalid UTF-8
byte sequences and improves embedded CLIXML sequences.
- ssh - Raise exception when sshpass returns error code (https://github.com/ansible/ansible/issues/58133).
- ssh - connection options were incorrectly templated during ``reset_connection``
tasks (https://github.com/ansible/ansible/pull/84238).
- stability - Fixed silent process failure on unhandled IOError/OSError under
``linear`` strategy.
- su become plugin - Ensure generated regex from ``prompt_l10n`` config values
is properly escaped.
- su become plugin - Ensure that password prompts are correctly detected in
the presence of leading output. Previously, this case resulted in a timeout
or hang.
- su become plugin - Ensure that trailing colon is expected on all ``prompt_l10n``
config values.
- sudo become plugin - The `sudo_chdir` config option allows the current directory
to be set to the specified value before executing sudo to avoid permission
errors when dropping privileges.
- sunos - remove hard coding of virtinfo command in facts gathering code (https://github.com/ansible/ansible/pull/84357).
- to_yaml/to_nice_yaml filters - Eliminated possibility of keyword arg collisions
with internally-set defaults.
- unarchive - Clamp timestamps from beyond y2038 to representible values when
unpacking zip files on platforms that use 32-bit time_t (e.g. Debian i386).
- uri - Form location correctly when the server returns a relative redirect
(https://github.com/ansible/ansible/issues/84540)
- uri - Handle HTTP exceptions raised while reading the content (https://github.com/ansible/ansible/issues/83794).
- uri - mark ``url`` as required (https://github.com/ansible/ansible/pull/83642).
- user - Create Buildroot subclass as alias to Busybox (https://github.com/ansible/ansible/issues/83665).
- user - Set timeout for passphrase interaction.
- user - Update prompt for SSH key passphrase (https://github.com/ansible/ansible/issues/84484).
- user - Use higher precedence HOME_MODE as UMASK for path provided (https://github.com/ansible/ansible/pull/84482).
- user action will now require O(force) to overwrite the public part of an ssh
key when generating ssh keys, as was already the case for the private part.
- user module now avoids changing ownership of files symlinked in provided home
dir skeleton
- vars lookup - The ``default`` substitution only applies when trying to look
up a variable which is not defined. If the variable is defined, but templates
to an undefined value, the ``default`` substitution will not apply. Use the
``default`` filter to coerce those values instead.
- wait_for_connection - a warning was displayed if any hosts used a local connection
(https://github.com/ansible/ansible/issues/84419)
deprecated_features:
- CLI - The ``--inventory-file`` option alias is deprecated. Use the ``-i``
or ``--inventory`` option instead.
- Stategy Plugins - Use of strategy plugins not provided in ``ansible.builtin``
are deprecated and do not carry any backwards compatibility guarantees going
forward. A future release will remove the ability to use external strategy
plugins. No alternative for third party strategy plugins is currently planned.
- '``ansible.module_utils.compat.datetime`` - The datetime compatibility shims
are now deprecated. They are scheduled to be removed in ``ansible-core`` v2.21.
This includes ``UTC``, ``utcfromtimestamp()`` and ``utcnow`` importable from
said module (https://github.com/ansible/ansible/pull/81874).'
- bool filter - Support for coercing unrecognized input values (including None)
has been deprecated. Consult the filter documentation for acceptable values,
or consider use of the ``truthy`` and ``falsy`` tests.
- cache plugins - The `ansible.plugins.cache.base` Python module is deprecated.
Use `ansible.plugins.cache` instead.
- callback plugins - The `v2_on_any` callback method is deprecated. Use specific
callback methods instead.
- callback plugins - The v1 callback API (callback methods not prefixed with
`v2_`) is deprecated. Use `v2_` prefixed methods instead.
- conditionals - Conditionals using Jinja templating delimiters (e.g., ``{{``,
``{%``) should be rewritten as expressions without delimiters, unless the
entire conditional value is a single template that resolves to a trusted string
expression. This is useful for dynamic indirection of conditional expressions,
but is limited to trusted literal string expressions.
- config - The ``ACTION_WARNINGS`` config has no effect. It previously disabled
command warnings, which have since been removed.
- config - The ``DEFAULT_JINJA2_NATIVE`` option has no effect. Jinja2 native
mode is now the default and only option.
- config - The ``DEFAULT_NULL_REPRESENTATION`` option has no effect. Null values
are no longer automatically converted to another value during templating of
single variable references.
- display - The ``Display.get_deprecation_message`` method has been deprecated.
Call ``Display.deprecated`` to display a deprecation message, or call it with
``removed=True`` to raise an ``AnsibleError``.
- file loading - Loading text files with ``DataLoader`` containing data that
cannot be decoded under the expected encoding is deprecated. In most cases
the encoding must be UTF-8, although some plugins allow choosing a different
encoding. Previously, invalid data was silently wrapped in Unicode surrogate
escape sequences, often resulting in later errors or other data corruption.
- first_found lookup - Splitting of file paths on ``,;:`` is deprecated. Pass
a list of paths instead. The ``split`` method on strings can be used to split
variables into a list as needed.
- interpreter discovery - The ``auto_legacy`` and ``auto_legacy_silent`` options
for ``INTERPRETER_PYTHON`` are deprecated. Use ``auto`` or ``auto_silent``
options instead, as they have the same effect.
- oneline callback - The ``oneline`` callback and its associated ad-hoc CLI
args (``-o``, ``--one-line``) are deprecated.
- paramiko - The paramiko connection plugin has been deprecated with planned
removal in 2.21.
- playbook variables - The ``play_hosts`` variable has been deprecated, use
``ansible_play_batch`` instead.
- plugin error handling - The ``AnsibleError`` constructor arg ``suppress_extended_error``
is deprecated. Using ``suppress_extended_error=True`` has the same effect
as ``show_content=False``.
- plugins - The ``listify_lookup_plugin_terms`` function is obsolete and in
most cases no longer needed.
- template lookup - The jinja2_native option is no longer used in the Ansible
Core code base. Jinja2 native mode is now the default and only option.
- templating - Support for enabling Jinja2 extensions (not plugins) has been
deprecated.
- templating - The ``ansible_managed`` variable available for certain templating
scenarios, such as the ``template`` action and ``template`` lookup has been
deprecated. Define and use a custom variable instead of relying on ``ansible_managed``.
- templating - The ``disable_lookups`` option has no effect, since plugins must
be updated to apply trust before any templating can be performed.
- to_yaml/to_nice_yaml filters - Implicit YAML dumping of vaulted value ciphertext
is deprecated. Set `dump_vault_tags` to explicitly specify the desired behavior.
- tree callback - The ``tree`` callback and its associated ad-hoc CLI args (``-t``,
``--tree``) are deprecated.
known_issues:
- templating - Any string value starting with ``#jinja2:`` which is templated
will always be interpreted as Jinja2 configuration overrides. To include this
literal value at the start of a string, a space or other character must precede
it.
- variables - Tagged values cannot be used for dictionary keys in many circumstances.
- variables - The values ``None``, ``True`` and ``False`` cannot be tagged because
they are singletons. Attempts to apply tags to these values will be silently
ignored.
major_changes:
- Jinja plugins - Jinja builtin filter and test plugins are now accessible via
their fully-qualified names ``ansible.builtin.{name}``.
- Task Execution / Forks - Forks no longer inherit stdio from the parent ``ansible-playbook``
process. ``stdout``, ``stderr``, and ``stdin`` within a worker are detached
from the terminal, and non-functional. All needs to access stdio from a fork
for controller side plugins requires use of ``Display``.
- ansible-test - Packages beneath ``module_utils`` can now contain ``__init__.py``
files.
- variables - The type system underlying Ansible's variable storage has been
significantly overhauled and formalized. Attempts to store unsupported Python
object types in variables will now result in an error.
- variables - To support new Ansible features, many variable objects are now
represented by subclasses of their respective native Python types. In most
cases, they behave indistinguishably from their original types, but some Python
libraries do not handle builtin object subclasses properly. Custom plugins
that interact with such libraries may require changes to convert and pass
the native types.
minor_changes:
- Added a -vvvvv log message indicating when a host fails to produce output
within the timeout period.
- AnsibleModule.uri - Add option ``multipart_encoding`` for ``form-multipart``
files in body to change default base64 encoding for files
- INVENTORY_IGNORE_EXTS config, removed ``ini`` from the default list, inventory
scripts using a corresponding .ini configuration are rare now and inventory.ini
files are more common. Those that need to ignore the ini files for inventory
scripts can still add it to configuration.
- Jinja plugins - Plugins can declare support for undefined values.
- Jinja2 version 3.1.0 or later is now required on the controller.
- Move ``follow_redirects`` parameter to module_utils so external modules can
reuse it.
- PlayIterator - do not return tasks from already executed roles so specific
strategy plugins do not have to do the filtering of such tasks themselves
- SSH Escalation-related -vvv log messages now include the associated host information.
- Windows - Add support for Windows Server 2025 to Ansible and as an ``ansible-test``
remote target - https://github.com/ansible/ansible/issues/84229
- Windows - refactor the async implementation to better handle errors during
bootstrapping and avoid WMI when possible.
- "``ansible-galaxy collection install`` \u2014 the collection dependency resolver
now prints out conflicts it hits during dependency resolution when it's taking
too long and it ends up backtracking a lot. It also displays suggestions on
how to help it compute the result more quickly."
- 'ansible, ansible-console, ansible-pull - add --flush-cache option (https://github.com/ansible/ansible/issues/83749).
'
- ansible-galaxy - Add support for Keycloak service accounts
- ansible-galaxy - support ``resolvelib >= 0.5.3, < 2.0.0`` (https://github.com/ansible/ansible/issues/84217).
- ansible-test - Added a macOS 15.3 remote VM, replacing 14.3.
- ansible-test - Automatically retry HTTP GET/PUT/DELETE requests on exceptions.
- ansible-test - Default to Python 3.13 in the ``base`` and ``default`` containers.
- ansible-test - Disable the ``deprecated-`` prefixed ``pylint`` rules as their
results vary by Python version.
- ansible-test - Disable the ``pep8`` sanity test rules ``E701`` and ``E704``
to improve compatibility with ``black``.
- ansible-test - Improve container runtime probe error handling. When unexpected
probe output is encountered, an error with more useful debugging information
is provided.
- ansible-test - Replace container Alpine 3.20 with 3.21.
- ansible-test - Replace container Fedora 40 with 41.
- ansible-test - Replace remote Alpine 3.20 with 3.21.
- ansible-test - Replace remote Fedora 40 with 41.
- ansible-test - Replace remote FreeBSD 13.3 with 13.5.
- ansible-test - Replace remote FreeBSD 14.1 with 14.2.
- ansible-test - Replace remote RHEL 9.4 with 9.5.
- ansible-test - Show a more user-friendly error message when a ``runme.sh``
script is not executable.
- ansible-test - The ``yamllint`` sanity test now enforces string values for
the ``!vault`` tag.
- ansible-test - Update ``nios-test-container`` to version 7.0.0.
- ansible-test - Update ``pylint`` sanity test to use version 3.3.1.
- ansible-test - Update distro containers to remove unnecessary pakages (apache2,
subversion, ruby).
- ansible-test - Update sanity test requirements to latest available versions.
- ansible-test - Update the HTTP test container.
- ansible-test - Update the PyPI test container.
- ansible-test - Update the ``base`` and ``default`` containers.
- ansible-test - Update the utility container.
- ansible-test - Use Python's ``urllib`` instead of ``curl`` for HTTP requests.
- ansible-test - When detection of the current container network fails, a warning
is now issued and execution continues. This simplifies usage in cases where
the current container cannot be inspected, such as when running in GitHub
Codespaces.
- ansible-test acme test container - bump `version to 2.3.0 <https://github.com/ansible/acme-test-container/releases/tag/2.3.0>`__
to include newer versions of Pebble, dependencies, and runtimes. This adds
support for ACME profiles, ``dns-account-01`` support, and some smaller improvements
(https://github.com/ansible/ansible/pull/84547).
- apt_key module - add notes to docs and errors to point at the CLI tool deprecation
by Debian and alternatives
- apt_repository module - add notes to errors to point at the CLI tool deprecation
by Debian and alternatives
- become plugins get new property 'pipelining' to show support or lack there
of for the feature.
- callback plugins - add has_option() to CallbackBase to match other functions
overloaded from AnsiblePlugin
- callback plugins - fix get_options() for CallbackBase
- copy - fix sanity test failures (https://github.com/ansible/ansible/pull/83643).
- copy - parameter ``local_follow`` was incorrectly documented as having default
value ``True`` (https://github.com/ansible/ansible/pull/83643).
- cron - Provide additional error information while writing cron file (https://github.com/ansible/ansible/issues/83223).
- csvfile - let the config system do the typecasting (https://github.com/ansible/ansible/pull/82263).
- display - Deduplication of warning and error messages considers the full content
of the message (including source and traceback contexts, if enabled). This
may result in fewer messages being omitted.
- distribution - Added openSUSE MicroOS to Suse OS family (#84685).
- dnf5, apt - add ``auto_install_module_deps`` option (https://github.com/ansible/ansible/issues/84206)
- docs - add collection name in message from which the module is being deprecated
(https://github.com/ansible/ansible/issues/84116).
- env lookup - The error message generated for a missing environment variable
when ``default`` is an undefined value (e.g. ``undef('something')``) will
contain the hint from that undefined value, except when the undefined value
is the default of ``undef()`` with no arguments. Previously, any existing
undefined hint would be ignored.
- file - enable file module to disable diff_mode (https://github.com/ansible/ansible/issues/80817).
- file - make code more readable and simple.
- filter - add support for URL-safe encoding and decoding in b64encode and b64decode
(https://github.com/ansible/ansible/issues/84147).
- find - add a checksum_algorithm parameter to specify which type of checksum
the module will return
- from_json filter - The filter accepts a ``profile`` argument, which defaults
to ``tagless``.
- handlers - Templated handler names with syntax errors, or that resolve to
``omit`` are now skipped like handlers with undefined variables in their name.
- improved error message for yaml parsing errors in plugin documentation
- local connection plugin - A new ``become_strip_preamble`` config option (default
True) was added; disable to preserve diagnostic ``become`` output in task
results.
- local connection plugin - A new ``become_success_timeout`` operation-wide
timeout config (default 10s) was added for ``become``.
- local connection plugin - When a ``become`` plugin's ``prompt`` value is a
non-string after the ``check_password_prompt`` callback has completed, no
prompt stripping will occur on stderr.
- lookup_template - add an option to trim blocks while templating (https://github.com/ansible/ansible/issues/75962).
- module - set ipv4 and ipv6 rules simultaneously in iptables module (https://github.com/ansible/ansible/issues/84404).
- module_utils - Add ``NoReturn`` type annotations to functions which never
return.
- modules - PowerShell modules can now receive ``datetime.date``, ``datetime.time``
and ``datetime.datetime`` values as ISO 8601 strings.
- modules - PowerShell modules can now receive strings sourced from inline vault-encrypted
strings.
- modules - Unhandled exceptions during Python module execution are now returned
as structured data from the target. This allows the new traceback handling
to be applied to exceptions raised on targets.
- pipelining logic has mostly moved to connection plugins so they can decide/override
settings.
- plugin error handling - When raising exceptions in an exception handler, be
sure to use ``raise ... from`` as appropriate. This supersedes the use of
the ``AnsibleError`` arg ``orig_exc`` to represent the cause. Specifying ``orig_exc``
as the cause is still permitted. Failure to use ``raise ... from`` when ``orig_exc``
is set will result in a warning. Additionally, if the two cause exceptions
do not match, a warning will be issued.
- removed harcoding of su plugin as it now works with pipelining.
- runtime-metadata sanity test - improve validation of ``action_groups`` (https://github.com/ansible/ansible/pull/83965).
- service_facts module got freebsd support added.
- ssh connection plugin - Support ``SSH_ASKPASS`` mechanism to provide passwords,
making it the default, but still offering an explicit choice to use ``sshpass``
(https://github.com/ansible/ansible/pull/83936)
- ssh connection plugin now overrides pipelining when a tty is requested.
- ssh-agent - ``ansible``, ``ansible-playbook`` and ``ansible-console`` are
capable of spawning or reusing an ssh-agent, allowing plugins to interact
with the ssh-agent. Additionally a pure python ssh-agent client has been added,
enabling easy interaction with the agent. The ssh connection plugin contains
new functionality via ``ansible_ssh_private_key`` and ``ansible_ssh_private_key_passphrase``,
for loading an SSH private key into the agent from a variable.
- templating - Access to an undefined variable from inside a lookup, filter,
or test (which raises MarkerError) no longer ends processing of the current
template. The triggering undefined value is returned as the result of the
offending plugin invocation, and the template continues to execute.
- templating - Embedding ``range()`` values in containers such as lists will
result in an error on use. Previously the value would be converted to a string
representing the range parameters, such as ``range(0, 3)``.
- templating - Handling of omitted values is now a first-class feature of the
template engine, and is usable in all Ansible Jinja template contexts. Any
template that resolves to ``omit`` is automatically removed from its parent
container during templating.
- templating - Template evaluation is lazier than in previous versions. Template
expressions which resolve only portions of a data structure no longer result
in the entire structure being templated.
- templating - Templating errors now provide more information about both the
location and context of the error, especially for deeply-nested and/or indirected
templating scenarios.
- templating - Unified ``omit`` behavior now requires that plugins calling ``Templar.template()``
handle cases where the entire template result is omitted, by catching the
``AnsibleValueOmittedError`` that is raised. Previously, this condition caused
a randomly-generated string marker to appear in the template result.
- templating - Variables of type ``set`` and ``tuple`` are now converted to
``list`` when exiting the final pass of templating.
- to_json / to_nice_json filters - The filters accept a ``profile`` argument,
which defaults to ``tagless``.
- troubleshooting - Tracebacks can be collected and displayed for most errors,
warnings, and deprecation warnings (including those generated by modules).
Tracebacks are no longer enabled with ``-vvv``; the behavior is directly configurable
via the ``DISPLAY_TRACEBACK`` config option. Module tracebacks passed to ``fail_json``
via the ``exception`` kwarg will not be included in the task result unless
error tracebacks are configured.
- undef jinja function - The ``undef`` jinja function now raises an error if
a non-string hint is given. Attempting to use an undefined hint also results
in an error, ensuring incorrect use of the function can be distinguished from
the function's normal behavior.
- validate-modules sanity test - make sure that ``module`` and ``plugin`` ``seealso``
entries use FQCNs (https://github.com/ansible/ansible/pull/84325).
- vault - improved vault filter documentation by adding missing example content
for dump_template_data.j2, refining examples for clarity, and ensuring variable
consistency (https://github.com/ansible/ansible/issues/83583).
- warnings - All warnings (including deprecation warnings) issued during a task's
execution are now accessible via the ``warnings`` and ``deprecations`` keys
on the task result.
- when the ``dict`` lookup is given a non-dict argument, show the value of the
argument and its type in the error message.
- windows - add hard minimum limit for PowerShell to 5.1. Ansible dropped support
for older versions of PowerShell in the 2.16 release but this reqirement is
now enforced at runtime.
- windows - refactor windows exec runner to improve efficiency and add better
error reporting on failures.
- winrm - Remove need for pexpect on macOS hosts when using ``kinit`` to retrieve
the Kerberos TGT. By default the code will now only use the builtin ``subprocess``
library which should handle issues with select and a high fd count and also
simplify the code.
release_summary: '| Release Date: 2025-04-14
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
'
removed_features:
- Remove deprecated plural form of collection path (https://github.com/ansible/ansible/pull/84156).
- Removed deprecated STRING_CONVERSION_ACTION (https://github.com/ansible/ansible/issues/84220).
- encrypt - passing unsupported passlib hashtype now raises AnsibleFilterError.
- manager - remove deprecated include_delegate_to parameter from get_vars API.
- modules - Modules returning non-UTF8 strings now result in an error. The ``MODULE_STRICT_UTF8_RESPONSE``
setting can be used to disable this check.
- removed deprecated pycompat24 and compat.importlib.
- selector - remove deprecated compat.selector related files (https://github.com/ansible/ansible/pull/84155).
- windows - removed common module functions ``ConvertFrom-AnsibleJson``, ``Format-AnsibleException``
from Windows modules as they are not used and add uneeded complexity to the
code.
security_fixes:
- include_vars action - Ensure that result masking is correctly requested when
vault-encrypted files are read. (CVE-2024-8775)
- task result processing - Ensure that action-sourced result masking (``_ansible_no_log=True``)
is preserved. (CVE-2024-8775)
- templating - Ansible's template engine no longer processes Jinja templates
in strings unless they are marked as coming from a trusted source. Untrusted
strings containing Jinja template markers are ignored with a warning. Examples
of trusted sources include playbooks, vars files, and many inventory sources.
Examples of untrusted sources include module results and facts. Plugins which
have not been updated to preserve trust while manipulating strings may inadvertently
cause them to lose their trusted status.
- templating - Changes to conditional expression handling removed numerous instances
of insecure multi-pass templating (which could result in execution of untrusted
template expressions).
- user action won't allow ssh-keygen, chown and chmod to run on existing ssh
public key file, avoiding traversal on existing symlinks (CVE-2024-9902).
codename: What Is and What Should Never Be
fragments:
- 2.19.0b1_summary.yaml
- 81709-ansible-galaxy-slow-resolution-hints.yml
- 81812-ansible-galaxy-negative-spec-is-pinned.yml
- 81874-deprecate-datetime-compat.yml
- 83642-fix-sanity-ignore-for-uri.yml
- 83643-fix-sanity-ignore-for-copy.yml
- 83690-get_url-content-disposition-filename.yml
- 83700-enable-file-disable-diff.yml
- 83757-deprecate-paramiko.yml
- 83936-ssh-askpass.yml
- 83965-action-groups-schema.yml
- 84008-additional-logging.yml
- 84019-ignore_unreachable-loop.yml
- 84149-add-flush-cache-for-adhoc-commands.yml
- 84206-dnf5-apt-auto-install-module-deps.yml
- 84213-ansible-galaxy-url-building.yml
- 84229-windows-server-2025.yml
- 84238-fix-reset_connection-ssh_executable-templated.yml
- 84259-dnf5-latest-fix.yml
- 84321-added-ansible_uptime_seconds_aix.yml
- 84325-validate-modules-seealso-fqcn.yml
- 84334-dnf5-consolidate-settings.yml
- 84384-fix-undefined-key-host-group-vars.yml
- 84419-fix-wait_for_connection-warning.yml
- 84468-timeout_become_unreachable.yml
- 84473-dict-lookup-type-error-message.yml
- 84490-allow-iptables-chain-creation-with-wait.yml
- 84496-CallbackBase-get_options.yml
- 84540-uri-relative-redirect.yml
- 84547-acme-test-container.yml
- 84578-dnf5-is_installed-provides.yml
- 84660-fix-meta-end_role-linear-strategy.yml
- 84685-add-opensuse-microos.yml
- 84705-error-message-malformed-plugin-documentation.yml
- 84725-deprecate-strategy-plugins.yml
- Ansible.Basic-required_if-null.yml
- ansible-galaxy-keycloak-service-accounts.yml
- ansible-test-added-macos-15.3.yml
- ansible-test-containers.yml
- ansible-test-coverage-test-files.yml
- ansible-test-curl.yml
- ansible-test-fix-command-traceback.yml
- ansible-test-freebsd-nss.yml
- ansible-test-network-detection.yml
- ansible-test-nios-container.yml
- ansible-test-no-exec-script.yml
- ansible-test-probe-error-handling.yml
- ansible-test-pylint-fix.yml
- ansible-test-remotes.yml
- ansible-test-update.yml
- apt_key_bye.yml
- become-runas-system-deux.yml
- buildroot.yml
- compat_removal.yml
- config.yml
- config_dump.yml
- copy_validate_input.yml
- cron_err.yml
- csvfile-col.yml
- cve-2024-8775.yml
- darwin_pagesize.yml
- debconf_empty_password.yml
- deprecated.yml
- distro_LMDE_6.yml
- dnf5-exception-forwarding.yml
- dnf5-plugins-compat.yml
- dnf5-remove-usage-deprecated-option.yml
- feature-uri-add-option-multipart-encoding.yml
- file_simplify.yml
- find-checksum.yml
- find_enoent.yml
- fix-ansible-galaxy-ignore-certs.yml
- fix-cli-doc-path_undefined.yaml
- fix-display-bug-in-action-plugin.yml
- fix-include_vars-merge-hash.yml
- fix-ipv6-pattern.yml
- fix-is-filter-is-test.yml
- fix-lookup-sequence-keyword-args-only.yml
- fix-module-utils-facts-timeout.yml
- fix_errors.yml
- follow_redirects_url.yml
- gather_facts_netos_fixes.yml
- gather_facts_smart_fix.yml
- get_url_bsd_style_digest.yml
- hide-loop-vars-debug-vars.yml
- implicit_flush_handlers_parents.yml
- include_delegate_to.yml
- interpreter-discovery-auto-legacy.yml
- jinja-version.yml
- libvirt_lxc.yml
- local-become-fixes.yml
- lookup_config.yml
- macos-correct-lock.yml
- no-inherit-stdio.yml
- no-return.yml
- openrc-status.yml
- os_family.yml
- package-dnf-action-plugins-facts-fail-msg.yml
- package_facts_fix.yml
- passlib.yml
- pin-wheel.yml
- pipelining_refactor.yml
- playiterator-add_tasks-optimize.yml
- ps-import-sanity.yml
- pull_changed_fix.yml
- remove_ini_ignored_dir.yml
- reserved_module_chekc.yml
- respawn-min-python.yml
- respawn_os_env.yml
- selector_removal.yml
- service_facts_fbsd.yml
- set_ipv4_and_ipv6_simultaneously.yml
- simplify-copy-module.yml
- skip-handlers-tagged-play.yml
- skip-implicit-flush_handlers-no-notify.yml
- skip-role-task-iterator.yml
- ssh-agent.yml
- ssh-clixml.yml
- ssh_raise_exception.yml
- string_conversion.yml
- sunos_virtinfo.yml
- templates_types_datatagging.yml
- toml-library-support-dropped.yml
- trim_blocks.yml
- unarchive_timestamp_t32.yaml
- update-resolvelib-lt-2_0_0.yml
- uri_httpexception.yml
- url_safe_b64_encode_decode.yml
- user_action_fix.yml
- user_module.yml
- user_passphrase.yml
- user_ssh_fix.yml
- v2.19.0-initial-commit.yaml
- vault_cli_fix.yml
- vault_docs_fix.yaml
- win-async-refactor.yml
- win-wdac-audit.yml
- windows-exec.yml
- winrm-kinit-pexpect.yml
release_date: '2025-04-14'
2.19.0b2:
changes:
bugfixes:
- Remove use of `required` parameter in `get_bin_path` which has been deprecated.
- ansible-doc - fix indentation for first line of descriptions of suboptions
and sub-return values (https://github.com/ansible/ansible/pull/84690).
- ansible-doc - fix line wrapping for first line of description of options and
return values (https://github.com/ansible/ansible/pull/84690).
minor_changes:
- comment filter - Improve the error message shown when an invalid ``style``
argument is provided.
release_summary: '| Release Date: 2025-04-24
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
'
codename: What Is and What Should Never Be
fragments:
- 2.19.0b2_summary.yaml
- 84690-ansible-doc-indent-wrapping.yml
- comment_fail.yml
- get_bin_path-remove-use-of-deprecated-param.yml
release_date: '2025-04-23'
2.19.0b3:
changes:
bugfixes:
- Ansible will now ensure predictable permissions on remote artifacts, until
now it only ensured executable and relied on system masks for the rest.
- dnf5 - avoid generating excessive transaction entries in the dnf5 history
(https://github.com/ansible/ansible/issues/85046)
deprecated_features:
- Passing a ``warnings` or ``deprecations`` key to ``exit_json`` or ``fail_json``
is deprecated. Use ``AnsibleModule.warn`` or ``AnsibleModule.deprecate`` instead.
- plugins - Accessing plugins with ``_``-prefixed filenames without the ``_``
prefix is deprecated.
minor_changes:
- ansible-config will now show internal, but not test configuration entries.
This allows for debugging but still denoting the configurations as internal
use only (_ prefix).
- ansible-test - Improved ``pylint`` checks for Ansible-specific deprecation
functions.
- ansible-test - Use the ``-t`` option to set the stop timeout when stopping
a container. This avoids use of the ``--time`` option which was deprecated
in Docker v28.0.
- collection metadata - The collection loader now parses scalar values from
``meta/runtime.yml`` as strings. This avoids issues caused by unquoted values
such as versions or dates being parsed as types other than strings.
- deprecation warnings - Deprecation warning APIs automatically capture the
identity of the deprecating plugin. The ``collection_name`` argument is only
required to correctly attribute deprecations that occur in module_utils or
other non-plugin code.
- deprecation warnings - Improved deprecation messages to more clearly indicate
the affected content, including plugin name when available.
- deprecations - Collection name strings not of the form ``ns.coll`` passed
to deprecation API functions will result in an error.
- deprecations - Removed support for specifying deprecation dates as a ``datetime.date``,
which was included in an earlier 2.19 pre-release.
- deprecations - Some argument names to ``deprecate_value`` for consistency
with existing APIs. An earlier 2.19 pre-release included a ``removal_`` prefix
on the ``date`` and ``version`` arguments.
- modules - The ``AnsibleModule.deprecate`` function no longer sends deprecation
messages to the target host's logging system.
release_summary: '| Release Date: 2025-05-06
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__
'
codename: What Is and What Should Never Be
fragments:
- 2.19.0b3_summary.yaml
- 85046-dnf5-history-entries.yml
- ansible-test-container-stop.yml
- config_priv.yml
- deprecator.yml
- ensure_remote_perms.yml
release_date: '2025-05-06'

@ -0,0 +1,3 @@
release_summary: |
| Release Date: 2025-04-14
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__

@ -0,0 +1,3 @@
release_summary: |
| Release Date: 2025-04-24
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__

@ -0,0 +1,3 @@
release_summary: |
| Release Date: 2025-05-06
| `Porting Guide <https://docs.ansible.com/ansible-core/2.19/porting_guides/porting_guide_core_2.19.html>`__

@ -0,0 +1,3 @@
bugfixes:
- "ansible-doc - fix indentation for first line of descriptions of suboptions and sub-return values (https://github.com/ansible/ansible/pull/84690)."
- "ansible-doc - fix line wrapping for first line of description of options and return values (https://github.com/ansible/ansible/pull/84690)."

@ -0,0 +1,2 @@
bugfixes:
- dnf5 - avoid generating excessive transaction entries in the dnf5 history (https://github.com/ansible/ansible/issues/85046)

@ -0,0 +1,3 @@
minor_changes:
- ansible-test - Use the ``-t`` option to set the stop timeout when stopping a container.
This avoids use of the ``--time`` option which was deprecated in Docker v28.0.

@ -0,0 +1,3 @@
---
minor_changes:
- comment filter - Improve the error message shown when an invalid ``style`` argument is provided.

@ -0,0 +1,2 @@
minor_changes:
- ansible-config will now show internal, but not test configuration entries. This allows for debugging but still denoting the configurations as internal use only (_ prefix).

@ -0,0 +1,17 @@
minor_changes:
- modules - The ``AnsibleModule.deprecate`` function no longer sends deprecation messages to the target host's logging system.
- ansible-test - Improved ``pylint`` checks for Ansible-specific deprecation functions.
- deprecations - Removed support for specifying deprecation dates as a ``datetime.date``, which was included in an earlier 2.19 pre-release.
- deprecations - Some argument names to ``deprecate_value`` for consistency with existing APIs.
An earlier 2.19 pre-release included a ``removal_`` prefix on the ``date`` and ``version`` arguments.
- deprecations - Collection name strings not of the form ``ns.coll`` passed to deprecation API functions will result in an error.
- collection metadata - The collection loader now parses scalar values from ``meta/runtime.yml`` as strings.
This avoids issues caused by unquoted values such as versions or dates being parsed as types other than strings.
- deprecation warnings - Deprecation warning APIs automatically capture the identity of the deprecating plugin.
The ``collection_name`` argument is only required to correctly attribute deprecations that occur in module_utils or other non-plugin code.
- deprecation warnings - Improved deprecation messages to more clearly indicate the affected content, including plugin name when available.
deprecated_features:
- plugins - Accessing plugins with ``_``-prefixed filenames without the ``_`` prefix is deprecated.
- Passing a ``warnings` or ``deprecations`` key to ``exit_json`` or ``fail_json`` is deprecated.
Use ``AnsibleModule.warn`` or ``AnsibleModule.deprecate`` instead.

@ -0,0 +1,2 @@
bugfixes:
- Ansible will now ensure predictable permissions on remote artifacts, until now it only ensured executable and relied on system masks for the rest.

@ -0,0 +1,2 @@
bugfixes:
- "Remove use of `required` parameter in `get_bin_path` which has been deprecated."

@ -47,10 +47,6 @@ minor_changes:
- to_json / to_nice_json filters - The filters accept a ``profile`` argument, which defaults to ``tagless``.
- undef jinja function - The ``undef`` jinja function now raises an error if a non-string hint is given.
Attempting to use an undefined hint also results in an error, ensuring incorrect use of the function can be distinguished from the function's normal behavior.
- display - The ``collection_name`` arg to ``Display.deprecated`` no longer has any effect.
Information about the calling plugin is automatically captured by the display infrastructure, included in the displayed messages, and made available to callbacks.
- modules - The ``collection_name`` arg to Python module-side ``deprecate`` methods no longer has any effect.
Information about the calling module is automatically captured by the warning infrastructure and included in the module result.
breaking_changes:
- loops - Omit placeholders no longer leak between loop item templating and task templating.
@ -173,6 +169,9 @@ deprecated_features:
- file loading - Loading text files with ``DataLoader`` containing data that cannot be decoded under the expected encoding is deprecated.
In most cases the encoding must be UTF-8, although some plugins allow choosing a different encoding.
Previously, invalid data was silently wrapped in Unicode surrogate escape sequences, often resulting in later errors or other data corruption.
- callback plugins - The v1 callback API (callback methods not prefixed with `v2_`) is deprecated.
Use `v2_` prefixed methods instead.
- callback plugins - The `v2_on_any` callback method is deprecated. Use specific callback methods instead.
removed_features:
- modules - Modules returning non-UTF8 strings now result in an error.

@ -40,7 +40,6 @@ import shutil
from pathlib import Path
from ansible.module_utils.common.messages import PluginInfo
from ansible.release import __version__
import ansible.utils.vars as utils_vars
from ansible.parsing.dataloader import DataLoader
@ -172,15 +171,8 @@ def boilerplate_module(modfile, args, interpreters, check, destfile):
modname = os.path.basename(modfile)
modname = os.path.splitext(modname)[0]
plugin = PluginInfo(
requested_name=modname,
resolved_name=modname,
type='module',
)
built_module = module_common.modify_module(
module_name=modname,
plugin=plugin,
module_path=modfile,
module_args=complex_args,
templar=Templar(loader=loader),
@ -225,10 +217,11 @@ def ansiballz_setup(modfile, modname, interpreters):
# All the directories in an AnsiBallZ that modules can live
core_dirs = glob.glob(os.path.join(debug_dir, 'ansible/modules'))
non_core_dirs = glob.glob(os.path.join(debug_dir, 'ansible/legacy'))
collection_dirs = glob.glob(os.path.join(debug_dir, 'ansible_collections/*/*/plugins/modules'))
# There's only one module in an AnsiBallZ payload so look for the first module and then exit
for module_dir in core_dirs + collection_dirs:
for module_dir in core_dirs + collection_dirs + non_core_dirs:
for dirname, directories, filenames in os.walk(module_dir):
for filename in filenames:
if filename == modname + '.py':

@ -42,7 +42,6 @@ def _ansiballz_main(
module_fqn: str,
params: str,
profile: str,
plugin_info_dict: dict[str, object],
date_time: datetime.datetime,
coverage_config: str | None,
coverage_output: str | None,
@ -142,7 +141,6 @@ def _ansiballz_main(
run_module(
json_params=json_params,
profile=profile,
plugin_info_dict=plugin_info_dict,
module_fqn=module_fqn,
modlib_path=modlib_path,
coverage_config=coverage_config,
@ -230,13 +228,12 @@ def _ansiballz_main(
run_module(
json_params=json_params,
profile=profile,
plugin_info_dict=plugin_info_dict,
module_fqn=module_fqn,
modlib_path=modlib_path,
)
else:
print('WARNING: Unknown debug command. Doing nothing.')
print(f'FATAL: Unknown debug command {command!r}. Doing nothing.')
#
# See comments in the debug() method for information on debugging

@ -0,0 +1,47 @@
from __future__ import annotations as _annotations
import collections.abc as _c
import typing as _t
_T_co = _t.TypeVar('_T_co', covariant=True)
class SequenceProxy(_c.Sequence[_T_co]):
"""A read-only sequence proxy."""
# DTFIX-RELEASE: needs unit test coverage
__slots__ = ('__value',)
def __init__(self, value: _c.Sequence[_T_co]) -> None:
self.__value = value
@_t.overload
def __getitem__(self, index: int) -> _T_co: ...
@_t.overload
def __getitem__(self, index: slice) -> _c.Sequence[_T_co]: ...
def __getitem__(self, index: int | slice) -> _T_co | _c.Sequence[_T_co]:
if isinstance(index, slice):
return self.__class__(self.__value[index])
return self.__value[index]
def __len__(self) -> int:
return len(self.__value)
def __contains__(self, item: object) -> bool:
return item in self.__value
def __iter__(self) -> _t.Iterator[_T_co]:
yield from self.__value
def __reversed__(self) -> _c.Iterator[_T_co]:
return reversed(self.__value)
def index(self, *args) -> int:
return self.__value.index(*args)
def count(self, value: object) -> int:
return self.__value.count(value)

@ -16,8 +16,8 @@ class ErrorAction(enum.Enum):
"""Action to take when an error is encountered."""
IGNORE = enum.auto()
WARN = enum.auto()
FAIL = enum.auto()
WARNING = enum.auto()
ERROR = enum.auto()
@classmethod
def from_config(cls, setting: str, variables: dict[str, t.Any] | None = None) -> t.Self:
@ -75,9 +75,9 @@ class ErrorHandler:
yield
except args as ex:
match self.action:
case ErrorAction.WARN:
case ErrorAction.WARNING:
display.error_as_warning(msg=None, exception=ex)
case ErrorAction.FAIL:
case ErrorAction.ERROR:
raise
case _: # ErrorAction.IGNORE
pass

@ -4,6 +4,7 @@
from __future__ import annotations
import enum
import json
import typing as t
@ -19,7 +20,9 @@ from ansible.module_utils._internal._datatag import (
from ansible.module_utils._internal._json._profiles import _tagless
from ansible.parsing.vault import EncryptedString
from ansible._internal._datatag._tags import Origin, TrustedAsTemplate
from ansible._internal._templating import _transform
from ansible.module_utils import _internal
from ansible.module_utils._internal import _datatag
_T = t.TypeVar('_T')
_sentinel = object()
@ -52,6 +55,19 @@ class StateTrackingMixIn(HasCurrent):
return self._stack[1:] + [self._current]
class EncryptedStringBehavior(enum.Enum):
"""How `AnsibleVariableVisitor` will handle instances of `EncryptedString`."""
PRESERVE = enum.auto()
"""Preserves the unmodified `EncryptedString` instance."""
DECRYPT = enum.auto()
"""Replaces the value with its decrypted plaintext."""
REDACT = enum.auto()
"""Replaces the value with a placeholder string."""
FAIL = enum.auto()
"""Raises an `AnsibleVariableTypeError` error."""
class AnsibleVariableVisitor:
"""Utility visitor base class to recursively apply various behaviors and checks to variable object graphs."""
@ -63,7 +79,9 @@ class AnsibleVariableVisitor:
convert_mapping_to_dict: bool = False,
convert_sequence_to_list: bool = False,
convert_custom_scalars: bool = False,
allow_encrypted_string: bool = False,
convert_to_native_values: bool = False,
apply_transforms: bool = False,
encrypted_string_behavior: EncryptedStringBehavior = EncryptedStringBehavior.DECRYPT,
):
super().__init__() # supports StateTrackingMixIn
@ -72,7 +90,16 @@ class AnsibleVariableVisitor:
self.convert_mapping_to_dict = convert_mapping_to_dict
self.convert_sequence_to_list = convert_sequence_to_list
self.convert_custom_scalars = convert_custom_scalars
self.allow_encrypted_string = allow_encrypted_string
self.convert_to_native_values = convert_to_native_values
self.apply_transforms = apply_transforms
self.encrypted_string_behavior = encrypted_string_behavior
if apply_transforms:
from ansible._internal._templating import _engine
self._template_engine = _engine.TemplateEngine()
else:
self._template_engine = None
self._current: t.Any = None # supports StateTrackingMixIn
@ -113,9 +140,19 @@ class AnsibleVariableVisitor:
value_type = type(value)
if self.apply_transforms and value_type in _transform._type_transform_mapping:
value = self._template_engine.transform(value)
value_type = type(value)
# DTFIX-RELEASE: need to handle native copy for keys too
if self.convert_to_native_values and isinstance(value, _datatag.AnsibleTaggedObject):
value = value._native_copy()
value_type = type(value)
result: _T
# DTFIX-RELEASE: the visitor is ignoring dict/mapping keys except for debugging and schema-aware checking, it should be doing type checks on keys
# keep in mind the allowed types for keys is a more restrictive set than for values (str and taggged str only, not EncryptedString)
# DTFIX-RELEASE: some type lists being consulted (the ones from datatag) are probably too permissive, and perhaps should not be dynamic
if (result := self._early_visit(value, value_type)) is not _sentinel:
@ -127,8 +164,14 @@ class AnsibleVariableVisitor:
elif value_type in _ANSIBLE_ALLOWED_NON_SCALAR_COLLECTION_VAR_TYPES:
with self: # supports StateTrackingMixIn
result = AnsibleTagHelper.tag_copy(value, (self._visit(k, v) for k, v in enumerate(t.cast(t.Iterable, value))), value_type=value_type)
elif self.allow_encrypted_string and isinstance(value, EncryptedString):
return value # type: ignore[return-value] # DTFIX-RELEASE: this should probably only be allowed for values in dict, not keys (set, dict)
elif self.encrypted_string_behavior != EncryptedStringBehavior.FAIL and isinstance(value, EncryptedString):
match self.encrypted_string_behavior:
case EncryptedStringBehavior.REDACT:
result = "<redacted>" # type: ignore[assignment]
case EncryptedStringBehavior.PRESERVE:
result = value # type: ignore[assignment]
case EncryptedStringBehavior.DECRYPT:
result = str(value) # type: ignore[assignment]
elif self.convert_mapping_to_dict and _internal.is_intermediate_mapping(value):
with self: # supports StateTrackingMixIn
result = {k: self._visit(k, v) for k, v in value.items()} # type: ignore[assignment]

@ -8,13 +8,12 @@ from __future__ import annotations as _annotations
import datetime as _datetime
import typing as _t
from ansible._internal import _json
from ansible._internal._datatag import _tags
from ansible.module_utils._internal import _datatag
from ansible.module_utils._internal._json import _profiles
from ansible.parsing import vault as _vault
from ... import _json
class _Untrusted:
"""
@ -48,7 +47,7 @@ class _LegacyVariableVisitor(_json.AnsibleVariableVisitor):
convert_mapping_to_dict=convert_mapping_to_dict,
convert_sequence_to_list=convert_sequence_to_list,
convert_custom_scalars=convert_custom_scalars,
allow_encrypted_string=True,
encrypted_string_behavior=_json.EncryptedStringBehavior.PRESERVE,
)
self.invert_trust = invert_trust

@ -12,7 +12,6 @@ from ansible.utils.display import Display
from ._access import NotifiableAccessContextBase
from ._utils import TemplateContext
display = Display()
@ -57,10 +56,10 @@ class DeprecatedAccessAuditContext(NotifiableAccessContextBase):
display._deprecated_with_plugin_info(
msg=msg,
help_text=item.deprecated.help_text,
version=item.deprecated.removal_version,
date=item.deprecated.removal_date,
version=item.deprecated.version,
date=item.deprecated.date,
obj=item.template,
plugin=item.deprecated.plugin,
deprecator=item.deprecated.deprecator,
)
return result

@ -566,7 +566,12 @@ class TemplateEngine:
)
if _TemplateConfig.allow_broken_conditionals:
_display.deprecated(msg=msg, obj=conditional, help_text=self._BROKEN_CONDITIONAL_ALLOWED_FRAGMENT, version='2.23')
_display.deprecated(
msg=msg,
obj=conditional,
help_text=self._BROKEN_CONDITIONAL_ALLOWED_FRAGMENT,
version='2.23',
)
return bool_result

@ -985,12 +985,12 @@ def _maybe_finalize_scalar(o: t.Any) -> t.Any:
match _TemplateConfig.unknown_type_conversion_handler.action:
# we don't want to show the object value, and it can't be Origin-tagged; send the current template value for best effort
case ErrorAction.WARN:
case ErrorAction.WARNING:
display.warning(
msg=f'Type {native_type_name(o)!r} is unsupported in variable storage, converting to {native_type_name(target_type)!r}.',
obj=TemplateContext.current(optional=True).template_value,
)
case ErrorAction.FAIL:
case ErrorAction.ERROR:
raise AnsibleVariableTypeError.from_value(obj=TemplateContext.current(optional=True).template_value)
return target_type(o)
@ -1006,12 +1006,12 @@ def _finalize_fallback_collection(
) -> t.Collection[t.Any]:
match _TemplateConfig.unknown_type_conversion_handler.action:
# we don't want to show the object value, and it can't be Origin-tagged; send the current template value for best effort
case ErrorAction.WARN:
case ErrorAction.WARNING:
display.warning(
msg=f'Type {native_type_name(o)!r} is unsupported in variable storage, converting to {native_type_name(target_type)!r}.',
obj=TemplateContext.current(optional=True).template_value,
)
case ErrorAction.FAIL:
case ErrorAction.ERROR:
raise AnsibleVariableTypeError.from_value(obj=TemplateContext.current(optional=True).template_value)
return _finalize_collection(o, mode, finalizer, target_type)

@ -8,12 +8,7 @@ import datetime
import functools
import typing as t
from ansible.errors import (
AnsibleTemplatePluginError,
)
from ansible.module_utils._internal._ambient_context import AmbientContextBase
from ansible.module_utils._internal._plugin_exec_context import PluginExecContext
from ansible.module_utils.common.collections import is_sequence
from ansible.module_utils._internal._datatag import AnsibleTagHelper
from ansible._internal._datatag._tags import TrustedAsTemplate
@ -115,7 +110,7 @@ class JinjaPluginIntercept(c.MutableMapping):
return first_marker
try:
with JinjaCallContext(accept_lazy_markers=instance.accept_lazy_markers), PluginExecContext(executing_plugin=instance):
with JinjaCallContext(accept_lazy_markers=instance.accept_lazy_markers):
return instance.j2_function(*lazify_container_args(args), **lazify_container_kwargs(kwargs))
except MarkerError as ex:
return ex.source
@ -216,10 +211,7 @@ def _invoke_lookup(*, plugin_name: str, lookup_terms: list, lookup_kwargs: dict[
wantlist = lookup_kwargs.pop('wantlist', False)
errors = lookup_kwargs.pop('errors', 'strict')
with (
JinjaCallContext(accept_lazy_markers=instance.accept_lazy_markers),
PluginExecContext(executing_plugin=instance),
):
with JinjaCallContext(accept_lazy_markers=instance.accept_lazy_markers):
try:
if _TemplateConfig.allow_embedded_templates:
# for backwards compat, only trust constant templates in lookup terms
@ -263,15 +255,13 @@ def _invoke_lookup(*, plugin_name: str, lookup_terms: list, lookup_kwargs: dict[
return ex.source
except Exception as ex:
# DTFIX-RELEASE: convert this to the new error/warn/ignore context manager
if isinstance(ex, AnsibleTemplatePluginError):
msg = f'Lookup failed but the error is being ignored: {ex}'
else:
msg = f'An unhandled exception occurred while running the lookup plugin {plugin_name!r}. Error was a {type(ex)}, original message: {ex}'
if errors == 'warn':
_display.warning(msg)
_display.error_as_warning(
msg=f'An error occurred while running the lookup plugin {plugin_name!r}.',
exception=ex,
)
elif errors == 'ignore':
_display.display(msg, log_only=True)
_display.display(f'An error of type {type(ex)} occurred while running the lookup plugin {plugin_name!r}: {ex}', log_only=True)
else:
raise AnsibleTemplatePluginRuntimeError('lookup', plugin_name) from ex

@ -10,7 +10,6 @@ import os
import signal
import sys
# We overload the ``ansible`` adhoc command to provide the functionality for
# ``SSH_ASKPASS``. This code is here, and not in ``adhoc.py`` to bypass
# unnecessary code. The program provided to ``SSH_ASKPASS`` can only be invoked
@ -89,18 +88,25 @@ from ansible import _internal # do not remove or defer; ensures controller-spec
_internal.setup()
from ansible.errors import AnsibleError, ExitCode
try:
from ansible import constants as C
from ansible.utils.display import Display
display = Display()
except Exception as ex:
print(f'ERROR: {ex}\n\n{"".join(traceback.format_exception(ex))}', file=sys.stderr)
if isinstance(ex, AnsibleError):
ex_msg = ' '.join((ex.message, ex._help_text)).strip()
else:
ex_msg = str(ex)
print(f'ERROR: {ex_msg}\n\n{"".join(traceback.format_exception(ex))}', file=sys.stderr)
sys.exit(5)
from ansible import context
from ansible.utils import display as _display
from ansible.cli.arguments import option_helpers as opt_help
from ansible.errors import AnsibleError, ExitCode
from ansible.inventory.manager import InventoryManager
from ansible.module_utils.six import string_types
from ansible.module_utils.common.text.converters import to_bytes, to_text
@ -116,6 +122,7 @@ from ansible.utils.collection_loader import AnsibleCollectionConfig
from ansible.utils.collection_loader._collection_finder import _get_collection_name_from_path
from ansible.utils.path import unfrackpath
from ansible.vars.manager import VariableManager
from ansible.module_utils._internal import _deprecator
try:
import argcomplete
@ -139,7 +146,7 @@ def _launch_ssh_agent() -> None:
return
case 'auto':
try:
ssh_agent_bin = get_bin_path('ssh-agent', required=True)
ssh_agent_bin = get_bin_path('ssh-agent')
except ValueError as e:
raise AnsibleError('SSH_AGENT set to auto, but cannot find ssh-agent binary') from e
ssh_agent_dir = os.path.join(C.DEFAULT_LOCAL_TMP, 'ssh_agent')
@ -251,7 +258,7 @@ class CLI(ABC):
else:
display.v(u"No config file found; using defaults")
C.handle_config_noise(display)
_display._report_config_warnings(_deprecator.ANSIBLE_CORE_DEPRECATOR)
@staticmethod
def split_vault_id(vault_id):

@ -56,7 +56,10 @@ class DeprecatedArgument:
from ansible.utils.display import Display
Display().deprecated(f'The {option!r} argument is deprecated.', version=self.version)
Display().deprecated( # pylint: disable=ansible-invalid-deprecated-version
msg=f'The {option!r} argument is deprecated.',
version=self.version,
)
class ArgumentParser(argparse.ArgumentParser):

@ -1172,12 +1172,16 @@ class DocCLI(CLI, RoleMixin):
return 'version %s' % (version_added, )
@staticmethod
def warp_fill(text, limit, initial_indent='', subsequent_indent='', **kwargs):
def warp_fill(text, limit, initial_indent='', subsequent_indent='', initial_extra=0, **kwargs):
result = []
for paragraph in text.split('\n\n'):
result.append(textwrap.fill(paragraph, limit, initial_indent=initial_indent, subsequent_indent=subsequent_indent,
break_on_hyphens=False, break_long_words=False, drop_whitespace=True, **kwargs))
wrapped = textwrap.fill(paragraph, limit, initial_indent=initial_indent + ' ' * initial_extra, subsequent_indent=subsequent_indent,
break_on_hyphens=False, break_long_words=False, drop_whitespace=True, **kwargs)
if initial_extra and wrapped.startswith(' ' * initial_extra):
wrapped = wrapped[initial_extra:]
result.append(wrapped)
initial_indent = subsequent_indent
initial_extra = 0
return '\n'.join(result)
@staticmethod
@ -1209,20 +1213,23 @@ class DocCLI(CLI, RoleMixin):
text.append('')
# TODO: push this to top of for and sort by size, create indent on largest key?
inline_indent = base_indent + ' ' * max((len(opt_indent) - len(o)) - len(base_indent), 2)
sub_indent = inline_indent + ' ' * (len(o) + 3)
inline_indent = ' ' * max((len(opt_indent) - len(o)) - len(base_indent), 2)
extra_indent = base_indent + ' ' * (len(o) + 3)
sub_indent = inline_indent + extra_indent
if is_sequence(opt['description']):
for entry_idx, entry in enumerate(opt['description'], 1):
if not isinstance(entry, string_types):
raise AnsibleError("Expected string in description of %s at index %s, got %s" % (o, entry_idx, type(entry)))
if entry_idx == 1:
text.append(key + DocCLI.warp_fill(DocCLI.tty_ify(entry), limit, initial_indent=inline_indent, subsequent_indent=sub_indent))
text.append(key + DocCLI.warp_fill(DocCLI.tty_ify(entry), limit,
initial_indent=inline_indent, subsequent_indent=sub_indent, initial_extra=len(extra_indent)))
else:
text.append(DocCLI.warp_fill(DocCLI.tty_ify(entry), limit, initial_indent=sub_indent, subsequent_indent=sub_indent))
else:
if not isinstance(opt['description'], string_types):
raise AnsibleError("Expected string in description of %s, got %s" % (o, type(opt['description'])))
text.append(key + DocCLI.warp_fill(DocCLI.tty_ify(opt['description']), limit, initial_indent=inline_indent, subsequent_indent=sub_indent))
text.append(key + DocCLI.warp_fill(DocCLI.tty_ify(opt['description']), limit,
initial_indent=inline_indent, subsequent_indent=sub_indent, initial_extra=len(extra_indent)))
del opt['description']
suboptions = []
@ -1328,7 +1335,6 @@ class DocCLI(CLI, RoleMixin):
'This was unintentionally allowed when plugin attributes were added, '
'but the feature does not map well to role argument specs.',
version='2.20',
collection_name='ansible.builtin',
)
text.append("")
text.append(_format("ATTRIBUTES:", 'bold'))

@ -9,6 +9,18 @@ _ANSIBLE_CONNECTION_PATH:
- For internal use only.
type: path
version_added: "2.18"
_CALLBACK_DISPATCH_ERROR_BEHAVIOR:
name: Callback dispatch error behavior
default: warning
description:
- Action to take when a callback dispatch results in an error.
type: choices
choices: &basic_error
error: issue a 'fatal' error and stop the play
warning: issue a warning but continue
ignore: just continue silently
env: [ { name: _ANSIBLE_CALLBACK_DISPATCH_ERROR_BEHAVIOR } ]
version_added: '2.19'
ALLOW_BROKEN_CONDITIONALS:
# This config option will be deprecated once it no longer has any effect (2.23).
name: Allow broken conditionals
@ -224,18 +236,6 @@ CACHE_PLUGIN_TIMEOUT:
- {key: fact_caching_timeout, section: defaults}
type: integer
yaml: {key: facts.cache.timeout}
_CALLBACK_DISPATCH_ERROR_BEHAVIOR:
name: Callback dispatch error behavior
default: warn
description:
- Action to take when a callback dispatch results in an error.
type: choices
choices: &choices_ignore_warn_fail
- ignore
- warn
- fail
env: [ { name: _ANSIBLE_CALLBACK_DISPATCH_ERROR_BEHAVIOR } ]
version_added: '2.19'
COLLECTIONS_SCAN_SYS_PATH:
name: Scan PYTHONPATH for installed collections
description: A boolean to enable or disable scanning the sys.path for installed collections.
@ -268,10 +268,7 @@ COLLECTIONS_ON_ANSIBLE_VERSION_MISMATCH:
- When a collection is loaded that does not support the running Ansible version (with the collection metadata key `requires_ansible`).
env: [{name: ANSIBLE_COLLECTIONS_ON_ANSIBLE_VERSION_MISMATCH}]
ini: [{key: collections_on_ansible_version_mismatch, section: defaults}]
choices: &basic_error
error: issue a 'fatal' error and stop the play
warning: issue a warning but continue
ignore: just continue silently
choices: *basic_error
default: warning
COLOR_CHANGED:
name: Color for 'changed' task status
@ -2058,13 +2055,13 @@ TASK_TIMEOUT:
version_added: '2.10'
_TEMPLAR_UNKNOWN_TYPE_CONVERSION:
name: Templar unknown type conversion behavior
default: warn
default: warning
description:
- Action to take when an unknown type is converted for variable storage during template finalization.
- This setting has no effect on the inability to store unsupported variable types as the result of templating.
- Experimental diagnostic feature, subject to change.
type: choices
choices: *choices_ignore_warn_fail
choices: *basic_error
env: [{name: _ANSIBLE_TEMPLAR_UNKNOWN_TYPE_CONVERSION}]
version_added: '2.19'
_TEMPLAR_UNKNOWN_TYPE_ENCOUNTERED:
@ -2074,7 +2071,7 @@ _TEMPLAR_UNKNOWN_TYPE_ENCOUNTERED:
- Action to take when an unknown type is encountered inside a template pipeline.
- Experimental diagnostic feature, subject to change.
type: choices
choices: *choices_ignore_warn_fail
choices: *basic_error
env: [{name: _ANSIBLE_TEMPLAR_UNKNOWN_TYPE_ENCOUNTERED}]
version_added: '2.19'
_TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR:
@ -2086,7 +2083,7 @@ _TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR:
- This setting has no effect on expressions.
- Experimental diagnostic feature, subject to change.
type: choices
choices: *choices_ignore_warn_fail
choices: *basic_error
env: [{name: _ANSIBLE_TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR}]
version_added: '2.19'
WORKER_SHUTDOWN_POLL_COUNT:

@ -480,9 +480,9 @@ class ConfigManager(object):
else:
ret = self._plugins.get(plugin_type, {}).get(name, {})
if ignore_private:
if ignore_private: # ignore 'test' config entries, they should not change runtime behaviors
for cdef in list(ret.keys()):
if cdef.startswith('_'):
if cdef.startswith('_Z_'):
del ret[cdef]
return ret

@ -10,9 +10,7 @@ from string import ascii_letters, digits
from ansible.config.manager import ConfigManager
from ansible.module_utils.common.text.converters import to_text
from ansible.module_utils.common.collections import Sequence
from ansible.module_utils.parsing.convert_bool import BOOLEANS_TRUE
from ansible.release import __version__
from ansible.utils.fqcn import add_internal_fqcns
# initialize config manager/config data to read/store global settings
@ -20,68 +18,11 @@ from ansible.utils.fqcn import add_internal_fqcns
config = ConfigManager()
def _warning(msg):
""" display is not guaranteed here, nor it being the full class, but try anyways, fallback to sys.stderr.write """
try:
from ansible.utils.display import Display
Display().warning(msg)
except Exception:
import sys
sys.stderr.write(' [WARNING] %s\n' % (msg))
def _deprecated(msg, version):
""" display is not guaranteed here, nor it being the full class, but try anyways, fallback to sys.stderr.write """
try:
from ansible.utils.display import Display
Display().deprecated(msg, version=version)
except Exception:
import sys
sys.stderr.write(' [DEPRECATED] %s, to be removed in %s\n' % (msg, version))
def handle_config_noise(display=None):
if display is not None:
w = display.warning
d = display.deprecated
else:
w = _warning
d = _deprecated
while config.WARNINGS:
warn = config.WARNINGS.pop()
w(warn)
while config.DEPRECATED:
# tuple with name and options
dep = config.DEPRECATED.pop(0)
msg = config.get_deprecated_msg_from_config(dep[1])
# use tabs only for ansible-doc?
msg = msg.replace("\t", "")
d(f"{dep[0]} option. {msg}", version=dep[1]['version'])
def set_constant(name, value, export=vars()):
""" sets constants and returns resolved options dict """
export[name] = value
class _DeprecatedSequenceConstant(Sequence):
def __init__(self, value, msg, version):
self._value = value
self._msg = msg
self._version = version
def __len__(self):
_deprecated(self._msg, self._version)
return len(self._value)
def __getitem__(self, y):
_deprecated(self._msg, self._version)
return self._value[y]
# CONSTANTS ### yes, actual ones
# The following are hard-coded action names
@ -245,6 +186,3 @@ MAGIC_VARIABLE_MAPPING = dict(
# POPULATE SETTINGS FROM CONFIG ###
for setting in config.get_configuration_definitions():
set_constant(setting, config.get_config_value(setting, variables=vars()))
# emit any warnings or deprecations
handle_config_noise()

@ -18,6 +18,9 @@ from ..module_utils.datatag import native_type_name
from ansible._internal._datatag import _tags
from .._internal._errors import _utils
if t.TYPE_CHECKING:
from ansible.plugins import loader as _t_loader
class ExitCode(enum.IntEnum):
SUCCESS = 0 # used by TQM, must be bit-flag safe
@ -374,8 +377,9 @@ class _AnsibleActionDone(AnsibleAction):
class AnsiblePluginError(AnsibleError):
"""Base class for Ansible plugin-related errors that do not need AnsibleError contextual data."""
def __init__(self, message=None, plugin_load_context=None):
super(AnsiblePluginError, self).__init__(message)
def __init__(self, message: str | None = None, plugin_load_context: _t_loader.PluginLoadContext | None = None, help_text: str | None = None) -> None:
super(AnsiblePluginError, self).__init__(message, help_text=help_text)
self.plugin_load_context = plugin_load_context

@ -39,7 +39,6 @@ from io import BytesIO
from ansible._internal import _locking
from ansible._internal._datatag import _utils
from ansible.module_utils._internal import _dataclass_validation
from ansible.module_utils.common.messages import PluginInfo
from ansible.module_utils.common.yaml import yaml_load
from ansible._internal._datatag._tags import Origin
from ansible.module_utils.common.json import Direction, get_module_encoder
@ -56,6 +55,7 @@ from ansible.template import Templar
from ansible.utils.collection_loader._collection_finder import _get_collection_metadata, _nested_dict_get
from ansible.module_utils._internal import _json, _ansiballz
from ansible.module_utils import basic as _basic
from ansible.module_utils.common import messages as _messages
if t.TYPE_CHECKING:
from ansible import template as _template
@ -434,7 +434,13 @@ class ModuleUtilLocatorBase:
else:
msg += '.'
display.deprecated(msg, removal_version, removed, removal_date, self._collection_name)
display.deprecated( # pylint: disable=ansible-deprecated-date-not-permitted,ansible-deprecated-unnecessary-collection-name
msg=msg,
version=removal_version,
removed=removed,
date=removal_date,
deprecator=_messages.PluginInfo._from_collection_name(self._collection_name),
)
if 'redirect' in routing_entry:
self.redirected = True
source_pkg = '.'.join(name_parts)
@ -944,7 +950,6 @@ class _CachedModule:
def _find_module_utils(
*,
module_name: str,
plugin: PluginInfo,
b_module_data: bytes,
module_path: str,
module_args: dict[object, object],
@ -1020,7 +1025,9 @@ def _find_module_utils(
# People should start writing collections instead of modules in roles so we
# may never fix this
display.debug('ANSIBALLZ: Could not determine module FQN')
remote_module_fqn = 'ansible.modules.%s' % module_name
# FIXME: add integration test to validate that builtins and legacy modules with the same name are tracked separately by the caching mechanism
# FIXME: surrogate FQN should be unique per source path- role-packaged modules with name collisions can still be aliased
remote_module_fqn = 'ansible.legacy.%s' % module_name
if module_substyle == 'python':
date_time = datetime.datetime.now(datetime.timezone.utc)
@ -1126,7 +1133,6 @@ def _find_module_utils(
module_fqn=remote_module_fqn,
params=encoded_params,
profile=module_metadata.serialization_profile,
plugin_info_dict=dataclasses.asdict(plugin),
date_time=date_time,
coverage_config=coverage_config,
coverage_output=coverage_output,
@ -1236,7 +1242,6 @@ def _extract_interpreter(b_module_data):
def modify_module(
*,
module_name: str,
plugin: PluginInfo,
module_path,
module_args,
templar,
@ -1277,7 +1282,6 @@ def modify_module(
module_bits = _find_module_utils(
module_name=module_name,
plugin=plugin,
b_module_data=b_module_data,
module_path=module_path,
module_args=module_args,

@ -32,6 +32,7 @@ from ansible._internal import _task
from ansible.errors import AnsibleConnectionFailure, AnsibleError
from ansible.executor.task_executor import TaskExecutor
from ansible.executor.task_queue_manager import FinalQueue, STDIN_FILENO, STDOUT_FILENO, STDERR_FILENO
from ansible.executor.task_result import _RawTaskResult
from ansible.inventory.host import Host
from ansible.module_utils.common.collections import is_sequence
from ansible.module_utils.common.text.converters import to_text
@ -226,7 +227,7 @@ class WorkerProcess(multiprocessing_context.Process): # type: ignore[name-defin
init_plugin_loader(cli_collections_path)
try:
# execute the task and build a TaskResult from the result
# execute the task and build a _RawTaskResult from the result
display.debug("running TaskExecutor() for %s/%s" % (self._host, self._task))
executor_result = TaskExecutor(
self._host,
@ -256,48 +257,52 @@ class WorkerProcess(multiprocessing_context.Process): # type: ignore[name-defin
# put the result on the result queue
display.debug("sending task result for task %s" % self._task._uuid)
try:
self._final_q.send_task_result(
self._host.name,
self._task._uuid,
executor_result,
self._final_q.send_task_result(_RawTaskResult(
host=self._host,
task=self._task,
return_data=executor_result,
task_fields=self._task.dump_attrs(),
)
))
except Exception as ex:
try:
raise AnsibleError("Task result omitted due to queue send failure.") from ex
except Exception as ex_wrapper:
self._final_q.send_task_result(
self._host.name,
self._task._uuid,
ActionBase.result_dict_from_exception(ex_wrapper), # Overriding the task result, to represent the failure
{}, # The failure pickling may have been caused by the task attrs, omit for safety
)
self._final_q.send_task_result(_RawTaskResult(
host=self._host,
task=self._task,
return_data=ActionBase.result_dict_from_exception(ex_wrapper), # Overriding the task result, to represent the failure
task_fields={}, # The failure pickling may have been caused by the task attrs, omit for safety
))
display.debug("done sending task result for task %s" % self._task._uuid)
except AnsibleConnectionFailure:
except AnsibleConnectionFailure as ex:
return_data = ActionBase.result_dict_from_exception(ex)
return_data.pop('failed')
return_data.update(unreachable=True)
self._host.vars = dict()
self._host.groups = []
self._final_q.send_task_result(
self._host.name,
self._task._uuid,
dict(unreachable=True),
self._final_q.send_task_result(_RawTaskResult(
host=self._host,
task=self._task,
return_data=return_data,
task_fields=self._task.dump_attrs(),
)
))
except Exception as e:
if not isinstance(e, (IOError, EOFError, KeyboardInterrupt, SystemExit)) or isinstance(e, TemplateNotFound):
except Exception as ex:
if not isinstance(ex, (IOError, EOFError, KeyboardInterrupt, SystemExit)) or isinstance(ex, TemplateNotFound):
try:
self._host.vars = dict()
self._host.groups = []
self._final_q.send_task_result(
self._host.name,
self._task._uuid,
dict(failed=True, exception=to_text(traceback.format_exc()), stdout=''),
self._final_q.send_task_result(_RawTaskResult(
host=self._host,
task=self._task,
return_data=ActionBase.result_dict_from_exception(ex),
task_fields=self._task.dump_attrs(),
)
))
except Exception:
display.debug(u"WORKER EXCEPTION: %s" % to_text(e))
display.debug(u"WORKER EXCEPTION: %s" % to_text(ex))
display.debug(u"WORKER TRACEBACK: %s" % to_text(traceback.format_exc()))
finally:
self._clean_up()

@ -20,10 +20,9 @@ from ansible.errors import (
AnsibleError, AnsibleParserError, AnsibleUndefinedVariable, AnsibleConnectionFailure, AnsibleActionFail, AnsibleActionSkip, AnsibleTaskError,
AnsibleValueOmittedError,
)
from ansible.executor.task_result import TaskResult
from ansible.executor.task_result import _RawTaskResult
from ansible._internal._datatag import _utils
from ansible.module_utils._internal._plugin_exec_context import PluginExecContext
from ansible.module_utils.common.messages import Detail, WarningSummary, DeprecationSummary
from ansible.module_utils.common.messages import Detail, WarningSummary, DeprecationSummary, PluginInfo
from ansible.module_utils.datatag import native_type_name
from ansible._internal._datatag._tags import TrustedAsTemplate
from ansible.module_utils.parsing.convert_bool import boolean
@ -44,6 +43,9 @@ from ansible.vars.clean import namespace_facts, clean_facts
from ansible.vars.manager import _deprecate_top_level_fact
from ansible._internal._errors import _captured
if t.TYPE_CHECKING:
from ansible.executor.task_queue_manager import FinalQueue
display = Display()
@ -79,7 +81,7 @@ class TaskExecutor:
class.
"""
def __init__(self, host, task: Task, job_vars, play_context, loader, shared_loader_obj, final_q, variable_manager):
def __init__(self, host, task: Task, job_vars, play_context, loader, shared_loader_obj, final_q: FinalQueue, variable_manager):
self._host = host
self._task = task
self._job_vars = job_vars
@ -361,10 +363,10 @@ class TaskExecutor:
if self._connection and not isinstance(self._connection, string_types):
task_fields['connection'] = getattr(self._connection, 'ansible_name')
tr = TaskResult(
self._host.name,
self._task._uuid,
res,
tr = _RawTaskResult(
host=self._host,
task=self._task,
return_data=res,
task_fields=task_fields,
)
@ -637,7 +639,7 @@ class TaskExecutor:
if self._task.timeout:
old_sig = signal.signal(signal.SIGALRM, task_timeout)
signal.alarm(self._task.timeout)
with PluginExecContext(self._handler):
result = self._handler.run(task_vars=vars_copy)
# DTFIX-RELEASE: nuke this, it hides a lot of error detail- remove the active exception propagation hack from AnsibleActionFail at the same time
@ -666,17 +668,23 @@ class TaskExecutor:
if result.get('failed'):
self._final_q.send_callback(
'v2_runner_on_async_failed',
TaskResult(self._host.name,
self._task._uuid,
result,
task_fields=self._task.dump_attrs()))
_RawTaskResult(
host=self._host,
task=self._task,
return_data=result,
task_fields=self._task.dump_attrs(),
),
)
else:
self._final_q.send_callback(
'v2_runner_on_async_ok',
TaskResult(self._host.name,
self._task._uuid,
result,
task_fields=self._task.dump_attrs()))
_RawTaskResult(
host=self._host,
task=self._task,
return_data=result,
task_fields=self._task.dump_attrs(),
),
)
if 'ansible_facts' in result and self._task.action not in C._ACTION_DEBUG:
if self._task.action in C._ACTION_WITH_CLEAN_FACTS:
@ -756,12 +764,12 @@ class TaskExecutor:
display.debug('Retrying task, attempt %d of %d' % (attempt, retries))
self._final_q.send_callback(
'v2_runner_retry',
TaskResult(
self._host.name,
self._task._uuid,
result,
_RawTaskResult(
host=self._host,
task=self._task,
return_data=result,
task_fields=self._task.dump_attrs()
)
),
)
time.sleep(delay)
self._handler = self._get_action_handler(templar=templar)
@ -835,13 +843,12 @@ class TaskExecutor:
if not isinstance(deprecation, DeprecationSummary):
# translate non-DeprecationMessageDetail message dicts
try:
if deprecation.pop('collection_name', ...) is not ...:
if (collection_name := deprecation.pop('collection_name', ...)) is not ...:
# deprecated: description='enable the deprecation message for collection_name' core_version='2.23'
# CAUTION: This deprecation cannot be enabled until the replacement (deprecator) has been documented, and the schema finalized.
# self.deprecated('The `collection_name` key in the `deprecations` dictionary is deprecated.', version='2.27')
pass
deprecation.update(deprecator=PluginInfo._from_collection_name(collection_name))
# DTFIX-RELEASE: when plugin isn't set, do it at the boundary where we receive the module/action results
# that may even allow us to never set it in modules/actions directly and to populate it at the boundary
deprecation = DeprecationSummary(
details=(
Detail(msg=deprecation.pop('msg')),
@ -926,10 +933,10 @@ class TaskExecutor:
time_left -= self._task.poll
self._final_q.send_callback(
'v2_runner_on_async_poll',
TaskResult(
self._host.name,
async_task._uuid,
async_result,
_RawTaskResult(
host=self._host,
task=async_task,
return_data=async_result,
task_fields=async_task.dump_attrs(),
),
)

@ -17,6 +17,7 @@
from __future__ import annotations
import dataclasses
import os
import sys
import tempfile
@ -31,7 +32,7 @@ from ansible.errors import AnsibleError, ExitCode, AnsibleCallbackError
from ansible._internal._errors._handler import ErrorHandler
from ansible.executor.play_iterator import PlayIterator
from ansible.executor.stats import AggregateStats
from ansible.executor.task_result import TaskResult
from ansible.executor.task_result import _RawTaskResult, _WireTaskResult
from ansible.inventory.data import InventoryData
from ansible.module_utils.six import string_types
from ansible.module_utils.common.text.converters import to_native
@ -47,7 +48,8 @@ from ansible.utils.display import Display
from ansible.utils.lock import lock_decorator
from ansible.utils.multiprocessing import context as multiprocessing_context
from dataclasses import dataclass
if t.TYPE_CHECKING:
from ansible.executor.process.worker import WorkerProcess
__all__ = ['TaskQueueManager']
@ -57,12 +59,13 @@ STDERR_FILENO = 2
display = Display()
_T = t.TypeVar('_T')
@dataclasses.dataclass(frozen=True, kw_only=True, slots=True)
class CallbackSend:
def __init__(self, method_name, *args, **kwargs):
self.method_name = method_name
self.args = args
self.kwargs = kwargs
method_name: str
wire_task_result: _WireTaskResult
class DisplaySend:
@ -72,7 +75,7 @@ class DisplaySend:
self.kwargs = kwargs
@dataclass
@dataclasses.dataclass
class PromptSend:
worker_id: int
prompt: str
@ -87,19 +90,11 @@ class FinalQueue(multiprocessing.queues.SimpleQueue):
kwargs['ctx'] = multiprocessing_context
super().__init__(*args, **kwargs)
def send_callback(self, method_name, *args, **kwargs):
self.put(
CallbackSend(method_name, *args, **kwargs),
)
def send_callback(self, method_name: str, task_result: _RawTaskResult) -> None:
self.put(CallbackSend(method_name=method_name, wire_task_result=task_result.as_wire_task_result()))
def send_task_result(self, *args, **kwargs):
if isinstance(args[0], TaskResult):
tr = args[0]
else:
tr = TaskResult(*args, **kwargs)
self.put(
tr,
)
def send_task_result(self, task_result: _RawTaskResult) -> None:
self.put(task_result.as_wire_task_result())
def send_display(self, method, *args, **kwargs):
self.put(
@ -194,11 +189,8 @@ class TaskQueueManager:
# plugins for inter-process locking.
self._connection_lockfile = tempfile.TemporaryFile()
def _initialize_processes(self, num):
self._workers = []
for i in range(num):
self._workers.append(None)
def _initialize_processes(self, num: int) -> None:
self._workers: list[WorkerProcess | None] = [None] * num
def load_callbacks(self):
"""
@ -438,54 +430,72 @@ class TaskQueueManager:
defunct = True
return defunct
@staticmethod
def _first_arg_of_type(value_type: t.Type[_T], args: t.Sequence) -> _T | None:
return next((arg for arg in args if isinstance(arg, value_type)), None)
@lock_decorator(attr='_callback_lock')
def send_callback(self, method_name, *args, **kwargs):
# We always send events to stdout callback first, rest should follow config order
for callback_plugin in [self._stdout_callback] + self._callback_plugins:
# a plugin that set self.disabled to True will not be called
# see osx_say.py example for such a plugin
if getattr(callback_plugin, 'disabled', False):
if callback_plugin.disabled:
continue
# a plugin can opt in to implicit tasks (such as meta). It does this
# by declaring self.wants_implicit_tasks = True.
wants_implicit_tasks = getattr(callback_plugin, 'wants_implicit_tasks', False)
if not callback_plugin.wants_implicit_tasks and (task_arg := self._first_arg_of_type(Task, args)) and task_arg.implicit:
continue
# try to find v2 method, fallback to v1 method, ignore callback if no method found
methods = []
for possible in [method_name, 'v2_on_any']:
gotit = getattr(callback_plugin, possible, None)
if gotit is None:
gotit = getattr(callback_plugin, possible.removeprefix('v2_'), None)
if gotit is not None:
methods.append(gotit)
method = getattr(callback_plugin, possible, None)
if method is None:
method = getattr(callback_plugin, possible.removeprefix('v2_'), None)
if method is not None:
display.deprecated(
msg='The v1 callback API is deprecated.',
version='2.23',
help_text='Use `v2_` prefixed callback methods instead.',
)
if method is not None and not getattr(method, '_base_impl', False): # don't bother dispatching to the base impls
if possible == 'v2_on_any':
display.deprecated(
msg='The `v2_on_any` callback method is deprecated.',
version='2.23',
help_text='Use event-specific callback methods instead.',
)
methods.append(method)
for method in methods:
# send clean copies
new_args = []
# If we end up being given an implicit task, we'll set this flag in
# the loop below. If the plugin doesn't care about those, then we
# check and continue to the next iteration of the outer loop.
is_implicit_task = False
for arg in args:
# FIXME: add play/task cleaners
if isinstance(arg, TaskResult):
new_args.append(arg.clean_copy())
# elif isinstance(arg, Play):
# elif isinstance(arg, Task):
if isinstance(arg, _RawTaskResult):
copied_tr = arg.as_callback_task_result()
new_args.append(copied_tr)
# this state hack requires that no callback ever accepts > 1 TaskResult object
callback_plugin._current_task_result = copied_tr
else:
new_args.append(arg)
if isinstance(arg, Task) and arg.implicit:
is_implicit_task = True
if is_implicit_task and not wants_implicit_tasks:
continue
for method in methods:
with self._callback_dispatch_error_handler.handle(AnsibleCallbackError):
try:
method(*new_args, **kwargs)
except AssertionError:
# Using an `assert` in integration tests is useful.
# Production code should never use `assert` or raise `AssertionError`.
raise
except Exception as ex:
raise AnsibleCallbackError(f"Callback dispatch {method_name!r} failed for plugin {callback_plugin._load_name!r}.") from ex
callback_plugin._current_task_result = None

@ -4,15 +4,24 @@
from __future__ import annotations
import collections.abc as _c
import dataclasses
import functools
import typing as t
from ansible import constants as C
from ansible.parsing.dataloader import DataLoader
from ansible import constants
from ansible.utils import vars as _vars
from ansible.vars.clean import module_response_deepcopy, strip_internal_keys
from ansible.module_utils.common import messages as _messages
from ansible._internal import _collection_proxy
if t.TYPE_CHECKING:
from ansible.inventory.host import Host
from ansible.playbook.task import Task
_IGNORE = ('failed', 'skipped')
_PRESERVE = ('attempts', 'changed', 'retries', '_ansible_no_log')
_SUB_PRESERVE = {'_ansible_delegated_vars': ('ansible_host', 'ansible_port', 'ansible_user', 'ansible_connection')}
_PRESERVE = {'attempts', 'changed', 'retries', '_ansible_no_log'}
_SUB_PRESERVE = {'_ansible_delegated_vars': {'ansible_host', 'ansible_port', 'ansible_user', 'ansible_connection'}}
# stuff callbacks need
CLEAN_EXCEPTIONS = (
@ -23,61 +32,120 @@ CLEAN_EXCEPTIONS = (
)
class TaskResult:
@t.final
@dataclasses.dataclass(frozen=True, kw_only=True, slots=True)
class _WireTaskResult:
"""A thin version of `_RawTaskResult` which can be sent over the worker queue."""
host_name: str
task_uuid: str
return_data: _c.MutableMapping[str, object]
task_fields: _c.Mapping[str, object]
class _BaseTaskResult:
"""
This class is responsible for interpreting the resulting data
from an executed task, and provides helper methods for determining
the result of a given task.
"""
def __init__(self, host, task, return_data, task_fields=None):
self._host = host
self._task = task
def __init__(self, host: Host, task: Task, return_data: _c.MutableMapping[str, t.Any], task_fields: _c.Mapping[str, t.Any]) -> None:
self.__host = host
self.__task = task
self._return_data = return_data # FIXME: this should be immutable, but strategy result processing mutates it in some corner cases
self.__task_fields = task_fields
if isinstance(return_data, dict):
self._result = return_data.copy()
else:
self._result = DataLoader().load(return_data)
@property
def host(self) -> Host:
"""The host associated with this result."""
return self.__host
if task_fields is None:
self._task_fields = dict()
else:
self._task_fields = task_fields
@property
def _host(self) -> Host:
"""Use the `host` property when supporting only ansible-core 2.19 or later."""
# deprecated: description='Deprecate `_host` in favor of `host`' core_version='2.23'
return self.__host
@property
def task(self) -> Task:
"""The task associated with this result."""
return self.__task
@property
def _task(self) -> Task:
"""Use the `task` property when supporting only ansible-core 2.19 or later."""
# deprecated: description='Deprecate `_task` in favor of `task`' core_version='2.23'
return self.__task
@property
def task_fields(self) -> _c.Mapping[str, t.Any]:
"""The task fields associated with this result."""
return self.__task_fields
@property
def _task_fields(self) -> _c.Mapping[str, t.Any]:
"""Use the `task_fields` property when supporting only ansible-core 2.19 or later."""
# deprecated: description='Deprecate `_task_fields` in favor of `task`' core_version='2.23'
return self.__task_fields
@property
def exception(self) -> _messages.ErrorSummary | None:
"""The error from this task result, if any."""
return self._return_data.get('exception')
@property
def warnings(self) -> _c.Sequence[_messages.WarningSummary]:
"""The warnings for this task, if any."""
return _collection_proxy.SequenceProxy(self._return_data.get('warnings') or [])
@property
def deprecations(self) -> _c.Sequence[_messages.DeprecationSummary]:
"""The deprecation warnings for this task, if any."""
return _collection_proxy.SequenceProxy(self._return_data.get('deprecations') or [])
@property
def task_name(self):
return self._task_fields.get('name', None) or self._task.get_name()
def _loop_results(self) -> list[_c.MutableMapping[str, t.Any]]:
"""Return a list of loop results. If no loop results are present, an empty list is returned."""
results = self._return_data.get('results')
def is_changed(self):
if not isinstance(results, list):
return []
return results
@property
def task_name(self) -> str:
return str(self.task_fields.get('name', '')) or self.task.get_name()
def is_changed(self) -> bool:
return self._check_key('changed')
def is_skipped(self):
# loop results
if 'results' in self._result:
results = self._result['results']
def is_skipped(self) -> bool:
if self._loop_results:
# Loop tasks are only considered skipped if all items were skipped.
# some squashed results (eg, dnf) are not dicts and can't be skipped individually
if results and all(isinstance(res, dict) and res.get('skipped', False) for res in results):
if all(isinstance(loop_res, dict) and loop_res.get('skipped', False) for loop_res in self._loop_results):
return True
# regular tasks and squashed non-dict results
return self._result.get('skipped', False)
return bool(self._return_data.get('skipped', False))
def is_failed(self):
if 'failed_when_result' in self._result or \
'results' in self._result and True in [True for x in self._result['results'] if 'failed_when_result' in x]:
def is_failed(self) -> bool:
if 'failed_when_result' in self._return_data or any(isinstance(loop_res, dict) and 'failed_when_result' in loop_res for loop_res in self._loop_results):
return self._check_key('failed_when_result')
else:
return self._check_key('failed')
def is_unreachable(self):
def is_unreachable(self) -> bool:
return self._check_key('unreachable')
def needs_debugger(self, globally_enabled=False):
_debugger = self._task_fields.get('debugger')
_ignore_errors = C.TASK_DEBUGGER_IGNORE_ERRORS and self._task_fields.get('ignore_errors')
def needs_debugger(self, globally_enabled: bool = False) -> bool:
_debugger = self.task_fields.get('debugger')
_ignore_errors = constants.TASK_DEBUGGER_IGNORE_ERRORS and self.task_fields.get('ignore_errors')
ret = False
if globally_enabled and ((self.is_failed() and not _ignore_errors) or self.is_unreachable()):
ret = True
@ -94,68 +162,96 @@ class TaskResult:
return ret
def _check_key(self, key):
"""get a specific key from the result or its items"""
def _check_key(self, key: str) -> bool:
"""Fetch a specific named boolean value from the result; if missing, a logical OR of the value from nested loop results; False for non-loop results."""
if (value := self._return_data.get(key, ...)) is not ...:
return bool(value)
if isinstance(self._result, dict) and key in self._result:
return self._result.get(key, False)
else:
flag = False
for res in self._result.get('results', []):
if isinstance(res, dict):
flag |= res.get(key, False)
return flag
return any(isinstance(result, dict) and result.get(key) for result in self._loop_results)
def clean_copy(self):
""" returns 'clean' taskresult object """
@t.final
class _RawTaskResult(_BaseTaskResult):
def as_wire_task_result(self) -> _WireTaskResult:
"""Return a `_WireTaskResult` from this instance."""
return _WireTaskResult(
host_name=self.host.name,
task_uuid=self.task._uuid,
return_data=self._return_data,
task_fields=self.task_fields,
)
# FIXME: clean task_fields, _task and _host copies
result = TaskResult(self._host, self._task, {}, self._task_fields)
def as_callback_task_result(self) -> CallbackTaskResult:
"""Return a `CallbackTaskResult` from this instance."""
ignore: tuple[str, ...]
# statuses are already reflected on the event type
if result._task and result._task.action in C._ACTION_DEBUG:
if self.task and self.task.action in constants._ACTION_DEBUG:
# debug is verbose by default to display vars, no need to add invocation
ignore = _IGNORE + ('invocation',)
else:
ignore = _IGNORE
subset = {}
subset: dict[str, dict[str, object]] = {}
# preserve subset for later
for sub in _SUB_PRESERVE:
if sub in self._result:
subset[sub] = {}
for key in _SUB_PRESERVE[sub]:
if key in self._result[sub]:
subset[sub][key] = self._result[sub][key]
for sub, sub_keys in _SUB_PRESERVE.items():
sub_data = self._return_data.get(sub)
if isinstance(sub_data, dict):
subset[sub] = {key: value for key, value in sub_data.items() if key in sub_keys}
# DTFIX-FUTURE: is checking no_log here redundant now that we use _ansible_no_log everywhere?
if isinstance(self._task.no_log, bool) and self._task.no_log or self._result.get('_ansible_no_log'):
censored_result = censor_result(self._result)
if isinstance(self.task.no_log, bool) and self.task.no_log or self._return_data.get('_ansible_no_log'):
censored_result = censor_result(self._return_data)
if results := self._result.get('results'):
if self._loop_results:
# maintain shape for loop results so callback behavior recognizes a loop was performed
censored_result.update(results=[censor_result(item) if item.get('_ansible_no_log') else item for item in results])
result._result = censored_result
elif self._result:
result._result = module_response_deepcopy(self._result)
censored_result.update(results=[
censor_result(loop_res) if isinstance(loop_res, dict) and loop_res.get('_ansible_no_log') else loop_res for loop_res in self._loop_results
])
# actually remove
for remove_key in ignore:
if remove_key in result._result:
del result._result[remove_key]
return_data = censored_result
elif self._return_data:
return_data = {k: v for k, v in module_response_deepcopy(self._return_data).items() if k not in ignore}
# remove almost ALL internal keys, keep ones relevant to callback
strip_internal_keys(result._result, exceptions=CLEAN_EXCEPTIONS)
strip_internal_keys(return_data, exceptions=CLEAN_EXCEPTIONS)
else:
return_data = {}
# keep subset
result._result.update(subset)
return_data.update(subset)
return CallbackTaskResult(self.host, self.task, return_data, self.task_fields)
@t.final
class CallbackTaskResult(_BaseTaskResult):
"""Public contract of TaskResult """
# DTFIX-RELEASE: find a better home for this since it's public API
@property
def _result(self) -> _c.MutableMapping[str, t.Any]:
"""Use the `result` property when supporting only ansible-core 2.19 or later."""
# deprecated: description='Deprecate `_result` in favor of `result`' core_version='2.23'
return self.result
@functools.cached_property
def result(self) -> _c.MutableMapping[str, t.Any]:
"""
Returns a cached copy of the task result dictionary for consumption by callbacks.
Internal custom types are transformed to native Python types to facilitate access and serialization.
"""
return t.cast(_c.MutableMapping[str, t.Any], _vars.transform_to_native_types(self._return_data))
return result
TaskResult = CallbackTaskResult
"""Compatibility name for the pre-2.19 callback-shaped TaskResult passed to callbacks."""
def censor_result(result: dict[str, t.Any]) -> dict[str, t.Any]:
def censor_result(result: _c.Mapping[str, t.Any]) -> dict[str, t.Any]:
censored_result = {key: value for key in _PRESERVE if (value := result.get(key, ...)) is not ...}
censored_result.update(censored="the output has been hidden due to the fact that 'no_log: true' was specified for this result")

@ -138,7 +138,7 @@ def g_connect(versions):
'The v2 Ansible Galaxy API is deprecated and no longer supported. '
'Ensure that you have configured the ansible-galaxy CLI to utilize an '
'updated and supported version of Ansible Galaxy.',
version='2.20'
version='2.20',
)
return method(self, *args, **kwargs)

@ -201,9 +201,9 @@ class CollectionSignatureError(Exception):
# FUTURE: expose actual verify result details for a collection on this object, maybe reimplement as dataclass on py3.8+
class CollectionVerifyResult:
def __init__(self, collection_name): # type: (str) -> None
self.collection_name = collection_name # type: str
self.success = True # type: bool
def __init__(self, collection_name: str) -> None:
self.collection_name = collection_name
self.success = True
def verify_local_collection(local_collection, remote_collection, artifacts_manager):

@ -30,6 +30,7 @@ from random import shuffle
from ansible import constants as C
from ansible._internal import _json, _wrapt
from ansible._internal._json import EncryptedStringBehavior
from ansible.errors import AnsibleError, AnsibleOptionsError
from ansible.inventory.data import InventoryData
from ansible.module_utils.six import string_types
@ -787,7 +788,7 @@ class _InventoryDataWrapper(_wrapt.ObjectProxy):
return _json.AnsibleVariableVisitor(
trusted_as_template=self._target_plugin.trusted_by_default,
origin=self._default_origin,
allow_encrypted_string=True,
encrypted_string_behavior=EncryptedStringBehavior.PRESERVE,
)
def set_variable(self, entity: str, varname: str, value: t.Any) -> None:

@ -6,7 +6,6 @@
from __future__ import annotations
import atexit
import dataclasses
import importlib.util
import json
import os
@ -15,17 +14,14 @@ import sys
import typing as t
from . import _errors
from ._plugin_exec_context import PluginExecContext, HasPluginInfo
from .. import basic
from ..common.json import get_module_encoder, Direction
from ..common.messages import PluginInfo
def run_module(
*,
json_params: bytes,
profile: str,
plugin_info_dict: dict[str, object],
module_fqn: str,
modlib_path: str,
init_globals: dict[str, t.Any] | None = None,
@ -38,7 +34,6 @@ def run_module(
_run_module(
json_params=json_params,
profile=profile,
plugin_info_dict=plugin_info_dict,
module_fqn=module_fqn,
modlib_path=modlib_path,
init_globals=init_globals,
@ -80,7 +75,6 @@ def _run_module(
*,
json_params: bytes,
profile: str,
plugin_info_dict: dict[str, object],
module_fqn: str,
modlib_path: str,
init_globals: dict[str, t.Any] | None = None,
@ -92,7 +86,6 @@ def _run_module(
init_globals = init_globals or {}
init_globals.update(_module_fqn=module_fqn, _modlib_path=modlib_path)
with PluginExecContext(_ModulePluginWrapper(PluginInfo._from_dict(plugin_info_dict))):
# Run the module. By importing it as '__main__', it executes as a script.
runpy.run_module(mod_name=module_fqn, init_globals=init_globals, run_name='__main__', alter_sys=True)
@ -112,22 +105,3 @@ def _handle_exception(exception: BaseException, profile: str) -> t.NoReturn:
print(json.dumps(result, cls=encoder)) # pylint: disable=ansible-bad-function
sys.exit(1) # pylint: disable=ansible-bad-function
@dataclasses.dataclass(frozen=True)
class _ModulePluginWrapper(HasPluginInfo):
"""Modules aren't plugin instances; this adapter implements the `HasPluginInfo` protocol to allow `PluginExecContext` infra to work with modules."""
plugin: PluginInfo
@property
def _load_name(self) -> str:
return self.plugin.requested_name
@property
def ansible_name(self) -> str:
return self.plugin.resolved_name
@property
def plugin_type(self) -> str:
return self.plugin.type

@ -1,64 +0,0 @@
"""Patch broken ClassVar support in dataclasses when ClassVar is accessed via a module other than `typing`."""
# deprecated: description='verify ClassVar support in dataclasses has been fixed in Python before removing this patching code', python_version='3.12'
from __future__ import annotations
import dataclasses
import sys
import typing as t
# trigger the bug by exposing typing.ClassVar via a module reference that is not `typing`
_ts = sys.modules[__name__]
ClassVar = t.ClassVar
def patch_dataclasses_is_type() -> None:
if not _is_patch_needed():
return # pragma: nocover
try:
real_is_type = dataclasses._is_type # type: ignore[attr-defined]
except AttributeError: # pragma: nocover
raise RuntimeError("unable to patch broken dataclasses ClassVar support") from None
# patch dataclasses._is_type - impl from https://github.com/python/cpython/blob/4c6d4f5cb33e48519922d635894eef356faddba2/Lib/dataclasses.py#L709-L765
def _is_type(annotation, cls, a_module, a_type, is_type_predicate):
match = dataclasses._MODULE_IDENTIFIER_RE.match(annotation) # type: ignore[attr-defined]
if match:
ns = None
module_name = match.group(1)
if not module_name:
# No module name, assume the class's module did
# "from dataclasses import InitVar".
ns = sys.modules.get(cls.__module__).__dict__
else:
# Look up module_name in the class's module.
module = sys.modules.get(cls.__module__)
if module and module.__dict__.get(module_name): # this is the patched line; removed `is a_module`
ns = sys.modules.get(a_type.__module__).__dict__
if ns and is_type_predicate(ns.get(match.group(2)), a_module):
return True
return False
_is_type._orig_impl = real_is_type # type: ignore[attr-defined] # stash this away to allow unit tests to undo the patch
dataclasses._is_type = _is_type # type: ignore[attr-defined]
try:
if _is_patch_needed():
raise RuntimeError("patching had no effect") # pragma: nocover
except Exception as ex: # pragma: nocover
dataclasses._is_type = real_is_type # type: ignore[attr-defined]
raise RuntimeError("dataclasses ClassVar support is still broken after patching") from ex
def _is_patch_needed() -> bool:
@dataclasses.dataclass
class CheckClassVar:
# this is the broken case requiring patching: ClassVar dot-referenced from a module that is not `typing` is treated as an instance field
# DTFIX-RELEASE: add link to CPython bug report to-be-filed (or update associated deprecation comments if we don't)
a_classvar: _ts.ClassVar[int] # type: ignore[name-defined]
a_field: int
return len(dataclasses.fields(CheckClassVar)) != 1

@ -1,7 +1,6 @@
from __future__ import annotations
import dataclasses
import datetime
import typing as t
from ansible.module_utils.common import messages as _messages
@ -12,27 +11,6 @@ from ansible.module_utils._internal import _datatag
class Deprecated(_datatag.AnsibleDatatagBase):
msg: str
help_text: t.Optional[str] = None
removal_date: t.Optional[datetime.date] = None
removal_version: t.Optional[str] = None
plugin: t.Optional[_messages.PluginInfo] = None
@classmethod
def _from_dict(cls, d: t.Dict[str, t.Any]) -> Deprecated:
source = d
removal_date = source.get('removal_date')
if removal_date is not None:
source = source.copy()
source['removal_date'] = datetime.date.fromisoformat(removal_date)
return cls(**source)
def _as_dict(self) -> t.Dict[str, t.Any]:
# deprecated: description='no-args super() with slotted dataclass requires 3.14+' python_version='3.13'
# see: https://github.com/python/cpython/pull/124455
value = super(Deprecated, self)._as_dict()
if self.removal_date is not None:
value['removal_date'] = self.removal_date.isoformat()
return value
date: t.Optional[str] = None
version: t.Optional[str] = None
deprecator: t.Optional[_messages.PluginInfo] = None

@ -0,0 +1,134 @@
from __future__ import annotations
import inspect
import re
import pathlib
import sys
import typing as t
from ansible.module_utils.common.messages import PluginInfo
_ansible_module_base_path: t.Final = pathlib.Path(sys.modules['ansible'].__file__).parent
"""Runtime-detected base path of the `ansible` Python package to distinguish between Ansible-owned and external code."""
ANSIBLE_CORE_DEPRECATOR: t.Final = PluginInfo._from_collection_name('ansible.builtin')
"""Singleton `PluginInfo` instance for ansible-core callers where the plugin can/should not be identified in messages."""
INDETERMINATE_DEPRECATOR: t.Final = PluginInfo(resolved_name='indeterminate', type='indeterminate')
"""Singleton `PluginInfo` instance for indeterminate deprecator."""
_DEPRECATOR_PLUGIN_TYPES = frozenset(
{
'action',
'become',
'cache',
'callback',
'cliconf',
'connection',
# doc_fragments - no code execution
# filter - basename inadequate to identify plugin
'httpapi',
'inventory',
'lookup',
'module', # only for collections
'netconf',
'shell',
'strategy',
'terminal',
# test - basename inadequate to identify plugin
'vars',
}
)
"""Plugin types which are valid for identifying a deprecator for deprecation purposes."""
_AMBIGUOUS_DEPRECATOR_PLUGIN_TYPES = frozenset(
{
'filter',
'test',
}
)
"""Plugin types for which basename cannot be used to identify the plugin name."""
def get_best_deprecator(*, deprecator: PluginInfo | None = None, collection_name: str | None = None) -> PluginInfo:
"""Return the best-available `PluginInfo` for the caller of this method."""
_skip_stackwalk = True
if deprecator and collection_name:
raise ValueError('Specify only one of `deprecator` or `collection_name`.')
return deprecator or PluginInfo._from_collection_name(collection_name) or get_caller_plugin_info() or INDETERMINATE_DEPRECATOR
def get_caller_plugin_info() -> PluginInfo | None:
"""Try to get `PluginInfo` for the caller of this method, ignoring marked infrastructure stack frames."""
_skip_stackwalk = True
if frame_info := next((frame_info for frame_info in inspect.stack() if '_skip_stackwalk' not in frame_info.frame.f_locals), None):
return _path_as_core_plugininfo(frame_info.filename) or _path_as_collection_plugininfo(frame_info.filename)
return None # pragma: nocover
def _path_as_core_plugininfo(path: str) -> PluginInfo | None:
"""Return a `PluginInfo` instance if the provided `path` refers to a core plugin."""
try:
relpath = str(pathlib.Path(path).relative_to(_ansible_module_base_path))
except ValueError:
return None # not ansible-core
namespace = 'ansible.builtin'
if match := re.match(r'plugins/(?P<plugin_type>\w+)/(?P<plugin_name>\w+)', relpath):
plugin_name = match.group("plugin_name")
plugin_type = match.group("plugin_type")
if plugin_type not in _DEPRECATOR_PLUGIN_TYPES:
# The plugin type isn't a known deprecator type, so we have to assume the caller is intermediate code.
# We have no way of knowing if the intermediate code is deprecating its own feature, or acting on behalf of another plugin.
# Callers in this case need to identify the deprecating plugin name, otherwise only ansible-core will be reported.
# Reporting ansible-core is never wrong, it just may be missing an additional detail (plugin name) in the "on behalf of" case.
return ANSIBLE_CORE_DEPRECATOR
elif match := re.match(r'modules/(?P<module_name>\w+)', relpath):
# AnsiballZ Python package for core modules
plugin_name = match.group("module_name")
plugin_type = "module"
elif match := re.match(r'legacy/(?P<module_name>\w+)', relpath):
# AnsiballZ Python package for non-core library/role modules
namespace = 'ansible.legacy'
plugin_name = match.group("module_name")
plugin_type = "module"
else:
return ANSIBLE_CORE_DEPRECATOR # non-plugin core path, safe to use ansible-core for the same reason as the non-deprecator plugin type case above
name = f'{namespace}.{plugin_name}'
return PluginInfo(resolved_name=name, type=plugin_type)
def _path_as_collection_plugininfo(path: str) -> PluginInfo | None:
"""Return a `PluginInfo` instance if the provided `path` refers to a collection plugin."""
if not (match := re.search(r'/ansible_collections/(?P<ns>\w+)/(?P<coll>\w+)/plugins/(?P<plugin_type>\w+)/(?P<plugin_name>\w+)', path)):
return None
plugin_type = match.group('plugin_type')
if plugin_type in _AMBIGUOUS_DEPRECATOR_PLUGIN_TYPES:
# We're able to detect the namespace, collection and plugin type -- but we have no way to identify the plugin name currently.
# To keep things simple we'll fall back to just identifying the namespace and collection.
# In the future we could improve the detection and/or make it easier for a caller to identify the plugin name.
return PluginInfo._from_collection_name('.'.join((match.group('ns'), match.group('coll'))))
if plugin_type == 'modules':
plugin_type = 'module'
if plugin_type not in _DEPRECATOR_PLUGIN_TYPES:
# The plugin type isn't a known deprecator type, so we have to assume the caller is intermediate code.
# We have no way of knowing if the intermediate code is deprecating its own feature, or acting on behalf of another plugin.
# Callers in this case need to identify the deprecator to avoid ambiguity, since it could be the same collection or another collection.
return INDETERMINATE_DEPRECATOR
name = '.'.join((match.group('ns'), match.group('coll'), match.group('plugin_name')))
return PluginInfo(resolved_name=name, type=plugin_type)

@ -1,49 +0,0 @@
from __future__ import annotations
import typing as t
from ._ambient_context import AmbientContextBase
from ..common.messages import PluginInfo
class HasPluginInfo(t.Protocol):
"""Protocol to type-annotate and expose PluginLoader-set values."""
@property
def _load_name(self) -> str:
"""The requested name used to load the plugin."""
@property
def ansible_name(self) -> str:
"""Fully resolved plugin name."""
@property
def plugin_type(self) -> str:
"""Plugin type name."""
class PluginExecContext(AmbientContextBase):
"""Execution context that wraps all plugin invocations to allow infrastructure introspection of the currently-executing plugin instance."""
def __init__(self, executing_plugin: HasPluginInfo) -> None:
self._executing_plugin = executing_plugin
@property
def executing_plugin(self) -> HasPluginInfo:
return self._executing_plugin
@property
def plugin_info(self) -> PluginInfo:
return PluginInfo(
requested_name=self._executing_plugin._load_name,
resolved_name=self._executing_plugin.ansible_name,
type=self._executing_plugin.plugin_type,
)
@classmethod
def get_current_plugin_info(cls) -> PluginInfo | None:
"""Utility method to extract a PluginInfo for the currently executing plugin (or None if no plugin is executing)."""
if ctx := cls.current(optional=True):
return ctx.plugin_info
return None

@ -0,0 +1,25 @@
from __future__ import annotations
import typing as t
from ..common import messages as _messages
class HasPluginInfo(t.Protocol):
"""Protocol to type-annotate and expose PluginLoader-set values."""
@property
def ansible_name(self) -> str | None:
"""Fully resolved plugin name."""
@property
def plugin_type(self) -> str:
"""Plugin type name."""
def get_plugin_info(value: HasPluginInfo) -> _messages.PluginInfo:
"""Utility method that returns a `PluginInfo` from an object implementing the `HasPluginInfo` protocol."""
return _messages.PluginInfo(
resolved_name=value.ansible_name,
type=value.plugin_type,
)

@ -0,0 +1,14 @@
from __future__ import annotations
import keyword
def validate_collection_name(collection_name: object, name: str = 'collection_name') -> None:
"""Validate a collection name."""
if not isinstance(collection_name, str):
raise TypeError(f"{name} must be {str} instead of {type(collection_name)}")
parts = collection_name.split('.')
if len(parts) != 2 or not all(part.isidentifier() and not keyword.iskeyword(part) for part in parts):
raise ValueError(f"{name} must consist of two non-keyword identifiers separated by '.'")

@ -53,9 +53,7 @@ try:
except ImportError:
HAS_SYSLOG = False
# deprecated: description='types.EllipsisType is available in Python 3.10+' python_version='3.9'
if t.TYPE_CHECKING:
from builtins import ellipsis
_UNSET = t.cast(t.Any, object())
try:
from systemd import journal, daemon as systemd_daemon
@ -77,7 +75,7 @@ except ImportError:
# Python2 & 3 way to get NoneType
NoneType = type(None)
from ._internal import _traceback, _errors, _debugging
from ._internal import _traceback, _errors, _debugging, _deprecator
from .common.text.converters import (
to_native,
@ -341,7 +339,7 @@ def _load_params():
except Exception as ex:
raise Exception("Failed to decode JSON module parameters.") from ex
if (ansible_module_args := params.get('ANSIBLE_MODULE_ARGS', ...)) is ...:
if (ansible_module_args := params.get('ANSIBLE_MODULE_ARGS', _UNSET)) is _UNSET:
raise Exception("ANSIBLE_MODULE_ARGS not provided.")
global _PARSED_MODULE_ARGS
@ -511,16 +509,31 @@ class AnsibleModule(object):
warn(warning)
self.log('[WARNING] %s' % warning)
def deprecate(self, msg, version=None, date=None, collection_name=None):
if version is not None and date is not None:
raise AssertionError("implementation error -- version and date must not both be set")
deprecate(msg, version=version, date=date)
# For compatibility, we accept that neither version nor date is set,
# and treat that the same as if version would not have been set
if date is not None:
self.log('[DEPRECATION WARNING] %s %s' % (msg, date))
else:
self.log('[DEPRECATION WARNING] %s %s' % (msg, version))
def deprecate(
self,
msg: str,
version: str | None = None,
date: str | None = None,
collection_name: str | None = None,
*,
deprecator: _messages.PluginInfo | None = None,
help_text: str | None = None,
) -> None:
"""
Record a deprecation warning to be returned with the module result.
Most callers do not need to provide `collection_name` or `deprecator` -- but provide only one if needed.
Specify `version` or `date`, but not both.
If `date` is a string, it must be in the form `YYYY-MM-DD`.
"""
_skip_stackwalk = True
deprecate( # pylint: disable=ansible-deprecated-date-not-permitted,ansible-deprecated-unnecessary-collection-name
msg=msg,
version=version,
date=date,
deprecator=_deprecator.get_best_deprecator(deprecator=deprecator, collection_name=collection_name),
help_text=help_text,
)
def load_file_common_arguments(self, params, path=None):
"""
@ -1406,6 +1419,7 @@ class AnsibleModule(object):
self.cleanup(path)
def _return_formatted(self, kwargs):
_skip_stackwalk = True
self.add_path_info(kwargs)
@ -1413,6 +1427,13 @@ class AnsibleModule(object):
kwargs['invocation'] = {'module_args': self.params}
if 'warnings' in kwargs:
self.deprecate( # pylint: disable=ansible-deprecated-unnecessary-collection-name
msg='Passing `warnings` to `exit_json` or `fail_json` is deprecated.',
version='2.23',
help_text='Use `AnsibleModule.warn` instead.',
deprecator=_deprecator.ANSIBLE_CORE_DEPRECATOR,
)
if isinstance(kwargs['warnings'], list):
for w in kwargs['warnings']:
self.warn(w)
@ -1424,17 +1445,38 @@ class AnsibleModule(object):
kwargs['warnings'] = warnings
if 'deprecations' in kwargs:
self.deprecate( # pylint: disable=ansible-deprecated-unnecessary-collection-name
msg='Passing `deprecations` to `exit_json` or `fail_json` is deprecated.',
version='2.23',
help_text='Use `AnsibleModule.deprecate` instead.',
deprecator=_deprecator.ANSIBLE_CORE_DEPRECATOR,
)
if isinstance(kwargs['deprecations'], list):
for d in kwargs['deprecations']:
if isinstance(d, SEQUENCETYPE) and len(d) == 2:
self.deprecate(d[0], version=d[1])
if isinstance(d, (KeysView, Sequence)) and len(d) == 2:
self.deprecate( # pylint: disable=ansible-deprecated-unnecessary-collection-name,ansible-invalid-deprecated-version
msg=d[0],
version=d[1],
deprecator=_deprecator.get_best_deprecator(),
)
elif isinstance(d, Mapping):
self.deprecate(d['msg'], version=d.get('version'), date=d.get('date'),
collection_name=d.get('collection_name'))
self.deprecate( # pylint: disable=ansible-deprecated-date-not-permitted,ansible-deprecated-unnecessary-collection-name
msg=d['msg'],
version=d.get('version'),
date=d.get('date'),
deprecator=_deprecator.get_best_deprecator(collection_name=d.get('collection_name')),
)
else:
self.deprecate(d) # pylint: disable=ansible-deprecated-no-version
self.deprecate( # pylint: disable=ansible-deprecated-unnecessary-collection-name,ansible-deprecated-no-version
msg=d,
deprecator=_deprecator.get_best_deprecator(),
)
else:
self.deprecate(kwargs['deprecations']) # pylint: disable=ansible-deprecated-no-version
self.deprecate( # pylint: disable=ansible-deprecated-unnecessary-collection-name,ansible-deprecated-no-version
msg=kwargs['deprecations'],
deprecator=_deprecator.get_best_deprecator(),
)
deprecations = get_deprecations()
if deprecations:
@ -1454,12 +1496,13 @@ class AnsibleModule(object):
def exit_json(self, **kwargs) -> t.NoReturn:
""" return from the module, without error """
_skip_stackwalk = True
self.do_cleanup_files()
self._return_formatted(kwargs)
sys.exit(0)
def fail_json(self, msg: str, *, exception: BaseException | str | ellipsis | None = ..., **kwargs) -> t.NoReturn:
def fail_json(self, msg: str, *, exception: BaseException | str | None = _UNSET, **kwargs) -> t.NoReturn:
"""
Return from the module with an error message and optional exception/traceback detail.
A traceback will only be included in the result if error traceback capturing has been enabled.
@ -1475,6 +1518,8 @@ class AnsibleModule(object):
When `exception` is not specified, a formatted traceback will be retrieved from the current exception.
If no exception is pending, the current call stack will be used instead.
"""
_skip_stackwalk = True
msg = str(msg) # coerce to str instead of raising an error due to an invalid type
kwargs.update(
@ -1498,7 +1543,7 @@ class AnsibleModule(object):
if isinstance(exception, str):
formatted_traceback = exception
elif exception is ... and (current_exception := t.cast(t.Optional[BaseException], sys.exc_info()[1])):
elif exception is _UNSET and (current_exception := t.cast(t.Optional[BaseException], sys.exc_info()[1])):
formatted_traceback = _traceback.maybe_extract_traceback(current_exception, _traceback.TracebackEvent.ERROR)
else:
formatted_traceback = _traceback.maybe_capture_traceback(_traceback.TracebackEvent.ERROR)

@ -22,6 +22,7 @@ from ansible.module_utils.common.parameters import (
from ansible.module_utils.common.text.converters import to_native
from ansible.module_utils.common.warnings import deprecate, warn
from ansible.module_utils.common import messages as _messages
from ansible.module_utils.common.validation import (
check_mutually_exclusive,
@ -300,9 +301,13 @@ class ModuleArgumentSpecValidator(ArgumentSpecValidator):
result = super(ModuleArgumentSpecValidator, self).validate(parameters)
for d in result._deprecations:
deprecate(d['msg'],
version=d.get('version'), date=d.get('date'),
collection_name=d.get('collection_name'))
# DTFIX-FUTURE: pass an actual deprecator instead of one derived from collection_name
deprecate( # pylint: disable=ansible-deprecated-date-not-permitted,ansible-deprecated-unnecessary-collection-name
msg=d['msg'],
version=d.get('version'),
date=d.get('date'),
deprecator=_messages.PluginInfo._from_collection_name(d.get('collection_name')),
)
for w in result._warnings:
warn('Both option {option} and its alias {alias} are set.'.format(option=w['option'], alias=w['alias']))

@ -13,7 +13,7 @@ import dataclasses as _dataclasses
# deprecated: description='typing.Self exists in Python 3.11+' python_version='3.10'
from ..compat import typing as _t
from ansible.module_utils._internal import _datatag
from ansible.module_utils._internal import _datatag, _validation
if _sys.version_info >= (3, 10):
# Using slots for reduced memory usage and improved performance.
@ -27,13 +27,27 @@ else:
class PluginInfo(_datatag.AnsibleSerializableDataclass):
"""Information about a loaded plugin."""
requested_name: str
"""The plugin name as requested, before resolving, which may be partially or fully qualified."""
resolved_name: str
"""The resolved canonical plugin name; always fully-qualified for collection plugins."""
type: str
"""The plugin type."""
_COLLECTION_ONLY_TYPE: _t.ClassVar[str] = 'collection'
"""This is not a real plugin type. It's a placeholder for use by a `PluginInfo` instance which references a collection without a plugin."""
@classmethod
def _from_collection_name(cls, collection_name: str | None) -> _t.Self | None:
"""Returns an instance with the special `collection` type to refer to a non-plugin or ambiguous caller within a collection."""
if not collection_name:
return None
_validation.validate_collection_name(collection_name)
return cls(
resolved_name=collection_name,
type=cls._COLLECTION_ONLY_TYPE,
)
@_dataclasses.dataclass(**_dataclass_kwargs)
class Detail(_datatag.AnsibleSerializableDataclass):
@ -75,34 +89,37 @@ class WarningSummary(SummaryBase):
class DeprecationSummary(WarningSummary):
"""Deprecation summary with details (possibly derived from an exception __cause__ chain) and an optional traceback."""
version: _t.Optional[str] = None
date: _t.Optional[str] = None
plugin: _t.Optional[PluginInfo] = None
@property
def collection_name(self) -> _t.Optional[str]:
if not self.plugin:
return None
parts = self.plugin.resolved_name.split('.')
if len(parts) < 2:
return None
collection_name = '.'.join(parts[:2])
deprecator: _t.Optional[PluginInfo] = None
"""
The identifier for the content which is being deprecated.
"""
# deprecated: description='enable the deprecation message for collection_name' core_version='2.23'
# from ansible.module_utils.datatag import deprecate_value
# collection_name = deprecate_value(collection_name, 'The `collection_name` property is deprecated.', removal_version='2.27')
date: _t.Optional[str] = None
"""
The date after which a new release of `deprecator` will remove the feature described by `msg`.
Ignored if `deprecator` is not provided.
"""
return collection_name
version: _t.Optional[str] = None
"""
The version of `deprecator` which will remove the feature described by `msg`.
Ignored if `deprecator` is not provided.
Ignored if `date` is provided.
"""
def _as_simple_dict(self) -> _t.Dict[str, _t.Any]:
"""Returns a dictionary representation of the deprecation object in the format exposed to playbooks."""
from ansible.module_utils._internal._deprecator import INDETERMINATE_DEPRECATOR # circular import from messages
if self.deprecator and self.deprecator != INDETERMINATE_DEPRECATOR:
collection_name = '.'.join(self.deprecator.resolved_name.split('.')[:2])
else:
collection_name = None
result = self._as_dict()
result.update(
msg=self._format(),
collection_name=self.collection_name,
collection_name=collection_name,
)
return result

@ -29,7 +29,6 @@ def get_bin_path(arg, opt_dirs=None, required=None):
deprecate(
msg="The `required` parameter in `get_bin_path` API is deprecated.",
version="2.21",
collection_name="ansible.builtin",
)
paths = []

@ -3,14 +3,12 @@
from __future__ import annotations
import dataclasses
import os
import pathlib
import subprocess
import sys
import typing as t
from ansible.module_utils._internal import _plugin_exec_context
from ansible.module_utils.common.text.converters import to_bytes
_ANSIBLE_PARENT_PATH = pathlib.Path(__file__).parents[3]
@ -99,7 +97,6 @@ if __name__ == '__main__':
json_params = {json_params!r}
profile = {profile!r}
plugin_info_dict = {plugin_info_dict!r}
module_fqn = {module_fqn!r}
modlib_path = {modlib_path!r}
@ -110,19 +107,15 @@ if __name__ == '__main__':
_ansiballz.run_module(
json_params=json_params,
profile=profile,
plugin_info_dict=plugin_info_dict,
module_fqn=module_fqn,
modlib_path=modlib_path,
init_globals=dict(_respawned=True),
)
"""
plugin_info = _plugin_exec_context.PluginExecContext.get_current_plugin_info()
respawn_code = respawn_code_template.format(
json_params=basic._ANSIBLE_ARGS,
profile=basic._ANSIBLE_PROFILE,
plugin_info_dict=dataclasses.asdict(plugin_info),
module_fqn=module_fqn,
modlib_path=modlib_path,
)

@ -4,15 +4,12 @@
from __future__ import annotations as _annotations
import datetime as _datetime
import typing as _t
from ansible.module_utils._internal import _traceback, _plugin_exec_context
from ansible.module_utils._internal import _traceback, _deprecator
from ansible.module_utils.common import messages as _messages
from ansible.module_utils import _internal
_UNSET = _t.cast(_t.Any, ...)
def warn(warning: str) -> None:
"""Record a warning to be returned with the module result."""
@ -28,22 +25,23 @@ def warn(warning: str) -> None:
def deprecate(
msg: str,
version: str | None = None,
date: str | _datetime.date | None = None,
collection_name: str | None = _UNSET,
date: str | None = None,
collection_name: str | None = None,
*,
deprecator: _messages.PluginInfo | None = None,
help_text: str | None = None,
obj: object | None = None,
) -> None:
"""
Record a deprecation warning to be returned with the module result.
Record a deprecation warning.
The `obj` argument is only useful in a controller context; it is ignored for target-side callers.
Most callers do not need to provide `collection_name` or `deprecator` -- but provide only one if needed.
Specify `version` or `date`, but not both.
If `date` is a string, it must be in the form `YYYY-MM-DD`.
"""
if isinstance(date, _datetime.date):
date = str(date)
_skip_stackwalk = True
# deprecated: description='enable the deprecation message for collection_name' core_version='2.23'
# if collection_name is not _UNSET:
# deprecate('The `collection_name` argument to `deprecate` is deprecated.', version='2.27')
deprecator = _deprecator.get_best_deprecator(deprecator=deprecator, collection_name=collection_name)
if _internal.is_controller:
_display = _internal.import_controller_module('ansible.utils.display').Display()
@ -53,6 +51,8 @@ def deprecate(
date=date,
help_text=help_text,
obj=obj,
# skip passing collection_name; get_best_deprecator already accounted for it when present
deprecator=deprecator,
)
return
@ -64,7 +64,7 @@ def deprecate(
formatted_traceback=_traceback.maybe_capture_traceback(_traceback.TracebackEvent.DEPRECATED),
version=version,
date=date,
plugin=_plugin_exec_context.PluginExecContext.get_current_plugin_info(),
deprecator=deprecator,
)] = None

@ -1,11 +1,11 @@
"""Public API for data tagging."""
from __future__ import annotations as _annotations
import datetime as _datetime
import typing as _t
from ._internal import _plugin_exec_context, _datatag
from ._internal import _datatag, _deprecator
from ._internal._datatag import _tags
from .common import messages as _messages
_T = _t.TypeVar('_T')
@ -14,28 +14,28 @@ def deprecate_value(
value: _T,
msg: str,
*,
version: str | None = None,
date: str | None = None,
collection_name: str | None = None,
deprecator: _messages.PluginInfo | None = None,
help_text: str | None = None,
removal_date: str | _datetime.date | None = None,
removal_version: str | None = None,
) -> _T:
"""
Return `value` tagged with the given deprecation details.
The types `None` and `bool` cannot be deprecated and are returned unmodified.
Raises a `TypeError` if `value` is not a supported type.
If `removal_date` is a string, it must be in the form `YYYY-MM-DD`.
This function is only supported in contexts where an Ansible plugin/module is executing.
Most callers do not need to provide `collection_name` or `deprecator` -- but provide only one if needed.
Specify `version` or `date`, but not both.
If `date` is provided, it should be in the form `YYYY-MM-DD`.
"""
if isinstance(removal_date, str):
# The `fromisoformat` method accepts other ISO 8601 formats than `YYYY-MM-DD` starting with Python 3.11.
# That should be considered undocumented behavior of `deprecate_value` rather than an intentional feature.
removal_date = _datetime.date.fromisoformat(removal_date)
_skip_stackwalk = True
deprecated = _tags.Deprecated(
msg=msg,
help_text=help_text,
removal_date=removal_date,
removal_version=removal_version,
plugin=_plugin_exec_context.PluginExecContext.get_current_plugin_info(),
date=date,
version=version,
deprecator=_deprecator.get_best_deprecator(deprecator=deprecator, collection_name=collection_name),
)
return deprecated.tag(value)

@ -68,7 +68,7 @@ EXAMPLES = r"""
ansible.builtin.async_status:
jid: '{{ dnf_sleeper.ansible_job_id }}'
register: job_result
until: job_result.finished
until: job_result is finished
retries: 100
delay: 10

@ -797,7 +797,7 @@ class Dnf5Module(YumDnf):
if self.module.check_mode:
if results:
msg = "Check mode: No changes made, but would have if not in check mode"
else:
elif changed:
transaction.download()
if not self.download_only:
transaction.set_description("ansible dnf5 module")

@ -87,7 +87,7 @@ options:
- 'If a checksum is passed to this parameter, the digest of the
destination file will be calculated after it is downloaded to ensure
its integrity and verify that the transfer completed successfully.
Format: <algorithm>:<checksum|url>, for example C(checksum="sha256:D98291AC[...]B6DC7B97",
Format: <algorithm>:<checksum|url>, for example C(checksum="sha256:D98291AC[...]B6DC7B97"),
C(checksum="sha256:http://example.com/path/sha256sum.txt").'
- If you worry about portability, only the sha1 algorithm is available
on all platforms and python versions.

@ -0,0 +1,40 @@
# (c) 2012-2014, Michael DeHaan <michael.dehaan@gmail.com>
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
from __future__ import annotations
import json
from ansible.utils.display import Display
Display().deprecated(f'{__name__!r} is deprecated.', version='2.23', help_text='Call `json.dumps` directly instead.')
def jsonify(result, format=False):
"""Format JSON output."""
if result is None:
return "{}"
indent = None
if format:
indent = 4
try:
return json.dumps(result, sort_keys=True, indent=indent, ensure_ascii=False)
except UnicodeDecodeError:
return json.dumps(result, sort_keys=True, indent=indent)

@ -8,25 +8,36 @@ from ansible.module_utils._internal import _datatag
from ansible.module_utils.common.text import converters as _converters
from ansible.parsing import vault as _vault
_UNSET = _t.cast(_t.Any, object())
class _AnsibleMapping(dict):
"""Backwards compatibility type."""
def __new__(cls, value):
return _datatag.AnsibleTagHelper.tag_copy(value, dict(value))
def __new__(cls, value=_UNSET, /, **kwargs):
if value is _UNSET:
return dict(**kwargs)
return _datatag.AnsibleTagHelper.tag_copy(value, dict(value, **kwargs))
class _AnsibleUnicode(str):
"""Backwards compatibility type."""
def __new__(cls, value):
return _datatag.AnsibleTagHelper.tag_copy(value, str(value))
def __new__(cls, object=_UNSET, **kwargs):
if object is _UNSET:
return str(**kwargs)
return _datatag.AnsibleTagHelper.tag_copy(object, str(object, **kwargs))
class _AnsibleSequence(list):
"""Backwards compatibility type."""
def __new__(cls, value):
def __new__(cls, value=_UNSET, /):
if value is _UNSET:
return list()
return _datatag.AnsibleTagHelper.tag_copy(value, list(value))

@ -21,34 +21,42 @@ import os
from ansible import constants as C
from ansible.errors import AnsibleError
from ansible.executor.task_result import _RawTaskResult
from ansible.inventory.host import Host
from ansible.module_utils.common.text.converters import to_text
from ansible.parsing.dataloader import DataLoader
from ansible.playbook.handler import Handler
from ansible.playbook.task_include import TaskInclude
from ansible.playbook.role_include import IncludeRole
from ansible._internal._templating._engine import TemplateEngine
from ansible.utils.display import Display
from ansible.vars.manager import VariableManager
display = Display()
class IncludedFile:
def __init__(self, filename, args, vars, task, is_role=False):
def __init__(self, filename, args, vars, task, is_role: bool = False) -> None:
self._filename = filename
self._args = args
self._vars = vars
self._task = task
self._hosts = []
self._hosts: list[Host] = []
self._is_role = is_role
self._results = []
self._results: list[_RawTaskResult] = []
def add_host(self, host):
def add_host(self, host: Host) -> None:
if host not in self._hosts:
self._hosts.append(host)
return
raise ValueError()
def __eq__(self, other):
if not isinstance(other, IncludedFile):
return False
return (other._filename == self._filename and
other._args == self._args and
other._vars == self._vars and
@ -59,23 +67,28 @@ class IncludedFile:
return "%s (args=%s vars=%s): %s" % (self._filename, self._args, self._vars, self._hosts)
@staticmethod
def process_include_results(results, iterator, loader, variable_manager):
included_files = []
task_vars_cache = {}
def process_include_results(
results: list[_RawTaskResult],
iterator,
loader: DataLoader,
variable_manager: VariableManager,
) -> list[IncludedFile]:
included_files: list[IncludedFile] = []
task_vars_cache: dict[tuple, dict] = {}
for res in results:
original_host = res._host
original_task = res._task
original_host = res.host
original_task = res.task
if original_task.action in C._ACTION_ALL_INCLUDES:
if original_task.loop:
if 'results' not in res._result:
if 'results' not in res._return_data:
continue
include_results = res._result['results']
include_results = res._loop_results
else:
include_results = [res._result]
include_results = [res._return_data]
for include_result in include_results:
# if the task result was skipped or failed, continue

@ -227,8 +227,6 @@ class Task(Base, Conditional, Taggable, CollectionSearch, Notifiable, Delegatabl
raise AnsibleError("you must specify a value when using %s" % k, obj=ds)
new_ds['loop_with'] = loop_name
new_ds['loop'] = v
# display.deprecated("with_ type loops are being phased out, use the 'loop' keyword instead",
# version="2.10", collection_name='ansible.builtin')
def preprocess_data(self, ds):
"""

@ -20,24 +20,26 @@
from __future__ import annotations
import abc
import functools
import types
import typing as t
from ansible import constants as C
from ansible.errors import AnsibleError
from ansible.utils.display import Display
from ansible.utils import display as _display
from ansible.module_utils._internal import _plugin_exec_context
from ansible.module_utils._internal import _plugin_info
display = Display()
if t.TYPE_CHECKING:
from .loader import PluginPathContext
from . import loader as _t_loader
# Global so that all instances of a PluginLoader will share the caches
MODULE_CACHE = {} # type: dict[str, dict[str, types.ModuleType]]
PATH_CACHE = {} # type: dict[str, list[PluginPathContext] | None]
PLUGIN_PATH_CACHE = {} # type: dict[str, dict[str, dict[str, PluginPathContext]]]
PATH_CACHE = {} # type: dict[str, list[_t_loader.PluginPathContext] | None]
PLUGIN_PATH_CACHE = {} # type: dict[str, dict[str, dict[str, _t_loader.PluginPathContext]]]
def get_plugin_class(obj):
@ -50,10 +52,10 @@ def get_plugin_class(obj):
class _ConfigurablePlugin(t.Protocol):
"""Protocol to provide type-safe access to config for plugin-related mixins."""
def get_option(self, option: str, hostvars: dict[str, object] | None = None) -> object: ...
def get_option(self, option: str, hostvars: dict[str, object] | None = None) -> t.Any: ...
class _AnsiblePluginInfoMixin(_plugin_exec_context.HasPluginInfo):
class _AnsiblePluginInfoMixin(_plugin_info.HasPluginInfo):
"""Mixin to provide type annotations and default values for existing PluginLoader-set load-time attrs."""
_original_path: str | None = None
_load_name: str | None = None
@ -102,6 +104,14 @@ class AnsiblePlugin(_AnsiblePluginInfoMixin, _ConfigurablePlugin, metaclass=abc.
raise KeyError(str(e))
return option_value, origin
@functools.cached_property
def __plugin_info(self):
"""
Internal cached property to retrieve `PluginInfo` for this plugin instance.
Only for use by the `AnsiblePlugin` base class.
"""
return _plugin_info.get_plugin_info(self)
def get_option(self, option, hostvars=None):
if option not in self._options:
@ -117,7 +127,7 @@ class AnsiblePlugin(_AnsiblePluginInfoMixin, _ConfigurablePlugin, metaclass=abc.
def set_option(self, option, value):
self._options[option] = C.config.get_config_value(option, plugin_type=self.plugin_type, plugin_name=self._load_name, direct={option: value})
C.handle_config_noise(display)
_display._report_config_warnings(self.__plugin_info)
def set_options(self, task_keys=None, var_options=None, direct=None):
"""
@ -134,7 +144,7 @@ class AnsiblePlugin(_AnsiblePluginInfoMixin, _ConfigurablePlugin, metaclass=abc.
if self.allow_extras and var_options and '_extras' in var_options:
# these are largely unvalidated passthroughs, either plugin or underlying API will validate
self._options['_extras'] = var_options['_extras']
C.handle_config_noise(display)
_display._report_config_warnings(self.__plugin_info)
def has_option(self, option):
if not self._options:

@ -318,13 +318,6 @@ class ActionBase(ABC, _AnsiblePluginInfoMixin):
final_environment: dict[str, t.Any] = {}
self._compute_environment_string(final_environment)
# `modify_module` adapts PluginInfo to allow target-side use of `PluginExecContext` since modules aren't plugins
plugin = PluginInfo(
requested_name=module_name,
resolved_name=result.resolved_fqcn,
type='module',
)
# modify_module will exit early if interpreter discovery is required; re-run after if necessary
for _dummy in (1, 2):
try:
@ -338,7 +331,6 @@ class ActionBase(ABC, _AnsiblePluginInfoMixin):
async_timeout=self._task.async_val,
environment=final_environment,
remote_is_local=bool(getattr(self._connection, '_remote_is_local', False)),
plugin=plugin,
become_plugin=self._connection.become,
)
@ -649,12 +641,12 @@ class ActionBase(ABC, _AnsiblePluginInfoMixin):
# done. Make the files +x if we're asked to, and return.
if not self._is_become_unprivileged():
if execute:
# Can't depend on the file being transferred with execute permissions.
# Can't depend on the file being transferred with required permissions.
# Only need user perms because no become was used here
res = self._remote_chmod(remote_paths, 'u+x')
res = self._remote_chmod(remote_paths, 'u+rwx')
if res['rc'] != 0:
raise AnsibleError(
'Failed to set execute bit on remote files '
'Failed to set permissions on remote files '
'(rc: {0}, err: {1})'.format(
res['rc'],
to_native(res['stderr'])))
@ -695,10 +687,10 @@ class ActionBase(ABC, _AnsiblePluginInfoMixin):
return remote_paths
# Step 3b: Set execute if we need to. We do this before anything else
# because some of the methods below might work but not let us set +x
# as part of them.
# because some of the methods below might work but not let us set
# permissions as part of them.
if execute:
res = self._remote_chmod(remote_paths, 'u+x')
res = self._remote_chmod(remote_paths, 'u+rwx')
if res['rc'] != 0:
raise AnsibleError(
'Failed to set file mode or acl on remote temporary files '

@ -28,10 +28,8 @@ class ActionModule(ActionBase):
# TODO: remove in favor of controller side argspec detecting valid arguments
# network facts modules must support gather_subset
try:
name = self._connection.ansible_name.removeprefix('ansible.netcommon.')
except AttributeError:
name = self._connection._load_name.split('.')[-1]
if name not in ('network_cli', 'httpapi', 'netconf'):
subset = mod_args.pop('gather_subset', None)
if subset not in ('all', ['all'], None):

@ -24,15 +24,14 @@ import re
import sys
import textwrap
import typing as t
import collections.abc as _c
from typing import TYPE_CHECKING
from collections.abc import MutableMapping
from copy import deepcopy
from ansible import constants as C
from ansible.module_utils._internal import _datatag
from ansible.module_utils.common.messages import ErrorSummary
from ansible._internal._yaml import _dumper
from ansible.plugins import AnsiblePlugin
from ansible.utils.color import stringc
@ -44,7 +43,7 @@ from ansible._internal._templating import _engine
import yaml
if TYPE_CHECKING:
from ansible.executor.task_result import TaskResult
from ansible.executor.task_result import CallbackTaskResult
global_display = Display()
@ -59,6 +58,19 @@ _YAML_BREAK_CHARS = '\n\x85\u2028\u2029' # NL, NEL, LS, PS
_SPACE_BREAK_RE = re.compile(fr' +([{_YAML_BREAK_CHARS}])')
_T_callable = t.TypeVar("_T_callable", bound=t.Callable)
def _callback_base_impl(wrapped: _T_callable) -> _T_callable:
"""
Decorator for the no-op methods on the `CallbackBase` base class.
Used to avoid unnecessary dispatch overhead to no-op base callback methods.
"""
wrapped._base_impl = True
return wrapped
class _AnsibleCallbackDumper(_dumper.AnsibleDumper):
def __init__(self, *args, lossy: bool = False, **kwargs):
super().__init__(*args, **kwargs)
@ -87,6 +99,8 @@ class _AnsibleCallbackDumper(_dumper.AnsibleDumper):
def _register_representers(cls) -> None:
super()._register_representers()
# exact type checks occur first against representers, then subclasses against multi-representers
cls.add_representer(str, cls._pretty_represent_str)
cls.add_multi_representer(str, cls._pretty_represent_str)
@ -140,12 +154,17 @@ class CallbackBase(AnsiblePlugin):
custom actions.
"""
def __init__(self, display=None, options=None):
def __init__(self, display: Display | None = None, options: dict[str, t.Any] | None = None) -> None:
super().__init__()
if display:
self._display = display
else:
self._display = global_display
# FUTURE: fix double-loading of non-collection stdout callback plugins that don't set CALLBACK_NEEDS_ENABLED
# FUTURE: this code is jacked for 2.x- it should just use the type names and always assume 2.0+ for normal cases
if self._display.verbosity >= 4:
name = getattr(self, 'CALLBACK_NAME', 'unnamed')
ctype = getattr(self, 'CALLBACK_TYPE', 'old')
@ -155,7 +174,8 @@ class CallbackBase(AnsiblePlugin):
self.disabled = False
self.wants_implicit_tasks = False
self._plugin_options = {}
self._plugin_options: dict[str, t.Any] = {}
if options is not None:
self.set_options(options)
@ -164,6 +184,8 @@ class CallbackBase(AnsiblePlugin):
'ansible_loop_var', 'ansible_index_var', 'ansible_loop',
)
self._current_task_result: CallbackTaskResult | None = None
# helper for callbacks, so they don't all have to include deepcopy
_copy_result = deepcopy
@ -185,25 +207,30 @@ class CallbackBase(AnsiblePlugin):
self._plugin_options = C.config.get_plugin_options(self.plugin_type, self._load_name, keys=task_keys, variables=var_options, direct=direct)
@staticmethod
def host_label(result):
"""Return label for the hostname (& delegated hostname) of a task
result.
"""
label = "%s" % result._host.get_name()
if result._task.delegate_to and result._task.delegate_to != result._host.get_name():
def host_label(result: CallbackTaskResult) -> str:
"""Return label for the hostname (& delegated hostname) of a task result."""
label = result.host.get_name()
if result.task.delegate_to and result.task.delegate_to != result.host.get_name():
# show delegated host
label += " -> %s" % result._task.delegate_to
label += " -> %s" % result.task.delegate_to
# in case we have 'extra resolution'
ahost = result._result.get('_ansible_delegated_vars', {}).get('ansible_host', result._task.delegate_to)
if result._task.delegate_to != ahost:
ahost = result.result.get('_ansible_delegated_vars', {}).get('ansible_host', result.task.delegate_to)
if result.task.delegate_to != ahost:
label += "(%s)" % ahost
return label
def _run_is_verbose(self, result, verbosity=0):
return ((self._display.verbosity > verbosity or result._result.get('_ansible_verbose_always', False) is True)
and result._result.get('_ansible_verbose_override', False) is False)
def _dump_results(self, result, indent=None, sort_keys=True, keep_invocation=False, serialize=True):
def _run_is_verbose(self, result: CallbackTaskResult, verbosity: int = 0) -> bool:
return ((self._display.verbosity > verbosity or result.result.get('_ansible_verbose_always', False) is True)
and result.result.get('_ansible_verbose_override', False) is False)
def _dump_results(
self,
result: _c.Mapping[str, t.Any],
indent: int | None = None,
sort_keys: bool = True,
keep_invocation: bool = False,
serialize: bool = True,
) -> str:
try:
result_format = self.get_option('result_format')
except KeyError:
@ -253,10 +280,12 @@ class CallbackBase(AnsiblePlugin):
# that want to further modify the result, or use custom serialization
return abridged_result
# DTFIX-RELEASE: Switch to stock json/yaml serializers here? We should always have a transformed plain-types result.
if result_format == 'json':
return json.dumps(abridged_result, cls=_fallback_to_str.Encoder, indent=indent, ensure_ascii=False, sort_keys=sort_keys)
elif result_format == 'yaml':
if result_format == 'yaml':
# None is a sentinel in this case that indicates default behavior
# default behavior for yaml is to prettify results
lossy = pretty_results in (None, True)
@ -281,22 +310,28 @@ class CallbackBase(AnsiblePlugin):
' ' * (indent or 4)
)
def _handle_warnings(self, res: dict[str, t.Any]) -> None:
# DTFIX-RELEASE: add test to exercise this case
raise ValueError(f'Unsupported result_format {result_format!r}.')
def _handle_warnings(self, res: _c.MutableMapping[str, t.Any]) -> None:
"""Display warnings and deprecation warnings sourced by task execution."""
for warning in res.pop('warnings', []):
if res.pop('warnings', None) and self._current_task_result and (warnings := self._current_task_result.warnings):
# display warnings from the current task result if `warnings` was not removed from `result` (or made falsey)
for warning in warnings:
# DTFIX-RELEASE: what to do about propagating wrap_text from the original display.warning call?
self._display._warning(warning, wrap_text=False)
for warning in res.pop('deprecations', []):
self._display._deprecated(warning)
def _handle_exception(self, result: dict[str, t.Any], use_stderr: bool = False) -> None:
error_summary: ErrorSummary | None
if res.pop('deprecations', None) and self._current_task_result and (deprecations := self._current_task_result.deprecations):
# display deprecations from the current task result if `deprecations` was not removed from `result` (or made falsey)
for deprecation in deprecations:
self._display._deprecated(deprecation)
if error_summary := result.pop('exception', None):
self._display._error(error_summary, wrap_text=False, stderr=use_stderr)
def _handle_exception(self, result: _c.MutableMapping[str, t.Any], use_stderr: bool = False) -> None:
if result.pop('exception', None) and self._current_task_result and (exception := self._current_task_result.exception):
# display exception from the current task result if `exception` was not removed from `result` (or made falsey)
self._display._error(exception, wrap_text=False, stderr=use_stderr)
def _handle_warnings_and_exception(self, result: TaskResult) -> None:
def _handle_warnings_and_exception(self, result: CallbackTaskResult) -> None:
"""Standardized handling of warnings/deprecations and exceptions from a task/item result."""
# DTFIX-RELEASE: make/doc/porting-guide a public version of this method?
try:
@ -304,8 +339,8 @@ class CallbackBase(AnsiblePlugin):
except KeyError:
use_stderr = False
self._handle_warnings(result._result)
self._handle_exception(result._result, use_stderr=use_stderr)
self._handle_warnings(result.result)
self._handle_exception(result.result, use_stderr=use_stderr)
def _serialize_diff(self, diff):
try:
@ -322,7 +357,8 @@ class CallbackBase(AnsiblePlugin):
if result_format == 'json':
return json.dumps(diff, sort_keys=True, indent=4, separators=(u',', u': ')) + u'\n'
elif result_format == 'yaml':
if result_format == 'yaml':
# None is a sentinel in this case that indicates default behavior
# default behavior for yaml is to prettify results
lossy = pretty_results in (None, True)
@ -338,6 +374,9 @@ class CallbackBase(AnsiblePlugin):
' '
)
# DTFIX-RELEASE: add test to exercise this case
raise ValueError(f'Unsupported result_format {result_format!r}.')
def _get_diff(self, difflist):
if not isinstance(difflist, list):
@ -356,7 +395,7 @@ class CallbackBase(AnsiblePlugin):
if 'before' in diff and 'after' in diff:
# format complex structures into 'files'
for x in ['before', 'after']:
if isinstance(diff[x], MutableMapping):
if isinstance(diff[x], _c.Mapping):
diff[x] = self._serialize_diff(diff[x])
elif diff[x] is None:
diff[x] = ''
@ -398,7 +437,7 @@ class CallbackBase(AnsiblePlugin):
ret.append(diff['prepared'])
return u''.join(ret)
def _get_item_label(self, result):
def _get_item_label(self, result: _c.Mapping[str, t.Any]) -> t.Any:
""" retrieves the value to be displayed as a label for an item entry from a result object"""
if result.get('_ansible_no_log', False):
item = "(censored due to no_log)"
@ -406,9 +445,9 @@ class CallbackBase(AnsiblePlugin):
item = result.get('_ansible_item_label', result.get('item'))
return item
def _process_items(self, result):
def _process_items(self, result: CallbackTaskResult) -> None:
# just remove them as now they get handled by individual callbacks
del result._result['results']
del result.result['results']
def _clean_results(self, result, task_name):
""" removes data from results for display """
@ -434,74 +473,97 @@ class CallbackBase(AnsiblePlugin):
def set_play_context(self, play_context):
pass
@_callback_base_impl
def on_any(self, *args, **kwargs):
pass
@_callback_base_impl
def runner_on_failed(self, host, res, ignore_errors=False):
pass
@_callback_base_impl
def runner_on_ok(self, host, res):
pass
@_callback_base_impl
def runner_on_skipped(self, host, item=None):
pass
@_callback_base_impl
def runner_on_unreachable(self, host, res):
pass
@_callback_base_impl
def runner_on_no_hosts(self):
pass
@_callback_base_impl
def runner_on_async_poll(self, host, res, jid, clock):
pass
@_callback_base_impl
def runner_on_async_ok(self, host, res, jid):
pass
@_callback_base_impl
def runner_on_async_failed(self, host, res, jid):
pass
@_callback_base_impl
def playbook_on_start(self):
pass
@_callback_base_impl
def playbook_on_notify(self, host, handler):
pass
@_callback_base_impl
def playbook_on_no_hosts_matched(self):
pass
@_callback_base_impl
def playbook_on_no_hosts_remaining(self):
pass
@_callback_base_impl
def playbook_on_task_start(self, name, is_conditional):
pass
@_callback_base_impl
def playbook_on_vars_prompt(self, varname, private=True, prompt=None, encrypt=None, confirm=False, salt_size=None, salt=None, default=None, unsafe=None):
pass
@_callback_base_impl
def playbook_on_setup(self):
pass
@_callback_base_impl
def playbook_on_import_for_host(self, host, imported_file):
pass
@_callback_base_impl
def playbook_on_not_import_for_host(self, host, missing_file):
pass
@_callback_base_impl
def playbook_on_play_start(self, name):
pass
@_callback_base_impl
def playbook_on_stats(self, stats):
pass
@_callback_base_impl
def on_file_diff(self, host, diff):
pass
# V2 METHODS, by default they call v1 counterparts if possible
@_callback_base_impl
def v2_on_any(self, *args, **kwargs):
self.on_any(args, kwargs)
def v2_runner_on_failed(self, result: TaskResult, ignore_errors: bool = False) -> None:
@_callback_base_impl
def v2_runner_on_failed(self, result: CallbackTaskResult, ignore_errors: bool = False) -> None:
"""Process results of a failed task.
Note: The value of 'ignore_errors' tells Ansible whether to
@ -512,7 +574,7 @@ class CallbackBase(AnsiblePlugin):
issues (for example, missing packages), or syntax errors.
:param result: The parameters of the task and its results.
:type result: TaskResult
:type result: CallbackTaskResult
:param ignore_errors: Whether Ansible should continue \
running tasks on the host where the task failed.
:type ignore_errors: bool
@ -520,147 +582,172 @@ class CallbackBase(AnsiblePlugin):
:return: None
:rtype: None
"""
host = result._host.get_name()
self.runner_on_failed(host, result._result, ignore_errors)
host = result.host.get_name()
self.runner_on_failed(host, result.result, ignore_errors)
def v2_runner_on_ok(self, result: TaskResult) -> None:
@_callback_base_impl
def v2_runner_on_ok(self, result: CallbackTaskResult) -> None:
"""Process results of a successful task.
:param result: The parameters of the task and its results.
:type result: TaskResult
:type result: CallbackTaskResult
:return: None
:rtype: None
"""
host = result._host.get_name()
self.runner_on_ok(host, result._result)
host = result.host.get_name()
self.runner_on_ok(host, result.result)
def v2_runner_on_skipped(self, result: TaskResult) -> None:
@_callback_base_impl
def v2_runner_on_skipped(self, result: CallbackTaskResult) -> None:
"""Process results of a skipped task.
:param result: The parameters of the task and its results.
:type result: TaskResult
:type result: CallbackTaskResult
:return: None
:rtype: None
"""
if C.DISPLAY_SKIPPED_HOSTS:
host = result._host.get_name()
self.runner_on_skipped(host, self._get_item_label(getattr(result._result, 'results', {})))
host = result.host.get_name()
self.runner_on_skipped(host, self._get_item_label(getattr(result.result, 'results', {})))
def v2_runner_on_unreachable(self, result: TaskResult) -> None:
@_callback_base_impl
def v2_runner_on_unreachable(self, result: CallbackTaskResult) -> None:
"""Process results of a task if a target node is unreachable.
:param result: The parameters of the task and its results.
:type result: TaskResult
:type result: CallbackTaskResult
:return: None
:rtype: None
"""
host = result._host.get_name()
self.runner_on_unreachable(host, result._result)
host = result.host.get_name()
self.runner_on_unreachable(host, result.result)
def v2_runner_on_async_poll(self, result: TaskResult) -> None:
@_callback_base_impl
def v2_runner_on_async_poll(self, result: CallbackTaskResult) -> None:
"""Get details about an unfinished task running in async mode.
Note: The value of the `poll` keyword in the task determines
the interval at which polling occurs and this method is run.
:param result: The parameters of the task and its status.
:type result: TaskResult
:type result: CallbackTaskResult
:rtype: None
:rtype: None
"""
host = result._host.get_name()
jid = result._result.get('ansible_job_id')
host = result.host.get_name()
jid = result.result.get('ansible_job_id')
# FIXME, get real clock
clock = 0
self.runner_on_async_poll(host, result._result, jid, clock)
self.runner_on_async_poll(host, result.result, jid, clock)
def v2_runner_on_async_ok(self, result: TaskResult) -> None:
@_callback_base_impl
def v2_runner_on_async_ok(self, result: CallbackTaskResult) -> None:
"""Process results of a successful task that ran in async mode.
:param result: The parameters of the task and its results.
:type result: TaskResult
:type result: CallbackTaskResult
:return: None
:rtype: None
"""
host = result._host.get_name()
jid = result._result.get('ansible_job_id')
self.runner_on_async_ok(host, result._result, jid)
host = result.host.get_name()
jid = result.result.get('ansible_job_id')
self.runner_on_async_ok(host, result.result, jid)
def v2_runner_on_async_failed(self, result):
host = result._host.get_name()
@_callback_base_impl
def v2_runner_on_async_failed(self, result: CallbackTaskResult) -> None:
host = result.host.get_name()
# Attempt to get the async job ID. If the job does not finish before the
# async timeout value, the ID may be within the unparsed 'async_result' dict.
jid = result._result.get('ansible_job_id')
if not jid and 'async_result' in result._result:
jid = result._result['async_result'].get('ansible_job_id')
self.runner_on_async_failed(host, result._result, jid)
jid = result.result.get('ansible_job_id')
if not jid and 'async_result' in result.result:
jid = result.result['async_result'].get('ansible_job_id')
self.runner_on_async_failed(host, result.result, jid)
@_callback_base_impl
def v2_playbook_on_start(self, playbook):
self.playbook_on_start()
@_callback_base_impl
def v2_playbook_on_notify(self, handler, host):
self.playbook_on_notify(host, handler)
@_callback_base_impl
def v2_playbook_on_no_hosts_matched(self):
self.playbook_on_no_hosts_matched()
@_callback_base_impl
def v2_playbook_on_no_hosts_remaining(self):
self.playbook_on_no_hosts_remaining()
@_callback_base_impl
def v2_playbook_on_task_start(self, task, is_conditional):
self.playbook_on_task_start(task.name, is_conditional)
# FIXME: not called
@_callback_base_impl
def v2_playbook_on_cleanup_task_start(self, task):
pass # no v1 correspondence
@_callback_base_impl
def v2_playbook_on_handler_task_start(self, task):
pass # no v1 correspondence
@_callback_base_impl
def v2_playbook_on_vars_prompt(self, varname, private=True, prompt=None, encrypt=None, confirm=False, salt_size=None, salt=None, default=None, unsafe=None):
self.playbook_on_vars_prompt(varname, private, prompt, encrypt, confirm, salt_size, salt, default, unsafe)
# FIXME: not called
def v2_playbook_on_import_for_host(self, result, imported_file):
host = result._host.get_name()
@_callback_base_impl
def v2_playbook_on_import_for_host(self, result: CallbackTaskResult, imported_file) -> None:
host = result.host.get_name()
self.playbook_on_import_for_host(host, imported_file)
# FIXME: not called
def v2_playbook_on_not_import_for_host(self, result, missing_file):
host = result._host.get_name()
@_callback_base_impl
def v2_playbook_on_not_import_for_host(self, result: CallbackTaskResult, missing_file) -> None:
host = result.host.get_name()
self.playbook_on_not_import_for_host(host, missing_file)
@_callback_base_impl
def v2_playbook_on_play_start(self, play):
self.playbook_on_play_start(play.name)
@_callback_base_impl
def v2_playbook_on_stats(self, stats):
self.playbook_on_stats(stats)
def v2_on_file_diff(self, result):
if 'diff' in result._result:
host = result._host.get_name()
self.on_file_diff(host, result._result['diff'])
@_callback_base_impl
def v2_on_file_diff(self, result: CallbackTaskResult) -> None:
if 'diff' in result.result:
host = result.host.get_name()
self.on_file_diff(host, result.result['diff'])
@_callback_base_impl
def v2_playbook_on_include(self, included_file):
pass # no v1 correspondence
def v2_runner_item_on_ok(self, result: TaskResult) -> None:
@_callback_base_impl
def v2_runner_item_on_ok(self, result: CallbackTaskResult) -> None:
pass
def v2_runner_item_on_failed(self, result: TaskResult) -> None:
@_callback_base_impl
def v2_runner_item_on_failed(self, result: CallbackTaskResult) -> None:
pass
def v2_runner_item_on_skipped(self, result: TaskResult) -> None:
@_callback_base_impl
def v2_runner_item_on_skipped(self, result: CallbackTaskResult) -> None:
pass
def v2_runner_retry(self, result):
@_callback_base_impl
def v2_runner_retry(self, result: CallbackTaskResult) -> None:
pass
@_callback_base_impl
def v2_runner_on_start(self, host, task):
"""Event used when host begins execution of a task

@ -21,7 +21,7 @@ DOCUMENTATION = """
from ansible import constants as C
from ansible import context
from ansible.executor.task_result import TaskResult
from ansible.executor.task_result import CallbackTaskResult
from ansible.playbook.task_include import TaskInclude
from ansible.plugins.callback import CallbackBase
from ansible.utils.color import colorize, hostcolor
@ -47,39 +47,39 @@ class CallbackModule(CallbackBase):
self._task_type_cache = {}
super(CallbackModule, self).__init__()
def v2_runner_on_failed(self, result: TaskResult, ignore_errors: bool = False) -> None:
def v2_runner_on_failed(self, result: CallbackTaskResult, ignore_errors: bool = False) -> None:
host_label = self.host_label(result)
if self._last_task_banner != result._task._uuid:
self._print_task_banner(result._task)
if self._last_task_banner != result.task._uuid:
self._print_task_banner(result.task)
self._handle_warnings_and_exception(result)
# FIXME: this method should not exist, delegate "suggested keys to display" to the plugin or something... As-is, the placement of this
# call obliterates `results`, which causes a task summary to be printed on loop failures, which we don't do anywhere else.
self._clean_results(result._result, result._task.action)
self._clean_results(result.result, result.task.action)
if result._task.loop and 'results' in result._result:
if result.task.loop and 'results' in result.result:
self._process_items(result)
else:
if self._display.verbosity < 2 and self.get_option('show_task_path_on_failure'):
self._print_task_path(result._task)
msg = "fatal: [%s]: FAILED! => %s" % (host_label, self._dump_results(result._result))
self._print_task_path(result.task)
msg = "fatal: [%s]: FAILED! => %s" % (host_label, self._dump_results(result.result))
self._display.display(msg, color=C.COLOR_ERROR, stderr=self.get_option('display_failed_stderr'))
if ignore_errors:
self._display.display("...ignoring", color=C.COLOR_SKIP)
def v2_runner_on_ok(self, result: TaskResult) -> None:
def v2_runner_on_ok(self, result: CallbackTaskResult) -> None:
host_label = self.host_label(result)
if isinstance(result._task, TaskInclude):
if self._last_task_banner != result._task._uuid:
self._print_task_banner(result._task)
if isinstance(result.task, TaskInclude):
if self._last_task_banner != result.task._uuid:
self._print_task_banner(result.task)
return
elif result._result.get('changed', False):
if self._last_task_banner != result._task._uuid:
self._print_task_banner(result._task)
elif result.result.get('changed', False):
if self._last_task_banner != result.task._uuid:
self._print_task_banner(result.task)
msg = "changed: [%s]" % (host_label,)
color = C.COLOR_CHANGED
@ -87,52 +87,52 @@ class CallbackModule(CallbackBase):
if not self.get_option('display_ok_hosts'):
return
if self._last_task_banner != result._task._uuid:
self._print_task_banner(result._task)
if self._last_task_banner != result.task._uuid:
self._print_task_banner(result.task)
msg = "ok: [%s]" % (host_label,)
color = C.COLOR_OK
self._handle_warnings_and_exception(result)
if result._task.loop and 'results' in result._result:
if result.task.loop and 'results' in result.result:
self._process_items(result)
else:
self._clean_results(result._result, result._task.action)
self._clean_results(result.result, result.task.action)
if self._run_is_verbose(result):
msg += " => %s" % (self._dump_results(result._result),)
msg += " => %s" % (self._dump_results(result.result),)
self._display.display(msg, color=color)
def v2_runner_on_skipped(self, result: TaskResult) -> None:
def v2_runner_on_skipped(self, result: CallbackTaskResult) -> None:
if self.get_option('display_skipped_hosts'):
self._clean_results(result._result, result._task.action)
self._clean_results(result.result, result.task.action)
if self._last_task_banner != result._task._uuid:
self._print_task_banner(result._task)
if self._last_task_banner != result.task._uuid:
self._print_task_banner(result.task)
self._handle_warnings_and_exception(result)
if result._task.loop is not None and 'results' in result._result:
if result.task.loop is not None and 'results' in result.result:
self._process_items(result)
msg = "skipping: [%s]" % result._host.get_name()
msg = "skipping: [%s]" % result.host.get_name()
if self._run_is_verbose(result):
msg += " => %s" % self._dump_results(result._result)
msg += " => %s" % self._dump_results(result.result)
self._display.display(msg, color=C.COLOR_SKIP)
def v2_runner_on_unreachable(self, result: TaskResult) -> None:
if self._last_task_banner != result._task._uuid:
self._print_task_banner(result._task)
def v2_runner_on_unreachable(self, result: CallbackTaskResult) -> None:
if self._last_task_banner != result.task._uuid:
self._print_task_banner(result.task)
self._handle_warnings_and_exception(result)
host_label = self.host_label(result)
msg = "fatal: [%s]: UNREACHABLE! => %s" % (host_label, self._dump_results(result._result))
msg = "fatal: [%s]: UNREACHABLE! => %s" % (host_label, self._dump_results(result.result))
self._display.display(msg, color=C.COLOR_UNREACHABLE, stderr=self.get_option('display_failed_stderr'))
if result._task.ignore_unreachable:
if result.task.ignore_unreachable:
self._display.display("...ignoring", color=C.COLOR_SKIP)
def v2_playbook_on_no_hosts_matched(self):
@ -222,29 +222,29 @@ class CallbackModule(CallbackBase):
self._display.banner(msg)
def v2_on_file_diff(self, result):
if result._task.loop and 'results' in result._result:
for res in result._result['results']:
def v2_on_file_diff(self, result: CallbackTaskResult) -> None:
if result.task.loop and 'results' in result.result:
for res in result.result['results']:
if 'diff' in res and res['diff'] and res.get('changed', False):
diff = self._get_diff(res['diff'])
if diff:
if self._last_task_banner != result._task._uuid:
self._print_task_banner(result._task)
if self._last_task_banner != result.task._uuid:
self._print_task_banner(result.task)
self._display.display(diff)
elif 'diff' in result._result and result._result['diff'] and result._result.get('changed', False):
diff = self._get_diff(result._result['diff'])
elif 'diff' in result.result and result.result['diff'] and result.result.get('changed', False):
diff = self._get_diff(result.result['diff'])
if diff:
if self._last_task_banner != result._task._uuid:
self._print_task_banner(result._task)
if self._last_task_banner != result.task._uuid:
self._print_task_banner(result.task)
self._display.display(diff)
def v2_runner_item_on_ok(self, result: TaskResult) -> None:
def v2_runner_item_on_ok(self, result: CallbackTaskResult) -> None:
host_label = self.host_label(result)
if isinstance(result._task, TaskInclude):
if isinstance(result.task, TaskInclude):
return
elif result._result.get('changed', False):
if self._last_task_banner != result._task._uuid:
self._print_task_banner(result._task)
elif result.result.get('changed', False):
if self._last_task_banner != result.task._uuid:
self._print_task_banner(result.task)
msg = 'changed'
color = C.COLOR_CHANGED
@ -252,47 +252,47 @@ class CallbackModule(CallbackBase):
if not self.get_option('display_ok_hosts'):
return
if self._last_task_banner != result._task._uuid:
self._print_task_banner(result._task)
if self._last_task_banner != result.task._uuid:
self._print_task_banner(result.task)
msg = 'ok'
color = C.COLOR_OK
self._handle_warnings_and_exception(result)
msg = "%s: [%s] => (item=%s)" % (msg, host_label, self._get_item_label(result._result))
self._clean_results(result._result, result._task.action)
msg = "%s: [%s] => (item=%s)" % (msg, host_label, self._get_item_label(result.result))
self._clean_results(result.result, result.task.action)
if self._run_is_verbose(result):
msg += " => %s" % self._dump_results(result._result)
msg += " => %s" % self._dump_results(result.result)
self._display.display(msg, color=color)
def v2_runner_item_on_failed(self, result: TaskResult) -> None:
if self._last_task_banner != result._task._uuid:
self._print_task_banner(result._task)
def v2_runner_item_on_failed(self, result: CallbackTaskResult) -> None:
if self._last_task_banner != result.task._uuid:
self._print_task_banner(result.task)
self._handle_warnings_and_exception(result)
host_label = self.host_label(result)
msg = "failed: [%s]" % (host_label,)
self._clean_results(result._result, result._task.action)
self._clean_results(result.result, result.task.action)
self._display.display(
msg + " (item=%s) => %s" % (self._get_item_label(result._result), self._dump_results(result._result)),
msg + " (item=%s) => %s" % (self._get_item_label(result.result), self._dump_results(result.result)),
color=C.COLOR_ERROR,
stderr=self.get_option('display_failed_stderr')
)
def v2_runner_item_on_skipped(self, result: TaskResult) -> None:
def v2_runner_item_on_skipped(self, result: CallbackTaskResult) -> None:
if self.get_option('display_skipped_hosts'):
if self._last_task_banner != result._task._uuid:
self._print_task_banner(result._task)
if self._last_task_banner != result.task._uuid:
self._print_task_banner(result.task)
self._handle_warnings_and_exception(result)
self._clean_results(result._result, result._task.action)
msg = "skipping: [%s] => (item=%s) " % (result._host.get_name(), self._get_item_label(result._result))
self._clean_results(result.result, result.task.action)
msg = "skipping: [%s] => (item=%s) " % (result.host.get_name(), self._get_item_label(result.result))
if self._run_is_verbose(result):
msg += " => %s" % self._dump_results(result._result)
msg += " => %s" % self._dump_results(result.result)
self._display.display(msg, color=C.COLOR_SKIP)
def v2_playbook_on_include(self, included_file):
@ -377,37 +377,37 @@ class CallbackModule(CallbackBase):
if context.CLIARGS['check'] and self.get_option('check_mode_markers'):
self._display.banner("DRY RUN")
def v2_runner_retry(self, result):
task_name = result.task_name or result._task
def v2_runner_retry(self, result: CallbackTaskResult) -> None:
task_name = result.task_name or result.task
host_label = self.host_label(result)
msg = "FAILED - RETRYING: [%s]: %s (%d retries left)." % (host_label, task_name, result._result['retries'] - result._result['attempts'])
msg = "FAILED - RETRYING: [%s]: %s (%d retries left)." % (host_label, task_name, result.result['retries'] - result.result['attempts'])
if self._run_is_verbose(result, verbosity=2):
msg += "Result was: %s" % self._dump_results(result._result)
msg += "Result was: %s" % self._dump_results(result.result)
self._display.display(msg, color=C.COLOR_DEBUG)
def v2_runner_on_async_poll(self, result):
host = result._host.get_name()
jid = result._result.get('ansible_job_id')
started = result._result.get('started')
finished = result._result.get('finished')
def v2_runner_on_async_poll(self, result: CallbackTaskResult) -> None:
host = result.host.get_name()
jid = result.result.get('ansible_job_id')
started = result.result.get('started')
finished = result.result.get('finished')
self._display.display(
'ASYNC POLL on %s: jid=%s started=%s finished=%s' % (host, jid, started, finished),
color=C.COLOR_DEBUG
)
def v2_runner_on_async_ok(self, result):
host = result._host.get_name()
jid = result._result.get('ansible_job_id')
def v2_runner_on_async_ok(self, result: CallbackTaskResult) -> None:
host = result.host.get_name()
jid = result.result.get('ansible_job_id')
self._display.display("ASYNC OK on %s: jid=%s" % (host, jid), color=C.COLOR_DEBUG)
def v2_runner_on_async_failed(self, result):
host = result._host.get_name()
def v2_runner_on_async_failed(self, result: CallbackTaskResult) -> None:
host = result.host.get_name()
# Attempt to get the async job ID. If the job does not finish before the
# async timeout value, the ID may be within the unparsed 'async_result' dict.
jid = result._result.get('ansible_job_id')
if not jid and 'async_result' in result._result:
jid = result._result['async_result'].get('ansible_job_id')
jid = result.result.get('ansible_job_id')
if not jid and 'async_result' in result.result:
jid = result.result['async_result'].get('ansible_job_id')
self._display.display("ASYNC FAILED on %s: jid=%s" % (host, jid), color=C.COLOR_DEBUG)
def v2_playbook_on_notify(self, handler, host):

@ -86,12 +86,14 @@ import decimal
import os
import time
import re
import typing as t
from ansible import constants
from ansible.module_utils.common.messages import ErrorSummary
from ansible.module_utils.common.text.converters import to_bytes, to_text
from ansible.playbook.task import Task
from ansible.plugins.callback import CallbackBase
from ansible.executor.task_result import CallbackTaskResult
from ansible.playbook.included_file import IncludedFile
from ansible.utils._junit_xml import (
TestCase,
TestError,
@ -184,23 +186,23 @@ class CallbackModule(CallbackBase):
self._task_data[uuid] = TaskData(uuid, name, path, play, action)
def _finish_task(self, status, result):
def _finish_task(self, status: str, result: IncludedFile | CallbackTaskResult) -> None:
""" record the results of a task for a single host """
task_uuid = result._task._uuid
if isinstance(result, CallbackTaskResult):
task_uuid = result.task._uuid
host_uuid = result.host._uuid
host_name = result.host.name
if hasattr(result, '_host'):
host_uuid = result._host._uuid
host_name = result._host.name
if self._fail_on_change == 'true' and status == 'ok' and result.result.get('changed', False):
status = 'failed'
else:
task_uuid = result._task._uuid
host_uuid = 'include'
host_name = 'include'
task_data = self._task_data[task_uuid]
if self._fail_on_change == 'true' and status == 'ok' and result._result.get('changed', False):
status = 'failed'
# ignore failure if expected and toggle result if asked for
if status == 'failed' and 'EXPECTED FAILURE' in task_data.name:
status = 'ok'
@ -233,7 +235,8 @@ class CallbackModule(CallbackBase):
if host_data.status == 'included':
return TestCase(name=name, classname=junit_classname, time=duration, system_out=str(host_data.result))
res = host_data.result._result
task_result = t.cast(CallbackTaskResult, host_data.result)
res = task_result.result
rc = res.get('rc', 0)
dump = self._dump_results(res, indent=0)
dump = self._cleanse_string(dump)
@ -243,10 +246,8 @@ class CallbackModule(CallbackBase):
test_case = TestCase(name=name, classname=junit_classname, time=duration)
error_summary: ErrorSummary
if host_data.status == 'failed':
if error_summary := res.get('exception'):
if error_summary := task_result.exception:
message = error_summary._format()
output = error_summary.formatted_traceback
test_case.errors.append(TestError(message=message, output=output))
@ -309,19 +310,19 @@ class CallbackModule(CallbackBase):
def v2_playbook_on_handler_task_start(self, task: Task) -> None:
self._start_task(task)
def v2_runner_on_failed(self, result, ignore_errors=False):
def v2_runner_on_failed(self, result: CallbackTaskResult, ignore_errors=False) -> None:
if ignore_errors and self._fail_on_ignore != 'true':
self._finish_task('ok', result)
else:
self._finish_task('failed', result)
def v2_runner_on_ok(self, result):
def v2_runner_on_ok(self, result: CallbackTaskResult) -> None:
self._finish_task('ok', result)
def v2_runner_on_skipped(self, result):
def v2_runner_on_skipped(self, result: CallbackTaskResult) -> None:
self._finish_task('skipped', result)
def v2_playbook_on_include(self, included_file):
def v2_playbook_on_include(self, included_file: IncludedFile) -> None:
self._finish_task('included', included_file)
def v2_playbook_on_stats(self, stats):
@ -347,7 +348,7 @@ class TaskData:
if host.uuid in self.host_data:
if host.status == 'included':
# concatenate task include output from multiple items
host.result = '%s\n%s' % (self.host_data[host.uuid].result, host.result)
host.result = f'{self.host_data[host.uuid].result}\n{host.result}'
else:
raise Exception('%s: %s: %s: duplicate host callback: %s' % (self.path, self.play, self.name, host.name))
@ -359,7 +360,7 @@ class HostData:
Data about an individual host.
"""
def __init__(self, uuid, name, status, result):
def __init__(self, uuid: str, name: str, status: str, result: IncludedFile | CallbackTaskResult | str) -> None:
self.uuid = uuid
self.name = name
self.status = status

@ -15,7 +15,7 @@ DOCUMENTATION = """
- result_format_callback
"""
from ansible.executor.task_result import TaskResult
from ansible.executor.task_result import CallbackTaskResult
from ansible.plugins.callback import CallbackBase
from ansible import constants as C
@ -41,41 +41,41 @@ class CallbackModule(CallbackBase):
return buf + "\n"
def v2_runner_on_failed(self, result: TaskResult, ignore_errors: bool = False) -> None:
def v2_runner_on_failed(self, result: CallbackTaskResult, ignore_errors: bool = False) -> None:
self._handle_warnings_and_exception(result)
if result._task.action in C.MODULE_NO_JSON and 'module_stderr' not in result._result:
self._display.display(self._command_generic_msg(result._host.get_name(), result._result, "FAILED"), color=C.COLOR_ERROR)
if result.task.action in C.MODULE_NO_JSON and 'module_stderr' not in result.result:
self._display.display(self._command_generic_msg(result.host.get_name(), result.result, "FAILED"), color=C.COLOR_ERROR)
else:
self._display.display("%s | FAILED! => %s" % (result._host.get_name(), self._dump_results(result._result, indent=4)), color=C.COLOR_ERROR)
self._display.display("%s | FAILED! => %s" % (result.host.get_name(), self._dump_results(result.result, indent=4)), color=C.COLOR_ERROR)
def v2_runner_on_ok(self, result: TaskResult) -> None:
def v2_runner_on_ok(self, result: CallbackTaskResult) -> None:
self._handle_warnings_and_exception(result)
self._clean_results(result._result, result._task.action)
self._clean_results(result.result, result.task.action)
if result._result.get('changed', False):
if result.result.get('changed', False):
color = C.COLOR_CHANGED
state = 'CHANGED'
else:
color = C.COLOR_OK
state = 'SUCCESS'
if result._task.action in C.MODULE_NO_JSON and 'ansible_job_id' not in result._result:
self._display.display(self._command_generic_msg(result._host.get_name(), result._result, state), color=color)
if result.task.action in C.MODULE_NO_JSON and 'ansible_job_id' not in result.result:
self._display.display(self._command_generic_msg(result.host.get_name(), result.result, state), color=color)
else:
self._display.display("%s | %s => %s" % (result._host.get_name(), state, self._dump_results(result._result, indent=4)), color=color)
self._display.display("%s | %s => %s" % (result.host.get_name(), state, self._dump_results(result.result, indent=4)), color=color)
def v2_runner_on_skipped(self, result: TaskResult) -> None:
def v2_runner_on_skipped(self, result: CallbackTaskResult) -> None:
self._handle_warnings_and_exception(result)
self._display.display("%s | SKIPPED" % (result._host.get_name()), color=C.COLOR_SKIP)
self._display.display("%s | SKIPPED" % (result.host.get_name()), color=C.COLOR_SKIP)
def v2_runner_on_unreachable(self, result: TaskResult) -> None:
def v2_runner_on_unreachable(self, result: CallbackTaskResult) -> None:
self._handle_warnings_and_exception(result)
self._display.display("%s | UNREACHABLE! => %s" % (result._host.get_name(), self._dump_results(result._result, indent=4)), color=C.COLOR_UNREACHABLE)
self._display.display("%s | UNREACHABLE! => %s" % (result.host.get_name(), self._dump_results(result.result, indent=4)), color=C.COLOR_UNREACHABLE)
def v2_on_file_diff(self, result):
if 'diff' in result._result and result._result['diff']:
self._display.display(self._get_diff(result._result['diff']))
if 'diff' in result.result and result.result['diff']:
self._display.display(self._get_diff(result.result['diff']))

@ -16,6 +16,8 @@ DOCUMENTATION = """
from ansible import constants as C
from ansible.plugins.callback import CallbackBase
from ansible.template import Templar
from ansible.executor.task_result import CallbackTaskResult
from ansible.module_utils._internal import _deprecator
class CallbackModule(CallbackBase):
@ -31,7 +33,12 @@ class CallbackModule(CallbackBase):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self._display.deprecated('The oneline callback plugin is deprecated.', version='2.23')
self._display.deprecated( # pylint: disable=ansible-deprecated-unnecessary-collection-name
msg='The oneline callback plugin is deprecated.',
version='2.23',
deprecator=_deprecator.ANSIBLE_CORE_DEPRECATOR, # entire plugin being removed; this improves the messaging
)
def _command_generic_msg(self, hostname, result, caption):
stdout = result.get('stdout', '').replace('\n', '\\n').replace('\r', '\\r')
@ -41,9 +48,9 @@ class CallbackModule(CallbackBase):
else:
return "%s | %s | rc=%s | (stdout) %s" % (hostname, caption, result.get('rc', -1), stdout)
def v2_runner_on_failed(self, result, ignore_errors=False):
if 'exception' in result._result:
error_text = Templar().template(result._result['exception']) # transform to a string
def v2_runner_on_failed(self, result: CallbackTaskResult, ignore_errors: bool = False) -> None:
if 'exception' in result.result:
error_text = Templar().template(result.result['exception']) # transform to a string
if self._display.verbosity < 3:
# extract just the actual error message from the exception text
error = error_text.strip().split('\n')[-1]
@ -51,31 +58,31 @@ class CallbackModule(CallbackBase):
else:
msg = "An exception occurred during task execution. The full traceback is:\n" + error_text.replace('\n', '')
if result._task.action in C.MODULE_NO_JSON and 'module_stderr' not in result._result:
self._display.display(self._command_generic_msg(result._host.get_name(), result._result, 'FAILED'), color=C.COLOR_ERROR)
if result.task.action in C.MODULE_NO_JSON and 'module_stderr' not in result.result:
self._display.display(self._command_generic_msg(result.host.get_name(), result.result, 'FAILED'), color=C.COLOR_ERROR)
else:
self._display.display(msg, color=C.COLOR_ERROR)
self._display.display("%s | FAILED! => %s" % (result._host.get_name(), self._dump_results(result._result, indent=0).replace('\n', '')),
self._display.display("%s | FAILED! => %s" % (result.host.get_name(), self._dump_results(result.result, indent=0).replace('\n', '')),
color=C.COLOR_ERROR)
def v2_runner_on_ok(self, result):
def v2_runner_on_ok(self, result: CallbackTaskResult) -> None:
if result._result.get('changed', False):
if result.result.get('changed', False):
color = C.COLOR_CHANGED
state = 'CHANGED'
else:
color = C.COLOR_OK
state = 'SUCCESS'
if result._task.action in C.MODULE_NO_JSON and 'ansible_job_id' not in result._result:
self._display.display(self._command_generic_msg(result._host.get_name(), result._result, state), color=color)
if result.task.action in C.MODULE_NO_JSON and 'ansible_job_id' not in result.result:
self._display.display(self._command_generic_msg(result.host.get_name(), result.result, state), color=color)
else:
self._display.display("%s | %s => %s" % (result._host.get_name(), state, self._dump_results(result._result, indent=0).replace('\n', '')),
self._display.display("%s | %s => %s" % (result.host.get_name(), state, self._dump_results(result.result, indent=0).replace('\n', '')),
color=color)
def v2_runner_on_unreachable(self, result):
self._display.display("%s | UNREACHABLE!: %s" % (result._host.get_name(), result._result.get('msg', '')), color=C.COLOR_UNREACHABLE)
def v2_runner_on_unreachable(self, result: CallbackTaskResult) -> None:
self._display.display("%s | UNREACHABLE!: %s" % (result.host.get_name(), result.result.get('msg', '')), color=C.COLOR_UNREACHABLE)
def v2_runner_on_skipped(self, result):
self._display.display("%s | SKIPPED" % (result._host.get_name()), color=C.COLOR_SKIP)
def v2_runner_on_skipped(self, result: CallbackTaskResult) -> None:
self._display.display("%s | SKIPPED" % (result.host.get_name()), color=C.COLOR_SKIP)

@ -30,9 +30,11 @@ DOCUMENTATION = """
import os
from ansible.constants import TREE_DIR
from ansible.executor.task_result import CallbackTaskResult
from ansible.module_utils.common.text.converters import to_bytes, to_text
from ansible.plugins.callback import CallbackBase
from ansible.utils.path import makedirs_safe, unfrackpath
from ansible.module_utils._internal import _deprecator
class CallbackModule(CallbackBase):
@ -47,7 +49,12 @@ class CallbackModule(CallbackBase):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self._display.deprecated('The tree callback plugin is deprecated.', version='2.23')
self._display.deprecated( # pylint: disable=ansible-deprecated-unnecessary-collection-name
msg='The tree callback plugin is deprecated.',
version='2.23',
deprecator=_deprecator.ANSIBLE_CORE_DEPRECATOR, # entire plugin being removed; this improves the messaging
)
def set_options(self, task_keys=None, var_options=None, direct=None):
""" override to set self.tree """
@ -76,14 +83,14 @@ class CallbackModule(CallbackBase):
except (OSError, IOError) as e:
self._display.warning(u"Unable to write to %s's file: %s" % (hostname, to_text(e)))
def result_to_tree(self, result):
self.write_tree_file(result._host.get_name(), self._dump_results(result._result))
def result_to_tree(self, result: CallbackTaskResult) -> None:
self.write_tree_file(result.host.get_name(), self._dump_results(result.result))
def v2_runner_on_ok(self, result):
def v2_runner_on_ok(self, result: CallbackTaskResult) -> None:
self.result_to_tree(result)
def v2_runner_on_failed(self, result, ignore_errors=False):
def v2_runner_on_failed(self, result: CallbackTaskResult, ignore_errors: bool = False) -> None:
self.result_to_tree(result)
def v2_runner_on_unreachable(self, result):
def v2_runner_on_unreachable(self, result: CallbackTaskResult) -> None:
self.result_to_tree(result)

@ -252,7 +252,7 @@ class Connection(ConnectionBase):
def _become_success_timeout(self) -> int:
"""Timeout value for become success in seconds."""
if (timeout := self.get_option('become_success_timeout')) < 1:
timeout = C.config.get_configuration_definitions('connection', 'local')['become_success_timeout']['default']
timeout = C.config.get_config_default('become_success_timeout', plugin_type='connection', plugin_name='local')
return timeout

@ -248,11 +248,13 @@ from ansible.errors import (
AnsibleError,
AnsibleFileNotFound,
)
from ansible.module_utils.common.text.converters import to_bytes, to_native, to_text
from ansible.module_utils.compat.paramiko import _PARAMIKO_IMPORT_ERR as PARAMIKO_IMPORT_ERR, _paramiko as paramiko
from ansible.plugins.connection import ConnectionBase
from ansible.utils.display import Display
from ansible.utils.path import makedirs_safe
from ansible.module_utils.common.text.converters import to_bytes, to_native, to_text
from ansible.module_utils._internal import _deprecator
display = Display()
@ -327,7 +329,12 @@ class Connection(ConnectionBase):
_log_channel: str | None = None
def __init__(self, *args, **kwargs):
display.deprecated('The paramiko connection plugin is deprecated.', version='2.21')
display.deprecated( # pylint: disable=ansible-deprecated-unnecessary-collection-name
msg='The paramiko connection plugin is deprecated.',
version='2.21',
deprecator=_deprecator.ANSIBLE_CORE_DEPRECATOR, # entire plugin being removed; this improves the messaging
)
super().__init__(*args, **kwargs)
def _cache_key(self) -> str:

@ -29,7 +29,7 @@ attributes:
platforms: all
until:
description: Denotes if this action obeys until/retry/poll keywords
support: full
support: none
tags:
description: Allows for the 'tags' keyword to control the selection of this action for execution
support: full

@ -26,7 +26,7 @@ from jinja2.filters import do_map, do_select, do_selectattr, do_reject, do_rejec
from jinja2.environment import Environment
from ansible._internal._templating import _lazy_containers
from ansible.errors import AnsibleFilterError, AnsibleTypeError
from ansible.errors import AnsibleFilterError, AnsibleTypeError, AnsibleTemplatePluginError
from ansible.module_utils.datatag import native_type_name
from ansible.module_utils.common.json import get_encoder, get_decoder
from ansible.module_utils.six import string_types, integer_types, text_type
@ -115,7 +115,10 @@ def to_bool(value: object) -> bool:
result = value_to_check == 1 # backwards compatibility with the old code which checked: value in ('yes', 'on', '1', 'true', 1)
# NB: update the doc string to reflect reality once this fallback is removed
display.deprecated(f'The `bool` filter coerced invalid value {value!r} ({native_type_name(value)}) to {result!r}.', version='2.23')
display.deprecated(
msg=f'The `bool` filter coerced invalid value {value!r} ({native_type_name(value)}) to {result!r}.',
version='2.23',
)
return result
@ -405,6 +408,13 @@ def comment(text, style='plain', **kw):
}
}
if style not in comment_styles:
raise AnsibleTemplatePluginError(
message=f"Invalid style {style!r}.",
help_text=f"Available styles: {', '.join(comment_styles)}",
obj=style,
)
# Pointer to the right comment type
style_params = comment_styles[style]

@ -28,7 +28,7 @@ from collections.abc import Mapping
from ansible import template as _template
from ansible.errors import AnsibleError, AnsibleParserError, AnsibleValueOmittedError
from ansible.inventory.group import to_safe_group_name as original_safe
from ansible.module_utils._internal import _plugin_exec_context
from ansible.module_utils._internal import _plugin_info
from ansible.parsing.utils.addresses import parse_address
from ansible.parsing.dataloader import DataLoader
from ansible.plugins import AnsiblePlugin, _ConfigurablePlugin
@ -314,7 +314,7 @@ class BaseFileInventoryPlugin(_BaseInventoryPlugin):
super(BaseFileInventoryPlugin, self).__init__()
class Cacheable(_plugin_exec_context.HasPluginInfo, _ConfigurablePlugin):
class Cacheable(_plugin_info.HasPluginInfo, _ConfigurablePlugin):
"""Mixin for inventory plugins which support caching."""
_cache: CachePluginAdjudicator

@ -29,7 +29,7 @@ from ansible.module_utils.common.text.converters import to_bytes, to_text, to_na
from ansible.module_utils.six import string_types
from ansible.parsing.yaml.loader import AnsibleLoader
from ansible._internal._yaml._loader import AnsibleInstrumentedLoader
from ansible.plugins import get_plugin_class, MODULE_CACHE, PATH_CACHE, PLUGIN_PATH_CACHE
from ansible.plugins import get_plugin_class, MODULE_CACHE, PATH_CACHE, PLUGIN_PATH_CACHE, AnsibleJinja2Plugin
from ansible.utils.collection_loader import AnsibleCollectionConfig, AnsibleCollectionRef
from ansible.utils.collection_loader._collection_finder import _AnsibleCollectionFinder, _get_collection_metadata
from ansible.utils.display import Display
@ -135,29 +135,44 @@ class PluginPathContext(object):
class PluginLoadContext(object):
def __init__(self):
self.original_name = None
self.redirect_list = []
self.error_list = []
self.import_error_list = []
self.load_attempts = []
self.pending_redirect = None
self.exit_reason = None
self.plugin_resolved_path = None
self.plugin_resolved_name = None
self.plugin_resolved_collection = None # empty string for resolved plugins from user-supplied paths
self.deprecated = False
self.removal_date = None
self.removal_version = None
self.deprecation_warnings = []
self.resolved = False
self._resolved_fqcn = None
self.action_plugin = None
def __init__(self, plugin_type: str, legacy_package_name: str) -> None:
self.original_name: str | None = None
self.redirect_list: list[str] = []
self.raw_error_list: list[Exception] = []
"""All exception instances encountered during the plugin load."""
self.error_list: list[str] = []
"""Stringified exceptions, excluding import errors."""
self.import_error_list: list[Exception] = []
"""All ImportError exception instances encountered during the plugin load."""
self.load_attempts: list[str] = []
self.pending_redirect: str | None = None
self.exit_reason: str | None = None
self.plugin_resolved_path: str | None = None
self.plugin_resolved_name: str | None = None
"""For collection plugins, the resolved Python module FQ __name__; for non-collections, the short name."""
self.plugin_resolved_collection: str | None = None # empty string for resolved plugins from user-supplied paths
"""For collection plugins, the resolved collection {ns}.{col}; empty string for non-collection plugins."""
self.deprecated: bool = False
self.removal_date: str | None = None
self.removal_version: str | None = None
self.deprecation_warnings: list[str] = []
self.resolved: bool = False
self._resolved_fqcn: str | None = None
self.action_plugin: str | None = None
self._plugin_type: str = plugin_type
"""The type of the plugin."""
self._legacy_package_name = legacy_package_name
"""The legacy sys.modules package name from the plugin loader instance; stored to prevent potentially incorrect manual computation."""
self._python_module_name: str | None = None
"""
The fully qualified Python module name for the plugin (accessible via `sys.modules`).
For non-collection non-core plugins, this may include a non-existent synthetic package element with a hash of the file path to avoid collisions.
"""
@property
def resolved_fqcn(self):
def resolved_fqcn(self) -> str | None:
if not self.resolved:
return
return None
if not self._resolved_fqcn:
final_plugin = self.redirect_list[-1]
@ -169,7 +184,7 @@ class PluginLoadContext(object):
return self._resolved_fqcn
def record_deprecation(self, name, deprecation, collection_name):
def record_deprecation(self, name: str, deprecation: dict[str, t.Any] | None, collection_name: str) -> t.Self:
if not deprecation:
return self
@ -183,7 +198,12 @@ class PluginLoadContext(object):
removal_version = None
warning_text = '{0} has been deprecated.{1}{2}'.format(name, ' ' if warning_text else '', warning_text)
display.deprecated(warning_text, date=removal_date, version=removal_version, collection_name=collection_name)
display.deprecated( # pylint: disable=ansible-deprecated-date-not-permitted,ansible-deprecated-unnecessary-collection-name
msg=warning_text,
date=removal_date,
version=removal_version,
deprecator=PluginInfo._from_collection_name(collection_name),
)
self.deprecated = True
if removal_date:
@ -193,28 +213,79 @@ class PluginLoadContext(object):
self.deprecation_warnings.append(warning_text)
return self
def resolve(self, resolved_name, resolved_path, resolved_collection, exit_reason, action_plugin):
def resolve(self, resolved_name: str, resolved_path: str, resolved_collection: str, exit_reason: str, action_plugin: str) -> t.Self:
"""Record a resolved collection plugin."""
self.pending_redirect = None
self.plugin_resolved_name = resolved_name
self.plugin_resolved_path = resolved_path
self.plugin_resolved_collection = resolved_collection
self.exit_reason = exit_reason
self._python_module_name = resolved_name
self.resolved = True
self.action_plugin = action_plugin
return self
def resolve_legacy(self, name: str, pull_cache: dict[str, PluginPathContext]) -> t.Self:
"""Record a resolved legacy plugin."""
plugin_path_context = pull_cache[name]
self.plugin_resolved_name = name
self.plugin_resolved_path = plugin_path_context.path
self.plugin_resolved_collection = 'ansible.builtin' if plugin_path_context.internal else ''
self._resolved_fqcn = 'ansible.builtin.' + name if plugin_path_context.internal else name
self._python_module_name = self._make_legacy_python_module_name()
self.resolved = True
return self
def resolve_legacy_jinja_plugin(self, name: str, known_plugin: AnsibleJinja2Plugin) -> t.Self:
"""Record a resolved legacy Jinja plugin."""
internal = known_plugin.ansible_name.startswith('ansible.builtin.')
self.plugin_resolved_name = name
self.plugin_resolved_path = known_plugin._original_path
self.plugin_resolved_collection = 'ansible.builtin' if internal else ''
self._resolved_fqcn = known_plugin.ansible_name
self._python_module_name = self._make_legacy_python_module_name()
self.resolved = True
return self
def redirect(self, redirect_name):
def redirect(self, redirect_name: str) -> t.Self:
self.pending_redirect = redirect_name
self.exit_reason = 'pending redirect resolution from {0} to {1}'.format(self.original_name, redirect_name)
self.resolved = False
return self
def nope(self, exit_reason):
def nope(self, exit_reason: str) -> t.Self:
self.pending_redirect = None
self.exit_reason = exit_reason
self.resolved = False
return self
def _make_legacy_python_module_name(self) -> str:
"""
Generate a fully-qualified Python module name for a legacy/builtin plugin.
The same package namespace is shared for builtin and legacy plugins.
Explicit requests for builtins via `ansible.builtin` are handled elsewhere with an aliased collection package resolved by the collection loader.
Only unqualified and `ansible.legacy`-qualified requests land here; whichever plugin is visible at the time will end up in sys.modules.
Filter and test plugin host modules receive special name suffixes to avoid collisions unrelated to the actual plugin name.
"""
name = os.path.splitext(self.plugin_resolved_path)[0]
basename = os.path.basename(name)
if self._plugin_type in ('filter', 'test'):
# Unlike other plugin types, filter and test plugin names are independent of the file where they are defined.
# As a result, the Python module name must be derived from the full path of the plugin.
# This prevents accidental shadowing of unrelated plugins of the same type.
basename += f'_{abs(hash(self.plugin_resolved_path))}'
return f'{self._legacy_package_name}.{basename}'
class PluginLoader:
"""
@ -224,7 +295,15 @@ class PluginLoader:
paths, and the python path. The first match is used.
"""
def __init__(self, class_name, package, config, subdir, aliases=None, required_base_class=None):
def __init__(
self,
class_name: str,
package: str,
config: str | list[str],
subdir: str,
aliases: dict[str, str] | None = None,
required_base_class: str | None = None,
) -> None:
aliases = {} if aliases is None else aliases
self.class_name = class_name
@ -250,15 +329,15 @@ class PluginLoader:
PLUGIN_PATH_CACHE[class_name] = defaultdict(dict)
# hold dirs added at runtime outside of config
self._extra_dirs = []
self._extra_dirs: list[str] = []
# caches
self._module_cache = MODULE_CACHE[class_name]
self._paths = PATH_CACHE[class_name]
self._plugin_path_cache = PLUGIN_PATH_CACHE[class_name]
self._plugin_instance_cache = {} if self.subdir == 'vars_plugins' else None
self._plugin_instance_cache: dict[str, tuple[object, PluginLoadContext]] | None = {} if self.subdir == 'vars_plugins' else None
self._searched_paths = set()
self._searched_paths: set[str] = set()
@property
def type(self):
@ -488,7 +567,13 @@ class PluginLoader:
entry = collection_meta.get('plugin_routing', {}).get(plugin_type, {}).get(subdir_qualified_resource, None)
return entry
def _find_fq_plugin(self, fq_name, extension, plugin_load_context, ignore_deprecated=False):
def _find_fq_plugin(
self,
fq_name: str,
extension: str | None,
plugin_load_context: PluginLoadContext,
ignore_deprecated: bool = False,
) -> PluginLoadContext:
"""Search builtin paths to find a plugin. No external paths are searched,
meaning plugins inside roles inside collections will be ignored.
"""
@ -525,17 +610,13 @@ class PluginLoader:
version=removal_version,
date=removal_date,
removed=True,
plugin=PluginInfo(
requested_name=acr.collection,
resolved_name=acr.collection,
type='collection',
),
deprecator=PluginInfo._from_collection_name(acr.collection),
)
plugin_load_context.removal_date = removal_date
plugin_load_context.removal_version = removal_version
plugin_load_context.date = removal_date
plugin_load_context.version = removal_version
plugin_load_context.resolved = True
plugin_load_context.exit_reason = removed_msg
raise AnsiblePluginRemovedError(removed_msg, plugin_load_context=plugin_load_context)
raise AnsiblePluginRemovedError(message=removed_msg, plugin_load_context=plugin_load_context)
redirect = routing_metadata.get('redirect', None)
@ -623,7 +704,7 @@ class PluginLoader:
collection_list: list[str] | None = None,
) -> PluginLoadContext:
""" Find a plugin named name, returning contextual info about the load, recursively resolving redirection """
plugin_load_context = PluginLoadContext()
plugin_load_context = PluginLoadContext(self.type, self.package)
plugin_load_context.original_name = name
while True:
result = self._resolve_plugin_step(name, mod_type, ignore_deprecated, check_aliases, collection_list, plugin_load_context=plugin_load_context)
@ -636,11 +717,8 @@ class PluginLoader:
else:
break
# TODO: smuggle these to the controller when we're in a worker, reduce noise from normal things like missing plugin packages during collection search
if plugin_load_context.error_list:
display.warning("errors were encountered during the plugin load for {0}:\n{1}".format(name, plugin_load_context.error_list))
# TODO: display/return import_error_list? Only useful for forensics...
for ex in plugin_load_context.raw_error_list:
display.error_as_warning(f"Error loading plugin {name!r}.", ex)
# FIXME: store structured deprecation data in PluginLoadContext and use display.deprecate
# if plugin_load_context.deprecated and C.config.get_config_value('DEPRECATION_WARNINGS'):
@ -650,9 +728,15 @@ class PluginLoader:
return plugin_load_context
# FIXME: name bikeshed
def _resolve_plugin_step(self, name, mod_type='', ignore_deprecated=False,
check_aliases=False, collection_list=None, plugin_load_context=PluginLoadContext()):
def _resolve_plugin_step(
self,
name: str,
mod_type: str = '',
ignore_deprecated: bool = False,
check_aliases: bool = False,
collection_list: list[str] | None = None,
plugin_load_context: PluginLoadContext | None = None,
) -> PluginLoadContext:
if not plugin_load_context:
raise ValueError('A PluginLoadContext is required')
@ -707,11 +791,14 @@ class PluginLoader:
except (AnsiblePluginRemovedError, AnsiblePluginCircularRedirect, AnsibleCollectionUnsupportedVersionError):
# these are generally fatal, let them fly
raise
except ImportError as ie:
plugin_load_context.import_error_list.append(ie)
except Exception as ex:
# FIXME: keep actual errors, not just assembled messages
plugin_load_context.error_list.append(to_native(ex))
plugin_load_context.raw_error_list.append(ex)
# DTFIX-RELEASE: can we deprecate/remove these stringified versions?
if isinstance(ex, ImportError):
plugin_load_context.import_error_list.append(ex)
else:
plugin_load_context.error_list.append(str(ex))
if plugin_load_context.error_list:
display.debug(msg='plugin lookup for {0} failed; errors: {1}'.format(name, '; '.join(plugin_load_context.error_list)))
@ -737,13 +824,7 @@ class PluginLoader:
# requested mod_type
pull_cache = self._plugin_path_cache[suffix]
try:
path_with_context = pull_cache[name]
plugin_load_context.plugin_resolved_path = path_with_context.path
plugin_load_context.plugin_resolved_name = name
plugin_load_context.plugin_resolved_collection = 'ansible.builtin' if path_with_context.internal else ''
plugin_load_context._resolved_fqcn = ('ansible.builtin.' + name if path_with_context.internal else name)
plugin_load_context.resolved = True
return plugin_load_context
return plugin_load_context.resolve_legacy(name=name, pull_cache=pull_cache)
except KeyError:
# Cache miss. Now let's find the plugin
pass
@ -796,13 +877,7 @@ class PluginLoader:
self._searched_paths.add(path)
try:
path_with_context = pull_cache[name]
plugin_load_context.plugin_resolved_path = path_with_context.path
plugin_load_context.plugin_resolved_name = name
plugin_load_context.plugin_resolved_collection = 'ansible.builtin' if path_with_context.internal else ''
plugin_load_context._resolved_fqcn = 'ansible.builtin.' + name if path_with_context.internal else name
plugin_load_context.resolved = True
return plugin_load_context
return plugin_load_context.resolve_legacy(name=name, pull_cache=pull_cache)
except KeyError:
# Didn't find the plugin in this directory. Load modules from the next one
pass
@ -810,18 +885,18 @@ class PluginLoader:
# if nothing is found, try finding alias/deprecated
if not name.startswith('_'):
alias_name = '_' + name
# We've already cached all the paths at this point
if alias_name in pull_cache:
path_with_context = pull_cache[alias_name]
if not ignore_deprecated and not os.path.islink(path_with_context.path):
# FIXME: this is not always the case, some are just aliases
display.deprecated('%s is kept for backwards compatibility but usage is discouraged. ' # pylint: disable=ansible-deprecated-no-version
'The module documentation details page may explain more about this rationale.' % name.lstrip('_'))
plugin_load_context.plugin_resolved_path = path_with_context.path
plugin_load_context.plugin_resolved_name = alias_name
plugin_load_context.plugin_resolved_collection = 'ansible.builtin' if path_with_context.internal else ''
plugin_load_context._resolved_fqcn = 'ansible.builtin.' + alias_name if path_with_context.internal else alias_name
plugin_load_context.resolved = True
try:
plugin_load_context.resolve_legacy(name=alias_name, pull_cache=pull_cache)
except KeyError:
pass
else:
display.deprecated(
msg=f'Plugin {name!r} automatically redirected to {alias_name!r}.',
help_text=f'Use {alias_name!r} instead of {name!r} to refer to the plugin.',
version='2.23',
)
return plugin_load_context
# last ditch, if it's something that can be redirected, look for a builtin redirect before giving up
@ -831,7 +906,7 @@ class PluginLoader:
return plugin_load_context.nope('{0} is not eligible for last-chance resolution'.format(name))
def has_plugin(self, name, collection_list=None):
def has_plugin(self, name: str, collection_list: list[str] | None = None) -> bool:
""" Checks if a plugin named name exists """
try:
@ -842,41 +917,37 @@ class PluginLoader:
# log and continue, likely an innocuous type/package loading failure in collections import
display.debug('has_plugin error: {0}'.format(to_text(ex)))
__contains__ = has_plugin
def _load_module_source(self, name, path):
return False
# avoid collisions across plugins
if name.startswith('ansible_collections.'):
full_name = name
else:
full_name = '.'.join([self.package, name])
__contains__ = has_plugin
if full_name in sys.modules:
def _load_module_source(self, *, python_module_name: str, path: str) -> types.ModuleType:
if python_module_name in sys.modules:
# Avoids double loading, See https://github.com/ansible/ansible/issues/13110
return sys.modules[full_name]
return sys.modules[python_module_name]
with warnings.catch_warnings():
# FIXME: this still has issues if the module was previously imported but not "cached",
# we should bypass this entire codepath for things that are directly importable
warnings.simplefilter("ignore", RuntimeWarning)
spec = importlib.util.spec_from_file_location(to_native(full_name), to_native(path))
spec = importlib.util.spec_from_file_location(to_native(python_module_name), to_native(path))
module = importlib.util.module_from_spec(spec)
# mimic import machinery; make the module-being-loaded available in sys.modules during import
# and remove if there's a failure...
sys.modules[full_name] = module
sys.modules[python_module_name] = module
try:
spec.loader.exec_module(module)
except Exception:
del sys.modules[full_name]
del sys.modules[python_module_name]
raise
return module
def _update_object(
self,
*,
obj: _AnsiblePluginInfoMixin,
name: str,
path: str,
@ -907,9 +978,9 @@ class PluginLoader:
is_core_plugin = ctx.plugin_load_context.plugin_resolved_collection == 'ansible.builtin'
if self.class_name == 'StrategyModule' and not is_core_plugin:
display.deprecated( # pylint: disable=ansible-deprecated-no-version
'Use of strategy plugins not included in ansible.builtin are deprecated and do not carry '
msg='Use of strategy plugins not included in ansible.builtin are deprecated and do not carry '
'any backwards compatibility guarantees. No alternative for third party strategy plugins '
'is currently planned.'
'is currently planned.',
)
return ctx.object
@ -936,8 +1007,6 @@ class PluginLoader:
return get_with_context_result(None, plugin_load_context)
fq_name = plugin_load_context.resolved_fqcn
if '.' not in fq_name and plugin_load_context.plugin_resolved_collection:
fq_name = '.'.join((plugin_load_context.plugin_resolved_collection, fq_name))
resolved_type_name = plugin_load_context.plugin_resolved_name
path = plugin_load_context.plugin_resolved_path
if (cached_result := (self._plugin_instance_cache or {}).get(fq_name)) and cached_result[1].resolved:
@ -947,7 +1016,7 @@ class PluginLoader:
redirected_names = plugin_load_context.redirect_list or []
if path not in self._module_cache:
self._module_cache[path] = self._load_module_source(resolved_type_name, path)
self._module_cache[path] = self._load_module_source(python_module_name=plugin_load_context._python_module_name, path=path)
found_in_cache = False
self._load_config_defs(resolved_type_name, self._module_cache[path], path)
@ -974,7 +1043,7 @@ class PluginLoader:
# A plugin may need to use its _load_name in __init__ (for example, to set
# or get options from config), so update the object before using the constructor
instance = object.__new__(obj)
self._update_object(instance, resolved_type_name, path, redirected_names, fq_name)
self._update_object(obj=instance, name=resolved_type_name, path=path, redirected_names=redirected_names, resolved=fq_name)
obj.__init__(instance, *args, **kwargs) # pylint: disable=unnecessary-dunder-call
obj = instance
except TypeError as e:
@ -984,12 +1053,12 @@ class PluginLoader:
return get_with_context_result(None, plugin_load_context)
raise
self._update_object(obj, resolved_type_name, path, redirected_names, fq_name)
self._update_object(obj=obj, name=resolved_type_name, path=path, redirected_names=redirected_names, resolved=fq_name)
if self._plugin_instance_cache is not None and getattr(obj, 'is_stateless', False):
self._plugin_instance_cache[fq_name] = (obj, plugin_load_context)
elif self._plugin_instance_cache is not None:
# The cache doubles as the load order, so record the FQCN even if the plugin hasn't set is_stateless = True
self._plugin_instance_cache[fq_name] = (None, PluginLoadContext())
self._plugin_instance_cache[fq_name] = (None, PluginLoadContext(self.type, self.package))
return get_with_context_result(obj, plugin_load_context)
def _display_plugin_load(self, class_name, name, searched_paths, path, found_in_cache=None, class_only=None):
@ -1064,10 +1133,15 @@ class PluginLoader:
basename = os.path.basename(name)
is_j2 = isinstance(self, Jinja2Loader)
if path in legacy_excluding_builtin:
fqcn = basename
else:
fqcn = f"ansible.builtin.{basename}"
if is_j2:
ref_name = path
else:
ref_name = basename
ref_name = fqcn
if not is_j2 and basename in _PLUGIN_FILTERS[self.package]:
# j2 plugins get processed in own class, here they would just be container files
@ -1090,26 +1164,18 @@ class PluginLoader:
yield path
continue
if path in legacy_excluding_builtin:
fqcn = basename
else:
fqcn = f"ansible.builtin.{basename}"
if (cached_result := (self._plugin_instance_cache or {}).get(fqcn)) and cached_result[1].resolved:
# Here just in case, but we don't call all() multiple times for vars plugins, so this should not be used.
yield cached_result[0]
continue
if path not in self._module_cache:
if self.type in ('filter', 'test'):
# filter and test plugin files can contain multiple plugins
# they must have a unique python module name to prevent them from shadowing each other
full_name = '{0}_{1}'.format(abs(hash(path)), basename)
else:
full_name = basename
path_context = PluginPathContext(path, path not in legacy_excluding_builtin)
load_context = PluginLoadContext(self.type, self.package)
load_context.resolve_legacy(basename, {basename: path_context})
try:
module = self._load_module_source(full_name, path)
module = self._load_module_source(python_module_name=load_context._python_module_name, path=path)
except Exception as e:
display.warning("Skipping plugin (%s), cannot load: %s" % (path, to_text(e)))
continue
@ -1147,7 +1213,7 @@ class PluginLoader:
except TypeError as e:
display.warning("Skipping plugin (%s) as it seems to be incomplete: %s" % (path, to_text(e)))
self._update_object(obj, basename, path, resolved=fqcn)
self._update_object(obj=obj, name=basename, path=path, resolved=fqcn)
if self._plugin_instance_cache is not None:
needs_enabled = False
@ -1239,7 +1305,7 @@ class Jinja2Loader(PluginLoader):
try:
# use 'parent' loader class to find files, but cannot return this as it can contain multiple plugins per file
if plugin_path not in self._module_cache:
self._module_cache[plugin_path] = self._load_module_source(full_name, plugin_path)
self._module_cache[plugin_path] = self._load_module_source(python_module_name=full_name, path=plugin_path)
module = self._module_cache[plugin_path]
obj = getattr(module, self.class_name)
except Exception as e:
@ -1262,7 +1328,7 @@ class Jinja2Loader(PluginLoader):
plugin = self._plugin_wrapper_type(func)
if plugin in plugins:
continue
self._update_object(plugin, full, plugin_path, resolved=fq_name)
self._update_object(obj=plugin, name=full, path=plugin_path, resolved=fq_name)
plugins.append(plugin)
return plugins
@ -1276,7 +1342,7 @@ class Jinja2Loader(PluginLoader):
requested_name = name
context = PluginLoadContext()
context = PluginLoadContext(self.type, self.package)
# avoid collection path for legacy
name = name.removeprefix('ansible.legacy.')
@ -1288,11 +1354,8 @@ class Jinja2Loader(PluginLoader):
if isinstance(known_plugin, _DeferredPluginLoadFailure):
raise known_plugin.ex
context.resolved = True
context.plugin_resolved_name = name
context.plugin_resolved_path = known_plugin._original_path
context.plugin_resolved_collection = 'ansible.builtin' if known_plugin.ansible_name.startswith('ansible.builtin.') else ''
context._resolved_fqcn = known_plugin.ansible_name
context.resolve_legacy_jinja_plugin(name, known_plugin)
return get_with_context_result(known_plugin, context)
plugin = None
@ -1328,7 +1391,12 @@ class Jinja2Loader(PluginLoader):
warning_text = f'{self.type.title()} "{key}" has been deprecated.{" " if warning_text else ""}{warning_text}'
display.deprecated(warning_text, version=removal_version, date=removal_date, collection_name=acr.collection)
display.deprecated( # pylint: disable=ansible-deprecated-date-not-permitted,ansible-deprecated-unnecessary-collection-name
msg=warning_text,
version=removal_version,
date=removal_date,
deprecator=PluginInfo._from_collection_name(acr.collection),
)
# check removal
tombstone_entry = routing_entry.get('tombstone')
@ -1343,11 +1411,7 @@ class Jinja2Loader(PluginLoader):
version=removal_version,
date=removal_date,
removed=True,
plugin=PluginInfo(
requested_name=acr.collection,
resolved_name=acr.collection,
type='collection',
),
deprecator=PluginInfo._from_collection_name(acr.collection),
)
raise AnsiblePluginRemovedError(exc_msg)
@ -1400,7 +1464,7 @@ class Jinja2Loader(PluginLoader):
plugin = self._plugin_wrapper_type(func)
if plugin:
context = plugin_impl.plugin_load_context
self._update_object(plugin, requested_name, plugin_impl.object._original_path, resolved=fq_name)
self._update_object(obj=plugin, name=requested_name, path=plugin_impl.object._original_path, resolved=fq_name)
# context will have filename, which for tests/filters might not be correct
context._resolved_fqcn = plugin.ansible_name
# FIXME: once we start caching these results, we'll be missing functions that would have loaded later

@ -230,8 +230,8 @@ class LookupModule(LookupBase):
display.vvvv("url lookup connecting to %s" % term)
if self.get_option('follow_redirects') in ('yes', 'no'):
display.deprecated(
"Using 'yes' or 'no' for 'follow_redirects' parameter is deprecated.",
version='2.22'
msg="Using 'yes' or 'no' for 'follow_redirects' parameter is deprecated.",
version='2.22',
)
try:
response = open_url(

@ -26,6 +26,7 @@ import sys
import threading
import time
import typing as t
import collections.abc as _c
from collections import deque
@ -35,13 +36,13 @@ from ansible import context
from ansible.errors import AnsibleError, AnsibleFileNotFound, AnsibleParserError, AnsibleTemplateError
from ansible.executor.play_iterator import IteratingStates, PlayIterator
from ansible.executor.process.worker import WorkerProcess
from ansible.executor.task_result import TaskResult
from ansible.executor.task_result import _RawTaskResult, _WireTaskResult
from ansible.executor.task_queue_manager import CallbackSend, DisplaySend, PromptSend, TaskQueueManager
from ansible.module_utils.six import string_types
from ansible.module_utils.common.text.converters import to_text
from ansible.module_utils.connection import Connection, ConnectionError
from ansible.playbook.handler import Handler
from ansible.playbook.helpers import load_list_of_blocks
from ansible.playbook.included_file import IncludedFile
from ansible.playbook.task import Task
from ansible.playbook.task_include import TaskInclude
from ansible.plugins import loader as plugin_loader
@ -89,7 +90,9 @@ def _get_item_vars(result, task):
return item_vars
def results_thread_main(strategy):
def results_thread_main(strategy: StrategyBase) -> None:
value: object
while True:
try:
result = strategy._final_q.get()
@ -99,13 +102,10 @@ def results_thread_main(strategy):
dmethod = getattr(display, result.method)
dmethod(*result.args, **result.kwargs)
elif isinstance(result, CallbackSend):
for arg in result.args:
if isinstance(arg, TaskResult):
strategy.normalize_task_result(arg)
break
strategy._tqm.send_callback(result.method_name, *result.args, **result.kwargs)
elif isinstance(result, TaskResult):
strategy.normalize_task_result(result)
task_result = strategy._convert_wire_task_result_to_raw(result.wire_task_result)
strategy._tqm.send_callback(result.method_name, task_result)
elif isinstance(result, _WireTaskResult):
result = strategy._convert_wire_task_result_to_raw(result)
with strategy._results_lock:
strategy._results.append(result)
elif isinstance(result, PromptSend):
@ -137,7 +137,7 @@ def results_thread_main(strategy):
def debug_closure(func):
"""Closure to wrap ``StrategyBase._process_pending_results`` and invoke the task debugger"""
@functools.wraps(func)
def inner(self, iterator, one_pass=False, max_passes=None):
def inner(self, iterator: PlayIterator, one_pass: bool = False, max_passes: int | None = None) -> list[_RawTaskResult]:
status_to_stats_map = (
('is_failed', 'failures'),
('is_unreachable', 'dark'),
@ -148,12 +148,12 @@ def debug_closure(func):
# We don't know the host yet, copy the previous states, for lookup after we process new results
prev_host_states = iterator.host_states.copy()
results = func(self, iterator, one_pass=one_pass, max_passes=max_passes)
_processed_results = []
results: list[_RawTaskResult] = func(self, iterator, one_pass=one_pass, max_passes=max_passes)
_processed_results: list[_RawTaskResult] = []
for result in results:
task = result._task
host = result._host
task = result.task
host = result.host
_queued_task_args = self._queued_task_cache.pop((host.name, task._uuid), None)
task_vars = _queued_task_args['task_vars']
play_context = _queued_task_args['play_context']
@ -239,7 +239,7 @@ class StrategyBase:
# outstanding tasks still in queue
self._blocked_hosts: dict[str, bool] = dict()
self._results: deque[TaskResult] = deque()
self._results: deque[_RawTaskResult] = deque()
self._results_lock = threading.Condition(threading.Lock())
# create the result processing thread for reading results in the background
@ -249,7 +249,7 @@ class StrategyBase:
# holds the list of active (persistent) connections to be shutdown at
# play completion
self._active_connections: dict[str, str] = dict()
self._active_connections: dict[Host, str] = dict()
# Caches for get_host calls, to avoid calling excessively
# These values should be set at the top of the ``run`` method of each
@ -447,39 +447,33 @@ class StrategyBase:
for target_host in host_list:
_set_host_facts(target_host, always_facts)
def normalize_task_result(self, task_result):
"""Normalize a TaskResult to reference actual Host and Task objects
when only given the ``Host.name``, or the ``Task._uuid``
Only the ``Host.name`` and ``Task._uuid`` are commonly sent back from
the ``TaskExecutor`` or ``WorkerProcess`` due to performance concerns
Mutates the original object
"""
if isinstance(task_result._host, string_types):
# If the value is a string, it is ``Host.name``
task_result._host = self._inventory.get_host(to_text(task_result._host))
def _convert_wire_task_result_to_raw(self, wire_task_result: _WireTaskResult) -> _RawTaskResult:
"""Return a `_RawTaskResult` created from a `_WireTaskResult`."""
host = self._inventory.get_host(wire_task_result.host_name)
queue_cache_entry = (host.name, wire_task_result.task_uuid)
if isinstance(task_result._task, string_types):
# If the value is a string, it is ``Task._uuid``
queue_cache_entry = (task_result._host.name, task_result._task)
try:
found_task = self._queued_task_cache[queue_cache_entry]['task']
except KeyError:
# This should only happen due to an implicit task created by the
# TaskExecutor, restrict this behavior to the explicit use case
# of an implicit async_status task
if task_result._task_fields.get('action') != 'async_status':
if wire_task_result.task_fields.get('action') != 'async_status':
raise
original_task = Task()
task = Task()
else:
original_task = found_task.copy(exclude_parent=True, exclude_tasks=True)
original_task._parent = found_task._parent
original_task.from_attrs(task_result._task_fields)
task_result._task = original_task
task = found_task.copy(exclude_parent=True, exclude_tasks=True)
task._parent = found_task._parent
task.from_attrs(wire_task_result.task_fields)
return task_result
return _RawTaskResult(
host=host,
task=task,
return_data=wire_task_result.return_data,
task_fields=wire_task_result.task_fields,
)
def search_handlers_by_notification(self, notification: str, iterator: PlayIterator) -> t.Generator[Handler, None, None]:
handlers = [h for b in reversed(iterator._play.handlers) for h in b.block]
@ -537,7 +531,7 @@ class StrategyBase:
yield handler
@debug_closure
def _process_pending_results(self, iterator: PlayIterator, one_pass: bool = False, max_passes: int | None = None) -> list[TaskResult]:
def _process_pending_results(self, iterator: PlayIterator, one_pass: bool = False, max_passes: int | None = None) -> list[_RawTaskResult]:
"""
Reads results off the final queue and takes appropriate action
based on the result (executing callbacks, updating state, etc.).
@ -553,8 +547,8 @@ class StrategyBase:
finally:
self._results_lock.release()
original_host = task_result._host
original_task: Task = task_result._task
original_host = task_result.host
original_task: Task = task_result.task
# all host status messages contain 2 entries: (msg, task_result)
role_ran = False
@ -588,7 +582,7 @@ class StrategyBase:
original_host.name,
dict(
ansible_failed_task=original_task.serialize(),
ansible_failed_result=task_result._result,
ansible_failed_result=task_result._return_data,
),
)
else:
@ -596,7 +590,7 @@ class StrategyBase:
else:
self._tqm._stats.increment('ok', original_host.name)
self._tqm._stats.increment('ignored', original_host.name)
if 'changed' in task_result._result and task_result._result['changed']:
if task_result.is_changed():
self._tqm._stats.increment('changed', original_host.name)
self._tqm.send_callback('v2_runner_on_failed', task_result, ignore_errors=ignore_errors)
elif task_result.is_unreachable():
@ -618,9 +612,9 @@ class StrategyBase:
if original_task.loop:
# this task had a loop, and has more than one result, so
# loop over all of them instead of a single result
result_items = task_result._result.get('results', [])
result_items = task_result._loop_results
else:
result_items = [task_result._result]
result_items = [task_result._return_data]
for result_item in result_items:
if '_ansible_notify' in result_item and task_result.is_changed():
@ -665,7 +659,7 @@ class StrategyBase:
if 'add_host' in result_item or 'add_group' in result_item:
item_vars = _get_item_vars(result_item, original_task)
found_task_vars = self._queued_task_cache.get((original_host.name, task_result._task._uuid))['task_vars']
found_task_vars = self._queued_task_cache.get((original_host.name, task_result.task._uuid))['task_vars']
if item_vars:
all_task_vars = combine_vars(found_task_vars, item_vars)
else:
@ -680,17 +674,17 @@ class StrategyBase:
original_task._resolve_conditional(original_task.failed_when, all_task_vars))
if original_task.loop or original_task.loop_with:
new_item_result = TaskResult(
task_result._host,
task_result._task,
new_item_result = _RawTaskResult(
task_result.host,
task_result.task,
result_item,
task_result._task_fields,
task_result.task_fields,
)
self._tqm.send_callback('v2_runner_item_on_ok', new_item_result)
if result_item.get('changed', False):
task_result._result['changed'] = True
task_result._return_data['changed'] = True
if result_item.get('failed', False):
task_result._result['failed'] = True
task_result._return_data['failed'] = True
if 'ansible_facts' in result_item and original_task.action not in C._ACTION_DEBUG:
# if delegated fact and we are delegating facts, we need to change target host for them
@ -738,13 +732,13 @@ class StrategyBase:
else:
self._tqm._stats.set_custom_stats(k, data[k], myhost)
if 'diff' in task_result._result:
if 'diff' in task_result._return_data:
if self._diff or getattr(original_task, 'diff', False):
self._tqm.send_callback('v2_on_file_diff', task_result)
if not isinstance(original_task, TaskInclude):
self._tqm._stats.increment('ok', original_host.name)
if 'changed' in task_result._result and task_result._result['changed']:
if task_result.is_changed():
self._tqm._stats.increment('changed', original_host.name)
# finally, send the ok for this task
@ -754,7 +748,7 @@ class StrategyBase:
if original_task.register:
host_list = self.get_task_hosts(iterator, original_host, original_task)
clean_copy = strip_internal_keys(module_response_deepcopy(task_result._result))
clean_copy = strip_internal_keys(module_response_deepcopy(task_result._return_data))
if 'invocation' in clean_copy:
del clean_copy['invocation']
@ -805,7 +799,7 @@ class StrategyBase:
return ret_results
def _copy_included_file(self, included_file):
def _copy_included_file(self, included_file: IncludedFile) -> IncludedFile:
"""
A proven safe and performant way to create a copy of an included file
"""
@ -818,7 +812,7 @@ class StrategyBase:
return ti_copy
def _load_included_file(self, included_file, iterator, is_handler=False, handle_stats_and_callbacks=True):
def _load_included_file(self, included_file: IncludedFile, iterator, is_handler=False, handle_stats_and_callbacks=True):
"""
Loads an included YAML file of tasks, applying the optional set of variables.
@ -828,12 +822,12 @@ class StrategyBase:
"""
if handle_stats_and_callbacks:
display.deprecated(
"Reporting play recap stats and running callbacks functionality for "
msg="Reporting play recap stats and running callbacks functionality for "
"``include_tasks`` in ``StrategyBase._load_included_file`` is deprecated. "
"See ``https://github.com/ansible/ansible/pull/79260`` for guidance on how to "
"move the reporting into specific strategy plugins to account for "
"``include_role`` tasks as well.",
version="2.21"
version="2.21",
)
display.debug("loading included file: %s" % included_file._filename)
try:
@ -865,11 +859,11 @@ class StrategyBase:
else:
reason = to_text(e)
if handle_stats_and_callbacks:
for r in included_file._results:
r._result['failed'] = True
for tr in included_file._results:
tr._return_data['failed'] = True
for host in included_file._hosts:
tr = TaskResult(host=host, task=included_file._task, return_data=dict(failed=True, reason=reason))
tr = _RawTaskResult(host=host, task=included_file._task, return_data=dict(failed=True, reason=reason), task_fields={})
self._tqm._stats.increment('failures', host.name)
self._tqm.send_callback('v2_runner_on_failed', tr)
raise AnsibleError(reason) from e
@ -905,7 +899,7 @@ class StrategyBase:
def _cond_not_supported_warn(self, task_name):
display.warning("%s task does not support when conditional" % task_name)
def _execute_meta(self, task: Task, play_context, iterator, target_host):
def _execute_meta(self, task: Task, play_context, iterator, target_host: Host):
task.resolved_action = 'ansible.builtin.meta' # _post_validate_args is never called for meta actions, so resolved_action hasn't been set
# meta tasks store their args in the _raw_params field of args,
@ -1083,7 +1077,7 @@ class StrategyBase:
else:
display.vv(f"META: {header}")
res = TaskResult(target_host, task, result)
res = _RawTaskResult(target_host, task, result, {})
if skipped:
self._tqm.send_callback('v2_runner_on_skipped', res)
return [res]
@ -1103,14 +1097,14 @@ class StrategyBase:
hosts_left.append(self._inventory.get_host(host))
return hosts_left
def update_active_connections(self, results):
def update_active_connections(self, results: _c.Iterable[_RawTaskResult]) -> None:
""" updates the current active persistent connections """
for r in results:
if 'args' in r._task_fields:
socket_path = r._task_fields['args'].get('_ansible_socket')
if 'args' in r.task_fields:
socket_path = r.task_fields['args'].get('_ansible_socket')
if socket_path:
if r._host not in self._active_connections:
self._active_connections[r._host] = socket_path
if r.host not in self._active_connections:
self._active_connections[r.host] = socket_path
class NextAction(object):

@ -252,11 +252,11 @@ class StrategyModule(StrategyBase):
# FIXME: send the error to the callback; don't directly write to display here
display.error(ex)
for r in included_file._results:
r._result['failed'] = True
r._result['reason'] = str(ex)
self._tqm._stats.increment('failures', r._host.name)
r._return_data['failed'] = True
r._return_data['reason'] = str(ex)
self._tqm._stats.increment('failures', r.host.name)
self._tqm.send_callback('v2_runner_on_failed', r)
failed_includes_hosts.add(r._host)
failed_includes_hosts.add(r.host)
continue
else:
# since we skip incrementing the stats when the task result is

@ -40,6 +40,8 @@ from ansible.utils.display import Display
from ansible.inventory.host import Host
from ansible.playbook.task import Task
from ansible.executor.play_iterator import PlayIterator
from ansible.playbook.play_context import PlayContext
from ansible.executor import task_result as _task_result
display = Display()
@ -92,7 +94,7 @@ class StrategyModule(StrategyBase):
return host_tasks
def run(self, iterator, play_context):
def run(self, iterator, play_context: PlayContext): # type: ignore[override]
"""
The linear strategy is simple - get the next task and queue
it for all hosts, then wait for the queue to drain before
@ -100,7 +102,7 @@ class StrategyModule(StrategyBase):
"""
# iterate over each task, while there is one left to run
result = self._tqm.RUN_OK
result = int(self._tqm.RUN_OK)
work_to_do = True
self._set_hosts_cache(iterator._play)
@ -125,7 +127,7 @@ class StrategyModule(StrategyBase):
# flag set if task is set to any_errors_fatal
any_errors_fatal = False
results = []
results: list[_task_result._RawTaskResult] = []
for (host, task) in host_tasks:
if self._tqm._terminated:
break
@ -285,11 +287,11 @@ class StrategyModule(StrategyBase):
# FIXME: send the error to the callback; don't directly write to display here
display.error(ex)
for r in included_file._results:
r._result['failed'] = True
r._result['reason'] = str(ex)
self._tqm._stats.increment('failures', r._host.name)
r._return_data['failed'] = True
r._return_data['reason'] = str(ex)
self._tqm._stats.increment('failures', r.host.name)
self._tqm.send_callback('v2_runner_on_failed', r)
failed_includes_hosts.add(r._host)
failed_includes_hosts.add(r.host)
else:
# since we skip incrementing the stats when the task result is
# first processed, we do so now for each host in the list
@ -320,9 +322,9 @@ class StrategyModule(StrategyBase):
unreachable_hosts = []
for res in results:
if res.is_failed():
failed_hosts.append(res._host.name)
failed_hosts.append(res.host.name)
elif res.is_unreachable():
unreachable_hosts.append(res._host.name)
unreachable_hosts.append(res.host.name)
if any_errors_fatal and (failed_hosts or unreachable_hosts):
for host in hosts_left:

@ -49,7 +49,7 @@ def timedout(result):
""" Test if task result yields a time out"""
if not isinstance(result, MutableMapping):
raise errors.AnsibleFilterError("The 'timedout' test expects a dictionary")
return result.get('timedout', False) and result['timedout'].get('period', False)
return result.get('timedout', False) and bool(result['timedout'].get('period', False))
def failed(result):

@ -17,6 +17,6 @@
from __future__ import annotations
__version__ = '2.19.0.dev0'
__version__ = '2.19.0b3'
__author__ = 'Ansible, Inc.'
__codename__ = "What Is and What Should Never Be"

@ -28,7 +28,7 @@ if _t.TYPE_CHECKING: # pragma: nocover
_display: _t.Final[_Display] = _Display()
_UNSET = _t.cast(_t.Any, ...)
_UNSET = _t.cast(_t.Any, object())
_TTrustable = _t.TypeVar('_TTrustable', bound=str | _io.IOBase | _t.TextIO | _t.BinaryIO)
_TRUSTABLE_TYPES = (str, _io.IOBase)
@ -171,7 +171,8 @@ class Templar:
variables=self._engine._variables if available_variables is None else available_variables,
)
templar._overrides = self._overrides.merge(context_overrides)
# backward compatibility: filter out None values from overrides, even though it is a valid value for some of them
templar._overrides = self._overrides.merge({key: value for key, value in context_overrides.items() if value is not None})
if searchpath is not None:
templar._engine.environment.loader.searchpath = searchpath
@ -198,7 +199,7 @@ class Templar:
available_variables=self._engine,
)
kwargs = dict(
target_args = dict(
searchpath=searchpath,
available_variables=available_variables,
)
@ -207,13 +208,14 @@ class Templar:
previous_overrides = self._overrides
try:
for key, value in kwargs.items():
for key, value in target_args.items():
if value is not None:
target = targets[key]
original[key] = getattr(target, key)
setattr(target, key, value)
self._overrides = self._overrides.merge(context_overrides)
# backward compatibility: filter out None values from overrides, even though it is a valid value for some of them
self._overrides = self._overrides.merge({key: value for key, value in context_overrides.items() if value is not None})
yield
finally:
@ -386,7 +388,7 @@ def generate_ansible_template_vars(path: str, fullpath: str | None = None, dest_
value=ansible_managed,
msg="The `ansible_managed` variable is deprecated.",
help_text="Define and use a custom variable instead.",
removal_version='2.23',
version='2.23',
)
temp_vars = dict(

@ -24,11 +24,13 @@ def _meta_yml_to_dict(yaml_string_data: bytes | str, content_id):
import yaml
try:
from yaml import CSafeLoader as SafeLoader
from yaml import CBaseLoader as BaseLoader
except (ImportError, AttributeError):
from yaml import SafeLoader # type: ignore[assignment]
from yaml import BaseLoader # type: ignore[assignment]
routing_dict = yaml.load(yaml_string_data, Loader=SafeLoader)
# Using BaseLoader ensures that all scalars are strings.
# Doing so avoids parsing unquoted versions as floats, dates as datetime.date, etc.
routing_dict = yaml.load(yaml_string_data, Loader=BaseLoader)
if not routing_dict:
routing_dict = {}
if not isinstance(routing_dict, Mapping):

@ -18,7 +18,6 @@
from __future__ import annotations
import dataclasses
import datetime
try:
import curses
@ -50,9 +49,10 @@ from functools import wraps
from struct import unpack, pack
from ansible import constants as C
from ansible.constants import config
from ansible.errors import AnsibleAssertionError, AnsiblePromptInterrupt, AnsiblePromptNoninteractive, AnsibleError
from ansible._internal._errors import _utils
from ansible.module_utils._internal import _ambient_context, _plugin_exec_context
from ansible.module_utils._internal import _ambient_context, _deprecator
from ansible.module_utils.common.text.converters import to_bytes, to_text
from ansible._internal._datatag._tags import TrustedAsTemplate
from ansible.module_utils.common.messages import ErrorSummary, WarningSummary, DeprecationSummary, Detail, SummaryBase, PluginInfo
@ -76,8 +76,6 @@ _LIBC.wcswidth.argtypes = (ctypes.c_wchar_p, ctypes.c_int)
# Max for c_int
_MAX_INT = 2 ** (ctypes.sizeof(ctypes.c_int) * 8 - 1) - 1
_UNSET = t.cast(t.Any, ...)
MOVE_TO_BOL = b'\r'
CLEAR_TO_EOL = b'\x1b[K'
@ -555,7 +553,7 @@ class Display(metaclass=Singleton):
msg: str,
version: str | None = None,
removed: bool = False,
date: str | datetime.date | None = None,
date: str | None = None,
collection_name: str | None = None,
) -> str:
"""Return a deprecation message and help text for non-display purposes (e.g., exception messages)."""
@ -570,7 +568,7 @@ class Display(metaclass=Singleton):
version=version,
removed=removed,
date=date,
plugin=_plugin_exec_context.PluginExecContext.get_current_plugin_info(),
deprecator=PluginInfo._from_collection_name(collection_name),
)
if removed:
@ -582,57 +580,63 @@ class Display(metaclass=Singleton):
def _get_deprecation_message_with_plugin_info(
self,
*,
msg: str,
version: str | None = None,
version: str | None,
removed: bool = False,
date: str | datetime.date | None = None,
plugin: PluginInfo | None = None,
date: str | None,
deprecator: PluginInfo | None,
) -> str:
"""Internal use only. Return a deprecation message and help text for display."""
msg = msg.strip()
if msg and msg[-1] not in ['!', '?', '.']:
msg += '.'
# DTFIX-RELEASE: the logic for omitting date/version doesn't apply to the payload, so it shows up in vars in some cases when it should not
if removed:
removal_fragment = 'This feature was removed'
help_text = 'Please update your playbooks.'
else:
removal_fragment = 'This feature will be removed'
help_text = ''
if plugin:
from_fragment = f'from the {self._describe_plugin_info(plugin)}'
if not deprecator or deprecator.type == _deprecator.INDETERMINATE_DEPRECATOR.type:
collection = None
plugin_fragment = ''
elif deprecator.type == _deprecator.PluginInfo._COLLECTION_ONLY_TYPE:
collection = deprecator.resolved_name
plugin_fragment = ''
else:
from_fragment = ''
parts = deprecator.resolved_name.split('.')
plugin_name = parts[-1]
# DTFIX-RELEASE: normalize 'modules' -> 'module' before storing it so we can eliminate the normalization here
plugin_type = "module" if deprecator.type in ("module", "modules") else f'{deprecator.type} plugin'
if date:
when = 'in a release after {0}.'.format(date)
elif version:
when = 'in version {0}.'.format(version)
else:
when = 'in a future release.'
collection = '.'.join(parts[:2]) if len(parts) > 2 else None
plugin_fragment = f'{plugin_type} {plugin_name!r}'
message_text = ' '.join(f for f in [msg, removal_fragment, from_fragment, when, help_text] if f)
if collection and plugin_fragment:
plugin_fragment += ' in'
return message_text
if collection == 'ansible.builtin':
collection_fragment = 'ansible-core'
elif collection:
collection_fragment = f'collection {collection!r}'
else:
collection_fragment = ''
@staticmethod
def _describe_plugin_info(plugin_info: PluginInfo) -> str:
"""Return a brief description of the plugin info, including name(s) and type."""
name = repr(plugin_info.resolved_name)
clarification = f' (requested as {plugin_info.requested_name!r})' if plugin_info.requested_name != plugin_info.resolved_name else ''
if plugin_info.type in ("module", "modules"):
# DTFIX-RELEASE: pluginloader or AnsiblePlugin needs a "type desc" property that doesn't suffer from legacy "inconsistencies" like this
plugin_type = "module"
elif plugin_info.type == "collection":
# not a real plugin type, but used for tombstone errors generated by plugin loader
plugin_type = plugin_info.type
if not collection:
when_fragment = 'in the future' if not removed else ''
elif date:
when_fragment = f'in a release after {date}'
elif version:
when_fragment = f'version {version}'
else:
when_fragment = 'in a future release' if not removed else ''
if plugin_fragment or collection_fragment:
from_fragment = 'from'
else:
plugin_type = f'{plugin_info.type} plugin'
from_fragment = ''
return f'{name} {plugin_type}{clarification}'
deprecation_msg = ' '.join(f for f in [removal_fragment, from_fragment, plugin_fragment, collection_fragment, when_fragment] if f) + '.'
return _join_sentences(msg, deprecation_msg)
def _wrap_message(self, msg: str, wrap_text: bool) -> str:
if wrap_text and self._wrap_stderr:
@ -661,20 +665,24 @@ class Display(metaclass=Singleton):
msg: str,
version: str | None = None,
removed: bool = False,
date: str | datetime.date | None = None,
collection_name: str | None = _UNSET,
date: str | None = None,
collection_name: str | None = None,
*,
deprecator: PluginInfo | None = None,
help_text: str | None = None,
obj: t.Any = None,
) -> None:
"""Display a deprecation warning message, if enabled."""
# deprecated: description='enable the deprecation message for collection_name' core_version='2.23'
# if collection_name is not _UNSET:
# self.deprecated('The `collection_name` argument to `deprecated` is deprecated.', version='2.27')
"""
Display a deprecation warning message, if enabled.
Most callers do not need to provide `collection_name` or `deprecator` -- but provide only one if needed.
Specify `version` or `date`, but not both.
If `date` is a string, it must be in the form `YYYY-MM-DD`.
"""
# DTFIX-RELEASE: are there any deprecation calls where the feature is switching from enabled to disabled, rather than being removed entirely?
# DTFIX-RELEASE: are there deprecated features which should going through deferred deprecation instead?
_skip_stackwalk = True
self._deprecated_with_plugin_info(
msg=msg,
version=version,
@ -682,37 +690,36 @@ class Display(metaclass=Singleton):
date=date,
help_text=help_text,
obj=obj,
plugin=_plugin_exec_context.PluginExecContext.get_current_plugin_info(),
deprecator=_deprecator.get_best_deprecator(deprecator=deprecator, collection_name=collection_name),
)
def _deprecated_with_plugin_info(
self,
*,
msg: str,
version: str | None = None,
version: str | None,
removed: bool = False,
date: str | datetime.date | None = None,
*,
help_text: str | None = None,
obj: t.Any = None,
plugin: PluginInfo | None = None,
date: str | None,
help_text: str | None,
obj: t.Any,
deprecator: PluginInfo | None,
) -> None:
"""
This is the internal pre-proxy half of the `deprecated` implementation.
Any logic that must occur on workers needs to be implemented here.
"""
_skip_stackwalk = True
if removed:
raise AnsibleError(self._get_deprecation_message_with_plugin_info(
formatted_msg = self._get_deprecation_message_with_plugin_info(
msg=msg,
version=version,
removed=removed,
date=date,
plugin=plugin,
))
deprecator=deprecator,
)
if not _DeferredWarningContext.deprecation_warnings_enabled():
return
self.warning('Deprecation warnings can be disabled by setting `deprecation_warnings=False` in ansible.cfg.')
raise AnsibleError(formatted_msg)
if source_context := _utils.SourceContext.from_value(obj):
formatted_source_context = str(source_context)
@ -728,8 +735,8 @@ class Display(metaclass=Singleton):
),
),
version=version,
date=str(date) if isinstance(date, datetime.date) else date,
plugin=plugin,
date=date,
deprecator=deprecator,
formatted_traceback=_traceback.maybe_capture_traceback(_traceback.TracebackEvent.DEPRECATED),
)
@ -746,6 +753,11 @@ class Display(metaclass=Singleton):
# This is the post-proxy half of the `deprecated` implementation.
# Any logic that must occur in the primary controller process needs to be implemented here.
if not _DeferredWarningContext.deprecation_warnings_enabled():
return
self.warning('Deprecation warnings can be disabled by setting `deprecation_warnings=False` in ansible.cfg.')
msg = format_message(warning)
msg = f'[DEPRECATION WARNING]: {msg}'
@ -983,8 +995,8 @@ class Display(metaclass=Singleton):
msg: str,
private: bool = False,
seconds: int | None = None,
interrupt_input: c.Container[bytes] | None = None,
complete_input: c.Container[bytes] | None = None,
interrupt_input: c.Iterable[bytes] | None = None,
complete_input: c.Iterable[bytes] | None = None,
) -> bytes:
if self._final_q:
from ansible.executor.process.worker import current_worker
@ -1039,8 +1051,8 @@ class Display(metaclass=Singleton):
self,
echo: bool = False,
seconds: int | None = None,
interrupt_input: c.Container[bytes] | None = None,
complete_input: c.Container[bytes] | None = None,
interrupt_input: c.Iterable[bytes] | None = None,
complete_input: c.Iterable[bytes] | None = None,
) -> bytes:
if self._final_q:
raise NotImplementedError
@ -1225,20 +1237,70 @@ def _get_message_lines(message: str, help_text: str | None, formatted_source_con
return message_lines
def _join_sentences(first: str | None, second: str | None) -> str:
"""Join two sentences together."""
first = (first or '').strip()
second = (second or '').strip()
if first and first[-1] not in ('!', '?', '.'):
first += '.'
if second and second[-1] not in ('!', '?', '.'):
second += '.'
if first and not second:
return first
if not first and second:
return second
return ' '.join((first, second))
def format_message(summary: SummaryBase) -> str:
details: t.Sequence[Detail]
details: c.Sequence[Detail] = summary.details
if isinstance(summary, DeprecationSummary):
details = [detail if idx else dataclasses.replace(
detail,
msg=_display._get_deprecation_message_with_plugin_info(
if isinstance(summary, DeprecationSummary) and details:
# augment the first detail element for deprecations to include additional diagnostic info and help text
detail_list = list(details)
detail = detail_list[0]
deprecation_msg = _display._get_deprecation_message_with_plugin_info(
msg=detail.msg,
version=summary.version,
date=summary.date,
plugin=summary.plugin,
),
) for idx, detail in enumerate(summary.details)]
else:
details = summary.details
deprecator=summary.deprecator,
)
detail_list[0] = dataclasses.replace(
detail,
msg=deprecation_msg,
help_text=detail.help_text,
)
details = detail_list
return _format_error_details(details, summary.formatted_traceback)
def _report_config_warnings(deprecator: PluginInfo) -> None:
"""Called by config to report warnings/deprecations collected during a config parse."""
while config.WARNINGS:
warn = config.WARNINGS.pop()
_display.warning(warn)
while config.DEPRECATED:
# tuple with name and options
dep = config.DEPRECATED.pop(0)
msg = config.get_deprecated_msg_from_config(dep[1]).replace("\t", "")
_display.deprecated( # pylint: disable=ansible-deprecated-unnecessary-collection-name,ansible-invalid-deprecated-version
msg=f"{dep[0]} option. {msg}",
version=dep[1]['version'],
deprecator=deprecator,
)
# emit any warnings or deprecations
# in the event config fails before display is up, we'll lose warnings -- but that's OK, since everything is broken anyway
_report_config_warnings(_deprecator.ANSIBLE_CORE_DEPRECATOR)

@ -6,7 +6,6 @@
from __future__ import annotations
import inspect
import os
from ansible.utils.display import Display
@ -19,13 +18,8 @@ def __getattr__(name):
if name != 'environ':
raise AttributeError(name)
caller = inspect.stack()[1]
display.deprecated(
(
'ansible.utils.py3compat.environ is deprecated in favor of os.environ. '
f'Accessed by {caller.filename} line number {caller.lineno}'
),
msg='ansible.utils.py3compat.environ is deprecated in favor of os.environ.',
version='2.20',
)

@ -56,7 +56,10 @@ def set_default_transport():
# deal with 'smart' connection .. one time ..
if C.DEFAULT_TRANSPORT == 'smart':
display.deprecated("The 'smart' option for connections is deprecated. Set the connection plugin directly instead.", version='2.20')
display.deprecated(
msg="The 'smart' option for connections is deprecated. Set the connection plugin directly instead.",
version='2.20',
)
# see if SSH can support ControlPersist if not use paramiko
if not check_for_controlpersist('ssh') and paramiko is not None:

@ -27,6 +27,7 @@ from json import dumps
from ansible import constants as C
from ansible import context
from ansible._internal import _json
from ansible.errors import AnsibleError, AnsibleOptionsError
from ansible.module_utils.datatag import native_type_name
from ansible.module_utils.common.text.converters import to_native, to_text
@ -284,3 +285,25 @@ def validate_variable_name(name: object) -> None:
help_text='Variable names must be strings starting with a letter or underscore character, and contain only letters, numbers and underscores.',
obj=name,
)
def transform_to_native_types(
value: object,
redact: bool = True,
) -> object:
"""
Recursively transform the given value to Python native types.
Potentially sensitive values such as individually vaulted variables will be redacted unless ``redact=False`` is passed.
Which values are considered potentially sensitive may change in future releases.
Types which cannot be converted to Python native types will result in an error.
"""
avv = _json.AnsibleVariableVisitor(
convert_mapping_to_dict=True,
convert_sequence_to_list=True,
convert_custom_scalars=True,
convert_to_native_values=True,
apply_transforms=True,
encrypted_string_behavior=_json.EncryptedStringBehavior.REDACT if redact else _json.EncryptedStringBehavior.DECRYPT,
)
return avv.visit(value)

@ -42,7 +42,7 @@ def module_response_deepcopy(v):
backwards compatibility, in case we need to extend this function
to handle our specific needs:
* ``ansible.executor.task_result.TaskResult.clean_copy``
* ``ansible.executor.task_result._RawTaskResult.as_callback_task_result``
* ``ansible.vars.clean.clean_facts``
* ``ansible.vars.namespace_facts``
"""

@ -25,6 +25,8 @@ from collections import defaultdict
from collections.abc import Mapping, MutableMapping
from ansible import constants as C
from ansible.module_utils._internal import _deprecator
from ansible.module_utils._internal._datatag import _tags
from ansible.errors import (AnsibleError, AnsibleParserError, AnsibleUndefinedVariable, AnsibleFileNotFound,
AnsibleAssertionError, AnsibleValueOmittedError)
from ansible.inventory.host import Host
@ -32,7 +34,6 @@ from ansible.inventory.helpers import sort_groups, get_group_vars
from ansible.inventory.manager import InventoryManager
from ansible.module_utils.datatag import native_type_name
from ansible.module_utils.six import text_type
from ansible.module_utils.datatag import deprecate_value
from ansible.parsing.dataloader import DataLoader
from ansible._internal._templating._engine import TemplateEngine
from ansible.plugins.loader import cache_loader
@ -50,8 +51,12 @@ if t.TYPE_CHECKING:
display = Display()
# deprecated: description='enable top-level facts deprecation' core_version='2.20'
# _DEPRECATE_TOP_LEVEL_FACT_MSG = sys.intern('Top-level facts are deprecated, use `ansible_facts` instead.')
# _DEPRECATE_TOP_LEVEL_FACT_REMOVAL_VERSION = sys.intern('2.22')
# _DEPRECATE_TOP_LEVEL_FACT_TAG = _tags.Deprecated(
# msg='Top-level facts are deprecated.',
# version='2.24',
# deprecator=_deprecator.ANSIBLE_CORE_DEPRECATOR,
# help_text='Use `ansible_facts` instead.',
# )
def _deprecate_top_level_fact(value: t.Any) -> t.Any:
@ -61,7 +66,7 @@ def _deprecate_top_level_fact(value: t.Any) -> t.Any:
Unique tag instances are required to achieve the correct de-duplication within a top-level templating operation.
"""
# deprecated: description='enable top-level facts deprecation' core_version='2.20'
# return deprecate_value(value, _DEPRECATE_TOP_LEVEL_FACT_MSG, removal_version=_DEPRECATE_TOP_LEVEL_FACT_REMOVAL_VERSION)
# return _DEPRECATE_TOP_LEVEL_FACT_TAG.tag(value)
return value
@ -96,6 +101,13 @@ class VariableManager:
_ALLOWED = frozenset(['plugins_by_group', 'groups_plugins_play', 'groups_plugins_inventory', 'groups_inventory',
'all_plugins_play', 'all_plugins_inventory', 'all_inventory'])
_PLAY_HOSTS_DEPRECATED_TAG = _tags.Deprecated(
msg='The `play_hosts` magic variable is deprecated.',
version='2.23',
deprecator=_deprecator.ANSIBLE_CORE_DEPRECATOR,
help_text='Use `ansible_play_batch` instead.',
)
def __init__(self, loader: DataLoader | None = None, inventory: InventoryManager | None = None, version_info: dict[str, str] | None = None) -> None:
self._nonpersistent_fact_cache: defaultdict[str, dict] = defaultdict(dict)
self._vars_cache: defaultdict[str, dict] = defaultdict(dict)
@ -214,24 +226,6 @@ class VariableManager:
all_group = self._inventory.groups.get('all')
host_groups = sort_groups([g for g in host.get_groups() if g.name != 'all'])
def _get_plugin_vars(plugin, path, entities):
data = {}
try:
data = plugin.get_vars(self._loader, path, entities)
except AttributeError:
try:
for entity in entities:
if isinstance(entity, Host):
data |= plugin.get_host_vars(entity.name)
else:
data |= plugin.get_group_vars(entity.name)
except AttributeError:
if hasattr(plugin, 'run'):
raise AnsibleError("Cannot use v1 type vars plugin %s from %s" % (plugin._load_name, plugin._original_path))
else:
raise AnsibleError("Invalid vars plugin %s from %s" % (plugin._load_name, plugin._original_path))
return data
# internal functions that actually do the work
def _plugins_inventory(entities):
""" merges all entities by inventory source """
@ -495,11 +489,8 @@ class VariableManager:
variables['ansible_play_hosts'] = [x for x in variables['ansible_play_hosts_all'] if x not in play._removed_hosts]
variables['ansible_play_batch'] = [x for x in _hosts if x not in play._removed_hosts]
variables['play_hosts'] = deprecate_value(
value=variables['ansible_play_batch'],
msg='Use `ansible_play_batch` instead of `play_hosts`.',
removal_version='2.23',
)
# use a static tag instead of `deprecate_value` to avoid stackwalk in a hot code path
variables['play_hosts'] = self._PLAY_HOSTS_DEPRECATED_TAG.tag(variables['ansible_play_batch'])
# Set options vars
for option, option_value in self._options_vars.items():

@ -34,7 +34,7 @@ def get_plugin_vars(loader, plugin, path, entities):
except AttributeError:
if hasattr(plugin, 'get_host_vars') or hasattr(plugin, 'get_group_vars'):
display.deprecated(
f"The vars plugin {plugin.ansible_name} from {plugin._original_path} is relying "
msg=f"The vars plugin {plugin.ansible_name} from {plugin._original_path} is relying "
"on the deprecated entrypoints 'get_host_vars' and 'get_group_vars'. "
"This plugin should be updated to inherit from BaseVarsPlugin and define "
"a 'get_vars' method as the main entrypoint instead.",

@ -10,7 +10,7 @@ ansible --task-timeout 5 localhost -m command -a '{"cmd": "whoami"}' | grep 'rc=
# ensure that legacy deserializer behaves as expected on JSON CLI args (https://github.com/ansible/ansible/issues/82600)
# also ensure that various templated args function (non-exhaustive)
_ANSIBLE_TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR=warn ansible '{{"localhost"}}' -m '{{"debug"}}' -a var=fromcli -e '{"fromcli":{"no_trust":{"__ansible_unsafe":"{{\"hello\"}}"},"trust":"{{ 1 }}"}}' > "${OUTPUT_DIR}/output.txt" 2>&1
_ANSIBLE_TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR=warning ansible '{{"localhost"}}' -m '{{"debug"}}' -a var=fromcli -e '{"fromcli":{"no_trust":{"__ansible_unsafe":"{{\"hello\"}}"},"trust":"{{ 1 }}"}}' > "${OUTPUT_DIR}/output.txt" 2>&1
grep '"no_trust": "{{."hello."}}"' "${OUTPUT_DIR}/output.txt" # ensure that the template was not rendered
grep '"trust": 1' "${OUTPUT_DIR}/output.txt" # ensure that the trusted template was rendered
grep "Encountered untrusted template" "${OUTPUT_DIR}/output.txt" # look for the untrusted template warning text

@ -8,6 +8,18 @@
set_fact:
generated_wrapper: "{{ (wrapper.stdout | regex_search('PUT .*? TO (/.*?/AnsiballZ_ping.py)', '\\1'))[0] }}"
- name: Check permissions
stat:
path: '{{ generated_wrapper }}'
register: wrapper_stats
- name: Ensure permissions
assert:
that:
- wrapper_stats.stat.executable is true
- wrapper_stats.stat.readable is true
- wrapper_stats.stat.writeable is true
- name: Explode the wrapper
command: "{{ generated_wrapper }} explode"
register: explode

@ -141,3 +141,17 @@
when:
- item[1] in gs[item[0]]
loop: '{{gs_keys | product(gs_all) }}'
- name: test ansible-config init for valid private and no hidden
vars:
config_keys: "{{ config_dump['stdout'] | from_json | selectattr('name', 'defined') }}"
block:
- name: run config full dump
shell: ansible-config dump -t all -f json
register: config_dump
- name: validate we get 'internal' but not hidden (_Z_)
assert:
that:
- config_keys | selectattr('name', 'match', '_.*') | length > 0
- config_keys | selectattr('name', 'match', '_Z_.*') | length == 0

@ -17,8 +17,8 @@ DEPRECATED:
OPTIONS (= indicates it is required):
- sub Suboptions. Contains `sub.subtest', which can be set to `123'.
You can use `TEST_ENV' to set this.
- sub Suboptions. Contains `sub.subtest', which can be set to
`123'. You can use `TEST_ENV' to set this.
set_via:
env:
- deprecated:
@ -31,15 +31,17 @@ OPTIONS (= indicates it is required):
type: dict
options:
- subtest2 Another suboption. Useful when [[ansible.builtin.shuffle]]
is used with value `[a,b,),d\]'.
- subtest2 Another suboption. Useful when
[[ansible.builtin.shuffle]] is used with value
`[a,b,),d\]'.
default: null
type: float
added in: version 1.1.0
suboptions:
- subtest A suboption. Not compatible to `path=c:\foo(1).txt' (of
module ansible.builtin.copy).
- subtest A suboption. Not compatible to
`path=c:\foo(1).txt' (of module
ansible.builtin.copy).
default: null
type: int
added in: version 1.1.0 of testns.testcol

@ -2,3 +2,4 @@ shippable/galaxy/group1
shippable/galaxy/smoketest
cloud/galaxy
context/controller
retry/never

@ -90,7 +90,7 @@ from multiprocessing import dummy as threading
from multiprocessing import TimeoutError, Lock
COLLECTIONS_BUILD_AND_PUBLISH_TIMEOUT = 180
COLLECTIONS_BUILD_AND_PUBLISH_TIMEOUT = 300
LOCK = Lock()
@ -256,7 +256,7 @@ def run_module():
start = datetime.datetime.now()
result = dict(changed=True, results=[], start=str(start))
pool = threading.Pool(4)
pool = threading.Pool(1)
publish_func = partial(publish_collection, module)
try:
result['results'] = pool.map_async(

@ -0,0 +1,36 @@
from __future__ import annotations
from ansible.plugins.action import ActionBase
from ansible.utils.display import _display
from ansible.module_utils.common.messages import PluginInfo
# extra lines below to allow for adding more imports without shifting the line numbers of the code that follows
#
#
#
#
#
class ActionModule(ActionBase):
def run(self, tmp=None, task_vars=None):
result = super(ActionModule, self).run(tmp, task_vars)
deprecator = PluginInfo._from_collection_name('ns.col')
# ansible-deprecated-version - only ansible-core can encounter this
_display.deprecated(msg='ansible-deprecated-no-version')
# ansible-invalid-deprecated-version - only ansible-core can encounter this
_display.deprecated(msg='collection-deprecated-version', version='1.0.0')
_display.deprecated(msg='collection-invalid-deprecated-version', version='not-a-version')
# ansible-deprecated-no-collection-name - only a module_utils can encounter this
_display.deprecated(msg='wrong-collection-deprecated', collection_name='ns.wrong', version='3.0.0')
_display.deprecated(msg='ansible-expired-deprecated-date', date='2000-01-01')
_display.deprecated(msg='ansible-invalid-deprecated-date', date='not-a-date')
_display.deprecated(msg='ansible-deprecated-both-version-and-date', version='3.0.0', date='2099-01-01')
_display.deprecated(msg='removal-version-must-be-major', version='3.1.0')
# ansible-deprecated-date-not-permitted - only ansible-core can encounter this
_display.deprecated(msg='ansible-deprecated-unnecessary-collection-name', deprecator=deprecator, version='3.0.0')
# ansible-deprecated-collection-name-not-permitted - only ansible-core can encounter this
_display.deprecated(msg='ansible-deprecated-both-collection-name-and-deprecator', collection_name='ns.col', deprecator=deprecator, version='3.0.0')
return result

@ -1,3 +1,4 @@
"""This Python module calls deprecation functions in a variety of ways to validate call inference is supported in all common scenarios."""
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import annotations
@ -13,9 +14,72 @@ author:
EXAMPLES = """#"""
RETURN = """#"""
import ansible.utils.display
import ansible.module_utils.common.warnings
from ansible.module_utils import datatag
from ansible.module_utils.basic import AnsibleModule
from ansible.module_utils.basic import deprecate
from ansible.module_utils.common import warnings
from ansible.module_utils.common.warnings import deprecate as basic_deprecate
from ansible.module_utils.datatag import deprecate_value
from ansible.plugins.lookup import LookupBase
from ansible.utils import display as x_display
from ansible.utils.display import Display as XDisplay
from ansible.utils.display import _display
global_display = XDisplay()
other_global_display = x_display.Display()
foreign_global_display = x_display._display
# extra lines below to allow for adding more imports without shifting the line numbers of the code that follows
#
#
#
#
#
#
#
class LookupModule(LookupBase):
def run(self, **kwargs):
return []
class MyModule(AnsibleModule):
"""A class."""
do_deprecated = global_display.deprecated
def my_method(self) -> None:
"""A method."""
self.deprecate('', version='2.0.0', collection_name='ns.col')
def give_me_a_func():
return global_display.deprecated
def do_stuff() -> None:
"""A function."""
d1 = x_display.Display()
d2 = XDisplay()
MyModule.do_deprecated('', version='2.0.0', collection_name='ns.col')
basic_deprecate('', version='2.0.0', collection_name='ns.col')
ansible.utils.display._display.deprecated('', version='2.0.0', collection_name='ns.col')
d1.deprecated('', version='2.0.0', collection_name='ns.col')
d2.deprecated('', version='2.0.0', collection_name='ns.col')
x_display.Display().deprecated('', version='2.0.0', collection_name='ns.col')
XDisplay().deprecated('', version='2.0.0', collection_name='ns.col')
warnings.deprecate('', version='2.0.0', collection_name='ns.col')
deprecate('', version='2.0.0', collection_name='ns.col')
datatag.deprecate_value("thing", '', collection_name='ns.col', version='2.0.0')
deprecate_value("thing", '', collection_name='ns.col', version='2.0.0')
global_display.deprecated('', version='2.0.0', collection_name='ns.col')
other_global_display.deprecated('', version='2.0.0', collection_name='ns.col')
foreign_global_display.deprecated('', version='2.0.0', collection_name='ns.col')
_display.deprecated('', version='2.0.0', collection_name='ns.col')
give_me_a_func()("hello") # not detected

@ -0,0 +1,34 @@
from __future__ import annotations
from ansible.module_utils.common.messages import PluginInfo
from ansible.module_utils.common.warnings import deprecate
# extra lines below to allow for adding more imports without shifting the line numbers of the code that follows
#
#
#
#
#
#
#
#
def do_stuff() -> None:
deprecator = PluginInfo._from_collection_name('ns.col')
# ansible-deprecated-version - only ansible-core can encounter this
deprecate(msg='ansible-deprecated-no-version', collection_name='ns.col')
# ansible-invalid-deprecated-version - only ansible-core can encounter this
deprecate(msg='collection-deprecated-version', collection_name='ns.col', version='1.0.0')
deprecate(msg='collection-invalid-deprecated-version', collection_name='ns.col', version='not-a-version')
# ansible-deprecated-no-collection-name - module_utils cannot encounter this
deprecate(msg='wrong-collection-deprecated', collection_name='ns.wrong', version='3.0.0')
deprecate(msg='ansible-expired-deprecated-date', collection_name='ns.col', date='2000-01-01')
deprecate(msg='ansible-invalid-deprecated-date', collection_name='ns.col', date='not-a-date')
deprecate(msg='ansible-deprecated-both-version-and-date', collection_name='ns.col', version='3.0.0', date='2099-01-01')
deprecate(msg='removal-version-must-be-major', collection_name='ns.col', version='3.1.0')
# ansible-deprecated-date-not-permitted - only ansible-core can encounter this
# ansible-deprecated-unnecessary-collection-name - module_utils cannot encounter this
# ansible-deprecated-collection-name-not-permitted - only ansible-core can encounter this
deprecate(msg='ansible-deprecated-both-collection-name-and-deprecator', collection_name='ns.col', deprecator=deprecator, version='3.0.0')

Some files were not shown because too many files have changed in this diff Show More

Loading…
Cancel
Save