mirror of https://github.com/ansible/ansible.git
Templating overhaul, implement Data Tagging (#84621)
Co-authored-by: Matt Davis <mrd@redhat.com> Co-authored-by: Matt Clay <matt@mystile.com>pull/84982/head
parent
6fc592df9b
commit
35750ed321
@ -1,3 +0,0 @@
|
|||||||
---
|
|
||||||
deprecated_features:
|
|
||||||
- fact_cache - deprecate first_order_merge API (https://github.com/ansible/ansible/pull/84568).
|
|
||||||
@ -0,0 +1,3 @@
|
|||||||
|
bugfixes:
|
||||||
|
- Correctly return ``False`` when using the ``filter`` and ``test`` Jinja tests on plugin names which are not filters or tests, respectively.
|
||||||
|
(resolves issue https://github.com/ansible/ansible/issues/82084)
|
||||||
@ -0,0 +1,179 @@
|
|||||||
|
# DTFIX-RELEASE: document EncryptedString replacing AnsibleVaultEncryptedUnicode
|
||||||
|
|
||||||
|
major_changes:
|
||||||
|
- variables - The type system underlying Ansible's variable storage has been significantly overhauled and formalized.
|
||||||
|
Attempts to store unsupported Python object types in variables will now result in an error. # DTFIX-RELEASE: link to type system docs TBD
|
||||||
|
- variables - To support new Ansible features, many variable objects are now represented by subclasses of their respective native Python types.
|
||||||
|
In most cases, they behave indistinguishably from their original types, but some Python libraries do not handle builtin object subclasses properly.
|
||||||
|
Custom plugins that interact with such libraries may require changes to convert and pass the native types. # DTFIX-RELEASE: link to plugin/data tagging API docs TBD
|
||||||
|
- ansible-test - Packages beneath ``module_utils`` can now contain ``__init__.py`` files.
|
||||||
|
- Jinja plugins - Jinja builtin filter and test plugins are now accessible via their fully-qualified names ``ansible.builtin.{name}``.
|
||||||
|
|
||||||
|
minor_changes:
|
||||||
|
- templating - Templating errors now provide more information about both the location and context of the error, especially for deeply-nested and/or indirected templating scenarios.
|
||||||
|
- templating - Handling of omitted values is now a first-class feature of the template engine, and is usable in all Ansible Jinja template contexts.
|
||||||
|
Any template that resolves to ``omit`` is automatically removed from its parent container during templating. # DTFIX-RELEASE: porting guide entry
|
||||||
|
- templating - Unified ``omit`` behavior now requires that plugins calling ``Templar.template()`` handle cases where the entire template result is omitted,
|
||||||
|
by catching the ``AnsibleValueOmittedError`` that is raised.
|
||||||
|
Previously, this condition caused a randomly-generated string marker to appear in the template result. # DTFIX-RELEASE: porting guide entry?
|
||||||
|
- templating - Template evaluation is lazier than in previous versions.
|
||||||
|
Template expressions which resolve only portions of a data structure no longer result in the entire structure being templated.
|
||||||
|
- handlers - Templated handler names with syntax errors, or that resolve to ``omit`` are now skipped like handlers with undefined variables in their name.
|
||||||
|
- env lookup - The error message generated for a missing environment variable when ``default`` is an undefined value (e.g. ``undef('something')``) will contain the hint from that undefined value,
|
||||||
|
except when the undefined value is the default of ``undef()`` with no arguments. Previously, any existing undefined hint would be ignored.
|
||||||
|
- templating - Embedding ``range()`` values in containers such as lists will result in an error on use.
|
||||||
|
Previously the value would be converted to a string representing the range parameters, such as ``range(0, 3)``.
|
||||||
|
- Jinja plugins - Plugins can declare support for undefined values. # DTFIX-RELEASE: examples, porting guide entry
|
||||||
|
- templating - Variables of type ``set`` and ``tuple`` are now converted to ``list`` when exiting the final pass of templating.
|
||||||
|
- templating - Access to an undefined variable from inside a lookup, filter, or test (which raises MarkerError) no longer ends processing of the current template.
|
||||||
|
The triggering undefined value is returned as the result of the offending plugin invocation, and the template continues to execute. # DTFIX-RELEASE: porting guide entry, samples needed
|
||||||
|
- plugin error handling - When raising exceptions in an exception handler, be sure to use ``raise ... from`` as appropriate.
|
||||||
|
This supersedes the use of the ``AnsibleError`` arg ``orig_exc`` to represent the cause.
|
||||||
|
Specifying ``orig_exc`` as the cause is still permitted.
|
||||||
|
Failure to use ``raise ... from`` when ``orig_exc`` is set will result in a warning.
|
||||||
|
Additionally, if the two cause exceptions do not match, a warning will be issued. # DTFIX-RELEASE: this needs a porting guide entry
|
||||||
|
- ansible-test - The ``yamllint`` sanity test now enforces string values for the ``!vault`` tag.
|
||||||
|
- warnings - All warnings (including deprecation warnings) issued during a task's execution are now accessible via the ``warnings`` and ``deprecations`` keys on the task result.
|
||||||
|
- troubleshooting - Tracebacks can be collected and displayed for most errors, warnings, and deprecation warnings (including those generated by modules).
|
||||||
|
Tracebacks are no longer enabled with ``-vvv``; the behavior is directly configurable via the ``DISPLAY_TRACEBACK`` config option.
|
||||||
|
Module tracebacks passed to ``fail_json`` via the ``exception`` kwarg will not be included in the task result unless error tracebacks are configured.
|
||||||
|
- display - Deduplication of warning and error messages considers the full content of the message (including source and traceback contexts, if enabled).
|
||||||
|
This may result in fewer messages being omitted.
|
||||||
|
- modules - Unhandled exceptions during Python module execution are now returned as structured data from the target.
|
||||||
|
This allows the new traceback handling to be applied to exceptions raised on targets.
|
||||||
|
- modules - PowerShell modules can now receive ``datetime.date``, ``datetime.time`` and ``datetime.datetime`` values as ISO 8601 strings.
|
||||||
|
- modules - PowerShell modules can now receive strings sourced from inline vault-encrypted strings.
|
||||||
|
- from_json filter - The filter accepts a ``profile`` argument, which defaults to ``tagless``.
|
||||||
|
- to_json / to_nice_json filters - The filters accept a ``profile`` argument, which defaults to ``tagless``.
|
||||||
|
- undef jinja function - The ``undef`` jinja function now raises an error if a non-string hint is given.
|
||||||
|
Attempting to use an undefined hint also results in an error, ensuring incorrect use of the function can be distinguished from the function's normal behavior.
|
||||||
|
- display - The ``collection_name`` arg to ``Display.deprecated`` no longer has any effect.
|
||||||
|
Information about the calling plugin is automatically captured by the display infrastructure, included in the displayed messages, and made available to callbacks.
|
||||||
|
- modules - The ``collection_name`` arg to Python module-side ``deprecate`` methods no longer has any effect.
|
||||||
|
Information about the calling module is automatically captured by the warning infrastructure and included in the module result.
|
||||||
|
|
||||||
|
breaking_changes:
|
||||||
|
- loops - Omit placeholders no longer leak between loop item templating and task templating.
|
||||||
|
Previously, ``omit`` placeholders could remain embedded in loop items after templating and be used as an ``omit`` for task templating.
|
||||||
|
Now, values resolving to ``omit`` are dropped immediately when loop items are templated.
|
||||||
|
To turn missing values into an ``omit`` for task templating, use ``| default(omit)``.
|
||||||
|
This solution is backwards compatible with previous versions of ansible-core. # DTFIX-RELEASE: porting guide entry with examples
|
||||||
|
- serialization of ``omit`` sentinel - Serialization of variables containing ``omit`` sentinels (e.g., by the ``to_json`` and ``to_yaml`` filters or ``ansible-inventory``) will fail if the variable has not completed templating.
|
||||||
|
Previously, serialization succeeded with placeholder strings emitted in the serialized output.
|
||||||
|
- conditionals - Conditional expressions that result in non-boolean values are now an error by default.
|
||||||
|
Such results often indicate unintentional use of templates where they are not supported, resulting in a conditional that is always true.
|
||||||
|
When this option is enabled, conditional expressions which are a literal ``None`` or empty string will evaluate as true, for backwards compatibility.
|
||||||
|
The error can be temporarily changed to a deprecation warning by enabling the ``ALLOW_BROKEN_CONDITIONALS`` config option.
|
||||||
|
- templating - Templates are always rendered in Jinja2 native mode.
|
||||||
|
As a result, non-string values are no longer automatically converted to strings.
|
||||||
|
- templating - Templates with embedded inline templates that were not contained within a Jinja string constant now result in an error, as support for multi-pass templating was removed for security reasons.
|
||||||
|
In most cases, such templates can be easily rewritten to avoid the use of embedded inline templates.
|
||||||
|
- templating - Conditionals and lookups which use embedded inline templates in Jinja string constants now display a warning.
|
||||||
|
These templates should be converted to their expression equivalent.
|
||||||
|
- templating - Templates resulting in ``None`` are no longer automatically converted to an empty string.
|
||||||
|
- template lookup - The ``convert_data`` option is deprecated and no longer has any effect.
|
||||||
|
Use the ``from_json`` filter on the lookup result instead.
|
||||||
|
- templating - ``#jinja2:`` overrides in templates with invalid override names or types are now templating errors.
|
||||||
|
- set_fact - The string values "yes", "no", "true" and "false" were previously converted (ignoring case) to boolean values when not using Jinja2 native mode.
|
||||||
|
Since Jinja2 native mode is always used, this conversion no longer occurs.
|
||||||
|
When boolean values are required, native boolean syntax should be used where variables are defined, such as in YAML.
|
||||||
|
When native boolean syntax is not an option, the ``bool`` filter can be used to parse string values into booleans.
|
||||||
|
- templating - The ``allow_unsafe_lookups`` option no longer has any effect.
|
||||||
|
Lookup plugins are responsible for tagging strings containing templates to allow evaluation as a template.
|
||||||
|
- assert - The ``quiet`` argument must be a commonly-accepted boolean value.
|
||||||
|
Previously, unrecognized values were silently treated as False.
|
||||||
|
- plugins - Any plugin that sources or creates templates must properly tag them as trusted. # DTFIX-RELEASE: porting guide entry for "how?" Don't forget to mention inventory plugin ``trusted_by_default`` config.
|
||||||
|
- first_found lookup - When specifying ``files`` or ``paths`` as a templated list containing undefined values, the undefined list elements will be discarded with a warning.
|
||||||
|
Previously, the entire list would be discarded without any warning.
|
||||||
|
- templating - The result of the ``range()`` global function cannot be returned from a template- it should always be passed to a filter (e.g., ``random``).
|
||||||
|
Previously, range objects returned from an intermediate template were always converted to a list, which is inconsistent with inline consumption of range objects.
|
||||||
|
- plugins - Custom Jinja plugins that accept undefined top-level arguments must opt in to receiving them. # DTFIX-RELEASE: porting guide entry + backcompat behavior description
|
||||||
|
- plugins - Custom Jinja plugins that use ``environment.getitem`` to retrieve undefined values will now trigger a ``MarkerError`` exception.
|
||||||
|
This exception must be handled to allow the plugin to return a ``Marker``, or the plugin must opt-in to accepting ``Marker`` values. # DTFIX-RELEASE: mention the decorator
|
||||||
|
- templating - Many Jinja plugins (filters, lookups, tests) and methods previously silently ignored undefined inputs, which often masked subtle errors.
|
||||||
|
Passing an undefined argument to a Jinja plugin or method that does not declare undefined support now results in an undefined value. # DTFIX-RELEASE: common examples, porting guide, `is defined`, `is undefined`, etc; porting guide should also mention that overly-broad exception handling may mask Undefined errors; also that lazy handling of Undefined can invoke a plugin and bomb out in the middle where it was previously never invoked (plugins with side effects, just don't)
|
||||||
|
- lookup plugins - Lookup plugins called as `with_(lookup)` will no longer have the `_subdir` attribute set. # DTFIX-RELEASE: porting guide re: `ansible_lookup_context`
|
||||||
|
- lookup plugins - ``terms`` will always be passed to ``run`` as the first positional arg, where previously it was sometimes passed as a keyword arg when using ``with_`` syntax.
|
||||||
|
- callback plugins - The structure of the ``exception``, ``warnings`` and ``deprecations`` values visible to callbacks has changed. Callbacks that inspect or serialize these values may require special handling. # DTFIX-RELEASE: porting guide re ErrorDetail/WarningMessageDetail/DeprecationMessageDetail
|
||||||
|
- modules - Ansible modules using ``sys.excepthook`` must use a standard ``try/except`` instead.
|
||||||
|
- templating - Access to ``_`` prefixed attributes and methods, and methods with known side effects, is no longer permitted.
|
||||||
|
In cases where a matching mapping key is present, the associated value will be returned instead of an error.
|
||||||
|
This increases template environment isolation and ensures more consistent behavior between the ``.`` and ``[]`` operators.
|
||||||
|
- inventory - Invalid variable names provided by inventories result in an inventory parse failure. This behavior is now consistent with other variable name usages throughout Ansible.
|
||||||
|
- internals - The ``ansible.utils.native_jinja`` Python module has been removed.
|
||||||
|
- internals - The ``AnsibleLoader`` and ``AnsibleDumper`` classes for working with YAML are now factory functions and cannot be extended.
|
||||||
|
- public API - The ``ansible.vars.fact_cache.FactCache`` wrapper has been removed.
|
||||||
|
|
||||||
|
security_fixes:
|
||||||
|
- templating - Ansible's template engine no longer processes Jinja templates in strings unless they are marked as coming from a trusted source.
|
||||||
|
Untrusted strings containing Jinja template markers are ignored with a warning.
|
||||||
|
Examples of trusted sources include playbooks, vars files, and many inventory sources.
|
||||||
|
Examples of untrusted sources include module results and facts.
|
||||||
|
Plugins which have not been updated to preserve trust while manipulating strings may inadvertently cause them to lose their trusted status.
|
||||||
|
- templating - Changes to conditional expression handling removed numerous instances of insecure multi-pass templating (which could result in execution of untrusted template expressions).
|
||||||
|
|
||||||
|
known_issues:
|
||||||
|
- variables - The values ``None``, ``True`` and ``False`` cannot be tagged because they are singletons. Attempts to apply tags to these values will be silently ignored.
|
||||||
|
- variables - Tagged values cannot be used for dictionary keys in many circumstances. # DTFIX-RELEASE: Explain this in more detail.
|
||||||
|
- templating - Any string value starting with ``#jinja2:`` which is templated will always be interpreted as Jinja2 configuration overrides.
|
||||||
|
To include this literal value at the start of a string, a space or other character must precede it.
|
||||||
|
|
||||||
|
bugfixes:
|
||||||
|
- module defaults - Module defaults are no longer templated unless they are used by a task that does not override them.
|
||||||
|
Previously, all module defaults for all modules were templated for every task.
|
||||||
|
- omitting task args - Use of omit for task args now properly falls back to args of lower precedence, such as module defaults.
|
||||||
|
Previously an omitted value would obliterate values of lower precedence. # DTFIX-RELEASE: do we need obliterate, is this a breaking change?
|
||||||
|
- regex_search filter - Corrected return value documentation to reflect None (not empty string) for no match.
|
||||||
|
- first_found lookup - Corrected return value documentation to reflect None (not empty string) for no files found.
|
||||||
|
- vars lookup - The ``default`` substitution only applies when trying to look up a variable which is not defined.
|
||||||
|
If the variable is defined, but templates to an undefined value, the ``default`` substitution will not apply.
|
||||||
|
Use the ``default`` filter to coerce those values instead.
|
||||||
|
- to_yaml/to_nice_yaml filters - Eliminated possibility of keyword arg collisions with internally-set defaults.
|
||||||
|
- Jinja plugins - Errors raised will always be derived from ``AnsibleTemplatePluginError``.
|
||||||
|
- ansible-test - Fixed traceback when handling certain YAML errors in the ``yamllint`` sanity test.
|
||||||
|
- YAML parsing - The `!unsafe` tag no longer coerces non-string scalars to strings.
|
||||||
|
- default callback - Error context is now shown for failing tasks that use the ``debug`` action.
|
||||||
|
- module arg templating - When using a templated raw task arg and a templated ``args`` keyword, args are now merged.
|
||||||
|
Previously use of templated raw task args silently ignored all values from the templated ``args`` keyword.
|
||||||
|
- action plugins - Action plugins that raise unhandled exceptions no longer terminate playbook loops. Previously, exceptions raised by an action plugin caused abnormal loop termination and loss of loop iteration results.
|
||||||
|
- display - The ``Display.deprecated`` method once again properly handles the ``removed=True`` argument (https://github.com/ansible/ansible/issues/82358).
|
||||||
|
- stability - Fixed silent process failure on unhandled IOError/OSError under ``linear`` strategy.
|
||||||
|
- lookup plugins - The ``terms`` arg to the ``run`` method is now always a list.
|
||||||
|
Previously, there were cases where a non-list could be received.
|
||||||
|
|
||||||
|
deprecated_features:
|
||||||
|
- templating - The ``ansible_managed`` variable available for certain templating scenarios, such as the ``template`` action and ``template`` lookup has been deprecated.
|
||||||
|
Define and use a custom variable instead of relying on ``ansible_managed``.
|
||||||
|
- display - The ``Display.get_deprecation_message`` method has been deprecated.
|
||||||
|
Call ``Display.deprecated`` to display a deprecation message, or call it with ``removed=True`` to raise an ``AnsibleError``.
|
||||||
|
- config - The ``DEFAULT_JINJA2_NATIVE`` option has no effect.
|
||||||
|
Jinja2 native mode is now the default and only option.
|
||||||
|
- config - The ``DEFAULT_NULL_REPRESENTATION`` option has no effect.
|
||||||
|
Null values are no longer automatically converted to another value during templating of single variable references.
|
||||||
|
- template lookup - The jinja2_native option is no longer used in the Ansible Core code base.
|
||||||
|
Jinja2 native mode is now the default and only option.
|
||||||
|
- conditionals - Conditionals using Jinja templating delimiters (e.g., ``{{``, ``{%``) should be rewritten as expressions without delimiters, unless the entire conditional value is a single template that resolves to a trusted string expression.
|
||||||
|
This is useful for dynamic indirection of conditional expressions, but is limited to trusted literal string expressions.
|
||||||
|
- templating - The ``disable_lookups`` option has no effect, since plugins must be updated to apply trust before any templating can be performed.
|
||||||
|
- to_yaml/to_nice_yaml filters - Implicit YAML dumping of vaulted value ciphertext is deprecated.
|
||||||
|
Set `dump_vault_tags` to explicitly specify the desired behavior.
|
||||||
|
- plugins - The ``listify_lookup_plugin_terms`` function is obsolete and in most cases no longer needed. # DTFIX-RELEASE: add a porting guide entry for this
|
||||||
|
- plugin error handling - The ``AnsibleError`` constructor arg ``suppress_extended_error`` is deprecated.
|
||||||
|
Using ``suppress_extended_error=True`` has the same effect as ``show_content=False``.
|
||||||
|
- config - The ``ACTION_WARNINGS`` config has no effect. It previously disabled command warnings, which have since been removed.
|
||||||
|
- templating - Support for enabling Jinja2 extensions (not plugins) has been deprecated.
|
||||||
|
- playbook variables - The ``play_hosts`` variable has been deprecated, use ``ansible_play_batch`` instead.
|
||||||
|
- bool filter - Support for coercing unrecognized input values (including None) has been deprecated. Consult the filter documentation for acceptable values, or consider use of the ``truthy`` and ``falsy`` tests. # DTFIX-RELEASE: porting guide
|
||||||
|
- oneline callback - The ``oneline`` callback and its associated ad-hoc CLI args (``-o``, ``--one-line``) are deprecated.
|
||||||
|
- tree callback - The ``tree`` callback and its associated ad-hoc CLI args (``-t``, ``--tree``) are deprecated.
|
||||||
|
- CLI - The ``--inventory-file`` option alias is deprecated. Use the ``-i`` or ``--inventory`` option instead.
|
||||||
|
- first_found lookup - Splitting of file paths on ``,;:`` is deprecated. Pass a list of paths instead.
|
||||||
|
The ``split`` method on strings can be used to split variables into a list as needed.
|
||||||
|
- cache plugins - The `ansible.plugins.cache.base` Python module is deprecated. Use `ansible.plugins.cache` instead.
|
||||||
|
- file loading - Loading text files with ``DataLoader`` containing data that cannot be decoded under the expected encoding is deprecated.
|
||||||
|
In most cases the encoding must be UTF-8, although some plugins allow choosing a different encoding.
|
||||||
|
Previously, invalid data was silently wrapped in Unicode surrogate escape sequences, often resulting in later errors or other data corruption.
|
||||||
|
|
||||||
|
removed_features:
|
||||||
|
- modules - Modules returning non-UTF8 strings now result in an error.
|
||||||
|
The ``MODULE_STRICT_UTF8_RESPONSE`` setting can be used to disable this check.
|
||||||
@ -0,0 +1,4 @@
|
|||||||
|
breaking_changes:
|
||||||
|
- Support for the ``toml`` library has been removed from TOML inventory parsing and dumping.
|
||||||
|
Use ``tomli`` for parsing on Python 3.10. Python 3.11 and later have built-in support for parsing.
|
||||||
|
Use ``tomli-w`` to support outputting inventory in TOML format.
|
||||||
@ -0,0 +1,53 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import importlib
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.module_utils import _internal
|
||||||
|
from ansible.module_utils._internal._json import _profiles
|
||||||
|
|
||||||
|
|
||||||
|
def get_controller_serialize_map() -> dict[type, t.Callable]:
|
||||||
|
"""
|
||||||
|
Injected into module_utils code to augment serialization maps with controller-only types.
|
||||||
|
This implementation replaces the no-op version in module_utils._internal in controller contexts.
|
||||||
|
"""
|
||||||
|
from ansible._internal._templating import _lazy_containers
|
||||||
|
from ansible.parsing.vault import EncryptedString
|
||||||
|
|
||||||
|
return {
|
||||||
|
_lazy_containers._AnsibleLazyTemplateDict: _profiles._JSONSerializationProfile.discard_tags,
|
||||||
|
_lazy_containers._AnsibleLazyTemplateList: _profiles._JSONSerializationProfile.discard_tags,
|
||||||
|
EncryptedString: str, # preserves tags since this is an intance of EncryptedString; if tags should be discarded from str, another entry will handle it
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def import_controller_module(module_name: str, /) -> t.Any:
|
||||||
|
"""
|
||||||
|
Injected into module_utils code to import and return the specified module.
|
||||||
|
This implementation replaces the no-op version in module_utils._internal in controller contexts.
|
||||||
|
"""
|
||||||
|
return importlib.import_module(module_name)
|
||||||
|
|
||||||
|
|
||||||
|
_T = t.TypeVar('_T')
|
||||||
|
|
||||||
|
|
||||||
|
def experimental(obj: _T) -> _T:
|
||||||
|
"""
|
||||||
|
Decorator for experimental types and methods outside the `_internal` package which accept or expose internal types.
|
||||||
|
As with internal APIs, these are subject to change at any time without notice.
|
||||||
|
"""
|
||||||
|
return obj
|
||||||
|
|
||||||
|
|
||||||
|
def setup() -> None:
|
||||||
|
"""No-op function to ensure that side-effect only imports of this module are not flagged/removed as 'unused'."""
|
||||||
|
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: this is really fragile- disordered/incorrect imports (among other things) can mess it up. Consider a hosting-env-managed context
|
||||||
|
# with an enum with at least Controller/Target/Unknown values, and possibly using lazy-init module shims or some other mechanism to allow controller-side
|
||||||
|
# notification/augmentation of this kind of metadata.
|
||||||
|
_internal.get_controller_serialize_map = get_controller_serialize_map
|
||||||
|
_internal.import_controller_module = import_controller_module
|
||||||
|
_internal.is_controller = True
|
||||||
@ -0,0 +1,265 @@
|
|||||||
|
# shebang placeholder
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import datetime
|
||||||
|
|
||||||
|
# For test-module.py script to tell this is a ANSIBALLZ_WRAPPER
|
||||||
|
_ANSIBALLZ_WRAPPER = True
|
||||||
|
|
||||||
|
# This code is part of Ansible, but is an independent component.
|
||||||
|
# The code in this particular templatable string, and this templatable string
|
||||||
|
# only, is BSD licensed. Modules which end up using this snippet, which is
|
||||||
|
# dynamically combined together by Ansible still belong to the author of the
|
||||||
|
# module, and they may assign their own license to the complete work.
|
||||||
|
#
|
||||||
|
# Copyright (c), James Cammarata, 2016
|
||||||
|
# Copyright (c), Toshio Kuratomi, 2016
|
||||||
|
#
|
||||||
|
# Redistribution and use in source and binary forms, with or without modification,
|
||||||
|
# are permitted provided that the following conditions are met:
|
||||||
|
#
|
||||||
|
# * Redistributions of source code must retain the above copyright
|
||||||
|
# notice, this list of conditions and the following disclaimer.
|
||||||
|
# * Redistributions in binary form must reproduce the above copyright notice,
|
||||||
|
# this list of conditions and the following disclaimer in the documentation
|
||||||
|
# and/or other materials provided with the distribution.
|
||||||
|
#
|
||||||
|
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
|
||||||
|
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
||||||
|
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
|
||||||
|
# IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
|
||||||
|
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
|
||||||
|
# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
|
||||||
|
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
|
||||||
|
# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
|
||||||
|
# USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||||
|
|
||||||
|
|
||||||
|
def _ansiballz_main(
|
||||||
|
zipdata: str,
|
||||||
|
ansible_module: str,
|
||||||
|
module_fqn: str,
|
||||||
|
params: str,
|
||||||
|
profile: str,
|
||||||
|
plugin_info_dict: dict[str, object],
|
||||||
|
date_time: datetime.datetime,
|
||||||
|
coverage_config: str | None,
|
||||||
|
coverage_output: str | None,
|
||||||
|
rlimit_nofile: int,
|
||||||
|
) -> None:
|
||||||
|
import os
|
||||||
|
import os.path
|
||||||
|
|
||||||
|
# Access to the working directory is required by Python when using pipelining, as well as for the coverage module.
|
||||||
|
# Some platforms, such as macOS, may not allow querying the working directory when using become to drop privileges.
|
||||||
|
try:
|
||||||
|
os.getcwd()
|
||||||
|
except OSError:
|
||||||
|
try:
|
||||||
|
os.chdir(os.path.expanduser('~'))
|
||||||
|
except OSError:
|
||||||
|
os.chdir('/')
|
||||||
|
|
||||||
|
if rlimit_nofile:
|
||||||
|
import resource
|
||||||
|
|
||||||
|
existing_soft, existing_hard = resource.getrlimit(resource.RLIMIT_NOFILE)
|
||||||
|
|
||||||
|
# adjust soft limit subject to existing hard limit
|
||||||
|
requested_soft = min(existing_hard, rlimit_nofile)
|
||||||
|
|
||||||
|
if requested_soft != existing_soft:
|
||||||
|
try:
|
||||||
|
resource.setrlimit(resource.RLIMIT_NOFILE, (requested_soft, existing_hard))
|
||||||
|
except ValueError:
|
||||||
|
# some platforms (eg macOS) lie about their hard limit
|
||||||
|
pass
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import __main__
|
||||||
|
|
||||||
|
# For some distros and python versions we pick up this script in the temporary
|
||||||
|
# directory. This leads to problems when the ansible module masks a python
|
||||||
|
# library that another import needs. We have not figured out what about the
|
||||||
|
# specific distros and python versions causes this to behave differently.
|
||||||
|
#
|
||||||
|
# Tested distros:
|
||||||
|
# Fedora23 with python3.4 Works
|
||||||
|
# Ubuntu15.10 with python2.7 Works
|
||||||
|
# Ubuntu15.10 with python3.4 Fails without this
|
||||||
|
# Ubuntu16.04.1 with python3.5 Fails without this
|
||||||
|
# To test on another platform:
|
||||||
|
# * use the copy module (since this shadows the stdlib copy module)
|
||||||
|
# * Turn off pipelining
|
||||||
|
# * Make sure that the destination file does not exist
|
||||||
|
# * ansible ubuntu16-test -m copy -a 'src=/etc/motd dest=/var/tmp/m'
|
||||||
|
# This will traceback in shutil. Looking at the complete traceback will show
|
||||||
|
# that shutil is importing copy which finds the ansible module instead of the
|
||||||
|
# stdlib module
|
||||||
|
scriptdir = None
|
||||||
|
try:
|
||||||
|
scriptdir = os.path.dirname(os.path.realpath(__main__.__file__))
|
||||||
|
except (AttributeError, OSError):
|
||||||
|
# Some platforms don't set __file__ when reading from stdin
|
||||||
|
# OSX raises OSError if using abspath() in a directory we don't have
|
||||||
|
# permission to read (realpath calls abspath)
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Strip cwd from sys.path to avoid potential permissions issues
|
||||||
|
excludes = {'', '.', scriptdir}
|
||||||
|
sys.path = [p for p in sys.path if p not in excludes]
|
||||||
|
|
||||||
|
import base64
|
||||||
|
import shutil
|
||||||
|
import tempfile
|
||||||
|
import zipfile
|
||||||
|
|
||||||
|
def invoke_module(modlib_path: str, json_params: bytes) -> None:
|
||||||
|
# When installed via setuptools (including python setup.py install),
|
||||||
|
# ansible may be installed with an easy-install.pth file. That file
|
||||||
|
# may load the system-wide install of ansible rather than the one in
|
||||||
|
# the module. sitecustomize is the only way to override that setting.
|
||||||
|
z = zipfile.ZipFile(modlib_path, mode='a')
|
||||||
|
|
||||||
|
# py3: modlib_path will be text, py2: it's bytes. Need bytes at the end
|
||||||
|
sitecustomize = u'import sys\\nsys.path.insert(0,"%s")\\n' % modlib_path
|
||||||
|
sitecustomize = sitecustomize.encode('utf-8')
|
||||||
|
# Use a ZipInfo to work around zipfile limitation on hosts with
|
||||||
|
# clocks set to a pre-1980 year (for instance, Raspberry Pi)
|
||||||
|
zinfo = zipfile.ZipInfo()
|
||||||
|
zinfo.filename = 'sitecustomize.py'
|
||||||
|
zinfo.date_time = date_time.utctimetuple()[:6]
|
||||||
|
z.writestr(zinfo, sitecustomize)
|
||||||
|
z.close()
|
||||||
|
|
||||||
|
# Put the zipped up module_utils we got from the controller first in the python path so that we
|
||||||
|
# can monkeypatch the right basic
|
||||||
|
sys.path.insert(0, modlib_path)
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._ansiballz import run_module
|
||||||
|
|
||||||
|
run_module(
|
||||||
|
json_params=json_params,
|
||||||
|
profile=profile,
|
||||||
|
plugin_info_dict=plugin_info_dict,
|
||||||
|
module_fqn=module_fqn,
|
||||||
|
modlib_path=modlib_path,
|
||||||
|
coverage_config=coverage_config,
|
||||||
|
coverage_output=coverage_output,
|
||||||
|
)
|
||||||
|
|
||||||
|
def debug(command: str, modlib_path: str, json_params: bytes) -> None:
|
||||||
|
# The code here normally doesn't run. It's only used for debugging on the
|
||||||
|
# remote machine.
|
||||||
|
#
|
||||||
|
# The subcommands in this function make it easier to debug ansiballz
|
||||||
|
# modules. Here's the basic steps:
|
||||||
|
#
|
||||||
|
# Run ansible with the environment variable: ANSIBLE_KEEP_REMOTE_FILES=1 and -vvv
|
||||||
|
# to save the module file remotely::
|
||||||
|
# $ ANSIBLE_KEEP_REMOTE_FILES=1 ansible host1 -m ping -a 'data=october' -vvv
|
||||||
|
#
|
||||||
|
# Part of the verbose output will tell you where on the remote machine the
|
||||||
|
# module was written to::
|
||||||
|
# [...]
|
||||||
|
# <host1> SSH: EXEC ssh -C -q -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o
|
||||||
|
# PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o
|
||||||
|
# ControlPath=/home/badger/.ansible/cp/ansible-ssh-%h-%p-%r -tt rhel7 '/bin/sh -c '"'"'LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8
|
||||||
|
# LC_MESSAGES=en_US.UTF-8 /usr/bin/python /home/badger/.ansible/tmp/ansible-tmp-1461173013.93-9076457629738/ping'"'"''
|
||||||
|
# [...]
|
||||||
|
#
|
||||||
|
# Login to the remote machine and run the module file via from the previous
|
||||||
|
# step with the explode subcommand to extract the module payload into
|
||||||
|
# source files::
|
||||||
|
# $ ssh host1
|
||||||
|
# $ /usr/bin/python /home/badger/.ansible/tmp/ansible-tmp-1461173013.93-9076457629738/ping explode
|
||||||
|
# Module expanded into:
|
||||||
|
# /home/badger/.ansible/tmp/ansible-tmp-1461173408.08-279692652635227/ansible
|
||||||
|
#
|
||||||
|
# You can now edit the source files to instrument the code or experiment with
|
||||||
|
# different parameter values. When you're ready to run the code you've modified
|
||||||
|
# (instead of the code from the actual zipped module), use the execute subcommand like this::
|
||||||
|
# $ /usr/bin/python /home/badger/.ansible/tmp/ansible-tmp-1461173013.93-9076457629738/ping execute
|
||||||
|
|
||||||
|
# Okay to use __file__ here because we're running from a kept file
|
||||||
|
basedir = os.path.join(os.path.abspath(os.path.dirname(__file__)), 'debug_dir')
|
||||||
|
args_path = os.path.join(basedir, 'args')
|
||||||
|
|
||||||
|
if command == 'explode':
|
||||||
|
# transform the ZIPDATA into an exploded directory of code and then
|
||||||
|
# print the path to the code. This is an easy way for people to look
|
||||||
|
# at the code on the remote machine for debugging it in that
|
||||||
|
# environment
|
||||||
|
z = zipfile.ZipFile(modlib_path)
|
||||||
|
for filename in z.namelist():
|
||||||
|
if filename.startswith('/'):
|
||||||
|
raise Exception('Something wrong with this module zip file: should not contain absolute paths')
|
||||||
|
|
||||||
|
dest_filename = os.path.join(basedir, filename)
|
||||||
|
if dest_filename.endswith(os.path.sep) and not os.path.exists(dest_filename):
|
||||||
|
os.makedirs(dest_filename)
|
||||||
|
else:
|
||||||
|
directory = os.path.dirname(dest_filename)
|
||||||
|
if not os.path.exists(directory):
|
||||||
|
os.makedirs(directory)
|
||||||
|
with open(dest_filename, 'wb') as writer:
|
||||||
|
writer.write(z.read(filename))
|
||||||
|
|
||||||
|
# write the args file
|
||||||
|
with open(args_path, 'wb') as writer:
|
||||||
|
writer.write(json_params)
|
||||||
|
|
||||||
|
print('Module expanded into:')
|
||||||
|
print(basedir)
|
||||||
|
|
||||||
|
elif command == 'execute':
|
||||||
|
# Execute the exploded code instead of executing the module from the
|
||||||
|
# embedded ZIPDATA. This allows people to easily run their modified
|
||||||
|
# code on the remote machine to see how changes will affect it.
|
||||||
|
|
||||||
|
# Set pythonpath to the debug dir
|
||||||
|
sys.path.insert(0, basedir)
|
||||||
|
|
||||||
|
# read in the args file which the user may have modified
|
||||||
|
with open(args_path, 'rb') as reader:
|
||||||
|
json_params = reader.read()
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._ansiballz import run_module
|
||||||
|
|
||||||
|
run_module(
|
||||||
|
json_params=json_params,
|
||||||
|
profile=profile,
|
||||||
|
plugin_info_dict=plugin_info_dict,
|
||||||
|
module_fqn=module_fqn,
|
||||||
|
modlib_path=modlib_path,
|
||||||
|
)
|
||||||
|
|
||||||
|
else:
|
||||||
|
print('WARNING: Unknown debug command. Doing nothing.')
|
||||||
|
|
||||||
|
#
|
||||||
|
# See comments in the debug() method for information on debugging
|
||||||
|
#
|
||||||
|
|
||||||
|
encoded_params = params.encode()
|
||||||
|
|
||||||
|
# There's a race condition with the controller removing the
|
||||||
|
# remote_tmpdir and this module executing under async. So we cannot
|
||||||
|
# store this in remote_tmpdir (use system tempdir instead)
|
||||||
|
# Only need to use [ansible_module]_payload_ in the temp_path until we move to zipimport
|
||||||
|
# (this helps ansible-test produce coverage stats)
|
||||||
|
temp_path = tempfile.mkdtemp(prefix='ansible_' + ansible_module + '_payload_')
|
||||||
|
|
||||||
|
try:
|
||||||
|
zipped_mod = os.path.join(temp_path, 'ansible_' + ansible_module + '_payload.zip')
|
||||||
|
|
||||||
|
with open(zipped_mod, 'wb') as modlib:
|
||||||
|
modlib.write(base64.b64decode(zipdata))
|
||||||
|
|
||||||
|
if len(sys.argv) == 2:
|
||||||
|
debug(sys.argv[1], zipped_mod, encoded_params)
|
||||||
|
else:
|
||||||
|
invoke_module(zipped_mod, encoded_params)
|
||||||
|
finally:
|
||||||
|
shutil.rmtree(temp_path, ignore_errors=True)
|
||||||
@ -0,0 +1,130 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import dataclasses
|
||||||
|
import os
|
||||||
|
import types
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._datatag import _tag_dataclass_kwargs, AnsibleDatatagBase, AnsibleSingletonTagBase
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(**_tag_dataclass_kwargs)
|
||||||
|
class Origin(AnsibleDatatagBase):
|
||||||
|
"""
|
||||||
|
A tag that stores origin metadata for a tagged value, intended for forensic/diagnostic use.
|
||||||
|
Origin metadata should not be used to make runtime decisions, as it is not guaranteed to be present or accurate.
|
||||||
|
Setting both `path` and `line_num` can result in diagnostic display of referenced file contents.
|
||||||
|
Either `path` or `description` must be present.
|
||||||
|
"""
|
||||||
|
|
||||||
|
path: str | None = None
|
||||||
|
"""The path from which the tagged content originated."""
|
||||||
|
description: str | None = None
|
||||||
|
"""A description of the origin, for display to users."""
|
||||||
|
line_num: int | None = None
|
||||||
|
"""An optional line number, starting at 1."""
|
||||||
|
col_num: int | None = None
|
||||||
|
"""An optional column number, starting at 1."""
|
||||||
|
|
||||||
|
UNKNOWN: t.ClassVar[t.Self]
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def get_or_create_tag(cls, value: t.Any, path: str | os.PathLike | None) -> Origin:
|
||||||
|
"""Return the tag from the given value, creating a tag from the provided path if no tag was found."""
|
||||||
|
if not (origin := cls.get_tag(value)):
|
||||||
|
if path:
|
||||||
|
origin = Origin(path=str(path)) # convert tagged strings and path-like values to a native str
|
||||||
|
else:
|
||||||
|
origin = Origin.UNKNOWN
|
||||||
|
|
||||||
|
return origin
|
||||||
|
|
||||||
|
def replace(
|
||||||
|
self,
|
||||||
|
path: str | types.EllipsisType = ...,
|
||||||
|
description: str | types.EllipsisType = ...,
|
||||||
|
line_num: int | None | types.EllipsisType = ...,
|
||||||
|
col_num: int | None | types.EllipsisType = ...,
|
||||||
|
) -> t.Self:
|
||||||
|
"""Return a new origin based on an existing one, with the given fields replaced."""
|
||||||
|
return dataclasses.replace(
|
||||||
|
self,
|
||||||
|
**{
|
||||||
|
key: value
|
||||||
|
for key, value in dict(
|
||||||
|
path=path,
|
||||||
|
description=description,
|
||||||
|
line_num=line_num,
|
||||||
|
col_num=col_num,
|
||||||
|
).items()
|
||||||
|
if value is not ...
|
||||||
|
}, # type: ignore[arg-type]
|
||||||
|
)
|
||||||
|
|
||||||
|
def _post_validate(self) -> None:
|
||||||
|
if self.path:
|
||||||
|
if not self.path.startswith('/'):
|
||||||
|
raise RuntimeError('The `src` field must be an absolute path.')
|
||||||
|
elif not self.description:
|
||||||
|
raise RuntimeError('The `src` or `description` field must be specified.')
|
||||||
|
|
||||||
|
def __str__(self) -> str:
|
||||||
|
"""Renders the origin in the form of path:line_num:col_num, omitting missing/invalid elements from the right."""
|
||||||
|
if self.path:
|
||||||
|
value = self.path
|
||||||
|
else:
|
||||||
|
value = self.description
|
||||||
|
|
||||||
|
if self.line_num and self.line_num > 0:
|
||||||
|
value += f':{self.line_num}'
|
||||||
|
|
||||||
|
if self.col_num and self.col_num > 0:
|
||||||
|
value += f':{self.col_num}'
|
||||||
|
|
||||||
|
if self.path and self.description:
|
||||||
|
value += f' ({self.description})'
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
Origin.UNKNOWN = Origin(description='<unknown>')
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(**_tag_dataclass_kwargs)
|
||||||
|
class VaultedValue(AnsibleDatatagBase):
|
||||||
|
"""Tag for vault-encrypted strings that carries the original ciphertext for round-tripping."""
|
||||||
|
|
||||||
|
ciphertext: str
|
||||||
|
|
||||||
|
def _get_tag_to_propagate(self, src: t.Any, value: object, *, value_type: t.Optional[type] = None) -> t.Self | None:
|
||||||
|
# Since VaultedValue stores the encrypted representation of the value on which it is tagged,
|
||||||
|
# it is incorrect to propagate the tag to a value which is not equal to the original.
|
||||||
|
# If the tag were copied to another value and subsequently serialized as the original encrypted value,
|
||||||
|
# the result would then differ from the value on which the tag was applied.
|
||||||
|
|
||||||
|
# Comparisons which can trigger an exception are indicative of a bug and should not be handled here.
|
||||||
|
# For example:
|
||||||
|
# * When `src` is an undecryptable `EncryptedString` -- it is not valid to apply this tag to that type.
|
||||||
|
# * When `value` is a `Marker` -- this requires a templating, but vaulted values do not support templating.
|
||||||
|
|
||||||
|
if src == value: # assume the tag was correctly applied to src
|
||||||
|
return self # same plaintext value, tag propagation with same ciphertext is safe
|
||||||
|
|
||||||
|
return self.get_tag(value) # different value, preserve the existing tag, if any
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(**_tag_dataclass_kwargs)
|
||||||
|
class TrustedAsTemplate(AnsibleSingletonTagBase):
|
||||||
|
"""
|
||||||
|
Indicates the tagged string is trusted to parse and render as a template.
|
||||||
|
Do *NOT* apply this tag to data from untrusted sources, as this would allow code injection during templating.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(**_tag_dataclass_kwargs)
|
||||||
|
class SourceWasEncrypted(AnsibleSingletonTagBase):
|
||||||
|
"""
|
||||||
|
For internal use only.
|
||||||
|
Indicates the tagged value was sourced from an encrypted file.
|
||||||
|
Currently applied only by DataLoader.get_text_file_contents() and by extension DataLoader.load_from_file().
|
||||||
|
"""
|
||||||
@ -0,0 +1,19 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._datatag import AnsibleTagHelper
|
||||||
|
|
||||||
|
|
||||||
|
def str_problematic_strip(value: str) -> str:
|
||||||
|
"""
|
||||||
|
Return a copy of `value` with leading and trailing whitespace removed.
|
||||||
|
Used where `str.strip` is needed, but tags must be preserved *AND* the stripping behavior likely shouldn't exist.
|
||||||
|
If the stripping behavior is non-problematic, use `AnsibleTagHelper.tag_copy` around `str.strip` instead.
|
||||||
|
"""
|
||||||
|
if (stripped_value := value.strip()) == value:
|
||||||
|
return value
|
||||||
|
|
||||||
|
# FUTURE: consider deprecating some/all usages of this method; they generally imply a code smell or pattern we shouldn't be supporting
|
||||||
|
|
||||||
|
stripped_value = AnsibleTagHelper.tag_copy(value, stripped_value)
|
||||||
|
|
||||||
|
return stripped_value
|
||||||
@ -0,0 +1,33 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import io
|
||||||
|
import typing as _t
|
||||||
|
|
||||||
|
from .._wrapt import ObjectProxy
|
||||||
|
from ...module_utils._internal import _datatag
|
||||||
|
|
||||||
|
|
||||||
|
class TaggedStreamWrapper(ObjectProxy):
|
||||||
|
"""
|
||||||
|
Janky proxy around IOBase to allow streams to carry tags and support basic interrogation by the tagging API.
|
||||||
|
Most tagging operations will have undefined behavior for this type.
|
||||||
|
"""
|
||||||
|
|
||||||
|
_self__ansible_tags_mapping: _datatag._AnsibleTagsMapping
|
||||||
|
|
||||||
|
def __init__(self, stream: io.IOBase, tags: _datatag.AnsibleDatatagBase | _t.Iterable[_datatag.AnsibleDatatagBase]) -> None:
|
||||||
|
super().__init__(stream)
|
||||||
|
|
||||||
|
tag_list: list[_datatag.AnsibleDatatagBase]
|
||||||
|
|
||||||
|
# noinspection PyProtectedMember
|
||||||
|
if type(tags) in _datatag._known_tag_types:
|
||||||
|
tag_list = [tags] # type: ignore[list-item]
|
||||||
|
else:
|
||||||
|
tag_list = list(tags) # type: ignore[arg-type]
|
||||||
|
|
||||||
|
self._self__ansible_tags_mapping = _datatag._AnsibleTagsMapping((type(tag), tag) for tag in tag_list)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def _ansible_tags_mapping(self) -> _datatag._AnsibleTagsMapping:
|
||||||
|
return self._self__ansible_tags_mapping
|
||||||
@ -0,0 +1,128 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import dataclasses
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.errors import AnsibleRuntimeError
|
||||||
|
from ansible.module_utils.common.messages import ErrorSummary, Detail, _dataclass_kwargs
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleCapturedError(AnsibleRuntimeError):
|
||||||
|
"""An exception representing error detail captured in another context where the error detail must be serialized to be preserved."""
|
||||||
|
|
||||||
|
context: t.ClassVar[str]
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
obj: t.Any = None,
|
||||||
|
error_summary: ErrorSummary,
|
||||||
|
) -> None:
|
||||||
|
super().__init__(
|
||||||
|
obj=obj,
|
||||||
|
)
|
||||||
|
|
||||||
|
self._error_summary = error_summary
|
||||||
|
|
||||||
|
@property
|
||||||
|
def error_summary(self) -> ErrorSummary:
|
||||||
|
return self._error_summary
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleResultCapturedError(AnsibleCapturedError):
|
||||||
|
"""An exception representing error detail captured in a foreign context where an action/module result dictionary is involved."""
|
||||||
|
|
||||||
|
def __init__(self, error_summary: ErrorSummary, result: dict[str, t.Any]) -> None:
|
||||||
|
super().__init__(error_summary=error_summary)
|
||||||
|
|
||||||
|
self._result = result
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def maybe_raise_on_result(cls, result: dict[str, t.Any]) -> None:
|
||||||
|
"""Normalize the result and raise an exception if the result indicated failure."""
|
||||||
|
if error_summary := cls.normalize_result_exception(result):
|
||||||
|
raise error_summary.error_type(error_summary, result)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def find_first_remoted_error(cls, exception: BaseException) -> t.Self | None:
|
||||||
|
"""Find the first captured module error in the cause chain, starting with the given exception, returning None if not found."""
|
||||||
|
while exception:
|
||||||
|
if isinstance(exception, cls):
|
||||||
|
return exception
|
||||||
|
|
||||||
|
exception = exception.__cause__
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def normalize_result_exception(cls, result: dict[str, t.Any]) -> CapturedErrorSummary | None:
|
||||||
|
"""
|
||||||
|
Normalize the result `exception`, if any, to be a `CapturedErrorSummary` instance.
|
||||||
|
If a new `CapturedErrorSummary` was created, the `error_type` will be `cls`.
|
||||||
|
The `exception` key will be removed if falsey.
|
||||||
|
A `CapturedErrorSummary` instance will be returned if `failed` is truthy.
|
||||||
|
"""
|
||||||
|
if type(cls) is AnsibleResultCapturedError: # pylint: disable=unidiomatic-typecheck
|
||||||
|
raise TypeError('The normalize_result_exception method cannot be called on the AnsibleCapturedError base type, use a derived type.')
|
||||||
|
|
||||||
|
if not isinstance(result, dict):
|
||||||
|
raise TypeError(f'Malformed result. Received {type(result)} instead of {dict}.')
|
||||||
|
|
||||||
|
failed = result.get('failed') # DTFIX-FUTURE: warn if failed is present and not a bool, or exception is present without failed being True
|
||||||
|
exception = result.pop('exception', None)
|
||||||
|
|
||||||
|
if not failed and not exception:
|
||||||
|
return None
|
||||||
|
|
||||||
|
if isinstance(exception, CapturedErrorSummary):
|
||||||
|
error_summary = exception
|
||||||
|
elif isinstance(exception, ErrorSummary):
|
||||||
|
error_summary = CapturedErrorSummary(
|
||||||
|
details=exception.details,
|
||||||
|
formatted_traceback=cls._normalize_traceback(exception.formatted_traceback),
|
||||||
|
error_type=cls,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
# translate non-ErrorDetail errors
|
||||||
|
error_summary = CapturedErrorSummary(
|
||||||
|
details=(Detail(msg=str(result.get('msg', 'Unknown error.'))),),
|
||||||
|
formatted_traceback=cls._normalize_traceback(exception),
|
||||||
|
error_type=cls,
|
||||||
|
)
|
||||||
|
|
||||||
|
result.update(exception=error_summary)
|
||||||
|
|
||||||
|
return error_summary if failed else None # even though error detail was normalized, only return it if the result indicated failure
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _normalize_traceback(cls, value: object | None) -> str | None:
|
||||||
|
"""Normalize the provided traceback value, returning None if it is falsey."""
|
||||||
|
if not value:
|
||||||
|
return None
|
||||||
|
|
||||||
|
value = str(value).rstrip()
|
||||||
|
|
||||||
|
if not value:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return value + '\n'
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleActionCapturedError(AnsibleResultCapturedError):
|
||||||
|
"""An exception representing error detail sourced directly by an action in its result dictionary."""
|
||||||
|
|
||||||
|
_default_message = 'Action failed.'
|
||||||
|
context = 'action'
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleModuleCapturedError(AnsibleResultCapturedError):
|
||||||
|
"""An exception representing error detail captured in a module context and returned from an action's result dictionary."""
|
||||||
|
|
||||||
|
_default_message = 'Module failed.'
|
||||||
|
context = 'target'
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(**_dataclass_kwargs)
|
||||||
|
class CapturedErrorSummary(ErrorSummary):
|
||||||
|
# DTFIX-RELEASE: where to put this, name, etc. since it shows up in results, it's not exactly private (and contains a type ref to an internal type)
|
||||||
|
error_type: type[AnsibleResultCapturedError] | None = None
|
||||||
@ -0,0 +1,91 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import contextlib
|
||||||
|
import enum
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.utils.display import Display
|
||||||
|
from ansible.constants import config
|
||||||
|
|
||||||
|
display = Display()
|
||||||
|
|
||||||
|
# FUTURE: add sanity test to detect use of skip_on_ignore without Skippable (and vice versa)
|
||||||
|
|
||||||
|
|
||||||
|
class ErrorAction(enum.Enum):
|
||||||
|
"""Action to take when an error is encountered."""
|
||||||
|
|
||||||
|
IGNORE = enum.auto()
|
||||||
|
WARN = enum.auto()
|
||||||
|
FAIL = enum.auto()
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_config(cls, setting: str, variables: dict[str, t.Any] | None = None) -> t.Self:
|
||||||
|
"""Return an `ErrorAction` enum from the specified Ansible config setting."""
|
||||||
|
return cls[config.get_config_value(setting, variables=variables).upper()]
|
||||||
|
|
||||||
|
|
||||||
|
class _SkipException(BaseException):
|
||||||
|
"""Internal flow control exception for skipping code blocks within a `Skippable` context manager."""
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
super().__init__('Skipping ignored action due to use of `skip_on_ignore`. It is a bug to encounter this message outside of debugging.')
|
||||||
|
|
||||||
|
|
||||||
|
class _SkippableContextManager:
|
||||||
|
"""Internal context manager to support flow control for skipping code blocks."""
|
||||||
|
|
||||||
|
def __enter__(self) -> None:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def __exit__(self, exc_type, _exc_val, _exc_tb) -> bool:
|
||||||
|
if exc_type is None:
|
||||||
|
raise RuntimeError('A `Skippable` context manager was entered, but a `skip_on_ignore` handler was never invoked.')
|
||||||
|
|
||||||
|
return exc_type is _SkipException # only mask a _SkipException, allow all others to raise
|
||||||
|
|
||||||
|
|
||||||
|
Skippable = _SkippableContextManager()
|
||||||
|
"""Context manager singleton required to enclose `ErrorHandler.handle` invocations when `skip_on_ignore` is `True`."""
|
||||||
|
|
||||||
|
|
||||||
|
class ErrorHandler:
|
||||||
|
"""
|
||||||
|
Provides a configurable error handler context manager for a specific list of exception types.
|
||||||
|
Unhandled errors leaving the context manager can be ignored, treated as warnings, or allowed to raise by setting `ErrorAction`.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, action: ErrorAction) -> None:
|
||||||
|
self.action = action
|
||||||
|
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def handle(self, *args: type[BaseException], skip_on_ignore: bool = False) -> t.Iterator[None]:
|
||||||
|
"""
|
||||||
|
Handle the specified exception(s) using the defined error action.
|
||||||
|
If `skip_on_ignore` is `True`, the body of the context manager will be skipped for `ErrorAction.IGNORE`.
|
||||||
|
Use of `skip_on_ignore` requires enclosure within the `Skippable` context manager.
|
||||||
|
"""
|
||||||
|
if not args:
|
||||||
|
raise ValueError('At least one exception type is required.')
|
||||||
|
|
||||||
|
if skip_on_ignore and self.action == ErrorAction.IGNORE:
|
||||||
|
raise _SkipException() # skipping ignored action
|
||||||
|
|
||||||
|
try:
|
||||||
|
yield
|
||||||
|
except args as ex:
|
||||||
|
match self.action:
|
||||||
|
case ErrorAction.WARN:
|
||||||
|
display.error_as_warning(msg=None, exception=ex)
|
||||||
|
case ErrorAction.FAIL:
|
||||||
|
raise
|
||||||
|
case _: # ErrorAction.IGNORE
|
||||||
|
pass
|
||||||
|
|
||||||
|
if skip_on_ignore:
|
||||||
|
raise _SkipException() # completed skippable action, ensures the `Skippable` context was used
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_config(cls, setting: str, variables: dict[str, t.Any] | None = None) -> t.Self:
|
||||||
|
"""Return an `ErrorHandler` instance configured using the specified Ansible config setting."""
|
||||||
|
return cls(ErrorAction.from_config(setting, variables=variables))
|
||||||
@ -0,0 +1,310 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import dataclasses
|
||||||
|
import itertools
|
||||||
|
import pathlib
|
||||||
|
import sys
|
||||||
|
import textwrap
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.module_utils.common.messages import Detail, ErrorSummary
|
||||||
|
from ansible._internal._datatag._tags import Origin
|
||||||
|
from ansible.module_utils._internal import _ambient_context, _traceback
|
||||||
|
from ansible import errors
|
||||||
|
|
||||||
|
if t.TYPE_CHECKING:
|
||||||
|
from ansible.utils.display import Display
|
||||||
|
|
||||||
|
|
||||||
|
class RedactAnnotatedSourceContext(_ambient_context.AmbientContextBase):
|
||||||
|
"""
|
||||||
|
When active, this context will redact annotated source lines, showing only the origin.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
def _dedupe_and_concat_message_chain(message_parts: list[str]) -> str:
|
||||||
|
message_parts = list(reversed(message_parts))
|
||||||
|
|
||||||
|
message = message_parts.pop(0)
|
||||||
|
|
||||||
|
for message_part in message_parts:
|
||||||
|
# avoid duplicate messages where the cause was already concatenated to the exception message
|
||||||
|
if message_part.endswith(message):
|
||||||
|
message = message_part
|
||||||
|
else:
|
||||||
|
message = concat_message(message_part, message)
|
||||||
|
|
||||||
|
return message
|
||||||
|
|
||||||
|
|
||||||
|
def _collapse_error_details(error_details: t.Sequence[Detail]) -> list[Detail]:
|
||||||
|
"""
|
||||||
|
Return a potentially modified error chain, with redundant errors collapsed into previous error(s) in the chain.
|
||||||
|
This reduces the verbosity of messages by eliminating repetition when multiple errors in the chain share the same contextual information.
|
||||||
|
"""
|
||||||
|
previous_error = error_details[0]
|
||||||
|
previous_warnings: list[str] = []
|
||||||
|
collapsed_error_details: list[tuple[Detail, list[str]]] = [(previous_error, previous_warnings)]
|
||||||
|
|
||||||
|
for error in error_details[1:]:
|
||||||
|
details_present = error.formatted_source_context or error.help_text
|
||||||
|
details_changed = error.formatted_source_context != previous_error.formatted_source_context or error.help_text != previous_error.help_text
|
||||||
|
|
||||||
|
if details_present and details_changed:
|
||||||
|
previous_error = error
|
||||||
|
previous_warnings = []
|
||||||
|
collapsed_error_details.append((previous_error, previous_warnings))
|
||||||
|
else:
|
||||||
|
previous_warnings.append(error.msg)
|
||||||
|
|
||||||
|
final_error_details: list[Detail] = []
|
||||||
|
|
||||||
|
for error, messages in collapsed_error_details:
|
||||||
|
final_error_details.append(dataclasses.replace(error, msg=_dedupe_and_concat_message_chain([error.msg] + messages)))
|
||||||
|
|
||||||
|
return final_error_details
|
||||||
|
|
||||||
|
|
||||||
|
def _get_cause(exception: BaseException) -> BaseException | None:
|
||||||
|
# deprecated: description='remove support for orig_exc (deprecated in 2.23)' core_version='2.27'
|
||||||
|
|
||||||
|
if not isinstance(exception, errors.AnsibleError):
|
||||||
|
return exception.__cause__
|
||||||
|
|
||||||
|
if exception.__cause__:
|
||||||
|
if exception.orig_exc and exception.orig_exc is not exception.__cause__:
|
||||||
|
_get_display().warning(
|
||||||
|
msg=f"The `orig_exc` argument to `{type(exception).__name__}` was given, but differed from the cause given by `raise ... from`.",
|
||||||
|
)
|
||||||
|
|
||||||
|
return exception.__cause__
|
||||||
|
|
||||||
|
if exception.orig_exc:
|
||||||
|
# encourage the use of `raise ... from` before deprecating `orig_exc`
|
||||||
|
_get_display().warning(msg=f"The `orig_exc` argument to `{type(exception).__name__}` was given without using `raise ... from orig_exc`.")
|
||||||
|
|
||||||
|
return exception.orig_exc
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
class _TemporaryDisplay:
|
||||||
|
# DTFIX-FUTURE: generalize this and hide it in the display module so all users of Display can benefit
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def warning(*args, **kwargs):
|
||||||
|
print(f'FALLBACK WARNING: {args} {kwargs}', file=sys.stderr)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def deprecated(*args, **kwargs):
|
||||||
|
print(f'FALLBACK DEPRECATION: {args} {kwargs}', file=sys.stderr)
|
||||||
|
|
||||||
|
|
||||||
|
def _get_display() -> Display | _TemporaryDisplay:
|
||||||
|
try:
|
||||||
|
from ansible.utils.display import Display
|
||||||
|
except ImportError:
|
||||||
|
return _TemporaryDisplay()
|
||||||
|
|
||||||
|
return Display()
|
||||||
|
|
||||||
|
|
||||||
|
def _create_error_summary(exception: BaseException, event: _traceback.TracebackEvent | None = None) -> ErrorSummary:
|
||||||
|
from . import _captured # avoid circular import due to AnsibleError import
|
||||||
|
|
||||||
|
current_exception: BaseException | None = exception
|
||||||
|
error_details: list[Detail] = []
|
||||||
|
|
||||||
|
if event:
|
||||||
|
formatted_traceback = _traceback.maybe_extract_traceback(exception, event)
|
||||||
|
else:
|
||||||
|
formatted_traceback = None
|
||||||
|
|
||||||
|
while current_exception:
|
||||||
|
if isinstance(current_exception, errors.AnsibleError):
|
||||||
|
include_cause_message = current_exception._include_cause_message
|
||||||
|
edc = Detail(
|
||||||
|
msg=current_exception._original_message.strip(),
|
||||||
|
formatted_source_context=current_exception._formatted_source_context,
|
||||||
|
help_text=current_exception._help_text,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
include_cause_message = True
|
||||||
|
edc = Detail(
|
||||||
|
msg=str(current_exception).strip(),
|
||||||
|
)
|
||||||
|
|
||||||
|
error_details.append(edc)
|
||||||
|
|
||||||
|
if isinstance(current_exception, _captured.AnsibleCapturedError):
|
||||||
|
detail = current_exception.error_summary
|
||||||
|
error_details.extend(detail.details)
|
||||||
|
|
||||||
|
if formatted_traceback and detail.formatted_traceback:
|
||||||
|
formatted_traceback = (
|
||||||
|
f'{detail.formatted_traceback}\n'
|
||||||
|
f'The {current_exception.context} exception above was the direct cause of the following controller exception:\n\n'
|
||||||
|
f'{formatted_traceback}'
|
||||||
|
)
|
||||||
|
|
||||||
|
if not include_cause_message:
|
||||||
|
break
|
||||||
|
|
||||||
|
current_exception = _get_cause(current_exception)
|
||||||
|
|
||||||
|
return ErrorSummary(details=tuple(error_details), formatted_traceback=formatted_traceback)
|
||||||
|
|
||||||
|
|
||||||
|
def concat_message(left: str, right: str) -> str:
|
||||||
|
"""Normalize `left` by removing trailing punctuation and spaces before appending new punctuation and `right`."""
|
||||||
|
return f'{left.rstrip(". ")}: {right}'
|
||||||
|
|
||||||
|
|
||||||
|
def get_chained_message(exception: BaseException) -> str:
|
||||||
|
"""
|
||||||
|
Return the full chain of exception messages by concatenating the cause(s) until all are exhausted.
|
||||||
|
"""
|
||||||
|
error_summary = _create_error_summary(exception)
|
||||||
|
message_parts = [edc.msg for edc in error_summary.details]
|
||||||
|
|
||||||
|
return _dedupe_and_concat_message_chain(message_parts)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(kw_only=True, frozen=True)
|
||||||
|
class SourceContext:
|
||||||
|
origin: Origin
|
||||||
|
annotated_source_lines: list[str]
|
||||||
|
target_line: str | None
|
||||||
|
|
||||||
|
def __str__(self) -> str:
|
||||||
|
msg_lines = [f'Origin: {self.origin}']
|
||||||
|
|
||||||
|
if self.annotated_source_lines:
|
||||||
|
msg_lines.append('')
|
||||||
|
msg_lines.extend(self.annotated_source_lines)
|
||||||
|
|
||||||
|
return '\n'.join(msg_lines)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_value(cls, value: t.Any) -> SourceContext | None:
|
||||||
|
"""Attempt to retrieve source and render a contextual indicator from the value's origin (if any)."""
|
||||||
|
if value is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
if isinstance(value, Origin):
|
||||||
|
origin = value
|
||||||
|
value = None
|
||||||
|
else:
|
||||||
|
origin = Origin.get_tag(value)
|
||||||
|
|
||||||
|
if RedactAnnotatedSourceContext.current(optional=True):
|
||||||
|
return cls.error('content redacted')
|
||||||
|
|
||||||
|
if origin and origin.path:
|
||||||
|
return cls.from_origin(origin)
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: redaction context may not be sufficient to avoid secret disclosure without SensitiveData and other enhancements
|
||||||
|
if value is None:
|
||||||
|
truncated_value = None
|
||||||
|
annotated_source_lines = []
|
||||||
|
else:
|
||||||
|
# DTFIX-FUTURE: cleanup/share width
|
||||||
|
try:
|
||||||
|
value = str(value)
|
||||||
|
except Exception as ex:
|
||||||
|
value = f'<< context unavailable: {ex} >>'
|
||||||
|
|
||||||
|
truncated_value = textwrap.shorten(value, width=120)
|
||||||
|
annotated_source_lines = [truncated_value]
|
||||||
|
|
||||||
|
return SourceContext(
|
||||||
|
origin=origin or Origin.UNKNOWN,
|
||||||
|
annotated_source_lines=annotated_source_lines,
|
||||||
|
target_line=truncated_value,
|
||||||
|
)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def error(message: str | None, origin: Origin | None = None) -> SourceContext:
|
||||||
|
return SourceContext(
|
||||||
|
origin=origin,
|
||||||
|
annotated_source_lines=[f'(source not shown: {message})'] if message else [],
|
||||||
|
target_line=None,
|
||||||
|
)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_origin(cls, origin: Origin) -> SourceContext:
|
||||||
|
"""Attempt to retrieve source and render a contextual indicator of an error location."""
|
||||||
|
from ansible.parsing.vault import is_encrypted # avoid circular import
|
||||||
|
|
||||||
|
# DTFIX-FUTURE: support referencing the column after the end of the target line, so we can indicate where a missing character (quote) needs to be added
|
||||||
|
# this is also useful for cases like end-of-stream reported by the YAML parser
|
||||||
|
|
||||||
|
# DTFIX-FUTURE: Implement line wrapping and match annotated line width to the terminal display width.
|
||||||
|
|
||||||
|
context_line_count: t.Final = 2
|
||||||
|
max_annotated_line_width: t.Final = 120
|
||||||
|
truncation_marker: t.Final = '...'
|
||||||
|
|
||||||
|
target_line_num = origin.line_num
|
||||||
|
|
||||||
|
if RedactAnnotatedSourceContext.current(optional=True):
|
||||||
|
return cls.error('content redacted', origin)
|
||||||
|
|
||||||
|
if not target_line_num or target_line_num < 1:
|
||||||
|
return cls.error(None, origin) # message omitted since lack of line number is obvious from pos
|
||||||
|
|
||||||
|
start_line_idx = max(0, (target_line_num - 1) - context_line_count) # if near start of file
|
||||||
|
target_col_num = origin.col_num
|
||||||
|
|
||||||
|
try:
|
||||||
|
with pathlib.Path(origin.path).open() as src:
|
||||||
|
first_line = src.readline()
|
||||||
|
lines = list(itertools.islice(itertools.chain((first_line,), src), start_line_idx, target_line_num))
|
||||||
|
except Exception as ex:
|
||||||
|
return cls.error(type(ex).__name__, origin)
|
||||||
|
|
||||||
|
if is_encrypted(first_line):
|
||||||
|
return cls.error('content encrypted', origin)
|
||||||
|
|
||||||
|
if len(lines) != target_line_num - start_line_idx:
|
||||||
|
return cls.error('file truncated', origin)
|
||||||
|
|
||||||
|
annotated_source_lines = []
|
||||||
|
|
||||||
|
line_label_width = len(str(target_line_num))
|
||||||
|
max_src_line_len = max_annotated_line_width - line_label_width - 1
|
||||||
|
|
||||||
|
usable_line_len = max_src_line_len
|
||||||
|
|
||||||
|
for line_num, line in enumerate(lines, start_line_idx + 1):
|
||||||
|
line = line.rstrip('\n') # universal newline default mode on `open` ensures we'll never see anything but \n
|
||||||
|
line = line.replace('\t', ' ') # mixed tab/space handling is intentionally disabled since we're both format and display config agnostic
|
||||||
|
|
||||||
|
if len(line) > max_src_line_len:
|
||||||
|
line = line[: max_src_line_len - len(truncation_marker)] + truncation_marker
|
||||||
|
usable_line_len = max_src_line_len - len(truncation_marker)
|
||||||
|
|
||||||
|
annotated_source_lines.append(f'{str(line_num).rjust(line_label_width)}{" " if line else ""}{line}')
|
||||||
|
|
||||||
|
if target_col_num and usable_line_len >= target_col_num >= 1:
|
||||||
|
column_marker = f'column {target_col_num}'
|
||||||
|
|
||||||
|
target_col_idx = target_col_num - 1
|
||||||
|
|
||||||
|
if target_col_idx + 2 + len(column_marker) > max_src_line_len:
|
||||||
|
column_marker = f'{" " * (target_col_idx - len(column_marker) - 1)}{column_marker} ^'
|
||||||
|
else:
|
||||||
|
column_marker = f'{" " * target_col_idx}^ {column_marker}'
|
||||||
|
|
||||||
|
column_marker = f'{" " * line_label_width} {column_marker}'
|
||||||
|
|
||||||
|
annotated_source_lines.append(column_marker)
|
||||||
|
elif target_col_num is None:
|
||||||
|
underline_length = len(annotated_source_lines[-1]) - line_label_width - 1
|
||||||
|
annotated_source_lines.append(f'{" " * line_label_width} {"^" * underline_length}')
|
||||||
|
|
||||||
|
return SourceContext(
|
||||||
|
origin=origin,
|
||||||
|
annotated_source_lines=annotated_source_lines,
|
||||||
|
target_line=lines[-1].rstrip('\n'), # universal newline default mode on `open` ensures we'll never see anything but \n
|
||||||
|
)
|
||||||
@ -0,0 +1,160 @@
|
|||||||
|
"""Internal utilities for serialization and deserialization."""
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: most of this isn't JSON specific, find a better home
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.errors import AnsibleVariableTypeError
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._datatag import (
|
||||||
|
_ANSIBLE_ALLOWED_MAPPING_VAR_TYPES,
|
||||||
|
_ANSIBLE_ALLOWED_NON_SCALAR_COLLECTION_VAR_TYPES,
|
||||||
|
_ANSIBLE_ALLOWED_VAR_TYPES,
|
||||||
|
_AnsibleTaggedStr,
|
||||||
|
AnsibleTagHelper,
|
||||||
|
)
|
||||||
|
from ansible.module_utils._internal._json._profiles import _tagless
|
||||||
|
from ansible.parsing.vault import EncryptedString
|
||||||
|
from ansible._internal._datatag._tags import Origin, TrustedAsTemplate
|
||||||
|
from ansible.module_utils import _internal
|
||||||
|
|
||||||
|
_T = t.TypeVar('_T')
|
||||||
|
_sentinel = object()
|
||||||
|
|
||||||
|
|
||||||
|
class HasCurrent(t.Protocol):
|
||||||
|
"""Utility protocol for mixin type safety."""
|
||||||
|
|
||||||
|
_current: t.Any
|
||||||
|
|
||||||
|
|
||||||
|
class StateTrackingMixIn(HasCurrent):
|
||||||
|
"""Mixin for use with `AnsibleVariableVisitor` to track current visitation context."""
|
||||||
|
|
||||||
|
def __init__(self, *args, **kwargs) -> None:
|
||||||
|
super().__init__(*args, **kwargs)
|
||||||
|
|
||||||
|
self._stack: list[t.Any] = []
|
||||||
|
|
||||||
|
def __enter__(self) -> None:
|
||||||
|
self._stack.append(self._current)
|
||||||
|
|
||||||
|
def __exit__(self, *_args, **_kwargs) -> None:
|
||||||
|
self._stack.pop()
|
||||||
|
|
||||||
|
def _get_stack(self) -> list[t.Any]:
|
||||||
|
if not self._stack:
|
||||||
|
return []
|
||||||
|
|
||||||
|
return self._stack[1:] + [self._current]
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleVariableVisitor:
|
||||||
|
"""Utility visitor base class to recursively apply various behaviors and checks to variable object graphs."""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
trusted_as_template: bool = False,
|
||||||
|
origin: Origin | None = None,
|
||||||
|
convert_mapping_to_dict: bool = False,
|
||||||
|
convert_sequence_to_list: bool = False,
|
||||||
|
convert_custom_scalars: bool = False,
|
||||||
|
allow_encrypted_string: bool = False,
|
||||||
|
):
|
||||||
|
super().__init__() # supports StateTrackingMixIn
|
||||||
|
|
||||||
|
self.trusted_as_template = trusted_as_template
|
||||||
|
self.origin = origin
|
||||||
|
self.convert_mapping_to_dict = convert_mapping_to_dict
|
||||||
|
self.convert_sequence_to_list = convert_sequence_to_list
|
||||||
|
self.convert_custom_scalars = convert_custom_scalars
|
||||||
|
self.allow_encrypted_string = allow_encrypted_string
|
||||||
|
|
||||||
|
self._current: t.Any = None # supports StateTrackingMixIn
|
||||||
|
|
||||||
|
def __enter__(self) -> t.Any:
|
||||||
|
"""No-op context manager dispatcher (delegates to mixin behavior if present)."""
|
||||||
|
if func := getattr(super(), '__enter__', None):
|
||||||
|
func()
|
||||||
|
|
||||||
|
def __exit__(self, *args, **kwargs) -> t.Any:
|
||||||
|
"""No-op context manager dispatcher (delegates to mixin behavior if present)."""
|
||||||
|
if func := getattr(super(), '__exit__', None):
|
||||||
|
func(*args, **kwargs)
|
||||||
|
|
||||||
|
def visit(self, value: _T) -> _T:
|
||||||
|
"""
|
||||||
|
Enforces Ansible's variable type system restrictions before a var is accepted in inventory. Also, conditionally implements template trust
|
||||||
|
compatibility, depending on the plugin's declared understanding (or lack thereof). This always recursively copies inputs to fully isolate
|
||||||
|
inventory data from what the plugin provided, and prevent any later mutation.
|
||||||
|
"""
|
||||||
|
return self._visit(None, value)
|
||||||
|
|
||||||
|
def _early_visit(self, value, value_type) -> t.Any:
|
||||||
|
"""Overridable hook point to allow custom string handling in derived visitors."""
|
||||||
|
if value_type in (str, _AnsibleTaggedStr):
|
||||||
|
# apply compatibility behavior
|
||||||
|
if self.trusted_as_template:
|
||||||
|
result = TrustedAsTemplate().tag(value)
|
||||||
|
else:
|
||||||
|
result = value
|
||||||
|
else:
|
||||||
|
result = _sentinel
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
def _visit(self, key: t.Any, value: _T) -> _T:
|
||||||
|
"""Internal implementation to recursively visit a data structure's contents."""
|
||||||
|
self._current = key # supports StateTrackingMixIn
|
||||||
|
|
||||||
|
value_type = type(value)
|
||||||
|
|
||||||
|
result: _T
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: the visitor is ignoring dict/mapping keys except for debugging and schema-aware checking, it should be doing type checks on keys
|
||||||
|
# DTFIX-RELEASE: some type lists being consulted (the ones from datatag) are probably too permissive, and perhaps should not be dynamic
|
||||||
|
|
||||||
|
if (result := self._early_visit(value, value_type)) is not _sentinel:
|
||||||
|
pass
|
||||||
|
# DTFIX-RELEASE: de-duplicate and optimize; extract inline generator expressions and fallback function or mapping for native type calculation?
|
||||||
|
elif value_type in _ANSIBLE_ALLOWED_MAPPING_VAR_TYPES: # check mappings first, because they're also collections
|
||||||
|
with self: # supports StateTrackingMixIn
|
||||||
|
result = AnsibleTagHelper.tag_copy(value, ((k, self._visit(k, v)) for k, v in value.items()), value_type=value_type)
|
||||||
|
elif value_type in _ANSIBLE_ALLOWED_NON_SCALAR_COLLECTION_VAR_TYPES:
|
||||||
|
with self: # supports StateTrackingMixIn
|
||||||
|
result = AnsibleTagHelper.tag_copy(value, (self._visit(k, v) for k, v in enumerate(t.cast(t.Iterable, value))), value_type=value_type)
|
||||||
|
elif self.allow_encrypted_string and isinstance(value, EncryptedString):
|
||||||
|
return value # type: ignore[return-value] # DTFIX-RELEASE: this should probably only be allowed for values in dict, not keys (set, dict)
|
||||||
|
elif self.convert_mapping_to_dict and _internal.is_intermediate_mapping(value):
|
||||||
|
with self: # supports StateTrackingMixIn
|
||||||
|
result = {k: self._visit(k, v) for k, v in value.items()} # type: ignore[assignment]
|
||||||
|
elif self.convert_sequence_to_list and _internal.is_intermediate_iterable(value):
|
||||||
|
with self: # supports StateTrackingMixIn
|
||||||
|
result = [self._visit(k, v) for k, v in enumerate(t.cast(t.Iterable, value))] # type: ignore[assignment]
|
||||||
|
elif self.convert_custom_scalars and isinstance(value, str):
|
||||||
|
result = str(value) # type: ignore[assignment]
|
||||||
|
elif self.convert_custom_scalars and isinstance(value, float):
|
||||||
|
result = float(value) # type: ignore[assignment]
|
||||||
|
elif self.convert_custom_scalars and isinstance(value, int) and not isinstance(value, bool):
|
||||||
|
result = int(value) # type: ignore[assignment]
|
||||||
|
else:
|
||||||
|
if value_type not in _ANSIBLE_ALLOWED_VAR_TYPES:
|
||||||
|
raise AnsibleVariableTypeError.from_value(obj=value)
|
||||||
|
|
||||||
|
# supported scalar type that requires no special handling, just return as-is
|
||||||
|
result = value
|
||||||
|
|
||||||
|
if self.origin and not Origin.is_tagged_on(result):
|
||||||
|
# apply shared instance default origin tag
|
||||||
|
result = self.origin.tag(result)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def json_dumps_formatted(value: object) -> str:
|
||||||
|
"""Return a JSON dump of `value` with formatting and keys sorted."""
|
||||||
|
return json.dumps(value, cls=_tagless.Encoder, sort_keys=True, indent=4)
|
||||||
@ -0,0 +1,34 @@
|
|||||||
|
from __future__ import annotations as _annotations
|
||||||
|
|
||||||
|
import typing as _t
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._json import _profiles
|
||||||
|
from ansible._internal._json._profiles import _legacy
|
||||||
|
from ansible.parsing import vault as _vault
|
||||||
|
|
||||||
|
|
||||||
|
class LegacyControllerJSONEncoder(_legacy.Encoder):
|
||||||
|
"""Compatibility wrapper over `legacy` profile JSON encoder to support trust stripping and vault value plaintext conversion."""
|
||||||
|
|
||||||
|
def __init__(self, preprocess_unsafe: bool = False, vault_to_text: bool = False, _decode_bytes: bool = False, **kwargs) -> None:
|
||||||
|
self._preprocess_unsafe = preprocess_unsafe
|
||||||
|
self._vault_to_text = vault_to_text
|
||||||
|
self._decode_bytes = _decode_bytes
|
||||||
|
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
def default(self, o: _t.Any) -> _t.Any:
|
||||||
|
"""Hooked default that can conditionally bypass base encoder behavior based on this instance's config."""
|
||||||
|
if type(o) is _profiles._WrappedValue: # pylint: disable=unidiomatic-typecheck
|
||||||
|
o = o.wrapped
|
||||||
|
|
||||||
|
if not self._preprocess_unsafe and type(o) is _legacy._Untrusted: # pylint: disable=unidiomatic-typecheck
|
||||||
|
return o.value # if not emitting unsafe markers, bypass custom unsafe serialization and just return the raw value
|
||||||
|
|
||||||
|
if self._vault_to_text and type(o) is _vault.EncryptedString: # pylint: disable=unidiomatic-typecheck
|
||||||
|
return str(o) # decrypt and return the plaintext (or fail trying)
|
||||||
|
|
||||||
|
if self._decode_bytes and isinstance(o, bytes):
|
||||||
|
return o.decode(errors='surrogateescape') # backward compatibility with `ansible.module_utils.basic.jsonify`
|
||||||
|
|
||||||
|
return super().default(o)
|
||||||
@ -0,0 +1,55 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import datetime as _datetime
|
||||||
|
|
||||||
|
from ansible.module_utils._internal import _datatag
|
||||||
|
from ansible.module_utils._internal._json import _profiles
|
||||||
|
from ansible.parsing import vault as _vault
|
||||||
|
from ansible._internal._datatag import _tags
|
||||||
|
|
||||||
|
|
||||||
|
class _Profile(_profiles._JSONSerializationProfile):
|
||||||
|
"""Profile for external cache persistence of inventory/fact data that preserves most tags."""
|
||||||
|
|
||||||
|
serialize_map = {}
|
||||||
|
schema_id = 1
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def post_init(cls, **kwargs):
|
||||||
|
cls.allowed_ansible_serializable_types = (
|
||||||
|
_profiles._common_module_types
|
||||||
|
| _profiles._common_module_response_types
|
||||||
|
| {
|
||||||
|
_datatag._AnsibleTaggedDate,
|
||||||
|
_datatag._AnsibleTaggedTime,
|
||||||
|
_datatag._AnsibleTaggedDateTime,
|
||||||
|
_datatag._AnsibleTaggedStr,
|
||||||
|
_datatag._AnsibleTaggedInt,
|
||||||
|
_datatag._AnsibleTaggedFloat,
|
||||||
|
_datatag._AnsibleTaggedList,
|
||||||
|
_datatag._AnsibleTaggedSet,
|
||||||
|
_datatag._AnsibleTaggedTuple,
|
||||||
|
_datatag._AnsibleTaggedDict,
|
||||||
|
_tags.SourceWasEncrypted,
|
||||||
|
_tags.Origin,
|
||||||
|
_tags.TrustedAsTemplate,
|
||||||
|
_vault.EncryptedString,
|
||||||
|
_vault.VaultedValue,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
cls.serialize_map = {
|
||||||
|
set: cls.serialize_as_list,
|
||||||
|
tuple: cls.serialize_as_list,
|
||||||
|
_datetime.date: _datatag.AnsibleSerializableDate,
|
||||||
|
_datetime.time: _datatag.AnsibleSerializableTime,
|
||||||
|
_datetime.datetime: _datatag.AnsibleSerializableDateTime,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class Encoder(_profiles.AnsibleProfileJSONEncoder):
|
||||||
|
_profile = _Profile
|
||||||
|
|
||||||
|
|
||||||
|
class Decoder(_profiles.AnsibleProfileJSONDecoder):
|
||||||
|
_profile = _Profile
|
||||||
@ -0,0 +1,40 @@
|
|||||||
|
"""
|
||||||
|
Backwards compatibility profile for serialization for persisted ansible-inventory output.
|
||||||
|
Behavior is equivalent to pre 2.18 `AnsibleJSONEncoder` with vault_to_text=True.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from ... import _json
|
||||||
|
from . import _legacy
|
||||||
|
|
||||||
|
|
||||||
|
class _InventoryVariableVisitor(_legacy._LegacyVariableVisitor, _json.StateTrackingMixIn):
|
||||||
|
"""State-tracking visitor implementation that only applies trust to `_meta.hostvars` and `vars` inventory values."""
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: does the variable visitor need to support conversion of sequence/mapping for inventory?
|
||||||
|
|
||||||
|
@property
|
||||||
|
def _allow_trust(self) -> bool:
|
||||||
|
stack = self._get_stack()
|
||||||
|
|
||||||
|
if len(stack) >= 4 and stack[:2] == ['_meta', 'hostvars']:
|
||||||
|
return True
|
||||||
|
|
||||||
|
if len(stack) >= 3 and stack[1] == 'vars':
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
class _Profile(_legacy._Profile):
|
||||||
|
visitor_type = _InventoryVariableVisitor
|
||||||
|
encode_strings_as_utf8 = True
|
||||||
|
|
||||||
|
|
||||||
|
class Encoder(_legacy.Encoder):
|
||||||
|
_profile = _Profile
|
||||||
|
|
||||||
|
|
||||||
|
class Decoder(_legacy.Decoder):
|
||||||
|
_profile = _Profile
|
||||||
@ -0,0 +1,198 @@
|
|||||||
|
"""
|
||||||
|
Backwards compatibility profile for serialization other than inventory (which should use inventory_legacy for backward-compatible trust behavior).
|
||||||
|
Behavior is equivalent to pre 2.18 `AnsibleJSONEncoder` with vault_to_text=True.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations as _annotations
|
||||||
|
|
||||||
|
import datetime as _datetime
|
||||||
|
import typing as _t
|
||||||
|
|
||||||
|
from ansible._internal._datatag import _tags
|
||||||
|
from ansible.module_utils._internal import _datatag
|
||||||
|
from ansible.module_utils._internal._json import _profiles
|
||||||
|
from ansible.parsing import vault as _vault
|
||||||
|
|
||||||
|
from ... import _json
|
||||||
|
|
||||||
|
|
||||||
|
class _Untrusted:
|
||||||
|
"""
|
||||||
|
Temporarily wraps strings which are not trusted for templating.
|
||||||
|
Used before serialization of strings not tagged TrustedAsTemplate when trust inversion is enabled and trust is allowed in the string's context.
|
||||||
|
Used during deserialization of `__ansible_unsafe` strings to indicate they should not be tagged TrustedAsTemplate.
|
||||||
|
"""
|
||||||
|
|
||||||
|
__slots__ = ('value',)
|
||||||
|
|
||||||
|
def __init__(self, value: str) -> None:
|
||||||
|
self.value = value
|
||||||
|
|
||||||
|
|
||||||
|
class _LegacyVariableVisitor(_json.AnsibleVariableVisitor):
|
||||||
|
"""Variable visitor that supports optional trust inversion for legacy serialization."""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
trusted_as_template: bool = False,
|
||||||
|
invert_trust: bool = False,
|
||||||
|
origin: _tags.Origin | None = None,
|
||||||
|
convert_mapping_to_dict: bool = False,
|
||||||
|
convert_sequence_to_list: bool = False,
|
||||||
|
convert_custom_scalars: bool = False,
|
||||||
|
):
|
||||||
|
super().__init__(
|
||||||
|
trusted_as_template=trusted_as_template,
|
||||||
|
origin=origin,
|
||||||
|
convert_mapping_to_dict=convert_mapping_to_dict,
|
||||||
|
convert_sequence_to_list=convert_sequence_to_list,
|
||||||
|
convert_custom_scalars=convert_custom_scalars,
|
||||||
|
allow_encrypted_string=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.invert_trust = invert_trust
|
||||||
|
|
||||||
|
if trusted_as_template and invert_trust:
|
||||||
|
raise ValueError('trusted_as_template is mutually exclusive with invert_trust')
|
||||||
|
|
||||||
|
@property
|
||||||
|
def _allow_trust(self) -> bool:
|
||||||
|
"""
|
||||||
|
This profile supports trust application in all contexts.
|
||||||
|
Derived implementations can override this behavior for application-dependent/schema-aware trust.
|
||||||
|
"""
|
||||||
|
return True
|
||||||
|
|
||||||
|
def _early_visit(self, value, value_type) -> _t.Any:
|
||||||
|
"""Similar to base implementation, but supports an intermediate wrapper for trust inversion."""
|
||||||
|
if value_type in (str, _datatag._AnsibleTaggedStr):
|
||||||
|
# apply compatibility behavior
|
||||||
|
if self.trusted_as_template and self._allow_trust:
|
||||||
|
result = _tags.TrustedAsTemplate().tag(value)
|
||||||
|
elif self.invert_trust and not _tags.TrustedAsTemplate.is_tagged_on(value) and self._allow_trust:
|
||||||
|
result = _Untrusted(value)
|
||||||
|
else:
|
||||||
|
result = value
|
||||||
|
elif value_type is _Untrusted:
|
||||||
|
result = value.value
|
||||||
|
else:
|
||||||
|
result = _json._sentinel
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
class _Profile(_profiles._JSONSerializationProfile["Encoder", "Decoder"]):
|
||||||
|
visitor_type = _LegacyVariableVisitor
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def serialize_untrusted(cls, value: _Untrusted) -> dict[str, str] | str:
|
||||||
|
return dict(
|
||||||
|
__ansible_unsafe=_datatag.AnsibleTagHelper.untag(value.value),
|
||||||
|
)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def serialize_tagged_str(cls, value: _datatag.AnsibleTaggedObject) -> _t.Any:
|
||||||
|
if ciphertext := _vault.VaultHelper.get_ciphertext(value, with_tags=False):
|
||||||
|
return dict(
|
||||||
|
__ansible_vault=ciphertext,
|
||||||
|
)
|
||||||
|
|
||||||
|
return _datatag.AnsibleTagHelper.untag(value)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def deserialize_unsafe(cls, value: dict[str, _t.Any]) -> _Untrusted:
|
||||||
|
ansible_unsafe = value['__ansible_unsafe']
|
||||||
|
|
||||||
|
if type(ansible_unsafe) is not str: # pylint: disable=unidiomatic-typecheck
|
||||||
|
raise TypeError(f"__ansible_unsafe is {type(ansible_unsafe)} not {str}")
|
||||||
|
|
||||||
|
return _Untrusted(ansible_unsafe)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def deserialize_vault(cls, value: dict[str, _t.Any]) -> _vault.EncryptedString:
|
||||||
|
ansible_vault = value['__ansible_vault']
|
||||||
|
|
||||||
|
if type(ansible_vault) is not str: # pylint: disable=unidiomatic-typecheck
|
||||||
|
raise TypeError(f"__ansible_vault is {type(ansible_vault)} not {str}")
|
||||||
|
|
||||||
|
encrypted_string = _vault.EncryptedString(ciphertext=ansible_vault)
|
||||||
|
|
||||||
|
return encrypted_string
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def serialize_encrypted_string(cls, value: _vault.EncryptedString) -> dict[str, str]:
|
||||||
|
return dict(
|
||||||
|
__ansible_vault=_vault.VaultHelper.get_ciphertext(value, with_tags=False),
|
||||||
|
)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def post_init(cls) -> None:
|
||||||
|
cls.serialize_map = {
|
||||||
|
set: cls.serialize_as_list,
|
||||||
|
tuple: cls.serialize_as_list,
|
||||||
|
_datetime.date: cls.serialize_as_isoformat, # existing devel behavior
|
||||||
|
_datetime.time: cls.serialize_as_isoformat, # always failed pre-2.18, so okay to include for consistency
|
||||||
|
_datetime.datetime: cls.serialize_as_isoformat, # existing devel behavior
|
||||||
|
_datatag._AnsibleTaggedDate: cls.discard_tags,
|
||||||
|
_datatag._AnsibleTaggedTime: cls.discard_tags,
|
||||||
|
_datatag._AnsibleTaggedDateTime: cls.discard_tags,
|
||||||
|
_vault.EncryptedString: cls.serialize_encrypted_string,
|
||||||
|
_datatag._AnsibleTaggedStr: cls.serialize_tagged_str, # for VaultedValue tagged str
|
||||||
|
_datatag._AnsibleTaggedInt: cls.discard_tags,
|
||||||
|
_datatag._AnsibleTaggedFloat: cls.discard_tags,
|
||||||
|
_datatag._AnsibleTaggedList: cls.discard_tags,
|
||||||
|
_datatag._AnsibleTaggedSet: cls.discard_tags,
|
||||||
|
_datatag._AnsibleTaggedTuple: cls.discard_tags,
|
||||||
|
_datatag._AnsibleTaggedDict: cls.discard_tags,
|
||||||
|
_Untrusted: cls.serialize_untrusted, # equivalent to AnsibleJSONEncoder(preprocess_unsafe=True) in devel
|
||||||
|
}
|
||||||
|
|
||||||
|
cls.deserialize_map = {
|
||||||
|
'__ansible_unsafe': cls.deserialize_unsafe,
|
||||||
|
'__ansible_vault': cls.deserialize_vault,
|
||||||
|
}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def pre_serialize(cls, encoder: Encoder, o: _t.Any) -> _t.Any:
|
||||||
|
# DTFIX-RELEASE: these conversion args probably aren't needed
|
||||||
|
avv = cls.visitor_type(invert_trust=True, convert_mapping_to_dict=True, convert_sequence_to_list=True, convert_custom_scalars=True)
|
||||||
|
|
||||||
|
return avv.visit(o)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def post_deserialize(cls, decoder: Decoder, o: _t.Any) -> _t.Any:
|
||||||
|
avv = cls.visitor_type(trusted_as_template=decoder._trusted_as_template, origin=decoder._origin)
|
||||||
|
|
||||||
|
return avv.visit(o)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def handle_key(cls, k: _t.Any) -> _t.Any:
|
||||||
|
if isinstance(k, str):
|
||||||
|
return k
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: decide if this is a deprecation warning, error, or what?
|
||||||
|
# Non-string variable names have been disallowed by set_fact and other things since at least 2021.
|
||||||
|
# DTFIX-RELEASE: document why this behavior is here, also verify the legacy tagless use case doesn't need this same behavior
|
||||||
|
return str(k)
|
||||||
|
|
||||||
|
|
||||||
|
class Encoder(_profiles.AnsibleProfileJSONEncoder):
|
||||||
|
_profile = _Profile
|
||||||
|
|
||||||
|
|
||||||
|
class Decoder(_profiles.AnsibleProfileJSONDecoder):
|
||||||
|
_profile = _Profile
|
||||||
|
|
||||||
|
def __init__(self, **kwargs) -> None:
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
# NB: these can only be sampled properly when loading strings, eg, `json.loads`; the global `json.load` function does not expose the file-like to us
|
||||||
|
self._origin: _tags.Origin | None = None
|
||||||
|
self._trusted_as_template: bool = False
|
||||||
|
|
||||||
|
def raw_decode(self, s: str, idx: int = 0) -> tuple[_t.Any, int]:
|
||||||
|
self._origin = _tags.Origin.get_tag(s)
|
||||||
|
self._trusted_as_template = _tags.TrustedAsTemplate.is_tagged_on(s)
|
||||||
|
|
||||||
|
return super().raw_decode(s, idx)
|
||||||
@ -0,0 +1,21 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import contextlib
|
||||||
|
import fcntl
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def named_mutex(path: str) -> t.Iterator[None]:
|
||||||
|
"""
|
||||||
|
Lightweight context manager wrapper over `fcntl.flock` to provide IPC locking via a shared filename.
|
||||||
|
Entering the context manager blocks until the lock is acquired.
|
||||||
|
The lock file will be created automatically, but creation of the parent directory and deletion of the lockfile are the caller's responsibility.
|
||||||
|
"""
|
||||||
|
with open(path, 'a') as file:
|
||||||
|
fcntl.flock(file, fcntl.LOCK_EX)
|
||||||
|
|
||||||
|
try:
|
||||||
|
yield
|
||||||
|
finally:
|
||||||
|
fcntl.flock(file, fcntl.LOCK_UN)
|
||||||
@ -0,0 +1,57 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import functools
|
||||||
|
import json
|
||||||
|
import json.encoder
|
||||||
|
import json.decoder
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from .._wrapt import ObjectProxy
|
||||||
|
from .._json._profiles import _cache_persistence
|
||||||
|
|
||||||
|
|
||||||
|
class PluginInterposer(ObjectProxy):
|
||||||
|
"""Proxies a Cache plugin instance to implement transparent encapsulation of serialized Ansible internal data types."""
|
||||||
|
|
||||||
|
_PAYLOAD_KEY = '__payload__'
|
||||||
|
"""The key used to store the serialized payload."""
|
||||||
|
|
||||||
|
def get(self, key: str) -> dict[str, object]:
|
||||||
|
return self._decode(self.__wrapped__.get(self._get_key(key)))
|
||||||
|
|
||||||
|
def set(self, key: str, value: dict[str, object]) -> None:
|
||||||
|
self.__wrapped__.set(self._get_key(key), self._encode(value))
|
||||||
|
|
||||||
|
def keys(self) -> t.Sequence[str]:
|
||||||
|
return [k for k in (self._restore_key(k) for k in self.__wrapped__.keys()) if k is not None]
|
||||||
|
|
||||||
|
def contains(self, key: t.Any) -> bool:
|
||||||
|
return self.__wrapped__.contains(self._get_key(key))
|
||||||
|
|
||||||
|
def delete(self, key: str) -> None:
|
||||||
|
self.__wrapped__.delete(self._get_key(key))
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _restore_key(cls, wrapped_key: str) -> str | None:
|
||||||
|
prefix = cls._get_wrapped_key_prefix()
|
||||||
|
|
||||||
|
if not wrapped_key.startswith(prefix):
|
||||||
|
return None
|
||||||
|
|
||||||
|
return wrapped_key[len(prefix) :]
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
@functools.cache
|
||||||
|
def _get_wrapped_key_prefix(cls) -> str:
|
||||||
|
return f's{_cache_persistence._Profile.schema_id}_'
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _get_key(cls, key: str) -> str:
|
||||||
|
"""Augment the supplied key with a schema identifier to allow for side-by-side caching across incompatible schemas."""
|
||||||
|
return f'{cls._get_wrapped_key_prefix()}{key}'
|
||||||
|
|
||||||
|
def _encode(self, value: dict[str, object]) -> dict[str, object]:
|
||||||
|
return {self._PAYLOAD_KEY: json.dumps(value, cls=_cache_persistence.Encoder)}
|
||||||
|
|
||||||
|
def _decode(self, value: dict[str, t.Any]) -> dict[str, object]:
|
||||||
|
return json.loads(value[self._PAYLOAD_KEY], cls=_cache_persistence.Decoder)
|
||||||
@ -0,0 +1,78 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import dataclasses
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from collections import abc as c
|
||||||
|
|
||||||
|
from ansible import constants
|
||||||
|
from ansible._internal._templating import _engine
|
||||||
|
from ansible._internal._templating._chain_templar import ChainTemplar
|
||||||
|
from ansible.errors import AnsibleError
|
||||||
|
from ansible.module_utils._internal._ambient_context import AmbientContextBase
|
||||||
|
from ansible.module_utils.datatag import native_type_name
|
||||||
|
from ansible.parsing import vault as _vault
|
||||||
|
from ansible.utils.display import Display
|
||||||
|
|
||||||
|
if t.TYPE_CHECKING:
|
||||||
|
from ansible.playbook.task import Task
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass
|
||||||
|
class TaskContext(AmbientContextBase):
|
||||||
|
"""Ambient context that wraps task execution on workers. It provides access to the currently executing task."""
|
||||||
|
|
||||||
|
task: Task
|
||||||
|
|
||||||
|
|
||||||
|
TaskArgsFinalizerCallback = t.Callable[[str, t.Any, _engine.TemplateEngine, t.Any], t.Any]
|
||||||
|
"""Type alias for the shape of the `ActionBase.finalize_task_arg` method."""
|
||||||
|
|
||||||
|
|
||||||
|
class TaskArgsChainTemplar(ChainTemplar):
|
||||||
|
"""
|
||||||
|
A ChainTemplar that carries a user-provided context object, optionally provided by `ActionBase.get_finalize_task_args_context`.
|
||||||
|
TaskArgsFinalizer provides the context to each `ActionBase.finalize_task_arg` call to allow for more complex/stateful customization.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, *sources: c.Mapping, templar: _engine.TemplateEngine, callback: TaskArgsFinalizerCallback, context: t.Any) -> None:
|
||||||
|
super().__init__(*sources, templar=templar)
|
||||||
|
|
||||||
|
self.callback = callback
|
||||||
|
self.context = context
|
||||||
|
|
||||||
|
def template(self, key: t.Any, value: t.Any) -> t.Any:
|
||||||
|
return self.callback(key, value, self.templar, self.context)
|
||||||
|
|
||||||
|
|
||||||
|
class TaskArgsFinalizer:
|
||||||
|
"""Invoked during task args finalization; allows actions to override default arg processing (e.g., templating)."""
|
||||||
|
|
||||||
|
def __init__(self, *args: c.Mapping[str, t.Any] | str | None, templar: _engine.TemplateEngine) -> None:
|
||||||
|
self._args_layers = [arg for arg in args if arg is not None]
|
||||||
|
self._templar = templar
|
||||||
|
|
||||||
|
def finalize(self, callback: TaskArgsFinalizerCallback, context: t.Any) -> dict[str, t.Any]:
|
||||||
|
resolved_layers: list[c.Mapping[str, t.Any]] = []
|
||||||
|
|
||||||
|
for layer in self._args_layers:
|
||||||
|
if isinstance(layer, (str, _vault.EncryptedString)): # EncryptedString can hide a template
|
||||||
|
if constants.config.get_config_value('INJECT_FACTS_AS_VARS'):
|
||||||
|
Display().warning(
|
||||||
|
"Using a template for task args is unsafe in some situations "
|
||||||
|
"(see https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat-unsafe).",
|
||||||
|
obj=layer,
|
||||||
|
)
|
||||||
|
|
||||||
|
resolved_layer = self._templar.resolve_to_container(layer, options=_engine.TemplateOptions(value_for_omit={}))
|
||||||
|
else:
|
||||||
|
resolved_layer = layer
|
||||||
|
|
||||||
|
if not isinstance(resolved_layer, dict):
|
||||||
|
raise AnsibleError(f'Task args must resolve to a {native_type_name(dict)!r} not {native_type_name(resolved_layer)!r}.', obj=layer)
|
||||||
|
|
||||||
|
resolved_layers.append(resolved_layer)
|
||||||
|
|
||||||
|
ct = TaskArgsChainTemplar(*reversed(resolved_layers), templar=self._templar, callback=callback, context=context)
|
||||||
|
|
||||||
|
return ct.as_dict()
|
||||||
@ -0,0 +1,10 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from jinja2 import __version__ as _jinja2_version
|
||||||
|
|
||||||
|
# DTFIX-FUTURE: sanity test to ensure this doesn't drift from requirements
|
||||||
|
_MINIMUM_JINJA_VERSION = (3, 1)
|
||||||
|
_CURRENT_JINJA_VERSION = tuple(map(int, _jinja2_version.split('.', maxsplit=2)[:2]))
|
||||||
|
|
||||||
|
if _CURRENT_JINJA_VERSION < _MINIMUM_JINJA_VERSION:
|
||||||
|
raise RuntimeError(f'Jinja version {".".join(map(str, _MINIMUM_JINJA_VERSION))} or higher is required (current version {_jinja2_version}).')
|
||||||
@ -0,0 +1,86 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import abc
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from contextvars import ContextVar
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._datatag import AnsibleTagHelper
|
||||||
|
|
||||||
|
|
||||||
|
class NotifiableAccessContextBase(metaclass=abc.ABCMeta):
|
||||||
|
"""Base class for a context manager that, when active, receives notification of managed access for types/tags in which it has registered an interest."""
|
||||||
|
|
||||||
|
_type_interest: t.FrozenSet[type] = frozenset()
|
||||||
|
"""Set of types (including tag types) for which this context will be notified upon access."""
|
||||||
|
|
||||||
|
_mask: t.ClassVar[bool] = False
|
||||||
|
"""When true, only the innermost (most recently created) context of this type will be notified."""
|
||||||
|
|
||||||
|
def __enter__(self):
|
||||||
|
# noinspection PyProtectedMember
|
||||||
|
AnsibleAccessContext.current()._register_interest(self)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __exit__(self, exc_type, exc_val, exc_tb) -> None:
|
||||||
|
# noinspection PyProtectedMember
|
||||||
|
AnsibleAccessContext.current()._unregister_interest(self)
|
||||||
|
return None
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def _notify(self, o: t.Any) -> t.Any:
|
||||||
|
"""Derived classes implement custom notification behavior when a registered type or tag is accessed."""
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleAccessContext:
|
||||||
|
"""
|
||||||
|
Broker object for managed access registration and notification.
|
||||||
|
Each thread or other logical callstack has a dedicated `AnsibleAccessContext` object with which `NotifiableAccessContext` objects can register interest.
|
||||||
|
When a managed access occurs on an object, each active `NotifiableAccessContext` within the current callstack that has registered interest in that
|
||||||
|
object's type or a tag present on it will be notified.
|
||||||
|
"""
|
||||||
|
|
||||||
|
_contextvar: t.ClassVar[ContextVar[AnsibleAccessContext]] = ContextVar('AnsibleAccessContext')
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def current() -> AnsibleAccessContext:
|
||||||
|
"""Creates or retrieves an `AnsibleAccessContext` for the current logical callstack."""
|
||||||
|
try:
|
||||||
|
ctx: AnsibleAccessContext = AnsibleAccessContext._contextvar.get()
|
||||||
|
except LookupError:
|
||||||
|
# didn't exist; create it
|
||||||
|
ctx = AnsibleAccessContext()
|
||||||
|
AnsibleAccessContext._contextvar.set(ctx) # we ignore the token, since this should live for the life of the thread/async ctx
|
||||||
|
|
||||||
|
return ctx
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
self._notify_contexts: list[NotifiableAccessContextBase] = []
|
||||||
|
|
||||||
|
def _register_interest(self, context: NotifiableAccessContextBase) -> None:
|
||||||
|
self._notify_contexts.append(context)
|
||||||
|
|
||||||
|
def _unregister_interest(self, context: NotifiableAccessContextBase) -> None:
|
||||||
|
ctx = self._notify_contexts.pop()
|
||||||
|
|
||||||
|
if ctx is not context:
|
||||||
|
raise RuntimeError(f'Out-of-order context deactivation detected. Found {ctx} instead of {context}.')
|
||||||
|
|
||||||
|
def access(self, value: t.Any) -> None:
|
||||||
|
"""Notify all contexts which have registered interest in the given value that it is being accessed."""
|
||||||
|
if not self._notify_contexts:
|
||||||
|
return
|
||||||
|
|
||||||
|
value_types = AnsibleTagHelper.tag_types(value) | frozenset((type(value),))
|
||||||
|
masked: set[type] = set()
|
||||||
|
|
||||||
|
for ctx in reversed(self._notify_contexts):
|
||||||
|
if ctx._mask:
|
||||||
|
if (ctx_type := type(ctx)) in masked:
|
||||||
|
continue
|
||||||
|
|
||||||
|
masked.add(ctx_type)
|
||||||
|
|
||||||
|
# noinspection PyProtectedMember
|
||||||
|
if ctx._type_interest.intersection(value_types):
|
||||||
|
ctx._notify(value)
|
||||||
@ -0,0 +1,63 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import collections.abc as c
|
||||||
|
import itertools
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.errors import AnsibleValueOmittedError, AnsibleError
|
||||||
|
|
||||||
|
from ._engine import TemplateEngine
|
||||||
|
|
||||||
|
|
||||||
|
class ChainTemplar:
|
||||||
|
"""A basic variable layering mechanism that supports templating and obliteration of `omit` values."""
|
||||||
|
|
||||||
|
def __init__(self, *sources: c.Mapping, templar: TemplateEngine) -> None:
|
||||||
|
self.sources = sources
|
||||||
|
self.templar = templar
|
||||||
|
|
||||||
|
def template(self, key: t.Any, value: t.Any) -> t.Any:
|
||||||
|
"""
|
||||||
|
Render the given value using the templar.
|
||||||
|
Intended to be overridden by subclasses.
|
||||||
|
"""
|
||||||
|
return self.templar.template(value)
|
||||||
|
|
||||||
|
def get(self, key: t.Any) -> t.Any:
|
||||||
|
"""Get the value for the given key, templating the result before returning it."""
|
||||||
|
for source in self.sources:
|
||||||
|
if key not in source:
|
||||||
|
continue
|
||||||
|
|
||||||
|
value = source[key]
|
||||||
|
|
||||||
|
try:
|
||||||
|
return self.template(key, value)
|
||||||
|
except AnsibleValueOmittedError:
|
||||||
|
break # omit == obliterate - matches historical behavior where dict layers were squashed before templating was applied
|
||||||
|
except Exception as ex:
|
||||||
|
raise AnsibleError(f'Error while resolving value for {key!r}.', obj=value) from ex
|
||||||
|
|
||||||
|
raise KeyError(key)
|
||||||
|
|
||||||
|
def keys(self) -> t.Iterable[t.Any]:
|
||||||
|
"""
|
||||||
|
Returns a sorted iterable of all keys present in all source layers, without templating associated values.
|
||||||
|
Values that resolve to `omit` are thus included.
|
||||||
|
"""
|
||||||
|
return sorted(set(itertools.chain.from_iterable(self.sources)))
|
||||||
|
|
||||||
|
def items(self) -> t.Iterable[t.Tuple[t.Any, t.Any]]:
|
||||||
|
"""
|
||||||
|
Returns a sorted iterable of (key, templated value) tuples.
|
||||||
|
Any tuple where the templated value resolves to `omit` will not be included in the result.
|
||||||
|
"""
|
||||||
|
for key in self.keys():
|
||||||
|
try:
|
||||||
|
yield key, self.get(key)
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def as_dict(self) -> dict[t.Any, t.Any]:
|
||||||
|
"""Returns a dict representing all layers, squashed and templated, with `omit` values dropped."""
|
||||||
|
return dict(self.items())
|
||||||
@ -0,0 +1,95 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import contextlib as _contextlib
|
||||||
|
import dataclasses
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._datatag import AnsibleSingletonTagBase, _tag_dataclass_kwargs
|
||||||
|
from ansible.module_utils._internal._datatag._tags import Deprecated
|
||||||
|
from ansible._internal._datatag._tags import Origin
|
||||||
|
from ansible.utils.display import Display
|
||||||
|
|
||||||
|
from ._access import NotifiableAccessContextBase
|
||||||
|
from ._utils import TemplateContext
|
||||||
|
|
||||||
|
|
||||||
|
display = Display()
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(**_tag_dataclass_kwargs)
|
||||||
|
class _JinjaConstTemplate(AnsibleSingletonTagBase):
|
||||||
|
# deprecated: description='embedded Jinja constant string template support' core_version='2.23'
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(frozen=True, kw_only=True, slots=True)
|
||||||
|
class _TrippedDeprecationInfo:
|
||||||
|
template: str
|
||||||
|
deprecated: Deprecated
|
||||||
|
|
||||||
|
|
||||||
|
class DeprecatedAccessAuditContext(NotifiableAccessContextBase):
|
||||||
|
"""When active, captures metadata about managed accesses to `Deprecated` tagged objects."""
|
||||||
|
|
||||||
|
_type_interest = frozenset([Deprecated])
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def when(cls, condition: bool, /) -> t.Self | _contextlib.nullcontext:
|
||||||
|
"""Returns a new instance if `condition` is True (usually `TemplateContext.is_top_level`), otherwise a `nullcontext` instance."""
|
||||||
|
if condition:
|
||||||
|
return cls()
|
||||||
|
|
||||||
|
return _contextlib.nullcontext()
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
self._tripped_deprecation_info: dict[int, _TrippedDeprecationInfo] = {}
|
||||||
|
|
||||||
|
def __exit__(self, exc_type, exc_val, exc_tb) -> None:
|
||||||
|
result = super().__exit__(exc_type, exc_val, exc_tb)
|
||||||
|
|
||||||
|
for item in self._tripped_deprecation_info.values():
|
||||||
|
if Origin.is_tagged_on(item.template):
|
||||||
|
msg = item.deprecated.msg
|
||||||
|
else:
|
||||||
|
# without an origin, we need to include what context we do have (the template)
|
||||||
|
msg = f'While processing {item.template!r}: {item.deprecated.msg}'
|
||||||
|
|
||||||
|
display._deprecated_with_plugin_info(
|
||||||
|
msg=msg,
|
||||||
|
help_text=item.deprecated.help_text,
|
||||||
|
version=item.deprecated.removal_version,
|
||||||
|
date=item.deprecated.removal_date,
|
||||||
|
obj=item.template,
|
||||||
|
plugin=item.deprecated.plugin,
|
||||||
|
)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
def _notify(self, o: t.Any) -> None:
|
||||||
|
deprecated = Deprecated.get_required_tag(o)
|
||||||
|
deprecated_key = id(deprecated)
|
||||||
|
|
||||||
|
if deprecated_key in self._tripped_deprecation_info:
|
||||||
|
return # record only the first access for each deprecated tag in a given context
|
||||||
|
|
||||||
|
template_ctx = TemplateContext.current(optional=True)
|
||||||
|
template = template_ctx.template_value if template_ctx else None
|
||||||
|
|
||||||
|
# when the current template input is a container, provide a descriptive string with origin propagated (if possible)
|
||||||
|
if not isinstance(template, str):
|
||||||
|
# DTFIX-FUTURE: ascend the template stack to try and find the nearest string source template
|
||||||
|
origin = Origin.get_tag(template)
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: this should probably use a synthesized description value on the tag
|
||||||
|
# it is reachable from the data_tagging_controller test: ../playbook_output_validator/filter.py actual_stdout.txt actual_stderr.txt
|
||||||
|
# -[DEPRECATION WARNING]: `something_old` is deprecated, don't use it! This feature will be removed in version 1.2.3.
|
||||||
|
# +[DEPRECATION WARNING]: While processing '<<container>>': `something_old` is deprecated, don't use it! This feature will be removed in ...
|
||||||
|
template = '<<container>>'
|
||||||
|
|
||||||
|
if origin:
|
||||||
|
origin.tag(template)
|
||||||
|
|
||||||
|
self._tripped_deprecation_info[deprecated_key] = _TrippedDeprecationInfo(
|
||||||
|
template=template,
|
||||||
|
deprecated=deprecated,
|
||||||
|
)
|
||||||
@ -0,0 +1,588 @@
|
|||||||
|
# (c) 2012-2014, Michael DeHaan <michael.dehaan@gmail.com>
|
||||||
|
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import copy
|
||||||
|
import dataclasses
|
||||||
|
import enum
|
||||||
|
import textwrap
|
||||||
|
import typing as t
|
||||||
|
import collections.abc as c
|
||||||
|
import re
|
||||||
|
|
||||||
|
from collections import ChainMap
|
||||||
|
|
||||||
|
from ansible.errors import (
|
||||||
|
AnsibleError,
|
||||||
|
AnsibleValueOmittedError,
|
||||||
|
AnsibleUndefinedVariable,
|
||||||
|
AnsibleTemplateSyntaxError,
|
||||||
|
AnsibleBrokenConditionalError,
|
||||||
|
AnsibleTemplateTransformLimitError,
|
||||||
|
TemplateTrustCheckFailedError,
|
||||||
|
)
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._datatag import AnsibleTaggedObject, NotTaggableError, AnsibleTagHelper
|
||||||
|
from ansible._internal._errors._handler import Skippable
|
||||||
|
from ansible._internal._datatag._tags import Origin, TrustedAsTemplate
|
||||||
|
from ansible.utils.display import Display
|
||||||
|
from ansible.utils.vars import validate_variable_name
|
||||||
|
from ansible.parsing.dataloader import DataLoader
|
||||||
|
|
||||||
|
from ._datatag import DeprecatedAccessAuditContext
|
||||||
|
from ._jinja_bits import (
|
||||||
|
AnsibleTemplate,
|
||||||
|
_TemplateCompileContext,
|
||||||
|
TemplateOverrides,
|
||||||
|
AnsibleEnvironment,
|
||||||
|
defer_template_error,
|
||||||
|
create_template_error,
|
||||||
|
is_possibly_template,
|
||||||
|
is_possibly_all_template,
|
||||||
|
AnsibleTemplateExpression,
|
||||||
|
_finalize_template_result,
|
||||||
|
FinalizeMode,
|
||||||
|
)
|
||||||
|
from ._jinja_common import _TemplateConfig, MarkerError, ExceptionMarker
|
||||||
|
from ._lazy_containers import _AnsibleLazyTemplateMixin
|
||||||
|
from ._marker_behaviors import MarkerBehavior, FAIL_ON_UNDEFINED
|
||||||
|
from ._transform import _type_transform_mapping
|
||||||
|
from ._utils import Omit, TemplateContext, IGNORE_SCALAR_VAR_TYPES, LazyOptions
|
||||||
|
from ...module_utils.datatag import native_type_name
|
||||||
|
|
||||||
|
_display = Display()
|
||||||
|
|
||||||
|
|
||||||
|
_shared_empty_unmask_type_names: frozenset[str] = frozenset()
|
||||||
|
|
||||||
|
TRANSFORM_CHAIN_LIMIT: int = 10
|
||||||
|
"""Arbitrary limit for chained transforms to prevent cycles; an exception will be raised if exceeded."""
|
||||||
|
|
||||||
|
|
||||||
|
class TemplateMode(enum.Enum):
|
||||||
|
# DTFIX-FUTURE: this enum ideally wouldn't exist - revisit/rename before making public
|
||||||
|
DEFAULT = enum.auto()
|
||||||
|
STOP_ON_TEMPLATE = enum.auto()
|
||||||
|
STOP_ON_CONTAINER = enum.auto()
|
||||||
|
ALWAYS_FINALIZE = enum.auto()
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(kw_only=True, slots=True, frozen=True)
|
||||||
|
class TemplateOptions:
|
||||||
|
DEFAULT: t.ClassVar[t.Self]
|
||||||
|
|
||||||
|
value_for_omit: object = Omit
|
||||||
|
escape_backslashes: bool = True
|
||||||
|
preserve_trailing_newlines: bool = True
|
||||||
|
# DTFIX-RELEASE: these aren't really overrides anymore, rename the dataclass and this field
|
||||||
|
# also mention in docstring this has no effect unless used to template a string
|
||||||
|
overrides: TemplateOverrides = TemplateOverrides.DEFAULT
|
||||||
|
|
||||||
|
|
||||||
|
TemplateOptions.DEFAULT = TemplateOptions()
|
||||||
|
|
||||||
|
|
||||||
|
class TemplateEncountered(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class TemplateEngine:
|
||||||
|
"""
|
||||||
|
The main class for templating, with the main entry-point of template().
|
||||||
|
"""
|
||||||
|
|
||||||
|
_sentinel = object()
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
loader: DataLoader | None = None,
|
||||||
|
variables: dict[str, t.Any] | ChainMap[str, t.Any] | None = None,
|
||||||
|
variables_factory: t.Callable[[], dict[str, t.Any] | ChainMap[str, t.Any]] | None = None,
|
||||||
|
marker_behavior: MarkerBehavior | None = None,
|
||||||
|
):
|
||||||
|
self._loader = loader
|
||||||
|
self._variables = variables
|
||||||
|
self._variables_factory = variables_factory
|
||||||
|
self._environment: AnsibleEnvironment | None = None
|
||||||
|
|
||||||
|
# inherit marker behavior from the active template context's templar unless otherwise specified
|
||||||
|
if not marker_behavior:
|
||||||
|
if template_ctx := TemplateContext.current(optional=True):
|
||||||
|
marker_behavior = template_ctx.templar.marker_behavior
|
||||||
|
else:
|
||||||
|
marker_behavior = FAIL_ON_UNDEFINED
|
||||||
|
|
||||||
|
self._marker_behavior = marker_behavior
|
||||||
|
|
||||||
|
def copy(self) -> t.Self:
|
||||||
|
new_engine = copy.copy(self)
|
||||||
|
new_engine._environment = None
|
||||||
|
|
||||||
|
return new_engine
|
||||||
|
|
||||||
|
def extend(self, marker_behavior: MarkerBehavior | None = None) -> t.Self:
|
||||||
|
# DTFIX-RELEASE: bikeshed name, supported features
|
||||||
|
new_templar = type(self)(
|
||||||
|
loader=self._loader,
|
||||||
|
variables=self._variables,
|
||||||
|
variables_factory=self._variables_factory,
|
||||||
|
marker_behavior=marker_behavior or self._marker_behavior,
|
||||||
|
)
|
||||||
|
|
||||||
|
if self._environment:
|
||||||
|
new_templar._environment = self._environment
|
||||||
|
|
||||||
|
return new_templar
|
||||||
|
|
||||||
|
@property
|
||||||
|
def marker_behavior(self) -> MarkerBehavior:
|
||||||
|
return self._marker_behavior
|
||||||
|
|
||||||
|
@property
|
||||||
|
def basedir(self) -> str:
|
||||||
|
"""The basedir from DataLoader."""
|
||||||
|
return self._loader.get_basedir() if self._loader else '.'
|
||||||
|
|
||||||
|
@property
|
||||||
|
def environment(self) -> AnsibleEnvironment:
|
||||||
|
if not self._environment:
|
||||||
|
self._environment = AnsibleEnvironment(ansible_basedir=self.basedir)
|
||||||
|
|
||||||
|
return self._environment
|
||||||
|
|
||||||
|
def _create_overlay(self, template: str, overrides: TemplateOverrides) -> tuple[str, AnsibleEnvironment]:
|
||||||
|
try:
|
||||||
|
template, overrides = overrides._extract_template_overrides(template)
|
||||||
|
except Exception as ex:
|
||||||
|
raise AnsibleTemplateSyntaxError("Syntax error in template.", obj=template) from ex
|
||||||
|
|
||||||
|
env = self.environment
|
||||||
|
|
||||||
|
if overrides is not TemplateOverrides.DEFAULT and (overlay_kwargs := overrides.overlay_kwargs()):
|
||||||
|
env = t.cast(AnsibleEnvironment, env.overlay(**overlay_kwargs))
|
||||||
|
|
||||||
|
return template, env
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _count_newlines_from_end(in_str):
|
||||||
|
"""
|
||||||
|
Counts the number of newlines at the end of a string. This is used during
|
||||||
|
the jinja2 templating to ensure the count matches the input, since some newlines
|
||||||
|
may be thrown away during the templating.
|
||||||
|
"""
|
||||||
|
|
||||||
|
i = len(in_str)
|
||||||
|
j = i - 1
|
||||||
|
|
||||||
|
try:
|
||||||
|
while in_str[j] == '\n':
|
||||||
|
j -= 1
|
||||||
|
except IndexError:
|
||||||
|
# Uncommon cases: zero length string and string containing only newlines
|
||||||
|
return i
|
||||||
|
|
||||||
|
return i - 1 - j
|
||||||
|
|
||||||
|
@property
|
||||||
|
def available_variables(self) -> dict[str, t.Any] | ChainMap[str, t.Any]:
|
||||||
|
"""Available variables this instance will use when templating."""
|
||||||
|
# DTFIX-RELEASE: ensure that we're always accessing this as a shallow container-level snapshot, and eliminate uses of anything
|
||||||
|
# that directly mutates this value. _new_context may resolve this for us?
|
||||||
|
if self._variables is None:
|
||||||
|
self._variables = self._variables_factory() if self._variables_factory else {}
|
||||||
|
|
||||||
|
return self._variables
|
||||||
|
|
||||||
|
@available_variables.setter
|
||||||
|
def available_variables(self, variables: dict[str, t.Any]) -> None:
|
||||||
|
self._variables = variables
|
||||||
|
|
||||||
|
def resolve_variable_expression(
|
||||||
|
self,
|
||||||
|
expression: str,
|
||||||
|
*,
|
||||||
|
local_variables: dict[str, t.Any] | None = None,
|
||||||
|
) -> t.Any:
|
||||||
|
"""
|
||||||
|
Resolve a potentially untrusted string variable expression consisting only of valid identifiers, integers, dots, and indexing containing these.
|
||||||
|
Optional local variables may be provided, which can only be referenced directly by the given expression.
|
||||||
|
Valid: x, x.y, x[y].z, x[1], 1, x[y.z]
|
||||||
|
Error: 'x', x['y'], q('env')
|
||||||
|
"""
|
||||||
|
components = re.split(r'[.\[\]]', expression)
|
||||||
|
|
||||||
|
try:
|
||||||
|
for component in components:
|
||||||
|
if re.fullmatch('[0-9]*', component):
|
||||||
|
continue # allow empty strings and integers
|
||||||
|
|
||||||
|
validate_variable_name(component)
|
||||||
|
except Exception as ex:
|
||||||
|
raise AnsibleError(f'Invalid variable expression: {expression}', obj=expression) from ex
|
||||||
|
|
||||||
|
return self.evaluate_expression(TrustedAsTemplate().tag(expression), local_variables=local_variables)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def variable_name_as_template(name: str) -> str:
|
||||||
|
"""Return a trusted template string that will resolve the provided variable name. Raises an error if `name` is not a valid identifier."""
|
||||||
|
validate_variable_name(name)
|
||||||
|
return AnsibleTagHelper.tag('{{' + name + '}}', (AnsibleTagHelper.tags(name) | {TrustedAsTemplate()}))
|
||||||
|
|
||||||
|
def transform(self, variable: t.Any) -> t.Any:
|
||||||
|
"""Recursively apply transformations to the given value and return the result."""
|
||||||
|
return self.template(variable, mode=TemplateMode.ALWAYS_FINALIZE, lazy_options=LazyOptions.SKIP_TEMPLATES_AND_ACCESS)
|
||||||
|
|
||||||
|
def template(
|
||||||
|
self,
|
||||||
|
variable: t.Any, # DTFIX-RELEASE: once we settle the new/old API boundaries, rename this (here and in other methods)
|
||||||
|
*,
|
||||||
|
options: TemplateOptions = TemplateOptions.DEFAULT,
|
||||||
|
mode: TemplateMode = TemplateMode.DEFAULT,
|
||||||
|
lazy_options: LazyOptions = LazyOptions.DEFAULT,
|
||||||
|
) -> t.Any:
|
||||||
|
"""Templates (possibly recursively) any given data as input."""
|
||||||
|
original_variable = variable
|
||||||
|
|
||||||
|
for _attempt in range(TRANSFORM_CHAIN_LIMIT):
|
||||||
|
if variable is None or (value_type := type(variable)) in IGNORE_SCALAR_VAR_TYPES:
|
||||||
|
return variable # quickly ignore supported scalar types which are not be templated
|
||||||
|
|
||||||
|
value_is_str = isinstance(variable, str)
|
||||||
|
|
||||||
|
if template_ctx := TemplateContext.current(optional=True):
|
||||||
|
stop_on_template = template_ctx.stop_on_template
|
||||||
|
else:
|
||||||
|
stop_on_template = False
|
||||||
|
|
||||||
|
if mode is TemplateMode.STOP_ON_TEMPLATE:
|
||||||
|
stop_on_template = True
|
||||||
|
|
||||||
|
with (
|
||||||
|
TemplateContext(template_value=variable, templar=self, options=options, stop_on_template=stop_on_template) as ctx,
|
||||||
|
DeprecatedAccessAuditContext.when(ctx.is_top_level),
|
||||||
|
):
|
||||||
|
try:
|
||||||
|
if not value_is_str:
|
||||||
|
# transforms are currently limited to non-str types as an optimization
|
||||||
|
if (transform := _type_transform_mapping.get(value_type)) and value_type.__name__ not in lazy_options.unmask_type_names:
|
||||||
|
variable = transform(variable)
|
||||||
|
continue
|
||||||
|
|
||||||
|
template_result = _AnsibleLazyTemplateMixin._try_create(variable, lazy_options)
|
||||||
|
elif not lazy_options.template:
|
||||||
|
template_result = variable
|
||||||
|
elif not is_possibly_template(variable, options.overrides):
|
||||||
|
template_result = variable
|
||||||
|
elif not self._trust_check(variable, skip_handler=stop_on_template):
|
||||||
|
template_result = variable
|
||||||
|
elif stop_on_template:
|
||||||
|
raise TemplateEncountered()
|
||||||
|
else:
|
||||||
|
compiled_template = self._compile_template(variable, options)
|
||||||
|
|
||||||
|
template_result = compiled_template(self.available_variables)
|
||||||
|
template_result = self._post_render_mutation(variable, template_result, options)
|
||||||
|
except TemplateEncountered:
|
||||||
|
raise
|
||||||
|
except Exception as ex:
|
||||||
|
template_result = defer_template_error(ex, variable, is_expression=False)
|
||||||
|
|
||||||
|
if ctx.is_top_level or mode is TemplateMode.ALWAYS_FINALIZE:
|
||||||
|
template_result = self._finalize_top_level_template_result(
|
||||||
|
variable, options, template_result, stop_on_container=mode is TemplateMode.STOP_ON_CONTAINER
|
||||||
|
)
|
||||||
|
|
||||||
|
return template_result
|
||||||
|
|
||||||
|
raise AnsibleTemplateTransformLimitError(obj=original_variable)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _finalize_top_level_template_result(
|
||||||
|
variable: t.Any,
|
||||||
|
options: TemplateOptions,
|
||||||
|
template_result: t.Any,
|
||||||
|
is_expression: bool = False,
|
||||||
|
stop_on_container: bool = False,
|
||||||
|
) -> t.Any:
|
||||||
|
"""
|
||||||
|
This method must be called for expressions and top-level templates to recursively finalize the result.
|
||||||
|
This renders any embedded templates and triggers `Marker` and omit behaviors.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
if template_result is Omit:
|
||||||
|
# When the template result is Omit, raise an AnsibleValueOmittedError if value_for_omit is Omit, otherwise return value_for_omit.
|
||||||
|
# Other occurrences of Omit will simply drop out of containers during _finalize_template_result.
|
||||||
|
if options.value_for_omit is Omit:
|
||||||
|
raise AnsibleValueOmittedError()
|
||||||
|
|
||||||
|
return options.value_for_omit # trust that value_for_omit is an allowed type
|
||||||
|
|
||||||
|
if stop_on_container and type(template_result) in AnsibleTaggedObject._collection_types:
|
||||||
|
# Use of stop_on_container implies the caller will perform necessary checks on values,
|
||||||
|
# most likely by passing them back into the templating system.
|
||||||
|
try:
|
||||||
|
return template_result._non_lazy_copy()
|
||||||
|
except AttributeError:
|
||||||
|
return template_result # non-lazy containers are returned as-is
|
||||||
|
|
||||||
|
return _finalize_template_result(template_result, FinalizeMode.TOP_LEVEL)
|
||||||
|
except TemplateEncountered:
|
||||||
|
raise
|
||||||
|
except Exception as ex:
|
||||||
|
raise_from: BaseException
|
||||||
|
|
||||||
|
if isinstance(ex, MarkerError):
|
||||||
|
exception_to_raise = ex.source._as_exception()
|
||||||
|
|
||||||
|
# MarkerError is never suitable for use as the cause of another exception, it is merely a raiseable container for the source marker
|
||||||
|
# used for flow control (so its stack trace is rarely useful). However, if the source derives from a ExceptionMarker, its contained
|
||||||
|
# exception (previously raised) should be used as the cause. Other sources do not contain exceptions, so cannot provide a cause.
|
||||||
|
raise_from = exception_to_raise if isinstance(ex.source, ExceptionMarker) else None
|
||||||
|
else:
|
||||||
|
exception_to_raise = ex
|
||||||
|
raise_from = ex
|
||||||
|
|
||||||
|
exception_to_raise = create_template_error(exception_to_raise, variable, is_expression)
|
||||||
|
|
||||||
|
if exception_to_raise is ex:
|
||||||
|
raise # when the exception to raise is the active exception, just re-raise it
|
||||||
|
|
||||||
|
if exception_to_raise is raise_from:
|
||||||
|
raise_from = exception_to_raise.__cause__ # preserve the exception's cause, if any, otherwise no cause will be used
|
||||||
|
|
||||||
|
raise exception_to_raise from raise_from # always raise from something to avoid the currently active exception becoming __context__
|
||||||
|
|
||||||
|
def _compile_template(self, template: str, options: TemplateOptions) -> t.Callable[[c.Mapping[str, t.Any]], t.Any]:
|
||||||
|
# NOTE: Creating an overlay that lives only inside _compile_template means that overrides are not applied
|
||||||
|
# when templating nested variables, where Templar.environment is used, not the overlay. They are, however,
|
||||||
|
# applied to includes and imports.
|
||||||
|
try:
|
||||||
|
stripped_template, env = self._create_overlay(template, options.overrides)
|
||||||
|
|
||||||
|
with _TemplateCompileContext(escape_backslashes=options.escape_backslashes):
|
||||||
|
return t.cast(AnsibleTemplate, env.from_string(stripped_template))
|
||||||
|
except Exception as ex:
|
||||||
|
return self._defer_jinja_compile_error(ex, template, False)
|
||||||
|
|
||||||
|
def _compile_expression(self, expression: str, options: TemplateOptions) -> t.Callable[[c.Mapping[str, t.Any]], t.Any]:
|
||||||
|
"""
|
||||||
|
Compile a Jinja expression, applying optional compile-time behavior via an environment overlay (if needed). The overlay is
|
||||||
|
necessary to avoid mutating settings on the Templar's shared environment, which could be visible to other code running concurrently.
|
||||||
|
In the specific case of escape_backslashes, the setting only applies to a top-level template at compile-time, not runtime, to
|
||||||
|
ensure that any nested template calls (e.g., include and import) do not inherit the (lack of) escaping behavior.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
with _TemplateCompileContext(escape_backslashes=options.escape_backslashes):
|
||||||
|
return AnsibleTemplateExpression(self.environment.compile_expression(expression, False))
|
||||||
|
except Exception as ex:
|
||||||
|
return self._defer_jinja_compile_error(ex, expression, True)
|
||||||
|
|
||||||
|
def _defer_jinja_compile_error(self, ex: Exception, variable: str, is_expression: bool) -> t.Callable[[c.Mapping[str, t.Any]], t.Any]:
|
||||||
|
deferred_error = defer_template_error(ex, variable, is_expression=is_expression)
|
||||||
|
|
||||||
|
def deferred_exception(_jinja_vars: c.Mapping[str, t.Any]) -> t.Any:
|
||||||
|
# a template/expression compile error always results in a single node representing the compile error
|
||||||
|
return self.marker_behavior.handle_marker(deferred_error)
|
||||||
|
|
||||||
|
return deferred_exception
|
||||||
|
|
||||||
|
def _post_render_mutation(self, template: str, result: t.Any, options: TemplateOptions) -> t.Any:
|
||||||
|
if options.preserve_trailing_newlines and isinstance(result, str):
|
||||||
|
# The low level calls above do not preserve the newline
|
||||||
|
# characters at the end of the input data, so we
|
||||||
|
# calculate the difference in newlines and append them
|
||||||
|
# to the resulting output for parity
|
||||||
|
#
|
||||||
|
# Using AnsibleEnvironment's keep_trailing_newline instead would
|
||||||
|
# result in change in behavior when trailing newlines
|
||||||
|
# would be kept also for included templates, for example:
|
||||||
|
# "Hello {% include 'world.txt' %}!" would render as
|
||||||
|
# "Hello world\n!\n" instead of "Hello world!\n".
|
||||||
|
data_newlines = self._count_newlines_from_end(template)
|
||||||
|
res_newlines = self._count_newlines_from_end(result)
|
||||||
|
|
||||||
|
if data_newlines > res_newlines:
|
||||||
|
newlines = options.overrides.newline_sequence * (data_newlines - res_newlines)
|
||||||
|
result = AnsibleTagHelper.tag_copy(result, result + newlines)
|
||||||
|
|
||||||
|
# If the input string template was source-tagged and the result is not, propagate the source tag to the new value.
|
||||||
|
# This provides further contextual information when a template-derived value/var causes an error.
|
||||||
|
if not Origin.is_tagged_on(result) and (origin := Origin.get_tag(template)):
|
||||||
|
try:
|
||||||
|
result = origin.tag(result)
|
||||||
|
except NotTaggableError:
|
||||||
|
pass # best effort- if we can't, oh well
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
def is_template(self, data: t.Any, overrides: TemplateOverrides = TemplateOverrides.DEFAULT) -> bool:
|
||||||
|
"""
|
||||||
|
Evaluate the input data to determine if it contains a template, even if that template is invalid. Containers will be recursively searched.
|
||||||
|
Objects subject to template-time transforms that do not yield a template are not considered templates by this method.
|
||||||
|
Gating a conditional call to `template` with this method is redundant and inefficient -- request templating unconditionally instead.
|
||||||
|
"""
|
||||||
|
options = TemplateOptions(overrides=overrides) if overrides is not TemplateOverrides.DEFAULT else TemplateOptions.DEFAULT
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.template(data, options=options, mode=TemplateMode.STOP_ON_TEMPLATE)
|
||||||
|
except TemplateEncountered:
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
return False
|
||||||
|
|
||||||
|
def resolve_to_container(self, variable: t.Any, options: TemplateOptions = TemplateOptions.DEFAULT) -> t.Any:
|
||||||
|
"""
|
||||||
|
Recursively resolve scalar string template input, stopping at the first container encountered (if any).
|
||||||
|
Used for e.g., partial templating of task arguments, where the plugin needs to handle final resolution of some args internally.
|
||||||
|
"""
|
||||||
|
return self.template(variable, options=options, mode=TemplateMode.STOP_ON_CONTAINER)
|
||||||
|
|
||||||
|
def evaluate_expression(
|
||||||
|
self,
|
||||||
|
expression: str,
|
||||||
|
*,
|
||||||
|
local_variables: dict[str, t.Any] | None = None,
|
||||||
|
escape_backslashes: bool = True,
|
||||||
|
_render_jinja_const_template: bool = False,
|
||||||
|
) -> t.Any:
|
||||||
|
"""
|
||||||
|
Evaluate a trusted string expression and return its result.
|
||||||
|
Optional local variables may be provided, which can only be referenced directly by the given expression.
|
||||||
|
"""
|
||||||
|
if not isinstance(expression, str):
|
||||||
|
raise TypeError(f"Expressions must be {str!r}, got {type(expression)!r}.")
|
||||||
|
|
||||||
|
options = TemplateOptions(escape_backslashes=escape_backslashes, preserve_trailing_newlines=False)
|
||||||
|
|
||||||
|
with (
|
||||||
|
TemplateContext(template_value=expression, templar=self, options=options, _render_jinja_const_template=_render_jinja_const_template) as ctx,
|
||||||
|
DeprecatedAccessAuditContext.when(ctx.is_top_level),
|
||||||
|
):
|
||||||
|
try:
|
||||||
|
if not TrustedAsTemplate.is_tagged_on(expression):
|
||||||
|
raise TemplateTrustCheckFailedError(obj=expression)
|
||||||
|
|
||||||
|
template_variables = ChainMap(local_variables, self.available_variables) if local_variables else self.available_variables
|
||||||
|
compiled_template = self._compile_expression(expression, options)
|
||||||
|
|
||||||
|
template_result = compiled_template(template_variables)
|
||||||
|
template_result = self._post_render_mutation(expression, template_result, options)
|
||||||
|
except Exception as ex:
|
||||||
|
template_result = defer_template_error(ex, expression, is_expression=True)
|
||||||
|
|
||||||
|
return self._finalize_top_level_template_result(expression, options, template_result, is_expression=True)
|
||||||
|
|
||||||
|
_BROKEN_CONDITIONAL_ALLOWED_FRAGMENT = 'Broken conditionals are currently allowed because the `ALLOW_BROKEN_CONDITIONALS` configuration option is enabled.'
|
||||||
|
_CONDITIONAL_AS_TEMPLATE_MSG = 'Conditionals should not be surrounded by templating delimiters such as {{ }} or {% %}.'
|
||||||
|
|
||||||
|
def _strip_conditional_handle_empty(self, conditional) -> t.Any:
|
||||||
|
"""
|
||||||
|
Strips leading/trailing whitespace from the input expression.
|
||||||
|
If `ALLOW_BROKEN_CONDITIONALS` is enabled, None/empty is coerced to True (legacy behavior, deprecated).
|
||||||
|
Otherwise, None/empty results in a broken conditional error being raised.
|
||||||
|
"""
|
||||||
|
if isinstance(conditional, str):
|
||||||
|
# Leading/trailing whitespace on conditional expressions is not a problem, except we can't tell if the expression is empty (which *is* a problem).
|
||||||
|
# Always strip conditional input strings. Neither conditional expressions nor all-template conditionals have legit reasons to preserve
|
||||||
|
# surrounding whitespace, and they complicate detection and processing of all-template fallback cases.
|
||||||
|
conditional = AnsibleTagHelper.tag_copy(conditional, conditional.strip())
|
||||||
|
|
||||||
|
if conditional in (None, ''):
|
||||||
|
# deprecated backward-compatible behavior; None/empty input conditionals are always True
|
||||||
|
if _TemplateConfig.allow_broken_conditionals:
|
||||||
|
_display.deprecated(
|
||||||
|
msg='Empty conditional expression was evaluated as True.',
|
||||||
|
help_text=self._BROKEN_CONDITIONAL_ALLOWED_FRAGMENT,
|
||||||
|
obj=conditional,
|
||||||
|
version='2.23',
|
||||||
|
)
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
|
raise AnsibleBrokenConditionalError("Empty conditional expressions are not allowed.", obj=conditional)
|
||||||
|
|
||||||
|
return conditional
|
||||||
|
|
||||||
|
def _normalize_and_evaluate_conditional(self, conditional: str | bool) -> t.Any:
|
||||||
|
"""Validate and normalize a conditional input value, resolving allowed embedded template cases and evaluating the resulting expression."""
|
||||||
|
conditional = self._strip_conditional_handle_empty(conditional)
|
||||||
|
|
||||||
|
# this must follow `_strip_conditional_handle_empty`, since None/empty are coerced to bool (deprecated)
|
||||||
|
if type(conditional) is bool: # pylint: disable=unidiomatic-typecheck
|
||||||
|
return conditional
|
||||||
|
|
||||||
|
try:
|
||||||
|
if not isinstance(conditional, str):
|
||||||
|
if _TemplateConfig.allow_broken_conditionals:
|
||||||
|
# because the input isn't a string, the result will never be a bool; the broken conditional warning in the caller will apply on the result
|
||||||
|
return self.template(conditional, mode=TemplateMode.ALWAYS_FINALIZE)
|
||||||
|
|
||||||
|
raise AnsibleBrokenConditionalError(message="Conditional expressions must be strings.", obj=conditional)
|
||||||
|
|
||||||
|
if is_possibly_all_template(conditional):
|
||||||
|
# Indirection of trusted expressions is always allowed. If the expression appears to be entirely wrapped in template delimiters,
|
||||||
|
# we must resolve it. e.g. `when: "{{ some_var_resolving_to_a_trusted_expression_string }}"`.
|
||||||
|
# Some invalid meta-templating corner cases may sneak through here (e.g., `when: '{{ "foo" }} == {{ "bar" }}'`); these will
|
||||||
|
# result in an untrusted expression error.
|
||||||
|
result = self.template(conditional, mode=TemplateMode.ALWAYS_FINALIZE)
|
||||||
|
result = self._strip_conditional_handle_empty(result)
|
||||||
|
|
||||||
|
if not isinstance(result, str):
|
||||||
|
_display.deprecated(msg=self._CONDITIONAL_AS_TEMPLATE_MSG, obj=conditional, version='2.23')
|
||||||
|
|
||||||
|
return result # not an expression
|
||||||
|
|
||||||
|
# The only allowed use of templates for conditionals is for indirect usage of an expression.
|
||||||
|
# Any other usage should simply be an expression, not an attempt at meta templating.
|
||||||
|
expression = result
|
||||||
|
else:
|
||||||
|
expression = conditional
|
||||||
|
|
||||||
|
# Disable escape_backslashes when processing conditionals, to maintain backwards compatibility.
|
||||||
|
# This is necessary because conditionals were previously evaluated using {% %}, which was *NOT* affected by escape_backslashes.
|
||||||
|
# Now that conditionals use expressions, they would be affected by escape_backslashes if it was not disabled.
|
||||||
|
return self.evaluate_expression(expression, escape_backslashes=False, _render_jinja_const_template=True)
|
||||||
|
|
||||||
|
except AnsibleUndefinedVariable as ex:
|
||||||
|
# DTFIX-FUTURE: we're only augmenting the message for context here; once we have proper contextual tracking, we can dump the re-raise
|
||||||
|
raise AnsibleUndefinedVariable("Error while evaluating conditional.", obj=conditional) from ex
|
||||||
|
|
||||||
|
def evaluate_conditional(self, conditional: str | bool) -> bool:
|
||||||
|
"""
|
||||||
|
Evaluate a trusted string expression or boolean and return its boolean result. A non-boolean result will raise `AnsibleBrokenConditionalError`.
|
||||||
|
The ALLOW_BROKEN_CONDITIONALS configuration option can temporarily relax this requirement, allowing truthy conditionals to succeed.
|
||||||
|
"""
|
||||||
|
result = self._normalize_and_evaluate_conditional(conditional)
|
||||||
|
|
||||||
|
if isinstance(result, bool):
|
||||||
|
return result
|
||||||
|
|
||||||
|
bool_result = bool(result)
|
||||||
|
|
||||||
|
msg = (
|
||||||
|
f'Conditional result was {textwrap.shorten(str(result), width=40)!r} of type {native_type_name(result)!r}, '
|
||||||
|
f'which evaluates to {bool_result}. Conditionals must have a boolean result.'
|
||||||
|
)
|
||||||
|
|
||||||
|
if _TemplateConfig.allow_broken_conditionals:
|
||||||
|
_display.deprecated(msg=msg, obj=conditional, help_text=self._BROKEN_CONDITIONAL_ALLOWED_FRAGMENT, version='2.23')
|
||||||
|
|
||||||
|
return bool_result
|
||||||
|
|
||||||
|
raise AnsibleBrokenConditionalError(msg, obj=conditional)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _trust_check(value: str, skip_handler: bool = False) -> bool:
|
||||||
|
"""
|
||||||
|
Return True if the given value is trusted for templating, otherwise return False.
|
||||||
|
When the value is not trusted, a warning or error may be generated, depending on configuration.
|
||||||
|
"""
|
||||||
|
if TrustedAsTemplate.is_tagged_on(value):
|
||||||
|
return True
|
||||||
|
|
||||||
|
if not skip_handler:
|
||||||
|
with Skippable, _TemplateConfig.untrusted_template_handler.handle(TemplateTrustCheckFailedError, skip_on_ignore=True):
|
||||||
|
raise TemplateTrustCheckFailedError(obj=value)
|
||||||
|
|
||||||
|
return False
|
||||||
@ -0,0 +1,28 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from ansible.errors import AnsibleTemplatePluginError
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleTemplatePluginRuntimeError(AnsibleTemplatePluginError):
|
||||||
|
"""The specified template plugin (lookup/filter/test) raised an exception during execution."""
|
||||||
|
|
||||||
|
def __init__(self, plugin_type: str, plugin_name: str) -> None:
|
||||||
|
super().__init__(f'The {plugin_type} plugin {plugin_name!r} failed.')
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleTemplatePluginLoadError(AnsibleTemplatePluginError):
|
||||||
|
"""The specified template plugin (lookup/filter/test) failed to load."""
|
||||||
|
|
||||||
|
def __init__(self, plugin_type: str, plugin_name: str) -> None:
|
||||||
|
super().__init__(f'The {plugin_type} plugin {plugin_name!r} failed to load.')
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleTemplatePluginNotFoundError(AnsibleTemplatePluginError, KeyError):
|
||||||
|
"""
|
||||||
|
The specified template plugin (lookup/filter/test) was not found.
|
||||||
|
This exception extends KeyError since Jinja filter/test resolution requires a KeyError to detect missing plugins.
|
||||||
|
Jinja compilation fails if a non-KeyError is raised for a missing filter/test, even if the plugin will not be invoked (inconsistent with stock Jinja).
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, plugin_type: str, plugin_name: str) -> None:
|
||||||
|
super().__init__(f'The {plugin_type} plugin {plugin_name!r} was not found.')
|
||||||
File diff suppressed because it is too large
Load Diff
@ -0,0 +1,332 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import abc
|
||||||
|
import collections.abc as c
|
||||||
|
import inspect
|
||||||
|
import itertools
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from jinja2 import UndefinedError, StrictUndefined, TemplateRuntimeError
|
||||||
|
from jinja2.utils import missing
|
||||||
|
|
||||||
|
from ansible.module_utils.common.messages import ErrorSummary, Detail
|
||||||
|
from ansible.constants import config
|
||||||
|
from ansible.errors import AnsibleUndefinedVariable, AnsibleTypeError
|
||||||
|
from ansible._internal._errors._handler import ErrorHandler
|
||||||
|
from ansible.module_utils._internal._datatag import Tripwire, _untaggable_types
|
||||||
|
|
||||||
|
from ._access import NotifiableAccessContextBase
|
||||||
|
from ._jinja_patches import _patch_jinja
|
||||||
|
from ._utils import TemplateContext
|
||||||
|
from .._errors import _captured
|
||||||
|
from ...module_utils.datatag import native_type_name
|
||||||
|
|
||||||
|
_patch_jinja() # apply Jinja2 patches before types are declared that are dependent on the changes
|
||||||
|
|
||||||
|
|
||||||
|
class _TemplateConfig:
|
||||||
|
allow_embedded_templates: bool = config.get_config_value("ALLOW_EMBEDDED_TEMPLATES")
|
||||||
|
allow_broken_conditionals: bool = config.get_config_value('ALLOW_BROKEN_CONDITIONALS')
|
||||||
|
jinja_extensions: list[str] = config.get_config_value('DEFAULT_JINJA2_EXTENSIONS')
|
||||||
|
|
||||||
|
unknown_type_encountered_handler = ErrorHandler.from_config('_TEMPLAR_UNKNOWN_TYPE_ENCOUNTERED')
|
||||||
|
unknown_type_conversion_handler = ErrorHandler.from_config('_TEMPLAR_UNKNOWN_TYPE_CONVERSION')
|
||||||
|
untrusted_template_handler = ErrorHandler.from_config('_TEMPLAR_UNTRUSTED_TEMPLATE_BEHAVIOR')
|
||||||
|
|
||||||
|
|
||||||
|
class MarkerError(UndefinedError):
|
||||||
|
"""
|
||||||
|
An Ansible specific subclass of Jinja's UndefinedError, used to preserve and later restore the original Marker instance that raised the error.
|
||||||
|
This error is only raised by Marker and should never escape the templating system.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, message: str, source: Marker) -> None:
|
||||||
|
super().__init__(message)
|
||||||
|
|
||||||
|
self.source = source
|
||||||
|
|
||||||
|
|
||||||
|
class Marker(StrictUndefined, Tripwire):
|
||||||
|
"""
|
||||||
|
Extends Jinja's `StrictUndefined`, allowing any kind of error occurring during recursive templating operations to be captured and deferred.
|
||||||
|
Direct or managed access to most `Marker` attributes will raise a `MarkerError`, which usually ends the current innermost templating
|
||||||
|
operation and converts the `MarkerError` back to the origin Marker instance (subject to the `MarkerBehavior` in effect at the time).
|
||||||
|
"""
|
||||||
|
|
||||||
|
__slots__ = ('_marker_template_source',)
|
||||||
|
|
||||||
|
concrete_subclasses: t.ClassVar[set[type[Marker]]] = set()
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
hint: t.Optional[str] = None,
|
||||||
|
obj: t.Any = missing,
|
||||||
|
name: t.Optional[str] = None,
|
||||||
|
exc: t.Type[TemplateRuntimeError] = UndefinedError, # Ansible doesn't set this argument or consume the attribute it is stored under.
|
||||||
|
*args,
|
||||||
|
_no_template_source=False,
|
||||||
|
**kwargs,
|
||||||
|
) -> None:
|
||||||
|
if not hint and name and obj is not missing:
|
||||||
|
hint = f"object of type {native_type_name(obj)!r} has no attribute {name!r}"
|
||||||
|
|
||||||
|
kwargs.update(
|
||||||
|
hint=hint,
|
||||||
|
obj=obj,
|
||||||
|
name=name,
|
||||||
|
exc=exc,
|
||||||
|
)
|
||||||
|
|
||||||
|
super().__init__(*args, **kwargs)
|
||||||
|
|
||||||
|
if _no_template_source:
|
||||||
|
self._marker_template_source = None
|
||||||
|
else:
|
||||||
|
self._marker_template_source = TemplateContext.current().template_value
|
||||||
|
|
||||||
|
def _as_exception(self) -> Exception:
|
||||||
|
"""Return the exception instance to raise in a top-level templating context."""
|
||||||
|
return AnsibleUndefinedVariable(self._undefined_message, obj=self._marker_template_source)
|
||||||
|
|
||||||
|
def _as_message(self) -> str:
|
||||||
|
"""Return the error message to show when this marker must be represented as a string, such as for subsitutions or warnings."""
|
||||||
|
return self._undefined_message
|
||||||
|
|
||||||
|
def _fail_with_undefined_error(self, *args: t.Any, **kwargs: t.Any) -> t.NoReturn:
|
||||||
|
"""Ansible-specific replacement for Jinja's _fail_with_undefined_error tripwire on dunder methods."""
|
||||||
|
self.trip()
|
||||||
|
|
||||||
|
def trip(self) -> t.NoReturn:
|
||||||
|
"""Raise an internal exception which can be converted back to this instance."""
|
||||||
|
raise MarkerError(self._undefined_message, self)
|
||||||
|
|
||||||
|
def __setattr__(self, name: str, value: t.Any) -> None:
|
||||||
|
"""
|
||||||
|
Any attempt to set an unknown attribute on a `Marker` should invoke the trip method to propagate the original context.
|
||||||
|
This does not protect against mutation of known attributes, but the implementation is fairly simple.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
super().__setattr__(name, value)
|
||||||
|
except AttributeError:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
return
|
||||||
|
|
||||||
|
self.trip()
|
||||||
|
|
||||||
|
def __getattr__(self, name: str) -> t.Any:
|
||||||
|
"""Raises AttributeError for dunder-looking accesses, self-propagates otherwise."""
|
||||||
|
if name.startswith('__') and name.endswith('__'):
|
||||||
|
raise AttributeError(name)
|
||||||
|
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __getitem__(self, key):
|
||||||
|
"""Self-propagates on all item accesses."""
|
||||||
|
return self
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def __init_subclass__(cls, **kwargs) -> None:
|
||||||
|
if not inspect.isabstract(cls):
|
||||||
|
_untaggable_types.add(cls)
|
||||||
|
cls.concrete_subclasses.add(cls)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _init_class(cls):
|
||||||
|
_untaggable_types.add(cls)
|
||||||
|
|
||||||
|
# These are the methods StrictUndefined already intercepts.
|
||||||
|
jinja_method_names = (
|
||||||
|
'__add__',
|
||||||
|
'__bool__',
|
||||||
|
'__call__',
|
||||||
|
'__complex__',
|
||||||
|
'__contains__',
|
||||||
|
'__div__',
|
||||||
|
'__eq__',
|
||||||
|
'__float__',
|
||||||
|
'__floordiv__',
|
||||||
|
'__ge__',
|
||||||
|
# '__getitem__', # using a custom implementation that propagates self instead
|
||||||
|
'__gt__',
|
||||||
|
'__hash__',
|
||||||
|
'__int__',
|
||||||
|
'__iter__',
|
||||||
|
'__le__',
|
||||||
|
'__len__',
|
||||||
|
'__lt__',
|
||||||
|
'__mod__',
|
||||||
|
'__mul__',
|
||||||
|
'__ne__',
|
||||||
|
'__neg__',
|
||||||
|
'__pos__',
|
||||||
|
'__pow__',
|
||||||
|
'__radd__',
|
||||||
|
'__rdiv__',
|
||||||
|
'__rfloordiv__',
|
||||||
|
'__rmod__',
|
||||||
|
'__rmul__',
|
||||||
|
'__rpow__',
|
||||||
|
'__rsub__',
|
||||||
|
'__rtruediv__',
|
||||||
|
'__str__',
|
||||||
|
'__sub__',
|
||||||
|
'__truediv__',
|
||||||
|
)
|
||||||
|
|
||||||
|
# These additional methods should be intercepted, even though they are not intercepted by StrictUndefined.
|
||||||
|
additional_method_names = (
|
||||||
|
'__aiter__',
|
||||||
|
'__delattr__',
|
||||||
|
'__format__',
|
||||||
|
'__repr__',
|
||||||
|
'__setitem__',
|
||||||
|
)
|
||||||
|
|
||||||
|
for name in jinja_method_names + additional_method_names:
|
||||||
|
setattr(cls, name, cls._fail_with_undefined_error)
|
||||||
|
|
||||||
|
|
||||||
|
Marker._init_class()
|
||||||
|
|
||||||
|
|
||||||
|
class TruncationMarker(Marker):
|
||||||
|
"""
|
||||||
|
An `Marker` value was previously encountered and reported.
|
||||||
|
A subsequent `Marker` value (this instance) indicates the template may have been truncated as a result.
|
||||||
|
It will only be visible if the previous `Marker` was ignored/replaced instead of being tripped, which would raise an exception.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: make this a singleton?
|
||||||
|
|
||||||
|
__slots__ = ()
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
super().__init__(hint='template potentially truncated')
|
||||||
|
|
||||||
|
|
||||||
|
class UndefinedMarker(Marker):
|
||||||
|
"""A `Marker` value that represents an undefined value encountered during templating."""
|
||||||
|
|
||||||
|
__slots__ = ()
|
||||||
|
|
||||||
|
|
||||||
|
class ExceptionMarker(Marker, metaclass=abc.ABCMeta):
|
||||||
|
"""Base `Marker` class that represents exceptions encountered and deferred during templating."""
|
||||||
|
|
||||||
|
__slots__ = ()
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def _as_exception(self) -> Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def _as_message(self) -> str:
|
||||||
|
return str(self._as_exception())
|
||||||
|
|
||||||
|
def trip(self) -> t.NoReturn:
|
||||||
|
"""Raise an internal exception which can be converted back to this instance while maintaining the cause for callers that follow them."""
|
||||||
|
raise MarkerError(self._undefined_message, self) from self._as_exception()
|
||||||
|
|
||||||
|
|
||||||
|
class CapturedExceptionMarker(ExceptionMarker):
|
||||||
|
"""A `Marker` value that represents an exception raised during templating."""
|
||||||
|
|
||||||
|
__slots__ = ('_marker_captured_exception',)
|
||||||
|
|
||||||
|
def __init__(self, exception: Exception) -> None:
|
||||||
|
super().__init__(hint=f'A captured exception marker was tripped: {exception}')
|
||||||
|
|
||||||
|
self._marker_captured_exception = exception
|
||||||
|
|
||||||
|
def _as_exception(self) -> Exception:
|
||||||
|
return self._marker_captured_exception
|
||||||
|
|
||||||
|
|
||||||
|
class UndecryptableVaultError(_captured.AnsibleCapturedError):
|
||||||
|
"""Template-external error raised by VaultExceptionMarker when an undecryptable variable is accessed."""
|
||||||
|
|
||||||
|
context = 'vault'
|
||||||
|
_default_message = "Attempt to use undecryptable variable."
|
||||||
|
|
||||||
|
|
||||||
|
class VaultExceptionMarker(ExceptionMarker):
|
||||||
|
"""A `Marker` value that represents an error accessing a vaulted value during templating."""
|
||||||
|
|
||||||
|
__slots__ = ('_marker_undecryptable_ciphertext', '_marker_undecryptable_reason', '_marker_undecryptable_traceback')
|
||||||
|
|
||||||
|
def __init__(self, ciphertext: str, reason: str, traceback: str | None) -> None:
|
||||||
|
# DTFIX-RELEASE: when does this show up, should it contain more details?
|
||||||
|
# see also CapturedExceptionMarker for a similar issue
|
||||||
|
super().__init__(hint='A vault exception marker was tripped.')
|
||||||
|
|
||||||
|
self._marker_undecryptable_ciphertext = ciphertext
|
||||||
|
self._marker_undecryptable_reason = reason
|
||||||
|
self._marker_undecryptable_traceback = traceback
|
||||||
|
|
||||||
|
def _as_exception(self) -> Exception:
|
||||||
|
return UndecryptableVaultError(
|
||||||
|
obj=self._marker_undecryptable_ciphertext,
|
||||||
|
error_summary=ErrorSummary(
|
||||||
|
details=(
|
||||||
|
Detail(
|
||||||
|
msg=self._marker_undecryptable_reason,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
formatted_traceback=self._marker_undecryptable_traceback,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
def _disarm(self) -> str:
|
||||||
|
return self._marker_undecryptable_ciphertext
|
||||||
|
|
||||||
|
|
||||||
|
def get_first_marker_arg(args: c.Sequence, kwargs: dict[str, t.Any]) -> Marker | None:
|
||||||
|
"""Utility method to inspect plugin args and return the first `Marker` encountered, otherwise `None`."""
|
||||||
|
# DTFIX-RELEASE: this may or may not need to be public API, move back to utils or once usage is wrapped in a decorator?
|
||||||
|
for arg in iter_marker_args(args, kwargs):
|
||||||
|
return arg
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def iter_marker_args(args: c.Sequence, kwargs: dict[str, t.Any]) -> t.Generator[Marker]:
|
||||||
|
"""Utility method to iterate plugin args and yield any `Marker` encountered."""
|
||||||
|
# DTFIX-RELEASE: this may or may not need to be public API, move back to utils or once usage is wrapped in a decorator?
|
||||||
|
for arg in itertools.chain(args, kwargs.values()):
|
||||||
|
if isinstance(arg, Marker):
|
||||||
|
yield arg
|
||||||
|
|
||||||
|
|
||||||
|
class JinjaCallContext(NotifiableAccessContextBase):
|
||||||
|
"""
|
||||||
|
An audit context that wraps all Jinja (template/filter/test/lookup/method/function) calls.
|
||||||
|
While active, calls `trip()` on managed access of `Marker` objects unless the callee declares an understanding of markers.
|
||||||
|
"""
|
||||||
|
|
||||||
|
_mask = True
|
||||||
|
|
||||||
|
def __init__(self, accept_lazy_markers: bool) -> None:
|
||||||
|
self._type_interest = frozenset() if accept_lazy_markers else frozenset(Marker.concrete_subclasses)
|
||||||
|
|
||||||
|
def _notify(self, o: Marker) -> t.NoReturn:
|
||||||
|
o.trip()
|
||||||
|
|
||||||
|
|
||||||
|
def validate_arg_type(name: str, value: t.Any, allowed_type_or_types: type | tuple[type, ...], /) -> None:
|
||||||
|
"""Validate the type of the given argument while preserving context for Marker values."""
|
||||||
|
# DTFIX-RELEASE: find a home for this as a general-purpose utliity method and expose it after some API review
|
||||||
|
if isinstance(value, allowed_type_or_types):
|
||||||
|
return
|
||||||
|
|
||||||
|
if isinstance(allowed_type_or_types, type):
|
||||||
|
arg_type_description = repr(native_type_name(allowed_type_or_types))
|
||||||
|
else:
|
||||||
|
arg_type_description = ' or '.join(repr(native_type_name(item)) for item in allowed_type_or_types)
|
||||||
|
|
||||||
|
if isinstance(value, Marker):
|
||||||
|
try:
|
||||||
|
value.trip()
|
||||||
|
except Exception as ex:
|
||||||
|
raise AnsibleTypeError(f"The {name!r} argument must be of type {arg_type_description}.", obj=value) from ex
|
||||||
|
|
||||||
|
raise AnsibleTypeError(f"The {name!r} argument must be of type {arg_type_description}, not {native_type_name(value)!r}.", obj=value)
|
||||||
@ -0,0 +1,44 @@
|
|||||||
|
"""Runtime patches for Jinja bugs affecting Ansible."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import jinja2
|
||||||
|
import jinja2.utils
|
||||||
|
|
||||||
|
|
||||||
|
def _patch_jinja_undefined_slots() -> None:
|
||||||
|
"""
|
||||||
|
Fix the broken __slots__ on Jinja's Undefined and StrictUndefined if they're missing in the current version.
|
||||||
|
This will no longer be necessary once the fix is included in the minimum supported Jinja version.
|
||||||
|
See: https://github.com/pallets/jinja/issues/2025
|
||||||
|
"""
|
||||||
|
if not hasattr(jinja2.Undefined, '__slots__'):
|
||||||
|
jinja2.Undefined.__slots__ = (
|
||||||
|
"_undefined_hint",
|
||||||
|
"_undefined_obj",
|
||||||
|
"_undefined_name",
|
||||||
|
"_undefined_exception",
|
||||||
|
)
|
||||||
|
|
||||||
|
if not hasattr(jinja2.StrictUndefined, '__slots__'):
|
||||||
|
jinja2.StrictUndefined.__slots__ = ()
|
||||||
|
|
||||||
|
|
||||||
|
def _patch_jinja_missing_type() -> None:
|
||||||
|
"""
|
||||||
|
Fix the `jinja2.utils.missing` type to support pickling while remaining a singleton.
|
||||||
|
This will no longer be necessary once the fix is included in the minimum supported Jinja version.
|
||||||
|
See: https://github.com/pallets/jinja/issues/2027
|
||||||
|
"""
|
||||||
|
if getattr(jinja2.utils.missing, '__reduce__')() != 'missing':
|
||||||
|
|
||||||
|
def __reduce__(*_args):
|
||||||
|
return 'missing'
|
||||||
|
|
||||||
|
type(jinja2.utils.missing).__reduce__ = __reduce__
|
||||||
|
|
||||||
|
|
||||||
|
def _patch_jinja() -> None:
|
||||||
|
"""Apply Jinja2 patches."""
|
||||||
|
_patch_jinja_undefined_slots()
|
||||||
|
_patch_jinja_missing_type()
|
||||||
@ -0,0 +1,351 @@
|
|||||||
|
"""Jinja template plugins (filters, tests, lookups) and custom global functions."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import collections.abc as c
|
||||||
|
import dataclasses
|
||||||
|
import datetime
|
||||||
|
import functools
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.errors import (
|
||||||
|
AnsibleTemplatePluginError,
|
||||||
|
)
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._ambient_context import AmbientContextBase
|
||||||
|
from ansible.module_utils._internal._plugin_exec_context import PluginExecContext
|
||||||
|
from ansible.module_utils.common.collections import is_sequence
|
||||||
|
from ansible.module_utils._internal._datatag import AnsibleTagHelper
|
||||||
|
from ansible._internal._datatag._tags import TrustedAsTemplate
|
||||||
|
from ansible.plugins import AnsibleJinja2Plugin
|
||||||
|
from ansible.plugins.loader import lookup_loader, Jinja2Loader
|
||||||
|
from ansible.plugins.lookup import LookupBase
|
||||||
|
from ansible.utils.display import Display
|
||||||
|
|
||||||
|
from ._datatag import _JinjaConstTemplate
|
||||||
|
from ._errors import AnsibleTemplatePluginRuntimeError, AnsibleTemplatePluginLoadError, AnsibleTemplatePluginNotFoundError
|
||||||
|
from ._jinja_common import MarkerError, _TemplateConfig, get_first_marker_arg, Marker, JinjaCallContext
|
||||||
|
from ._lazy_containers import lazify_container_kwargs, lazify_container_args, lazify_container, _AnsibleLazyTemplateMixin
|
||||||
|
from ._utils import LazyOptions, TemplateContext
|
||||||
|
|
||||||
|
_display = Display()
|
||||||
|
|
||||||
|
_TCallable = t.TypeVar("_TCallable", bound=t.Callable)
|
||||||
|
_ITERATOR_TYPES: t.Final = (c.Iterator, c.ItemsView, c.KeysView, c.ValuesView, range)
|
||||||
|
|
||||||
|
|
||||||
|
class JinjaPluginIntercept(c.MutableMapping):
|
||||||
|
"""
|
||||||
|
Simulated dict class that loads Jinja2Plugins at request
|
||||||
|
otherwise all plugins would need to be loaded a priori.
|
||||||
|
|
||||||
|
NOTE: plugin_loader still loads all 'builtin/legacy' at
|
||||||
|
start so only collection plugins are really at request.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, jinja_builtins: c.Mapping[str, AnsibleJinja2Plugin], plugin_loader: Jinja2Loader):
|
||||||
|
super(JinjaPluginIntercept, self).__init__()
|
||||||
|
|
||||||
|
self._plugin_loader = plugin_loader
|
||||||
|
self._jinja_builtins = jinja_builtins
|
||||||
|
self._wrapped_funcs: dict[str, t.Callable] = {}
|
||||||
|
|
||||||
|
def _wrap_and_set_func(self, instance: AnsibleJinja2Plugin) -> t.Callable:
|
||||||
|
if self._plugin_loader.type == 'filter':
|
||||||
|
plugin_func = self._wrap_filter(instance)
|
||||||
|
else:
|
||||||
|
plugin_func = self._wrap_test(instance)
|
||||||
|
|
||||||
|
self._wrapped_funcs[instance._load_name] = plugin_func
|
||||||
|
|
||||||
|
return plugin_func
|
||||||
|
|
||||||
|
def __getitem__(self, key: str) -> t.Callable:
|
||||||
|
instance: AnsibleJinja2Plugin | None = None
|
||||||
|
plugin_func: t.Callable[..., t.Any] | None
|
||||||
|
|
||||||
|
if plugin_func := self._wrapped_funcs.get(key):
|
||||||
|
return plugin_func
|
||||||
|
|
||||||
|
try:
|
||||||
|
instance = self._plugin_loader.get(key)
|
||||||
|
except KeyError:
|
||||||
|
# The plugin name was invalid or no plugin was found by that name.
|
||||||
|
pass
|
||||||
|
except Exception as ex:
|
||||||
|
# An unexpected exception occurred.
|
||||||
|
raise AnsibleTemplatePluginLoadError(self._plugin_loader.type, key) from ex
|
||||||
|
|
||||||
|
if not instance:
|
||||||
|
try:
|
||||||
|
instance = self._jinja_builtins[key]
|
||||||
|
except KeyError:
|
||||||
|
raise AnsibleTemplatePluginNotFoundError(self._plugin_loader.type, key) from None
|
||||||
|
|
||||||
|
plugin_func = self._wrap_and_set_func(instance)
|
||||||
|
|
||||||
|
return plugin_func
|
||||||
|
|
||||||
|
def __setitem__(self, key: str, value: t.Callable) -> None:
|
||||||
|
self._wrap_and_set_func(self._plugin_loader._wrap_func(key, key, value))
|
||||||
|
|
||||||
|
def __delitem__(self, key):
|
||||||
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
def __contains__(self, item: t.Any) -> bool:
|
||||||
|
try:
|
||||||
|
self.__getitem__(item)
|
||||||
|
except AnsibleTemplatePluginLoadError:
|
||||||
|
return True
|
||||||
|
except AnsibleTemplatePluginNotFoundError:
|
||||||
|
return False
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __iter__(self):
|
||||||
|
raise NotImplementedError() # dynamic container
|
||||||
|
|
||||||
|
def __len__(self):
|
||||||
|
raise NotImplementedError() # dynamic container
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _invoke_plugin(instance: AnsibleJinja2Plugin, *args, **kwargs) -> t.Any:
|
||||||
|
if not instance.accept_args_markers:
|
||||||
|
if (first_marker := get_first_marker_arg(args, kwargs)) is not None:
|
||||||
|
return first_marker
|
||||||
|
|
||||||
|
try:
|
||||||
|
with JinjaCallContext(accept_lazy_markers=instance.accept_lazy_markers), PluginExecContext(executing_plugin=instance):
|
||||||
|
return instance.j2_function(*lazify_container_args(args), **lazify_container_kwargs(kwargs))
|
||||||
|
except MarkerError as ex:
|
||||||
|
return ex.source
|
||||||
|
except Exception as ex:
|
||||||
|
raise AnsibleTemplatePluginRuntimeError(instance.plugin_type, instance.ansible_name) from ex # DTFIX-RELEASE: which name to use? use plugin info?
|
||||||
|
|
||||||
|
def _wrap_test(self, instance: AnsibleJinja2Plugin) -> t.Callable:
|
||||||
|
"""Intercept point for all test plugins to ensure that args are properly templated/lazified."""
|
||||||
|
|
||||||
|
@functools.wraps(instance.j2_function)
|
||||||
|
def wrapper(*args, **kwargs) -> bool | Marker:
|
||||||
|
result = self._invoke_plugin(instance, *args, **kwargs)
|
||||||
|
|
||||||
|
if not isinstance(result, bool):
|
||||||
|
template = TemplateContext.current().template_value
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: which name to use? use plugin info?
|
||||||
|
_display.deprecated(
|
||||||
|
msg=f"The test plugin {instance.ansible_name!r} returned a non-boolean result of type {type(result)!r}. "
|
||||||
|
"Test plugins must have a boolean result.",
|
||||||
|
obj=template,
|
||||||
|
version="2.23",
|
||||||
|
)
|
||||||
|
|
||||||
|
result = bool(result)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
def _wrap_filter(self, instance: AnsibleJinja2Plugin) -> t.Callable:
|
||||||
|
"""Intercept point for all filter plugins to ensure that args are properly templated/lazified."""
|
||||||
|
|
||||||
|
@functools.wraps(instance.j2_function)
|
||||||
|
def wrapper(*args, **kwargs) -> t.Any:
|
||||||
|
result = self._invoke_plugin(instance, *args, **kwargs)
|
||||||
|
result = _wrap_plugin_output(result)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
|
||||||
|
class _DirectCall:
|
||||||
|
"""Functions/methods marked `_DirectCall` bypass Jinja Environment checks for `Marker`."""
|
||||||
|
|
||||||
|
_marker_attr: str = "_directcall"
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def mark(cls, src: _TCallable) -> _TCallable:
|
||||||
|
setattr(src, cls._marker_attr, True)
|
||||||
|
return src
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def is_marked(cls, value: t.Callable) -> bool:
|
||||||
|
return callable(value) and getattr(value, "_directcall", False)
|
||||||
|
|
||||||
|
|
||||||
|
@_DirectCall.mark
|
||||||
|
def _query(plugin_name: str, /, *args, **kwargs) -> t.Any:
|
||||||
|
"""wrapper for lookup, force wantlist true"""
|
||||||
|
kwargs['wantlist'] = True
|
||||||
|
return _invoke_lookup(plugin_name=plugin_name, lookup_terms=list(args), lookup_kwargs=kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
@_DirectCall.mark
|
||||||
|
def _lookup(plugin_name: str, /, *args, **kwargs) -> t.Any:
|
||||||
|
# convert the args tuple to a list, since some plugins make a poor assumption that `run.args` is a list
|
||||||
|
return _invoke_lookup(plugin_name=plugin_name, lookup_terms=list(args), lookup_kwargs=kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass
|
||||||
|
class _LookupContext(AmbientContextBase):
|
||||||
|
"""Ambient context that wraps lookup execution, providing information about how it was invoked."""
|
||||||
|
|
||||||
|
invoked_as_with: bool
|
||||||
|
|
||||||
|
|
||||||
|
@_DirectCall.mark
|
||||||
|
def _invoke_lookup(*, plugin_name: str, lookup_terms: list, lookup_kwargs: dict[str, t.Any], invoked_as_with: bool = False) -> t.Any:
|
||||||
|
templar = TemplateContext.current().templar
|
||||||
|
|
||||||
|
from ansible import template as _template
|
||||||
|
|
||||||
|
try:
|
||||||
|
instance: LookupBase | None = lookup_loader.get(plugin_name, loader=templar._loader, templar=_template.Templar._from_template_engine(templar))
|
||||||
|
except Exception as ex:
|
||||||
|
raise AnsibleTemplatePluginLoadError('lookup', plugin_name) from ex
|
||||||
|
|
||||||
|
if instance is None:
|
||||||
|
raise AnsibleTemplatePluginNotFoundError('lookup', plugin_name)
|
||||||
|
|
||||||
|
# if the lookup doesn't understand `Marker` and there's at least one in the top level, short-circuit by returning the first one we found
|
||||||
|
if not instance.accept_args_markers and (first_marker := get_first_marker_arg(lookup_terms, lookup_kwargs)) is not None:
|
||||||
|
return first_marker
|
||||||
|
|
||||||
|
# don't pass these through to the lookup
|
||||||
|
wantlist = lookup_kwargs.pop('wantlist', False)
|
||||||
|
errors = lookup_kwargs.pop('errors', 'strict')
|
||||||
|
|
||||||
|
with (
|
||||||
|
JinjaCallContext(accept_lazy_markers=instance.accept_lazy_markers),
|
||||||
|
PluginExecContext(executing_plugin=instance),
|
||||||
|
):
|
||||||
|
try:
|
||||||
|
if _TemplateConfig.allow_embedded_templates:
|
||||||
|
# for backwards compat, only trust constant templates in lookup terms
|
||||||
|
with JinjaCallContext(accept_lazy_markers=True):
|
||||||
|
# Force lazy marker support on for this call; the plugin's understanding is irrelevant, as is any existing context, since this backward
|
||||||
|
# compat code always understands markers.
|
||||||
|
lookup_terms = [templar.template(value) for value in _trust_jinja_constants(lookup_terms)]
|
||||||
|
|
||||||
|
# since embedded template support is enabled, repeat the check for `Marker` on lookup_terms, since a template may render as a `Marker`
|
||||||
|
if not instance.accept_args_markers and (first_marker := get_first_marker_arg(lookup_terms, {})) is not None:
|
||||||
|
return first_marker
|
||||||
|
else:
|
||||||
|
lookup_terms = AnsibleTagHelper.tag_copy(lookup_terms, (lazify_container(value) for value in lookup_terms), value_type=list)
|
||||||
|
|
||||||
|
with _LookupContext(invoked_as_with=invoked_as_with):
|
||||||
|
# The lookup context currently only supports the internal use-case where `first_found` requires extra info when invoked via `with_first_found`.
|
||||||
|
# The context may be public API in the future, but for now, other plugins should not implement this kind of dynamic behavior,
|
||||||
|
# though we're stuck with it for backward compatibility on `first_found`.
|
||||||
|
lookup_res = instance.run(lookup_terms, variables=templar.available_variables, **lazify_container_kwargs(lookup_kwargs))
|
||||||
|
|
||||||
|
# DTFIX-FUTURE: Consider allowing/requiring lookup plugins to declare how their result should be handled.
|
||||||
|
# Currently, there are multiple behaviors that are less than ideal and poorly documented (or not at all):
|
||||||
|
# * When `errors=warn` or `errors=ignore` the result is `None` unless `wantlist=True`, in which case the result is `[]`.
|
||||||
|
# * The user must specify `wantlist=True` to receive the plugin return value unmodified.
|
||||||
|
# A plugin can achieve similar results by wrapping its result in a list -- unless of course the user specifies `wantlist=True`.
|
||||||
|
# * When `wantlist=True` is specified, the result is not guaranteed to be a list as the option implies (except on plugin error).
|
||||||
|
# * Sequences are munged unless the user specifies `wantlist=True`:
|
||||||
|
# * len() == 0 - Return an empty sequence.
|
||||||
|
# * len() == 1 - Return the only element in the sequence.
|
||||||
|
# * len() >= 2 when all elements are `str` - Return all the values joined into a single comma separated string.
|
||||||
|
# * len() >= 2 when at least one element is not `str` - Return the sequence as-is.
|
||||||
|
|
||||||
|
if not is_sequence(lookup_res):
|
||||||
|
# DTFIX-FUTURE: deprecate return types which are not a list
|
||||||
|
# previously non-Sequence return types were deprecated and then became an error in 2.18
|
||||||
|
# however, the deprecation message (and this error) mention `list` specifically rather than `Sequence`
|
||||||
|
# letting non-list values through will trigger variable type checking warnings/errors
|
||||||
|
raise TypeError(f'returned {type(lookup_res)} instead of {list}')
|
||||||
|
|
||||||
|
except MarkerError as ex:
|
||||||
|
return ex.source
|
||||||
|
except Exception as ex:
|
||||||
|
# DTFIX-RELEASE: convert this to the new error/warn/ignore context manager
|
||||||
|
if isinstance(ex, AnsibleTemplatePluginError):
|
||||||
|
msg = f'Lookup failed but the error is being ignored: {ex}'
|
||||||
|
else:
|
||||||
|
msg = f'An unhandled exception occurred while running the lookup plugin {plugin_name!r}. Error was a {type(ex)}, original message: {ex}'
|
||||||
|
|
||||||
|
if errors == 'warn':
|
||||||
|
_display.warning(msg)
|
||||||
|
elif errors == 'ignore':
|
||||||
|
_display.display(msg, log_only=True)
|
||||||
|
else:
|
||||||
|
raise AnsibleTemplatePluginRuntimeError('lookup', plugin_name) from ex
|
||||||
|
|
||||||
|
return [] if wantlist else None
|
||||||
|
|
||||||
|
if not wantlist and lookup_res:
|
||||||
|
# when wantlist=False the lookup result is either partially delaizified (single element) or fully delaizified (multiple elements)
|
||||||
|
|
||||||
|
if len(lookup_res) == 1:
|
||||||
|
lookup_res = lookup_res[0]
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
lookup_res = ",".join(lookup_res) # for backwards compatibility, attempt to join `ran` into single string
|
||||||
|
except TypeError:
|
||||||
|
pass # for backwards compatibility, return `ran` as-is when the sequence contains non-string values
|
||||||
|
|
||||||
|
return _wrap_plugin_output(lookup_res)
|
||||||
|
|
||||||
|
|
||||||
|
def _now(utc=False, fmt=None):
|
||||||
|
"""Jinja2 global function (now) to return current datetime, potentially formatted via strftime."""
|
||||||
|
if utc:
|
||||||
|
now = datetime.datetime.now(datetime.timezone.utc).replace(tzinfo=None)
|
||||||
|
else:
|
||||||
|
now = datetime.datetime.now()
|
||||||
|
|
||||||
|
if fmt:
|
||||||
|
return now.strftime(fmt)
|
||||||
|
|
||||||
|
return now
|
||||||
|
|
||||||
|
|
||||||
|
def _jinja_const_template_warning(value: object, is_conditional: bool) -> None:
|
||||||
|
"""Issue a warning regarding embedded template usage."""
|
||||||
|
help_text = "Use inline expressions, for example: "
|
||||||
|
|
||||||
|
if is_conditional:
|
||||||
|
help_text += """`when: "{{ a_var }}" == 42` becomes `when: a_var == 42`"""
|
||||||
|
else:
|
||||||
|
help_text += """`msg: "{{ lookup('env', '{{ a_var }}') }}"` becomes `msg: "{{ lookup('env', a_var) }}"`"""
|
||||||
|
|
||||||
|
# deprecated: description='disable embedded templates by default and deprecate the feature' core_version='2.23'
|
||||||
|
_display.warning(
|
||||||
|
msg="Jinja constant strings should not contain embedded templates. This feature will be disabled by default in ansible-core 2.23.",
|
||||||
|
obj=value,
|
||||||
|
help_text=help_text,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _trust_jinja_constants(o: t.Any) -> t.Any:
|
||||||
|
"""
|
||||||
|
Recursively apply TrustedAsTemplate to values tagged with _JinjaConstTemplate and remove the tag.
|
||||||
|
Only container types emitted by the Jinja compiler are checked, since others do not contain constants.
|
||||||
|
This is used to provide backwards compatibility with historical lookup behavior for positional arguments.
|
||||||
|
"""
|
||||||
|
if _JinjaConstTemplate.is_tagged_on(o):
|
||||||
|
_jinja_const_template_warning(o, is_conditional=False)
|
||||||
|
|
||||||
|
return TrustedAsTemplate().tag(_JinjaConstTemplate.untag(o))
|
||||||
|
|
||||||
|
o_type = type(o)
|
||||||
|
|
||||||
|
if o_type is dict:
|
||||||
|
return {k: _trust_jinja_constants(v) for k, v in o.items()}
|
||||||
|
|
||||||
|
if o_type in (list, tuple):
|
||||||
|
return o_type(_trust_jinja_constants(v) for v in o)
|
||||||
|
|
||||||
|
return o
|
||||||
|
|
||||||
|
|
||||||
|
def _wrap_plugin_output(o: t.Any) -> t.Any:
|
||||||
|
"""Utility method to ensure that iterators/generators returned from a plugins are consumed."""
|
||||||
|
if isinstance(o, _ITERATOR_TYPES):
|
||||||
|
o = list(o)
|
||||||
|
|
||||||
|
return _AnsibleLazyTemplateMixin._try_create(o, LazyOptions.SKIP_TEMPLATES)
|
||||||
@ -0,0 +1,633 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import copy
|
||||||
|
import dataclasses
|
||||||
|
import functools
|
||||||
|
import types
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from jinja2.environment import TemplateModule
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._datatag import (
|
||||||
|
AnsibleTagHelper,
|
||||||
|
AnsibleTaggedObject,
|
||||||
|
_AnsibleTaggedDict,
|
||||||
|
_AnsibleTaggedList,
|
||||||
|
_AnsibleTaggedTuple,
|
||||||
|
_NO_INSTANCE_STORAGE,
|
||||||
|
_try_get_internal_tags_mapping,
|
||||||
|
)
|
||||||
|
|
||||||
|
from ansible.utils.sentinel import Sentinel
|
||||||
|
from ansible.errors import AnsibleVariableTypeError
|
||||||
|
from ansible._internal._errors._handler import Skippable
|
||||||
|
from ansible.vars.hostvars import HostVarsVars, HostVars
|
||||||
|
|
||||||
|
from ._access import AnsibleAccessContext
|
||||||
|
from ._jinja_common import Marker, _TemplateConfig
|
||||||
|
from ._utils import TemplateContext, PASS_THROUGH_SCALAR_VAR_TYPES, LazyOptions
|
||||||
|
|
||||||
|
if t.TYPE_CHECKING:
|
||||||
|
from ._engine import TemplateEngine
|
||||||
|
|
||||||
|
_KNOWN_TYPES: t.Final[set[type]] = (
|
||||||
|
{
|
||||||
|
HostVars, # example: hostvars
|
||||||
|
HostVarsVars, # example: hostvars.localhost | select
|
||||||
|
type, # example: range(20) | list # triggered on retrieval of `range` type from globals
|
||||||
|
range, # example: range(20) | list # triggered when returning a `range` instance from a call
|
||||||
|
types.FunctionType, # example: undef() | default("blah")
|
||||||
|
types.MethodType, # example: ansible_facts.get | type_debug
|
||||||
|
functools.partial,
|
||||||
|
type(''.startswith), # example: inventory_hostname.upper | type_debug # using `startswith` to resolve `builtin_function_or_method`
|
||||||
|
TemplateModule, # example: '{% import "importme.j2" as im %}{{ im | type_debug }}'
|
||||||
|
}
|
||||||
|
| set(PASS_THROUGH_SCALAR_VAR_TYPES)
|
||||||
|
| set(Marker.concrete_subclasses)
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
These types are known to the templating system.
|
||||||
|
In addition to the statically defined types, additional types will be added at runtime.
|
||||||
|
When enabled in config, this set will be used to determine if an encountered type should trigger a warning or error.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
def register_known_types(*args: type) -> None:
|
||||||
|
"""Register a type with the template engine so it will not trigger warnings or errors when encountered."""
|
||||||
|
_KNOWN_TYPES.update(args)
|
||||||
|
|
||||||
|
|
||||||
|
class UnsupportedConstructionMethodError(RuntimeError):
|
||||||
|
"""Error raised when attempting to construct a lazy container with unsupported arguments."""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
super().__init__("Direct construction of lazy containers is not supported.")
|
||||||
|
|
||||||
|
|
||||||
|
@t.final
|
||||||
|
@dataclasses.dataclass(frozen=True, slots=True)
|
||||||
|
class _LazyValue:
|
||||||
|
"""Wrapper around values to indicate lazy behavior has not yet been applied."""
|
||||||
|
|
||||||
|
value: t.Any
|
||||||
|
|
||||||
|
|
||||||
|
@t.final
|
||||||
|
@dataclasses.dataclass(frozen=True, kw_only=True, slots=True)
|
||||||
|
class _LazyValueSource:
|
||||||
|
"""Intermediate value source for lazy-eligible collection copy operations."""
|
||||||
|
|
||||||
|
source: t.Iterable
|
||||||
|
templar: TemplateEngine
|
||||||
|
lazy_options: LazyOptions
|
||||||
|
|
||||||
|
|
||||||
|
@t.final
|
||||||
|
class _NoKeySentinel(Sentinel):
|
||||||
|
"""Sentinel used to indicate a requested key was not found."""
|
||||||
|
|
||||||
|
|
||||||
|
# There are several operations performed by lazy containers, with some variation between types.
|
||||||
|
#
|
||||||
|
# Columns: D=dict, L=list, T=tuple
|
||||||
|
# Cells: l=lazy (upon access), n=non-lazy (__init__/__new__)
|
||||||
|
#
|
||||||
|
# D L T Feature Description
|
||||||
|
# - - - ----------- ---------------------------------------------------------------
|
||||||
|
# l l n propagation when container items which are containers become lazy instances
|
||||||
|
# l l n transform when transforms are applied to container items
|
||||||
|
# l l n templating when templating is performed on container items
|
||||||
|
# l l l access when access calls are performed on container items
|
||||||
|
|
||||||
|
|
||||||
|
class _AnsibleLazyTemplateMixin:
|
||||||
|
__slots__ = _NO_INSTANCE_STORAGE
|
||||||
|
|
||||||
|
_dispatch_types: t.ClassVar[dict[type, type[_AnsibleLazyTemplateMixin]]] = {} # populated by __init_subclass__
|
||||||
|
_container_types: t.ClassVar[set[type]] = set() # populated by __init_subclass__
|
||||||
|
|
||||||
|
_native_type: t.ClassVar[type] # from AnsibleTaggedObject
|
||||||
|
|
||||||
|
_SLOTS: t.Final = (
|
||||||
|
'_templar',
|
||||||
|
'_lazy_options',
|
||||||
|
)
|
||||||
|
|
||||||
|
_templar: TemplateEngine
|
||||||
|
_lazy_options: LazyOptions
|
||||||
|
|
||||||
|
def __init_subclass__(cls, **kwargs) -> None:
|
||||||
|
tagged_type = cls.__mro__[1]
|
||||||
|
native_type = tagged_type.__mro__[1]
|
||||||
|
|
||||||
|
for check_type in (tagged_type, native_type):
|
||||||
|
if conflicting_type := cls._dispatch_types.get(check_type):
|
||||||
|
raise TypeError(f"Lazy mixin {cls.__name__!r} type {check_type.__name__!r} conflicts with {conflicting_type.__name__!r}.")
|
||||||
|
|
||||||
|
cls._dispatch_types[native_type] = cls
|
||||||
|
cls._dispatch_types[tagged_type] = cls
|
||||||
|
cls._container_types.add(native_type)
|
||||||
|
cls._empty_tags_as_native = False # never revert to the native type when no tags remain
|
||||||
|
|
||||||
|
register_known_types(cls)
|
||||||
|
|
||||||
|
def __init__(self, contents: t.Iterable | _LazyValueSource) -> None:
|
||||||
|
if isinstance(contents, _LazyValueSource):
|
||||||
|
self._templar = contents.templar
|
||||||
|
self._lazy_options = contents.lazy_options
|
||||||
|
elif isinstance(contents, _AnsibleLazyTemplateMixin):
|
||||||
|
self._templar = contents._templar
|
||||||
|
self._lazy_options = contents._lazy_options
|
||||||
|
else:
|
||||||
|
raise UnsupportedConstructionMethodError()
|
||||||
|
|
||||||
|
def __reduce_ex__(self, protocol):
|
||||||
|
raise NotImplementedError("Pickling of Ansible lazy objects is not permitted.")
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _try_create(item: t.Any, lazy_options: LazyOptions = LazyOptions.DEFAULT) -> t.Any:
|
||||||
|
"""
|
||||||
|
If `item` is a container type which supports lazy access and/or templating, return a lazy wrapped version -- otherwise return it as-is.
|
||||||
|
When returning as-is, a warning or error may be generated for unknown types.
|
||||||
|
The `lazy_options.skip_templates` argument should be set to `True` when `item` is sourced from a plugin instead of Ansible variable storage.
|
||||||
|
This provides backwards compatibility and reduces lazy overhead, as plugins do not normally introduce templates.
|
||||||
|
If a plugin needs to introduce templates, the plugin is responsible for invoking the templar and returning the result.
|
||||||
|
"""
|
||||||
|
item_type = type(item)
|
||||||
|
|
||||||
|
# Try to use exact type match first to determine which wrapper (if any) to apply; isinstance checks
|
||||||
|
# are extremely expensive, so try to avoid them for our commonly-supported types.
|
||||||
|
if (dispatcher := _AnsibleLazyTemplateMixin._dispatch_types.get(item_type)) is not None:
|
||||||
|
# Create a generator that yields the elements of `item` wrapped in a `_LazyValue` wrapper.
|
||||||
|
# The wrapper is used to signal to the lazy container that the value must be processed before being returned.
|
||||||
|
# Values added to the lazy container later through other means will be returned as-is, without any special processing.
|
||||||
|
lazy_values = dispatcher._lazy_values(item, lazy_options)
|
||||||
|
tags_mapping = _try_get_internal_tags_mapping(item)
|
||||||
|
value = t.cast(AnsibleTaggedObject, dispatcher)._instance_factory(lazy_values, tags_mapping)
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
with Skippable, _TemplateConfig.unknown_type_encountered_handler.handle(AnsibleVariableTypeError, skip_on_ignore=True):
|
||||||
|
if item_type not in _KNOWN_TYPES:
|
||||||
|
raise AnsibleVariableTypeError(
|
||||||
|
message=f"Encountered unknown type {item_type.__name__!r} during template operation.",
|
||||||
|
help_text="Use supported types to avoid unexpected behavior.",
|
||||||
|
obj=TemplateContext.current().template_value,
|
||||||
|
)
|
||||||
|
|
||||||
|
return item
|
||||||
|
|
||||||
|
def _is_not_lazy_combine_candidate(self, other: object) -> bool:
|
||||||
|
"""Returns `True` if `other` cannot be lazily combined with the current instance due to differing templar/options, otherwise returns `False`."""
|
||||||
|
return isinstance(other, _AnsibleLazyTemplateMixin) and (self._templar is not other._templar or self._lazy_options != other._lazy_options)
|
||||||
|
|
||||||
|
def _non_lazy_copy(self) -> t.Collection:
|
||||||
|
"""
|
||||||
|
Return a non-lazy copy of this collection.
|
||||||
|
Any remaining lazy wrapped values will be unwrapped without further processing.
|
||||||
|
Tags on this instance will be preserved on the returned copy.
|
||||||
|
"""
|
||||||
|
raise NotImplementedError() # pragma: nocover
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _lazy_values(values: t.Any, lazy_options: LazyOptions) -> _LazyValueSource:
|
||||||
|
"""
|
||||||
|
Return an iterable that wraps each of the given elements in a lazy wrapper.
|
||||||
|
Only elements wrapped this way will receive lazy processing when retrieved from the collection.
|
||||||
|
"""
|
||||||
|
# DTFIX-RELEASE: check relative performance of method-local vs stored generator expressions on implementations of this method
|
||||||
|
raise NotImplementedError() # pragma: nocover
|
||||||
|
|
||||||
|
def _proxy_or_render_lazy_value(self, key: t.Any, value: t.Any) -> t.Any:
|
||||||
|
"""
|
||||||
|
Ensure that the value is lazy-proxied or rendered, and if a key is provided, replace the original value with the result.
|
||||||
|
"""
|
||||||
|
if type(value) is not _LazyValue: # pylint: disable=unidiomatic-typecheck
|
||||||
|
if self._lazy_options.access:
|
||||||
|
AnsibleAccessContext.current().access(value)
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
original_value = value.value
|
||||||
|
|
||||||
|
if self._lazy_options.access:
|
||||||
|
AnsibleAccessContext.current().access(original_value)
|
||||||
|
|
||||||
|
new_value = self._templar.template(original_value, lazy_options=self._lazy_options)
|
||||||
|
|
||||||
|
if new_value is not original_value and self._lazy_options.access:
|
||||||
|
AnsibleAccessContext.current().access(new_value)
|
||||||
|
|
||||||
|
if key is not _NoKeySentinel:
|
||||||
|
self._native_type.__setitem__(self, key, new_value) # type: ignore # pylint: disable=unnecessary-dunder-call
|
||||||
|
|
||||||
|
return new_value
|
||||||
|
|
||||||
|
|
||||||
|
@t.final # consumers of lazy collections rely heavily on the concrete types being final
|
||||||
|
class _AnsibleLazyTemplateDict(_AnsibleTaggedDict, _AnsibleLazyTemplateMixin):
|
||||||
|
__slots__ = _AnsibleLazyTemplateMixin._SLOTS
|
||||||
|
|
||||||
|
def __init__(self, contents: t.Iterable | _LazyValueSource, /, **kwargs) -> None:
|
||||||
|
_AnsibleLazyTemplateMixin.__init__(self, contents)
|
||||||
|
|
||||||
|
if isinstance(contents, _AnsibleLazyTemplateDict):
|
||||||
|
super().__init__(dict.items(contents), **kwargs)
|
||||||
|
elif isinstance(contents, _LazyValueSource):
|
||||||
|
super().__init__(contents.source, **kwargs)
|
||||||
|
else:
|
||||||
|
raise UnsupportedConstructionMethodError()
|
||||||
|
|
||||||
|
def get(self, key: t.Any, default: t.Any = None) -> t.Any:
|
||||||
|
if (value := super().get(key, _NoKeySentinel)) is _NoKeySentinel:
|
||||||
|
return default
|
||||||
|
|
||||||
|
return self._proxy_or_render_lazy_value(key, value)
|
||||||
|
|
||||||
|
def __getitem__(self, key: t.Any, /) -> t.Any:
|
||||||
|
return self._proxy_or_render_lazy_value(key, super().__getitem__(key))
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return str(self.copy()._native_copy()) # inefficient, but avoids mutating the current instance (to make debugging practical)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return repr(self.copy()._native_copy()) # inefficient, but avoids mutating the current instance (to make debugging practical)
|
||||||
|
|
||||||
|
def __iter__(self):
|
||||||
|
# We're using the base implementation, but must override `__iter__` to skip `dict` fast-path copy, which would bypass lazy behavior.
|
||||||
|
# See: https://github.com/python/cpython/blob/ffcc450a9b8b6927549b501eff7ac14abc238448/Objects/dictobject.c#L3861-L3864
|
||||||
|
return super().__iter__()
|
||||||
|
|
||||||
|
def setdefault(self, key, default=None, /) -> t.Any:
|
||||||
|
if (value := self.get(key, _NoKeySentinel)) is not _NoKeySentinel:
|
||||||
|
return value
|
||||||
|
|
||||||
|
super().__setitem__(key, default)
|
||||||
|
|
||||||
|
return default
|
||||||
|
|
||||||
|
def items(self):
|
||||||
|
for key, value in super().items():
|
||||||
|
yield key, self._proxy_or_render_lazy_value(key, value)
|
||||||
|
|
||||||
|
def values(self):
|
||||||
|
for _key, value in self.items():
|
||||||
|
yield value
|
||||||
|
|
||||||
|
def pop(self, key, default=_NoKeySentinel, /) -> t.Any:
|
||||||
|
if (value := super().get(key, _NoKeySentinel)) is _NoKeySentinel:
|
||||||
|
if default is _NoKeySentinel:
|
||||||
|
raise KeyError(key)
|
||||||
|
|
||||||
|
return default
|
||||||
|
|
||||||
|
value = self._proxy_or_render_lazy_value(_NoKeySentinel, value)
|
||||||
|
|
||||||
|
del self[key]
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
def popitem(self) -> t.Any:
|
||||||
|
try:
|
||||||
|
key = next(reversed(self))
|
||||||
|
except StopIteration:
|
||||||
|
raise KeyError("popitem(): dictionary is empty")
|
||||||
|
|
||||||
|
value = self._proxy_or_render_lazy_value(_NoKeySentinel, self[key])
|
||||||
|
|
||||||
|
del self[key]
|
||||||
|
|
||||||
|
return key, value
|
||||||
|
|
||||||
|
def _native_copy(self) -> dict:
|
||||||
|
return dict(self.items())
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _item_source(value: dict) -> dict | _LazyValueSource:
|
||||||
|
if isinstance(value, _AnsibleLazyTemplateDict):
|
||||||
|
return _LazyValueSource(source=dict.items(value), templar=value._templar, lazy_options=value._lazy_options)
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
def _yield_non_lazy_dict_items(self) -> t.Iterator[tuple[str, t.Any]]:
|
||||||
|
"""
|
||||||
|
Delegate to the base collection items iterator to yield the raw contents.
|
||||||
|
As of Python 3.13, generator functions are significantly faster than inline generator expressions.
|
||||||
|
"""
|
||||||
|
for k, v in dict.items(self):
|
||||||
|
yield k, v.value if type(v) is _LazyValue else v # pylint: disable=unidiomatic-typecheck
|
||||||
|
|
||||||
|
def _non_lazy_copy(self) -> dict:
|
||||||
|
return AnsibleTagHelper.tag_copy(self, self._yield_non_lazy_dict_items(), value_type=dict)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _lazy_values(values: dict, lazy_options: LazyOptions) -> _LazyValueSource:
|
||||||
|
return _LazyValueSource(source=((k, _LazyValue(v)) for k, v in values.items()), templar=TemplateContext.current().templar, lazy_options=lazy_options)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _proxy_or_render_other(other: t.Any | None) -> None:
|
||||||
|
"""Call `_proxy_or_render_lazy_values` if `other` is a lazy dict. Used internally by comparison methods."""
|
||||||
|
if type(other) is _AnsibleLazyTemplateDict: # pylint: disable=unidiomatic-typecheck
|
||||||
|
other._proxy_or_render_lazy_values()
|
||||||
|
|
||||||
|
def _proxy_or_render_lazy_values(self) -> None:
|
||||||
|
"""Ensure all `_LazyValue` wrapped values have been processed."""
|
||||||
|
for _unused in self.values():
|
||||||
|
pass
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
self._proxy_or_render_other(other)
|
||||||
|
return super().__eq__(other)
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
self._proxy_or_render_other(other)
|
||||||
|
return super().__ne__(other)
|
||||||
|
|
||||||
|
def __or__(self, other):
|
||||||
|
# DTFIX-RELEASE: support preservation of laziness when possible like we do for list
|
||||||
|
# Both sides end up going through _proxy_or_render_lazy_value, so there's no Templar preservation needed.
|
||||||
|
# In the future this could be made more lazy when both Templar instances are the same, or if per-value Templar tracking was used.
|
||||||
|
return super().__or__(other)
|
||||||
|
|
||||||
|
def __ror__(self, other):
|
||||||
|
# DTFIX-RELEASE: support preservation of laziness when possible like we do for list
|
||||||
|
# Both sides end up going through _proxy_or_render_lazy_value, so there's no Templar preservation needed.
|
||||||
|
# In the future this could be made more lazy when both Templar instances are the same, or if per-value Templar tracking was used.
|
||||||
|
return super().__ror__(other)
|
||||||
|
|
||||||
|
def __deepcopy__(self, memo):
|
||||||
|
return _AnsibleLazyTemplateDict(
|
||||||
|
_LazyValueSource(
|
||||||
|
source=((copy.deepcopy(k), copy.deepcopy(v)) for k, v in super().items()),
|
||||||
|
templar=copy.deepcopy(self._templar),
|
||||||
|
lazy_options=copy.deepcopy(self._lazy_options),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@t.final # consumers of lazy collections rely heavily on the concrete types being final
|
||||||
|
class _AnsibleLazyTemplateList(_AnsibleTaggedList, _AnsibleLazyTemplateMixin):
|
||||||
|
__slots__ = _AnsibleLazyTemplateMixin._SLOTS
|
||||||
|
|
||||||
|
def __init__(self, contents: t.Iterable | _LazyValueSource, /) -> None:
|
||||||
|
_AnsibleLazyTemplateMixin.__init__(self, contents)
|
||||||
|
|
||||||
|
if isinstance(contents, _AnsibleLazyTemplateList):
|
||||||
|
super().__init__(list.__iter__(contents))
|
||||||
|
elif isinstance(contents, _LazyValueSource):
|
||||||
|
super().__init__(contents.source)
|
||||||
|
else:
|
||||||
|
raise UnsupportedConstructionMethodError()
|
||||||
|
|
||||||
|
def __getitem__(self, key: t.SupportsIndex | slice, /) -> t.Any:
|
||||||
|
if type(key) is slice: # pylint: disable=unidiomatic-typecheck
|
||||||
|
return _AnsibleLazyTemplateList(_LazyValueSource(source=super().__getitem__(key), templar=self._templar, lazy_options=self._lazy_options))
|
||||||
|
|
||||||
|
return self._proxy_or_render_lazy_value(key, super().__getitem__(key))
|
||||||
|
|
||||||
|
def __iter__(self):
|
||||||
|
for key, value in enumerate(super().__iter__()):
|
||||||
|
yield self._proxy_or_render_lazy_value(key, value)
|
||||||
|
|
||||||
|
def pop(self, idx: t.SupportsIndex = -1, /) -> t.Any:
|
||||||
|
if not self:
|
||||||
|
raise IndexError('pop from empty list')
|
||||||
|
|
||||||
|
try:
|
||||||
|
value = self[idx]
|
||||||
|
except IndexError:
|
||||||
|
raise IndexError('pop index out of range')
|
||||||
|
|
||||||
|
value = self._proxy_or_render_lazy_value(_NoKeySentinel, value)
|
||||||
|
|
||||||
|
del self[idx]
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return str(self.copy()._native_copy()) # inefficient, but avoids mutating the current instance (to make debugging practical)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return repr(self.copy()._native_copy()) # inefficient, but avoids mutating the current instance (to make debugging practical)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _item_source(value: list) -> list | _LazyValueSource:
|
||||||
|
if isinstance(value, _AnsibleLazyTemplateList):
|
||||||
|
return _LazyValueSource(source=list.__iter__(value), templar=value._templar, lazy_options=value._lazy_options)
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
def _yield_non_lazy_list_items(self):
|
||||||
|
"""
|
||||||
|
Delegate to the base collection iterator to yield the raw contents.
|
||||||
|
As of Python 3.13, generator functions are significantly faster than inline generator expressions.
|
||||||
|
"""
|
||||||
|
for v in list.__iter__(self):
|
||||||
|
yield v.value if type(v) is _LazyValue else v # pylint: disable=unidiomatic-typecheck
|
||||||
|
|
||||||
|
def _non_lazy_copy(self) -> list:
|
||||||
|
return AnsibleTagHelper.tag_copy(self, self._yield_non_lazy_list_items(), value_type=list)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _lazy_values(values: list, lazy_options: LazyOptions) -> _LazyValueSource:
|
||||||
|
return _LazyValueSource(source=(_LazyValue(v) for v in values), templar=TemplateContext.current().templar, lazy_options=lazy_options)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _proxy_or_render_other(other: t.Any | None) -> None:
|
||||||
|
"""Call `_proxy_or_render_lazy_values` if `other` is a lazy list. Used internally by comparison methods."""
|
||||||
|
if type(other) is _AnsibleLazyTemplateList: # pylint: disable=unidiomatic-typecheck
|
||||||
|
other._proxy_or_render_lazy_values()
|
||||||
|
|
||||||
|
def _proxy_or_render_lazy_values(self) -> None:
|
||||||
|
"""Ensure all `_LazyValue` wrapped values have been processed."""
|
||||||
|
for _unused in self:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
self._proxy_or_render_other(other)
|
||||||
|
return super().__eq__(other)
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
self._proxy_or_render_other(other)
|
||||||
|
return super().__ne__(other)
|
||||||
|
|
||||||
|
def __gt__(self, other):
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
self._proxy_or_render_other(other)
|
||||||
|
return super().__gt__(other)
|
||||||
|
|
||||||
|
def __ge__(self, other):
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
self._proxy_or_render_other(other)
|
||||||
|
return super().__ge__(other)
|
||||||
|
|
||||||
|
def __lt__(self, other):
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
self._proxy_or_render_other(other)
|
||||||
|
return super().__lt__(other)
|
||||||
|
|
||||||
|
def __le__(self, other):
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
self._proxy_or_render_other(other)
|
||||||
|
return super().__le__(other)
|
||||||
|
|
||||||
|
def __contains__(self, item):
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
return super().__contains__(item)
|
||||||
|
|
||||||
|
def __reversed__(self):
|
||||||
|
for idx in range(self.__len__() - 1, -1, -1):
|
||||||
|
yield self[idx]
|
||||||
|
|
||||||
|
def __add__(self, other):
|
||||||
|
if self._is_not_lazy_combine_candidate(other):
|
||||||
|
# When other is lazy with a different templar/options, it cannot be lazily combined with self and a plain list must be returned.
|
||||||
|
# If other is a list, de-lazify both, otherwise just let the operation fail.
|
||||||
|
|
||||||
|
if isinstance(other, _AnsibleLazyTemplateList):
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
other._proxy_or_render_lazy_values()
|
||||||
|
|
||||||
|
return super().__add__(other)
|
||||||
|
|
||||||
|
# For all other cases, the new list inherits our templar and all values stay lazy.
|
||||||
|
# We use list.__add__ to avoid implementing all its error behavior.
|
||||||
|
return _AnsibleLazyTemplateList(_LazyValueSource(source=super().__add__(other), templar=self._templar, lazy_options=self._lazy_options))
|
||||||
|
|
||||||
|
def __radd__(self, other):
|
||||||
|
if not (other_add := getattr(other, '__add__', None)):
|
||||||
|
raise TypeError(f'unsupported operand type(s) for +: {type(other).__name__!r} and {type(self).__name__!r}') from None
|
||||||
|
|
||||||
|
return _AnsibleLazyTemplateList(_LazyValueSource(source=other_add(self), templar=self._templar, lazy_options=self._lazy_options))
|
||||||
|
|
||||||
|
def __mul__(self, other):
|
||||||
|
return _AnsibleLazyTemplateList(_LazyValueSource(source=super().__mul__(other), templar=self._templar, lazy_options=self._lazy_options))
|
||||||
|
|
||||||
|
def __rmul__(self, other):
|
||||||
|
return _AnsibleLazyTemplateList(_LazyValueSource(source=super().__rmul__(other), templar=self._templar, lazy_options=self._lazy_options))
|
||||||
|
|
||||||
|
def index(self, *args, **kwargs) -> int:
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
return super().index(*args, **kwargs)
|
||||||
|
|
||||||
|
def remove(self, *args, **kwargs) -> None:
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
super().remove(*args, **kwargs)
|
||||||
|
|
||||||
|
def sort(self, *args, **kwargs) -> None:
|
||||||
|
self._proxy_or_render_lazy_values()
|
||||||
|
super().sort(*args, **kwargs)
|
||||||
|
|
||||||
|
def __deepcopy__(self, memo):
|
||||||
|
return _AnsibleLazyTemplateList(
|
||||||
|
_LazyValueSource(
|
||||||
|
source=(copy.deepcopy(v) for v in super().__iter__()),
|
||||||
|
templar=copy.deepcopy(self._templar),
|
||||||
|
lazy_options=copy.deepcopy(self._lazy_options),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@t.final # consumers of lazy collections rely heavily on the concrete types being final
|
||||||
|
class _AnsibleLazyAccessTuple(_AnsibleTaggedTuple, _AnsibleLazyTemplateMixin):
|
||||||
|
"""
|
||||||
|
A tagged tuple subclass that provides only managed access for existing lazy values.
|
||||||
|
|
||||||
|
Since tuples are immutable, they cannot support lazy templating (which would change the tuple's value as templates were resolved).
|
||||||
|
When this type is created, each value in the source tuple is lazified:
|
||||||
|
|
||||||
|
* template strings are templated immediately (possibly resulting in lazy containers)
|
||||||
|
* non-tuple containers are lazy-wrapped
|
||||||
|
* tuples are immediately recursively lazy-wrapped
|
||||||
|
* transformations are applied immediately
|
||||||
|
|
||||||
|
The resulting object provides only managed access to its values (e.g., deprecation warnings, tripwires), and propagates to new lazy containers
|
||||||
|
created as a results of managed access.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: ensure we have tests that explicitly verify this behavior
|
||||||
|
|
||||||
|
# nonempty __slots__ not supported for subtype of 'tuple'
|
||||||
|
|
||||||
|
def __new__(cls, contents: t.Iterable | _LazyValueSource, /) -> t.Self:
|
||||||
|
if isinstance(contents, _AnsibleLazyAccessTuple):
|
||||||
|
return super().__new__(cls, tuple.__iter__(contents))
|
||||||
|
|
||||||
|
if isinstance(contents, _LazyValueSource):
|
||||||
|
return super().__new__(cls, contents.source)
|
||||||
|
|
||||||
|
raise UnsupportedConstructionMethodError()
|
||||||
|
|
||||||
|
def __init__(self, contents: t.Iterable | _LazyValueSource, /) -> None:
|
||||||
|
_AnsibleLazyTemplateMixin.__init__(self, contents)
|
||||||
|
|
||||||
|
def __getitem__(self, key: t.SupportsIndex | slice, /) -> t.Any:
|
||||||
|
if type(key) is slice: # pylint: disable=unidiomatic-typecheck
|
||||||
|
return _AnsibleLazyAccessTuple(super().__getitem__(key))
|
||||||
|
|
||||||
|
value = super().__getitem__(key)
|
||||||
|
|
||||||
|
if self._lazy_options.access:
|
||||||
|
AnsibleAccessContext.current().access(value)
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _item_source(value: tuple) -> tuple | _LazyValueSource:
|
||||||
|
if isinstance(value, _AnsibleLazyAccessTuple):
|
||||||
|
return _LazyValueSource(source=tuple.__iter__(value), templar=value._templar, lazy_options=value._lazy_options)
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _lazy_values(values: t.Any, lazy_options: LazyOptions) -> _LazyValueSource:
|
||||||
|
templar = TemplateContext.current().templar
|
||||||
|
|
||||||
|
return _LazyValueSource(source=(templar.template(value, lazy_options=lazy_options) for value in values), templar=templar, lazy_options=lazy_options)
|
||||||
|
|
||||||
|
def _non_lazy_copy(self) -> tuple:
|
||||||
|
return AnsibleTagHelper.tag_copy(self, self, value_type=tuple)
|
||||||
|
|
||||||
|
def __deepcopy__(self, memo):
|
||||||
|
return _AnsibleLazyAccessTuple(
|
||||||
|
_LazyValueSource(
|
||||||
|
source=(copy.deepcopy(v) for v in super().__iter__()),
|
||||||
|
templar=copy.deepcopy(self._templar),
|
||||||
|
lazy_options=copy.deepcopy(self._lazy_options),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def lazify_container(value: t.Any) -> t.Any:
|
||||||
|
"""
|
||||||
|
If the given value is a supported container type, return its lazy version, otherwise return the value as-is.
|
||||||
|
This is used to ensure that managed access and templating occur on args and kwargs to a callable, even if they were sourced from Jinja constants.
|
||||||
|
|
||||||
|
Since both variable access and plugin output are already lazified, this mostly affects Jinja constant containers.
|
||||||
|
However, plugins that directly invoke other plugins (e.g., `Environment.call_filter`) are another potential source of non-lazy containers.
|
||||||
|
In these cases, templating will occur for trusted templates automatically upon access.
|
||||||
|
|
||||||
|
Sets, tuples, and dictionary keys cannot be lazy, since their correct operation requires hashability and equality.
|
||||||
|
These properties are mutually exclusive with the following lazy features:
|
||||||
|
|
||||||
|
- managed access on encrypted strings - may raise errors on both operations when decryption fails
|
||||||
|
- managed access on markers - must raise errors on both operations
|
||||||
|
- templating - mutates values
|
||||||
|
|
||||||
|
That leaves non-raising managed access as the only remaining feature, which is insufficient to warrant lazy support.
|
||||||
|
"""
|
||||||
|
return _AnsibleLazyTemplateMixin._try_create(value)
|
||||||
|
|
||||||
|
|
||||||
|
def lazify_container_args(item: tuple) -> tuple:
|
||||||
|
"""Return the given args with values converted to lazy containers as needed."""
|
||||||
|
return tuple(lazify_container(value) for value in item)
|
||||||
|
|
||||||
|
|
||||||
|
def lazify_container_kwargs(item: dict[str, t.Any]) -> dict[str, t.Any]:
|
||||||
|
"""Return the given kwargs with values converted to lazy containers as needed."""
|
||||||
|
return {key: lazify_container(value) for key, value in item.items()}
|
||||||
@ -0,0 +1,103 @@
|
|||||||
|
"""Handling of `Marker` values."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import abc
|
||||||
|
import contextlib
|
||||||
|
import dataclasses
|
||||||
|
import itertools
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.utils.display import Display
|
||||||
|
|
||||||
|
from ._jinja_common import Marker
|
||||||
|
|
||||||
|
|
||||||
|
class MarkerBehavior(metaclass=abc.ABCMeta):
|
||||||
|
"""Base class to support custom handling of `Marker` values encountered during concatenation or finalization."""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def handle_marker(self, value: Marker) -> t.Any:
|
||||||
|
"""Handle the given `Marker` value."""
|
||||||
|
|
||||||
|
|
||||||
|
class FailingMarkerBehavior(MarkerBehavior):
|
||||||
|
"""
|
||||||
|
The default behavior when encountering a `Marker` value during concatenation or finalization.
|
||||||
|
This always raises the template-internal `MarkerError` exception.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def handle_marker(self, value: Marker) -> t.Any:
|
||||||
|
value.trip()
|
||||||
|
|
||||||
|
|
||||||
|
# FAIL_ON_MARKER_BEHAVIOR
|
||||||
|
# _DETONATE_MARKER_BEHAVIOR - internal singleton since it's the default and nobody should need to reference it, or make it an actual singleton
|
||||||
|
FAIL_ON_UNDEFINED: t.Final = FailingMarkerBehavior() # no sense in making many instances...
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(kw_only=True, slots=True, frozen=True)
|
||||||
|
class _MarkerTracker:
|
||||||
|
"""A numbered occurrence of a `Marker` value for later conversion to a warning."""
|
||||||
|
|
||||||
|
number: int
|
||||||
|
value: Marker
|
||||||
|
|
||||||
|
|
||||||
|
class ReplacingMarkerBehavior(MarkerBehavior):
|
||||||
|
"""All `Marker` values are replaced with a numbered string placeholder and the message from the value."""
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
self._trackers: list[_MarkerTracker] = []
|
||||||
|
|
||||||
|
def record_marker(self, value: Marker) -> t.Any:
|
||||||
|
"""Assign a sequence number to the given value and record it for later generation of warnings."""
|
||||||
|
number = len(self._trackers) + 1
|
||||||
|
|
||||||
|
self._trackers.append(_MarkerTracker(number=number, value=value))
|
||||||
|
|
||||||
|
return number
|
||||||
|
|
||||||
|
def emit_warnings(self) -> None:
|
||||||
|
"""Emit warning messages caused by Marker values, aggregated by unique template."""
|
||||||
|
|
||||||
|
display = Display()
|
||||||
|
grouped_templates = itertools.groupby(self._trackers, key=lambda tracker: tracker.value._marker_template_source)
|
||||||
|
|
||||||
|
for template, items in grouped_templates:
|
||||||
|
item_list = list(items)
|
||||||
|
|
||||||
|
msg = f'Encountered {len(item_list)} template error{"s" if len(item_list) > 1 else ""}.'
|
||||||
|
|
||||||
|
for item in item_list:
|
||||||
|
msg += f'\nerror {item.number} - {item.value._as_message()}'
|
||||||
|
|
||||||
|
display.warning(msg=msg, obj=template)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def warning_context(cls) -> t.Generator[t.Self, None, None]:
|
||||||
|
"""Collect warnings for `Marker` values and emit warnings when the context exits."""
|
||||||
|
instance = cls()
|
||||||
|
|
||||||
|
try:
|
||||||
|
yield instance
|
||||||
|
finally:
|
||||||
|
instance.emit_warnings()
|
||||||
|
|
||||||
|
def handle_marker(self, value: Marker) -> t.Any:
|
||||||
|
number = self.record_marker(value)
|
||||||
|
|
||||||
|
return f"<< error {number} - {value._as_message()} >>"
|
||||||
|
|
||||||
|
|
||||||
|
class RoutingMarkerBehavior(MarkerBehavior):
|
||||||
|
"""Routes instances of Marker (by type reference) to another MarkerBehavior, defaulting to FailingMarkerBehavior."""
|
||||||
|
|
||||||
|
def __init__(self, dispatch_table: dict[type[Marker], MarkerBehavior]) -> None:
|
||||||
|
self._dispatch_table = dispatch_table
|
||||||
|
|
||||||
|
def handle_marker(self, value: Marker) -> t.Any:
|
||||||
|
behavior = self._dispatch_table.get(type(value), FAIL_ON_UNDEFINED)
|
||||||
|
|
||||||
|
return behavior.handle_marker(value)
|
||||||
@ -0,0 +1,63 @@
|
|||||||
|
"""Runtime projections to provide template/var-visible views of objects that are not natively allowed in Ansible's type system."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import dataclasses
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.module_utils._internal import _traceback
|
||||||
|
from ansible.module_utils.common.messages import PluginInfo, ErrorSummary, WarningSummary, DeprecationSummary
|
||||||
|
from ansible.parsing.vault import EncryptedString, VaultHelper
|
||||||
|
from ansible.utils.display import Display
|
||||||
|
|
||||||
|
from ._jinja_common import VaultExceptionMarker
|
||||||
|
from .._errors import _captured, _utils
|
||||||
|
|
||||||
|
display = Display()
|
||||||
|
|
||||||
|
|
||||||
|
def plugin_info(value: PluginInfo) -> dict[str, str]:
|
||||||
|
"""Render PluginInfo as a dictionary."""
|
||||||
|
return dataclasses.asdict(value)
|
||||||
|
|
||||||
|
|
||||||
|
def error_summary(value: ErrorSummary) -> str:
|
||||||
|
"""Render ErrorSummary as a formatted traceback for backward-compatibility with pre-2.19 TaskResult.exception."""
|
||||||
|
return value.formatted_traceback or '(traceback unavailable)'
|
||||||
|
|
||||||
|
|
||||||
|
def warning_summary(value: WarningSummary) -> str:
|
||||||
|
"""Render WarningSummary as a simple message string for backward-compatibility with pre-2.19 TaskResult.warnings."""
|
||||||
|
return value._format()
|
||||||
|
|
||||||
|
|
||||||
|
def deprecation_summary(value: DeprecationSummary) -> dict[str, t.Any]:
|
||||||
|
"""Render DeprecationSummary as dict values for backward-compatibility with pre-2.19 TaskResult.deprecations."""
|
||||||
|
# DTFIX-RELEASE: reconsider which deprecation fields should be exposed here, taking into account that collection_name is to be deprecated
|
||||||
|
result = value._as_simple_dict()
|
||||||
|
result.pop('details')
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def encrypted_string(value: EncryptedString) -> str | VaultExceptionMarker:
|
||||||
|
"""Decrypt an encrypted string and return its value, or a VaultExceptionMarker if decryption fails."""
|
||||||
|
try:
|
||||||
|
return value._decrypt()
|
||||||
|
except Exception as ex:
|
||||||
|
return VaultExceptionMarker(
|
||||||
|
ciphertext=VaultHelper.get_ciphertext(value, with_tags=True),
|
||||||
|
reason=_utils.get_chained_message(ex),
|
||||||
|
traceback=_traceback.maybe_extract_traceback(ex, _traceback.TracebackEvent.ERROR),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
_type_transform_mapping: dict[type, t.Callable[[t.Any], t.Any]] = {
|
||||||
|
_captured.CapturedErrorSummary: error_summary,
|
||||||
|
PluginInfo: plugin_info,
|
||||||
|
ErrorSummary: error_summary,
|
||||||
|
WarningSummary: warning_summary,
|
||||||
|
DeprecationSummary: deprecation_summary,
|
||||||
|
EncryptedString: encrypted_string,
|
||||||
|
}
|
||||||
|
"""This mapping is consulted by `Templar.template` to provide custom views of some objects."""
|
||||||
@ -0,0 +1,107 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import dataclasses
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.module_utils._internal import _ambient_context, _datatag
|
||||||
|
|
||||||
|
if t.TYPE_CHECKING:
|
||||||
|
from ._engine import TemplateEngine, TemplateOptions
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(kw_only=True, slots=True, frozen=True)
|
||||||
|
class LazyOptions:
|
||||||
|
"""Templating options that apply to lazy containers, which are inherited by descendent lazy containers."""
|
||||||
|
|
||||||
|
DEFAULT: t.ClassVar[t.Self]
|
||||||
|
"""A shared instance with the default options to minimize instance creation for arg defaults."""
|
||||||
|
SKIP_TEMPLATES: t.ClassVar[t.Self]
|
||||||
|
"""A shared instance with only `template=False` set to minimize instance creation for arg defaults."""
|
||||||
|
SKIP_TEMPLATES_AND_ACCESS: t.ClassVar[t.Self]
|
||||||
|
"""A shared instance with both `template=False` and `access=False` set to minimize instance creation for arg defaults."""
|
||||||
|
|
||||||
|
template: bool = True
|
||||||
|
"""Enable/disable templating."""
|
||||||
|
|
||||||
|
access: bool = True
|
||||||
|
"""Enable/disables access calls."""
|
||||||
|
|
||||||
|
unmask_type_names: frozenset[str] = frozenset()
|
||||||
|
"""Disables template transformations for the provided type names."""
|
||||||
|
|
||||||
|
|
||||||
|
LazyOptions.DEFAULT = LazyOptions()
|
||||||
|
LazyOptions.SKIP_TEMPLATES = LazyOptions(template=False)
|
||||||
|
LazyOptions.SKIP_TEMPLATES_AND_ACCESS = LazyOptions(template=False, access=False)
|
||||||
|
|
||||||
|
|
||||||
|
class TemplateContext(_ambient_context.AmbientContextBase):
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
template_value: t.Any,
|
||||||
|
templar: TemplateEngine,
|
||||||
|
options: TemplateOptions,
|
||||||
|
stop_on_template: bool = False,
|
||||||
|
_render_jinja_const_template: bool = False,
|
||||||
|
):
|
||||||
|
self._template_value = template_value
|
||||||
|
self._templar = templar
|
||||||
|
self._options = options
|
||||||
|
self._stop_on_template = stop_on_template
|
||||||
|
self._parent_ctx = TemplateContext.current(optional=True)
|
||||||
|
self._render_jinja_const_template = _render_jinja_const_template
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_top_level(self) -> bool:
|
||||||
|
return not self._parent_ctx
|
||||||
|
|
||||||
|
@property
|
||||||
|
def template_value(self) -> t.Any:
|
||||||
|
return self._template_value
|
||||||
|
|
||||||
|
@property
|
||||||
|
def templar(self) -> TemplateEngine:
|
||||||
|
return self._templar
|
||||||
|
|
||||||
|
@property
|
||||||
|
def options(self) -> TemplateOptions:
|
||||||
|
return self._options
|
||||||
|
|
||||||
|
@property
|
||||||
|
def stop_on_template(self) -> bool:
|
||||||
|
return self._stop_on_template
|
||||||
|
|
||||||
|
|
||||||
|
class _OmitType:
|
||||||
|
"""
|
||||||
|
A placeholder singleton used to dynamically omit items from a dict/list/tuple/set when the value is `Omit`.
|
||||||
|
|
||||||
|
The `Omit` singleton is accessible from all Ansible templating contexts via the Jinja global name `omit`.
|
||||||
|
The `Omit` placeholder value will be visible to Jinja plugins during templating.
|
||||||
|
Jinja plugins requiring omit behavior are responsible for handling encountered `Omit` values.
|
||||||
|
`Omit` values remaining in template results will be automatically dropped during template finalization.
|
||||||
|
When a finalized template renders to a scalar `Omit`, `AnsibleValueOmittedError` will be raised.
|
||||||
|
Passing a value other than `Omit` for `value_for_omit` to the `template` call allows that value to be substituted instead of raising.
|
||||||
|
"""
|
||||||
|
|
||||||
|
__slots__ = ()
|
||||||
|
|
||||||
|
def __new__(cls):
|
||||||
|
return Omit
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<<Omit>>"
|
||||||
|
|
||||||
|
|
||||||
|
Omit = object.__new__(_OmitType)
|
||||||
|
|
||||||
|
_datatag._untaggable_types.add(_OmitType)
|
||||||
|
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: review these type sets to ensure they're not overly permissive/dynamic
|
||||||
|
IGNORE_SCALAR_VAR_TYPES = {value for value in _datatag._ANSIBLE_ALLOWED_SCALAR_VAR_TYPES if not issubclass(value, str)}
|
||||||
|
|
||||||
|
PASS_THROUGH_SCALAR_VAR_TYPES = _datatag._ANSIBLE_ALLOWED_SCALAR_VAR_TYPES | {
|
||||||
|
_OmitType, # allow pass through of omit for later handling after top-level finalize completes
|
||||||
|
}
|
||||||
File diff suppressed because it is too large
Load Diff
@ -0,0 +1,240 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import abc
|
||||||
|
import copy
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from yaml import Node
|
||||||
|
from yaml.constructor import SafeConstructor
|
||||||
|
from yaml.resolver import BaseResolver
|
||||||
|
|
||||||
|
from ansible import constants as C
|
||||||
|
from ansible.module_utils.common.text.converters import to_text
|
||||||
|
from ansible.module_utils._internal._datatag import AnsibleTagHelper
|
||||||
|
from ansible._internal._datatag._tags import Origin, TrustedAsTemplate
|
||||||
|
from ansible.parsing.vault import EncryptedString
|
||||||
|
from ansible.utils.display import Display
|
||||||
|
|
||||||
|
from ._errors import AnsibleConstructorError
|
||||||
|
|
||||||
|
display = Display()
|
||||||
|
|
||||||
|
_TRUSTED_AS_TEMPLATE: t.Final[TrustedAsTemplate] = TrustedAsTemplate()
|
||||||
|
|
||||||
|
|
||||||
|
class _BaseConstructor(SafeConstructor, metaclass=abc.ABCMeta):
|
||||||
|
"""Base class for Ansible YAML constructors."""
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
@abc.abstractmethod
|
||||||
|
def _register_constructors(cls) -> None:
|
||||||
|
"""Method used to register constructors to derived types during class initialization."""
|
||||||
|
|
||||||
|
def __init_subclass__(cls, **kwargs) -> None:
|
||||||
|
"""Initialization for derived types."""
|
||||||
|
cls._register_constructors()
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleInstrumentedConstructor(_BaseConstructor):
|
||||||
|
"""Ansible constructor which supports Ansible custom behavior such as `Origin` tagging, but no Ansible-specific YAML tags."""
|
||||||
|
|
||||||
|
name: t.Any # provided by the YAML parser, which retrieves it from the stream
|
||||||
|
|
||||||
|
def __init__(self, origin: Origin, trusted_as_template: bool) -> None:
|
||||||
|
if not origin.line_num:
|
||||||
|
origin = origin.replace(line_num=1)
|
||||||
|
|
||||||
|
self._origin = origin
|
||||||
|
self._trusted_as_template = trusted_as_template
|
||||||
|
self._duplicate_key_mode = C.config.get_config_value('DUPLICATE_YAML_DICT_KEY')
|
||||||
|
|
||||||
|
super().__init__()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def trusted_as_template(self) -> bool:
|
||||||
|
return self._trusted_as_template
|
||||||
|
|
||||||
|
def construct_yaml_map(self, node):
|
||||||
|
data = self._node_position_info(node).tag({}) # always an ordered dictionary on py3.7+
|
||||||
|
yield data
|
||||||
|
value = self.construct_mapping(node)
|
||||||
|
data.update(value)
|
||||||
|
|
||||||
|
def construct_mapping(self, node, deep=False):
|
||||||
|
# Delegate to built-in implementation to construct the mapping.
|
||||||
|
# This is done before checking for duplicates to leverage existing error checking on the input node.
|
||||||
|
mapping = super().construct_mapping(node, deep)
|
||||||
|
keys = set()
|
||||||
|
|
||||||
|
# Now that the node is known to be a valid mapping, handle any duplicate keys.
|
||||||
|
for key_node, _value_node in node.value:
|
||||||
|
if (key := self.construct_object(key_node, deep=deep)) in keys:
|
||||||
|
msg = f'Found duplicate mapping key {key!r}.'
|
||||||
|
|
||||||
|
if self._duplicate_key_mode == 'error':
|
||||||
|
raise AnsibleConstructorError(problem=msg, problem_mark=key_node.start_mark)
|
||||||
|
|
||||||
|
if self._duplicate_key_mode == 'warn':
|
||||||
|
display.warning(msg=msg, obj=key, help_text='Using last defined value only.')
|
||||||
|
|
||||||
|
keys.add(key)
|
||||||
|
|
||||||
|
return mapping
|
||||||
|
|
||||||
|
def construct_yaml_int(self, node):
|
||||||
|
value = super().construct_yaml_int(node)
|
||||||
|
return self._node_position_info(node).tag(value)
|
||||||
|
|
||||||
|
def construct_yaml_float(self, node):
|
||||||
|
value = super().construct_yaml_float(node)
|
||||||
|
return self._node_position_info(node).tag(value)
|
||||||
|
|
||||||
|
def construct_yaml_timestamp(self, node):
|
||||||
|
value = super().construct_yaml_timestamp(node)
|
||||||
|
return self._node_position_info(node).tag(value)
|
||||||
|
|
||||||
|
def construct_yaml_omap(self, node):
|
||||||
|
origin = self._node_position_info(node)
|
||||||
|
display.deprecated(
|
||||||
|
msg='Use of the YAML `!!omap` tag is deprecated.',
|
||||||
|
version='2.23',
|
||||||
|
obj=origin,
|
||||||
|
help_text='Use a standard mapping instead, as key order is always preserved.',
|
||||||
|
)
|
||||||
|
items = list(super().construct_yaml_omap(node))[0]
|
||||||
|
items = [origin.tag(item) for item in items]
|
||||||
|
yield origin.tag(items)
|
||||||
|
|
||||||
|
def construct_yaml_pairs(self, node):
|
||||||
|
origin = self._node_position_info(node)
|
||||||
|
display.deprecated(
|
||||||
|
msg='Use of the YAML `!!pairs` tag is deprecated.',
|
||||||
|
version='2.23',
|
||||||
|
obj=origin,
|
||||||
|
help_text='Use a standard mapping instead.',
|
||||||
|
)
|
||||||
|
items = list(super().construct_yaml_pairs(node))[0]
|
||||||
|
items = [origin.tag(item) for item in items]
|
||||||
|
yield origin.tag(items)
|
||||||
|
|
||||||
|
def construct_yaml_str(self, node):
|
||||||
|
# Override the default string handling function
|
||||||
|
# to always return unicode objects
|
||||||
|
# DTFIX-FUTURE: is this to_text conversion still necessary under Py3?
|
||||||
|
value = to_text(self.construct_scalar(node))
|
||||||
|
|
||||||
|
tags = [self._node_position_info(node)]
|
||||||
|
|
||||||
|
if self.trusted_as_template:
|
||||||
|
# NB: since we're not context aware, this will happily add trust to dictionary keys; this is actually necessary for
|
||||||
|
# certain backward compat scenarios, though might be accomplished in other ways if we wanted to avoid trusting keys in
|
||||||
|
# the general scenario
|
||||||
|
tags.append(_TRUSTED_AS_TEMPLATE)
|
||||||
|
|
||||||
|
return AnsibleTagHelper.tag(value, tags)
|
||||||
|
|
||||||
|
def construct_yaml_binary(self, node):
|
||||||
|
value = super().construct_yaml_binary(node)
|
||||||
|
|
||||||
|
return AnsibleTagHelper.tag(value, self._node_position_info(node))
|
||||||
|
|
||||||
|
def construct_yaml_set(self, node):
|
||||||
|
data = AnsibleTagHelper.tag(set(), self._node_position_info(node))
|
||||||
|
yield data
|
||||||
|
value = self.construct_mapping(node)
|
||||||
|
data.update(value)
|
||||||
|
|
||||||
|
def construct_yaml_seq(self, node):
|
||||||
|
data = self._node_position_info(node).tag([])
|
||||||
|
yield data
|
||||||
|
data.extend(self.construct_sequence(node))
|
||||||
|
|
||||||
|
def _resolve_and_construct_object(self, node):
|
||||||
|
# use a copied node to avoid mutating existing node and tripping the recursion check in construct_object
|
||||||
|
copied_node = copy.copy(node)
|
||||||
|
# repeat implicit resolution process to determine the proper tag for the value in the unsafe node
|
||||||
|
copied_node.tag = t.cast(BaseResolver, self).resolve(type(node), node.value, (True, False))
|
||||||
|
|
||||||
|
# re-entrant call using the correct tag
|
||||||
|
# non-deferred construction of hierarchical nodes so the result is a fully realized object, and so our stateful unsafe propagation behavior works
|
||||||
|
return self.construct_object(copied_node, deep=True)
|
||||||
|
|
||||||
|
def _node_position_info(self, node) -> Origin:
|
||||||
|
# the line number where the previous token has ended (plus empty lines)
|
||||||
|
# Add one so that the first line is line 1 rather than line 0
|
||||||
|
return self._origin.replace(line_num=node.start_mark.line + self._origin.line_num, col_num=node.start_mark.column + 1)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _register_constructors(cls) -> None:
|
||||||
|
constructors: dict[str, t.Callable] = {
|
||||||
|
'tag:yaml.org,2002:binary': cls.construct_yaml_binary,
|
||||||
|
'tag:yaml.org,2002:float': cls.construct_yaml_float,
|
||||||
|
'tag:yaml.org,2002:int': cls.construct_yaml_int,
|
||||||
|
'tag:yaml.org,2002:map': cls.construct_yaml_map,
|
||||||
|
'tag:yaml.org,2002:omap': cls.construct_yaml_omap,
|
||||||
|
'tag:yaml.org,2002:pairs': cls.construct_yaml_pairs,
|
||||||
|
'tag:yaml.org,2002:python/dict': cls.construct_yaml_map,
|
||||||
|
'tag:yaml.org,2002:python/unicode': cls.construct_yaml_str,
|
||||||
|
'tag:yaml.org,2002:seq': cls.construct_yaml_seq,
|
||||||
|
'tag:yaml.org,2002:set': cls.construct_yaml_set,
|
||||||
|
'tag:yaml.org,2002:str': cls.construct_yaml_str,
|
||||||
|
'tag:yaml.org,2002:timestamp': cls.construct_yaml_timestamp,
|
||||||
|
}
|
||||||
|
|
||||||
|
for tag, constructor in constructors.items():
|
||||||
|
cls.add_constructor(tag, constructor)
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleConstructor(AnsibleInstrumentedConstructor):
|
||||||
|
"""Ansible constructor which supports Ansible custom behavior such as `Origin` tagging, as well as Ansible-specific YAML tags."""
|
||||||
|
|
||||||
|
def __init__(self, origin: Origin, trusted_as_template: bool) -> None:
|
||||||
|
self._unsafe_depth = 0 # volatile state var used during recursive construction of a value tagged unsafe
|
||||||
|
|
||||||
|
super().__init__(origin=origin, trusted_as_template=trusted_as_template)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def trusted_as_template(self) -> bool:
|
||||||
|
return self._trusted_as_template and not self._unsafe_depth
|
||||||
|
|
||||||
|
def construct_yaml_unsafe(self, node):
|
||||||
|
self._unsafe_depth += 1
|
||||||
|
|
||||||
|
try:
|
||||||
|
return self._resolve_and_construct_object(node)
|
||||||
|
finally:
|
||||||
|
self._unsafe_depth -= 1
|
||||||
|
|
||||||
|
def construct_yaml_vault(self, node: Node) -> EncryptedString:
|
||||||
|
ciphertext = self._resolve_and_construct_object(node)
|
||||||
|
|
||||||
|
if not isinstance(ciphertext, str):
|
||||||
|
raise AnsibleConstructorError(problem=f"the {node.tag!r} tag requires a string value", problem_mark=node.start_mark)
|
||||||
|
|
||||||
|
encrypted_string = AnsibleTagHelper.tag_copy(ciphertext, EncryptedString(ciphertext=AnsibleTagHelper.untag(ciphertext)))
|
||||||
|
|
||||||
|
return encrypted_string
|
||||||
|
|
||||||
|
def construct_yaml_vault_encrypted(self, node: Node) -> EncryptedString:
|
||||||
|
origin = self._node_position_info(node)
|
||||||
|
display.deprecated(
|
||||||
|
msg='Use of the YAML `!vault-encrypted` tag is deprecated.',
|
||||||
|
version='2.23',
|
||||||
|
obj=origin,
|
||||||
|
help_text='Use the `!vault` tag instead.',
|
||||||
|
)
|
||||||
|
|
||||||
|
return self.construct_yaml_vault(node)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _register_constructors(cls) -> None:
|
||||||
|
super()._register_constructors()
|
||||||
|
|
||||||
|
constructors: dict[str, t.Callable] = {
|
||||||
|
'!unsafe': cls.construct_yaml_unsafe,
|
||||||
|
'!vault': cls.construct_yaml_vault,
|
||||||
|
'!vault-encrypted': cls.construct_yaml_vault_encrypted,
|
||||||
|
}
|
||||||
|
|
||||||
|
for tag, constructor in constructors.items():
|
||||||
|
cls.add_constructor(tag, constructor)
|
||||||
@ -0,0 +1,62 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import abc
|
||||||
|
import collections.abc as c
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from yaml.representer import SafeRepresenter
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._datatag import AnsibleTaggedObject, Tripwire, AnsibleTagHelper
|
||||||
|
from ansible.parsing.vault import VaultHelper
|
||||||
|
from ansible.module_utils.common.yaml import HAS_LIBYAML
|
||||||
|
|
||||||
|
if HAS_LIBYAML:
|
||||||
|
from yaml.cyaml import CSafeDumper as SafeDumper
|
||||||
|
else:
|
||||||
|
from yaml import SafeDumper # type: ignore[assignment]
|
||||||
|
|
||||||
|
|
||||||
|
class _BaseDumper(SafeDumper, metaclass=abc.ABCMeta):
|
||||||
|
"""Base class for Ansible YAML dumpers."""
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
@abc.abstractmethod
|
||||||
|
def _register_representers(cls) -> None:
|
||||||
|
"""Method used to register representers to derived types during class initialization."""
|
||||||
|
|
||||||
|
def __init_subclass__(cls, **kwargs) -> None:
|
||||||
|
"""Initialization for derived types."""
|
||||||
|
cls._register_representers()
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleDumper(_BaseDumper):
|
||||||
|
"""A simple stub class that allows us to add representers for our custom types."""
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: need a better way to handle serialization controls during YAML dumping
|
||||||
|
def __init__(self, *args, dump_vault_tags: bool | None = None, **kwargs):
|
||||||
|
super().__init__(*args, **kwargs)
|
||||||
|
|
||||||
|
self._dump_vault_tags = dump_vault_tags
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _register_representers(cls) -> None:
|
||||||
|
cls.add_multi_representer(AnsibleTaggedObject, cls.represent_ansible_tagged_object)
|
||||||
|
cls.add_multi_representer(Tripwire, cls.represent_tripwire)
|
||||||
|
cls.add_multi_representer(c.Mapping, SafeRepresenter.represent_dict)
|
||||||
|
cls.add_multi_representer(c.Sequence, SafeRepresenter.represent_list)
|
||||||
|
|
||||||
|
def represent_ansible_tagged_object(self, data):
|
||||||
|
if self._dump_vault_tags is not False and (ciphertext := VaultHelper.get_ciphertext(data, with_tags=False)):
|
||||||
|
# deprecated: description='enable the deprecation warning below' core_version='2.23'
|
||||||
|
# if self._dump_vault_tags is None:
|
||||||
|
# Display().deprecated(
|
||||||
|
# msg="Implicit YAML dumping of vaulted value ciphertext is deprecated. Set `dump_vault_tags` to explicitly specify the desired behavior",
|
||||||
|
# version="2.27",
|
||||||
|
# )
|
||||||
|
|
||||||
|
return self.represent_scalar('!vault', ciphertext, style='|')
|
||||||
|
|
||||||
|
return self.represent_data(AnsibleTagHelper.as_native_type(data)) # automatically decrypts encrypted strings
|
||||||
|
|
||||||
|
def represent_tripwire(self, data: Tripwire) -> t.NoReturn:
|
||||||
|
data.trip()
|
||||||
@ -0,0 +1,166 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import re
|
||||||
|
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from yaml import MarkedYAMLError
|
||||||
|
from yaml.constructor import ConstructorError
|
||||||
|
|
||||||
|
from ansible._internal._errors import _utils
|
||||||
|
from ansible.errors import AnsibleParserError
|
||||||
|
from ansible._internal._datatag._tags import Origin
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleConstructorError(ConstructorError):
|
||||||
|
"""Ansible-specific ConstructorError used to bypass exception analysis during wrapping in AnsibleYAMLParserError."""
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleYAMLParserError(AnsibleParserError):
|
||||||
|
"""YAML-specific parsing failure wrapping an exception raised by the YAML parser."""
|
||||||
|
|
||||||
|
_default_message = 'YAML parsing failed.'
|
||||||
|
|
||||||
|
_include_cause_message = False # hide the underlying cause message, it's included by `handle_exception` as needed
|
||||||
|
|
||||||
|
_formatted_source_context_value: str | None = None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def _formatted_source_context(self) -> str | None:
|
||||||
|
return self._formatted_source_context_value
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def handle_exception(cls, exception: Exception, origin: Origin) -> t.NoReturn:
|
||||||
|
if isinstance(exception, MarkedYAMLError):
|
||||||
|
origin = origin.replace(line_num=exception.problem_mark.line + 1, col_num=exception.problem_mark.column + 1)
|
||||||
|
|
||||||
|
source_context = _utils.SourceContext.from_origin(origin)
|
||||||
|
|
||||||
|
target_line = source_context.target_line or '' # for these cases, we don't need to distinguish between None and empty string
|
||||||
|
|
||||||
|
message: str | None = None
|
||||||
|
help_text = None
|
||||||
|
|
||||||
|
# FIXME: Do all this by walking the parsed YAML doc stream. Using regexes is a dead-end; YAML's just too flexible to not have a
|
||||||
|
# raft of false-positives and corner cases. If we directly consume either the YAML parse stream or override the YAML composer, we can
|
||||||
|
# better catch these things without worrying about duplicating YAML's scalar parsing logic around quoting/escaping. At first, we can
|
||||||
|
# replace the regex logic below with tiny special-purpose parse consumers to catch specific issues, but ideally, we could do a lot of this
|
||||||
|
# inline with the actual doc parse, since our rules are a lot more strict than YAML's (eg, no support for non-scalar keys), and a lot of the
|
||||||
|
# problem cases where that comes into play are around expression quoting and Jinja {{ syntax looking like weird YAML values we don't support.
|
||||||
|
# Some common examples, where -> is "what YAML actually sees":
|
||||||
|
# foo: {{ bar }} -> {"foo": {{"bar": None}: None}} - a mapping with a mapping as its key (legal YAML, but not legal Python/Ansible)
|
||||||
|
#
|
||||||
|
# - copy: src=foo.txt # kv syntax (kv could be on following line(s), too- implicit multi-line block scalar)
|
||||||
|
# dest: bar.txt # orphaned mapping, since the value of `copy` is the scalar "src=foo.txt"
|
||||||
|
#
|
||||||
|
# - msg == "Error: 'dude' was not found" # unquoted scalar has a : in it -> {'msg == "Error"': 'dude'} [ was not found" ] is garbage orphan scalar
|
||||||
|
|
||||||
|
# noinspection PyUnboundLocalVariable
|
||||||
|
if not isinstance(exception, MarkedYAMLError):
|
||||||
|
pass # unexpected exception, don't use special analysis of exception
|
||||||
|
|
||||||
|
elif isinstance(exception, AnsibleConstructorError):
|
||||||
|
pass # raised internally by ansible code, don't use special analysis of exception
|
||||||
|
|
||||||
|
# Check for tabs.
|
||||||
|
# There may be cases where there is a valid tab in a line that has other errors.
|
||||||
|
# That's OK, users should "fix" their tab usage anyway -- at which point later error handling logic will hopefully find the real issue.
|
||||||
|
elif (tab_idx := target_line.find('\t')) >= 0:
|
||||||
|
source_context = _utils.SourceContext.from_origin(origin.replace(col_num=tab_idx + 1))
|
||||||
|
message = "Tabs are usually invalid in YAML."
|
||||||
|
|
||||||
|
# Check for unquoted templates.
|
||||||
|
elif match := re.search(r'^\s*(?:-\s+)*(?:[\w\s]+:\s+)?(?P<value>\{\{.*}})', target_line):
|
||||||
|
source_context = _utils.SourceContext.from_origin(origin.replace(col_num=match.start('value') + 1))
|
||||||
|
message = 'This may be an issue with missing quotes around a template block.'
|
||||||
|
# FIXME: Use the captured value to show the actual fix required.
|
||||||
|
help_text = """
|
||||||
|
For example:
|
||||||
|
|
||||||
|
raw: {{ some_var }}
|
||||||
|
|
||||||
|
Should be:
|
||||||
|
|
||||||
|
raw: "{{ some_var }}"
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Check for common unquoted colon mistakes.
|
||||||
|
elif (
|
||||||
|
# ignore lines starting with only whitespace and a colon
|
||||||
|
not target_line.lstrip().startswith(':')
|
||||||
|
# find the value after list/dict preamble
|
||||||
|
and (value_match := re.search(r'^\s*(?:-\s+)*(?:[\w\s\[\]{}]+:\s+)?(?P<value>.*)$', target_line))
|
||||||
|
# ignore properly quoted values
|
||||||
|
and (target_fragment := _replace_quoted_value(value_match.group('value')))
|
||||||
|
# look for an unquoted colon in the value
|
||||||
|
and (colon_match := re.search(r':($| )', target_fragment))
|
||||||
|
):
|
||||||
|
source_context = _utils.SourceContext.from_origin(origin.replace(col_num=value_match.start('value') + colon_match.start() + 1))
|
||||||
|
message = 'Colons in unquoted values must be followed by a non-space character.'
|
||||||
|
# FIXME: Use the captured value to show the actual fix required.
|
||||||
|
help_text = """
|
||||||
|
For example:
|
||||||
|
|
||||||
|
raw: echo 'name: ansible'
|
||||||
|
|
||||||
|
Should be:
|
||||||
|
|
||||||
|
raw: "echo 'name: ansible'"
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Check for common quoting mistakes.
|
||||||
|
elif match := re.search(r'^\s*(?:-\s+)*(?:[\w\s]+:\s+)?(?P<value>[\"\'].*?\s*)$', target_line):
|
||||||
|
suspected_value = match.group('value')
|
||||||
|
first, last = suspected_value[0], suspected_value[-1]
|
||||||
|
|
||||||
|
if first != last: # "foo" in bar
|
||||||
|
source_context = _utils.SourceContext.from_origin(origin.replace(col_num=match.start('value') + 1))
|
||||||
|
message = 'Values starting with a quote must end with the same quote.'
|
||||||
|
# FIXME: Use the captured value to show the actual fix required, and use that same logic to improve the origin further.
|
||||||
|
help_text = """
|
||||||
|
For example:
|
||||||
|
|
||||||
|
raw: "foo" in bar
|
||||||
|
|
||||||
|
Should be:
|
||||||
|
|
||||||
|
raw: '"foo" in bar'
|
||||||
|
"""
|
||||||
|
elif first == last and target_line.count(first) > 2: # "foo" and "bar"
|
||||||
|
source_context = _utils.SourceContext.from_origin(origin.replace(col_num=match.start('value') + 1))
|
||||||
|
message = 'Values starting with a quote must end with the same quote, and not contain that quote.'
|
||||||
|
# FIXME: Use the captured value to show the actual fix required, and use that same logic to improve the origin further.
|
||||||
|
help_text = """
|
||||||
|
For example:
|
||||||
|
|
||||||
|
raw: "foo" in "bar"
|
||||||
|
|
||||||
|
Should be:
|
||||||
|
|
||||||
|
raw: '"foo" in "bar"'
|
||||||
|
"""
|
||||||
|
|
||||||
|
if not message:
|
||||||
|
if isinstance(exception, MarkedYAMLError):
|
||||||
|
# marked YAML error, pull out the useful messages while omitting the noise
|
||||||
|
message = ' '.join(filter(None, (exception.context, exception.problem, exception.note)))
|
||||||
|
message = message.strip()
|
||||||
|
message = f'{message[0].upper()}{message[1:]}'
|
||||||
|
|
||||||
|
if not message.endswith('.'):
|
||||||
|
message += '.'
|
||||||
|
else:
|
||||||
|
# unexpected error, use the exception message (normally hidden by overriding include_cause_message)
|
||||||
|
message = str(exception)
|
||||||
|
|
||||||
|
message = re.sub(r'\s+', ' ', message).strip()
|
||||||
|
|
||||||
|
error = cls(message, obj=source_context.origin)
|
||||||
|
error._formatted_source_context_value = str(source_context)
|
||||||
|
error._help_text = help_text
|
||||||
|
|
||||||
|
raise error from exception
|
||||||
|
|
||||||
|
|
||||||
|
def _replace_quoted_value(value: str, replacement='.') -> str:
|
||||||
|
return re.sub(r"""^\s*('[^']*'|"[^"]*")\s*$""", lambda match: replacement * len(match.group(0)), value)
|
||||||
@ -0,0 +1,66 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import io as _io
|
||||||
|
|
||||||
|
from yaml.resolver import Resolver
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._datatag import AnsibleTagHelper
|
||||||
|
from ansible.module_utils.common.yaml import HAS_LIBYAML
|
||||||
|
from ansible._internal._datatag import _tags
|
||||||
|
|
||||||
|
from ._constructor import AnsibleConstructor, AnsibleInstrumentedConstructor
|
||||||
|
|
||||||
|
if HAS_LIBYAML:
|
||||||
|
from yaml.cyaml import CParser
|
||||||
|
|
||||||
|
class _YamlParser(CParser):
|
||||||
|
def __init__(self, stream: str | bytes | _io.IOBase) -> None:
|
||||||
|
if isinstance(stream, (str, bytes)):
|
||||||
|
stream = AnsibleTagHelper.untag(stream) # PyYAML + libyaml barfs on str/bytes subclasses
|
||||||
|
|
||||||
|
CParser.__init__(self, stream)
|
||||||
|
|
||||||
|
self.name = getattr(stream, 'name', None) # provide feature parity with the Python implementation (yaml.reader.Reader provides name)
|
||||||
|
|
||||||
|
else:
|
||||||
|
from yaml.composer import Composer
|
||||||
|
from yaml.reader import Reader
|
||||||
|
from yaml.scanner import Scanner
|
||||||
|
from yaml.parser import Parser
|
||||||
|
|
||||||
|
class _YamlParser(Reader, Scanner, Parser, Composer): # type: ignore[no-redef]
|
||||||
|
def __init__(self, stream: str | bytes | _io.IOBase) -> None:
|
||||||
|
Reader.__init__(self, stream)
|
||||||
|
Scanner.__init__(self)
|
||||||
|
Parser.__init__(self)
|
||||||
|
Composer.__init__(self)
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleInstrumentedLoader(_YamlParser, AnsibleInstrumentedConstructor, Resolver):
|
||||||
|
"""Ansible YAML loader which supports Ansible custom behavior such as `Origin` tagging, but no Ansible-specific YAML tags."""
|
||||||
|
|
||||||
|
def __init__(self, stream: str | bytes | _io.IOBase) -> None:
|
||||||
|
_YamlParser.__init__(self, stream)
|
||||||
|
|
||||||
|
AnsibleInstrumentedConstructor.__init__(
|
||||||
|
self,
|
||||||
|
origin=_tags.Origin.get_or_create_tag(stream, self.name),
|
||||||
|
trusted_as_template=_tags.TrustedAsTemplate.is_tagged_on(stream),
|
||||||
|
)
|
||||||
|
|
||||||
|
Resolver.__init__(self)
|
||||||
|
|
||||||
|
|
||||||
|
class AnsibleLoader(_YamlParser, AnsibleConstructor, Resolver):
|
||||||
|
"""Ansible loader which supports Ansible custom behavior such as `Origin` tagging, as well as Ansible-specific YAML tags."""
|
||||||
|
|
||||||
|
def __init__(self, stream: str | bytes | _io.IOBase) -> None:
|
||||||
|
_YamlParser.__init__(self, stream)
|
||||||
|
|
||||||
|
AnsibleConstructor.__init__(
|
||||||
|
self,
|
||||||
|
origin=_tags.Origin.get_or_create_tag(stream, self.name),
|
||||||
|
trusted_as_template=_tags.TrustedAsTemplate.is_tagged_on(stream),
|
||||||
|
)
|
||||||
|
|
||||||
|
Resolver.__init__(self)
|
||||||
@ -0,0 +1,11 @@
|
|||||||
|
"Protomatter - an unstable substance which every ethical scientist in the galaxy has denounced as dangerously unpredictable."
|
||||||
|
|
||||||
|
"But it was the only way to solve certain problems..."
|
||||||
|
|
||||||
|
This Ansible Collection is embedded within ansible-core.
|
||||||
|
It contains plugins useful for ansible-core's own integration tests.
|
||||||
|
They have been made available, completely unsupported,
|
||||||
|
in case they prove useful for debugging and troubleshooting purposes.
|
||||||
|
|
||||||
|
> CAUTION: This collection is not supported, and may be changed or removed in any version without prior notice.
|
||||||
|
Use of these plugins outside ansible-core is highly discouraged.
|
||||||
@ -0,0 +1,36 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.module_utils.common.validation import _check_type_str_no_conversion, _check_type_list_strict
|
||||||
|
from ansible.plugins.action import ActionBase
|
||||||
|
from ansible._internal._templating._engine import TemplateEngine
|
||||||
|
from ansible._internal._templating._marker_behaviors import ReplacingMarkerBehavior
|
||||||
|
|
||||||
|
|
||||||
|
class ActionModule(ActionBase):
|
||||||
|
TRANSFERS_FILES = False
|
||||||
|
_requires_connection = False
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def finalize_task_arg(cls, name: str, value: t.Any, templar: TemplateEngine, context: t.Any) -> t.Any:
|
||||||
|
if name == 'expression':
|
||||||
|
return value
|
||||||
|
|
||||||
|
return super().finalize_task_arg(name, value, templar, context)
|
||||||
|
|
||||||
|
def run(self, tmp=None, task_vars=None):
|
||||||
|
# accepts a list of literal expressions (no templating), evaluates with no failure on undefined, returns all results
|
||||||
|
_vr, args = self.validate_argument_spec(
|
||||||
|
argument_spec=dict(
|
||||||
|
expression=dict(type=_check_type_list_strict, elements=_check_type_str_no_conversion, required=True),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
with ReplacingMarkerBehavior.warning_context() as replacing_behavior:
|
||||||
|
templar = self._templar._engine.extend(marker_behavior=replacing_behavior)
|
||||||
|
|
||||||
|
return dict(
|
||||||
|
_ansible_verbose_always=True,
|
||||||
|
expression_result=[templar.evaluate_expression(expression) for expression in args['expression']],
|
||||||
|
)
|
||||||
@ -0,0 +1,19 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible._internal._datatag._tags import TrustedAsTemplate
|
||||||
|
|
||||||
|
|
||||||
|
def apply_trust(value: object) -> object:
|
||||||
|
"""
|
||||||
|
Filter that returns a tagged copy of the input string with TrustedAsTemplate.
|
||||||
|
Containers and other non-string values are returned unmodified.
|
||||||
|
"""
|
||||||
|
return TrustedAsTemplate().tag(value) if isinstance(value, str) else value
|
||||||
|
|
||||||
|
|
||||||
|
class FilterModule:
|
||||||
|
@staticmethod
|
||||||
|
def filters() -> dict[str, t.Callable]:
|
||||||
|
return dict(apply_trust=apply_trust)
|
||||||
@ -0,0 +1,18 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import dataclasses
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
|
||||||
|
def dump_object(value: t.Any) -> object:
|
||||||
|
"""Internal filter to convert objects not supported by JSON to types which are."""
|
||||||
|
if dataclasses.is_dataclass(value):
|
||||||
|
return dataclasses.asdict(value) # type: ignore[arg-type]
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
class FilterModule(object):
|
||||||
|
@staticmethod
|
||||||
|
def filters() -> dict[str, t.Callable]:
|
||||||
|
return dict(dump_object=dump_object)
|
||||||
@ -0,0 +1,16 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible._internal._templating._engine import _finalize_template_result, FinalizeMode
|
||||||
|
|
||||||
|
|
||||||
|
def finalize(value: t.Any) -> t.Any:
|
||||||
|
"""Perform an explicit top-level template finalize operation on the supplied value."""
|
||||||
|
return _finalize_template_result(value, mode=FinalizeMode.TOP_LEVEL)
|
||||||
|
|
||||||
|
|
||||||
|
class FilterModule:
|
||||||
|
@staticmethod
|
||||||
|
def filters() -> dict[str, t.Callable]:
|
||||||
|
return dict(finalize=finalize)
|
||||||
@ -0,0 +1,18 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible._internal._datatag._tags import Origin
|
||||||
|
|
||||||
|
|
||||||
|
def origin(value: object) -> str | None:
|
||||||
|
"""Return the origin of the value, if any, otherwise `None`."""
|
||||||
|
origin_tag = Origin.get_tag(value)
|
||||||
|
|
||||||
|
return str(origin_tag) if origin_tag else None
|
||||||
|
|
||||||
|
|
||||||
|
class FilterModule:
|
||||||
|
@staticmethod
|
||||||
|
def filters() -> dict[str, t.Callable]:
|
||||||
|
return dict(origin=origin)
|
||||||
@ -0,0 +1,24 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import ast
|
||||||
|
|
||||||
|
from ansible.errors import AnsibleTypeError
|
||||||
|
|
||||||
|
|
||||||
|
def python_literal_eval(value: object, ignore_errors=False) -> object:
|
||||||
|
try:
|
||||||
|
if isinstance(value, str):
|
||||||
|
return ast.literal_eval(value)
|
||||||
|
|
||||||
|
raise AnsibleTypeError("The `value` to eval must be a string.", obj=value)
|
||||||
|
except Exception:
|
||||||
|
if ignore_errors:
|
||||||
|
return value
|
||||||
|
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
class FilterModule(object):
|
||||||
|
@staticmethod
|
||||||
|
def filters():
|
||||||
|
return dict(python_literal_eval=python_literal_eval)
|
||||||
@ -0,0 +1,33 @@
|
|||||||
|
DOCUMENTATION:
|
||||||
|
name: python_literal_eval
|
||||||
|
version_added: "2.19"
|
||||||
|
short_description: evaluate a Python literal expression string
|
||||||
|
description:
|
||||||
|
- Evaluates the input string as a Python literal expression, returning the resulting data structure.
|
||||||
|
- Previous versions of Ansible applied this behavior to all template results in non-native Jinja mode.
|
||||||
|
- This filter provides a way to emulate the previous behavior.
|
||||||
|
notes:
|
||||||
|
- Directly calls Python's C(ast.literal_eval).
|
||||||
|
positional: _input
|
||||||
|
options:
|
||||||
|
_input:
|
||||||
|
description: Python literal string expression.
|
||||||
|
type: str
|
||||||
|
required: true
|
||||||
|
ignore_errors:
|
||||||
|
description: Whether to silently ignore all errors resulting from the literal_eval operation. If true, the input is silently returned unmodified when an error occurs.
|
||||||
|
type: bool
|
||||||
|
default: false
|
||||||
|
|
||||||
|
EXAMPLES: |
|
||||||
|
- name: evaluate an expression comprised only of Python literals
|
||||||
|
assert:
|
||||||
|
that: (another_var | ansible._protomatter.python_literal_eval)[1] == 2 # in 2.19 and later, the explicit python_literal_eval emulates the old templating behavior
|
||||||
|
vars:
|
||||||
|
another_var: "{{ some_var }}" # in 2.18 and earlier, indirection through templating caused implicit literal_eval, converting the value to a list
|
||||||
|
some_var: "[1, 2]" # a value that looks like a Python list literal embedded in a string
|
||||||
|
|
||||||
|
RETURN:
|
||||||
|
_value:
|
||||||
|
description: Resulting data structure.
|
||||||
|
type: raw
|
||||||
@ -0,0 +1,16 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.module_utils._internal._datatag import AnsibleTagHelper
|
||||||
|
|
||||||
|
|
||||||
|
def tag_names(value: object) -> list[str]:
|
||||||
|
"""Return a list of tag type names (if any) present on the given object."""
|
||||||
|
return sorted(tag_type.__name__ for tag_type in AnsibleTagHelper.tag_types(value))
|
||||||
|
|
||||||
|
|
||||||
|
class FilterModule:
|
||||||
|
@staticmethod
|
||||||
|
def filters() -> dict[str, t.Callable]:
|
||||||
|
return dict(tag_names=tag_names)
|
||||||
@ -0,0 +1,17 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.plugins import accept_args_markers
|
||||||
|
|
||||||
|
|
||||||
|
@accept_args_markers
|
||||||
|
def true_type(obj: object) -> str:
|
||||||
|
"""Internal filter to show the true type name of the given object, not just the base type name like the `debug` filter."""
|
||||||
|
return obj.__class__.__name__
|
||||||
|
|
||||||
|
|
||||||
|
class FilterModule(object):
|
||||||
|
@staticmethod
|
||||||
|
def filters() -> dict[str, t.Callable]:
|
||||||
|
return dict(true_type=true_type)
|
||||||
@ -0,0 +1,49 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import copy
|
||||||
|
import dataclasses
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible._internal._templating._jinja_common import validate_arg_type
|
||||||
|
from ansible._internal._templating._lazy_containers import _AnsibleLazyTemplateMixin
|
||||||
|
from ansible._internal._templating._transform import _type_transform_mapping
|
||||||
|
from ansible.errors import AnsibleError
|
||||||
|
|
||||||
|
|
||||||
|
def unmask(value: object, type_names: str | list[str]) -> object:
|
||||||
|
"""
|
||||||
|
Internal filter to suppress automatic type transformation in Jinja (e.g., WarningMessageDetail, DeprecationMessageDetail, ErrorDetail).
|
||||||
|
Lazy collection caching is in play - the first attempt to access a value in a given lazy container must be with unmasking in place, or the transformed value
|
||||||
|
will already be cached.
|
||||||
|
"""
|
||||||
|
validate_arg_type("type_names", type_names, (str, list))
|
||||||
|
|
||||||
|
if isinstance(type_names, str):
|
||||||
|
check_type_names = [type_names]
|
||||||
|
else:
|
||||||
|
check_type_names = type_names
|
||||||
|
|
||||||
|
valid_type_names = {key.__name__ for key in _type_transform_mapping}
|
||||||
|
invalid_type_names = [type_name for type_name in check_type_names if type_name not in valid_type_names]
|
||||||
|
|
||||||
|
if invalid_type_names:
|
||||||
|
raise AnsibleError(f'Unknown type name(s): {", ".join(invalid_type_names)}', obj=type_names)
|
||||||
|
|
||||||
|
result: object
|
||||||
|
|
||||||
|
if isinstance(value, _AnsibleLazyTemplateMixin):
|
||||||
|
result = copy.copy(value)
|
||||||
|
result._lazy_options = dataclasses.replace(
|
||||||
|
result._lazy_options,
|
||||||
|
unmask_type_names=result._lazy_options.unmask_type_names | frozenset(check_type_names),
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
result = value
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
class FilterModule(object):
|
||||||
|
@staticmethod
|
||||||
|
def filters() -> dict[str, t.Callable]:
|
||||||
|
return dict(unmask=unmask)
|
||||||
@ -0,0 +1,21 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from ansible.plugins.lookup import LookupBase
|
||||||
|
|
||||||
|
|
||||||
|
class LookupModule(LookupBase):
|
||||||
|
"""Specialized config lookup that applies data transformations on values that config cannot."""
|
||||||
|
|
||||||
|
def run(self, terms, variables=None, **kwargs):
|
||||||
|
if not terms or not (config_name := terms[0]):
|
||||||
|
raise ValueError("config name is required")
|
||||||
|
|
||||||
|
match config_name:
|
||||||
|
case 'DISPLAY_TRACEBACK':
|
||||||
|
# since config can't expand this yet, we need the post-processed version
|
||||||
|
from ansible.module_utils._internal._traceback import traceback_for
|
||||||
|
|
||||||
|
return traceback_for()
|
||||||
|
# DTFIX-FUTURE: plumb through normal config fallback
|
||||||
|
case _:
|
||||||
|
raise ValueError(f"Unknown config name {config_name!r}.")
|
||||||
@ -0,0 +1,2 @@
|
|||||||
|
DOCUMENTATION:
|
||||||
|
name: config
|
||||||
@ -0,0 +1,15 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.module_utils._internal import _datatag
|
||||||
|
|
||||||
|
|
||||||
|
def tagged(value: t.Any) -> bool:
|
||||||
|
return bool(_datatag.AnsibleTagHelper.tag_types(value))
|
||||||
|
|
||||||
|
|
||||||
|
class TestModule:
|
||||||
|
@staticmethod
|
||||||
|
def tests() -> dict[str, t.Callable]:
|
||||||
|
return dict(tagged=tagged)
|
||||||
@ -0,0 +1,19 @@
|
|||||||
|
DOCUMENTATION:
|
||||||
|
name: tagged
|
||||||
|
author: Ansible Core
|
||||||
|
version_added: "2.19"
|
||||||
|
short_description: does the value have a data tag
|
||||||
|
description:
|
||||||
|
- Check if the provided value has a data tag.
|
||||||
|
options:
|
||||||
|
_input:
|
||||||
|
description: A value.
|
||||||
|
type: raw
|
||||||
|
|
||||||
|
EXAMPLES: |
|
||||||
|
is_data_tagged: "{{ my_variable is ansible._protomatter.tagged }}"
|
||||||
|
|
||||||
|
RETURN:
|
||||||
|
_value:
|
||||||
|
description: Returns C(True) if the value has one or more data tags, otherwise C(False).
|
||||||
|
type: boolean
|
||||||
@ -0,0 +1,18 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from ansible.module_utils._internal import _datatag
|
||||||
|
|
||||||
|
|
||||||
|
def tagged_with(value: t.Any, tag_name: str) -> bool:
|
||||||
|
if tag_type := _datatag._known_tag_type_map.get(tag_name):
|
||||||
|
return tag_type.is_tagged_on(value)
|
||||||
|
|
||||||
|
raise ValueError(f"Unknown tag name {tag_name!r}.")
|
||||||
|
|
||||||
|
|
||||||
|
class TestModule:
|
||||||
|
@staticmethod
|
||||||
|
def tests() -> dict[str, t.Callable]:
|
||||||
|
return dict(tagged_with=tagged_with)
|
||||||
@ -0,0 +1,19 @@
|
|||||||
|
DOCUMENTATION:
|
||||||
|
name: tagged_with
|
||||||
|
author: Ansible Core
|
||||||
|
version_added: "2.19"
|
||||||
|
short_description: does the value have the specified data tag
|
||||||
|
description:
|
||||||
|
- Check if the provided value has the specified data tag.
|
||||||
|
options:
|
||||||
|
_input:
|
||||||
|
description: A value.
|
||||||
|
type: raw
|
||||||
|
|
||||||
|
EXAMPLES: |
|
||||||
|
is_data_tagged: "{{ my_variable is ansible._protomatter.tagged_with('Origin') }}"
|
||||||
|
|
||||||
|
RETURN:
|
||||||
|
_value:
|
||||||
|
description: Returns C(True) if the value has the specified data tag, otherwise C(False).
|
||||||
|
type: boolean
|
||||||
@ -1,138 +0,0 @@
|
|||||||
# (c) 2012-2014, Michael DeHaan <michael.dehaan@gmail.com>
|
|
||||||
#
|
|
||||||
# This file is part of Ansible
|
|
||||||
#
|
|
||||||
# Ansible is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU General Public License as published by
|
|
||||||
# the Free Software Foundation, either version 3 of the License, or
|
|
||||||
# (at your option) any later version.
|
|
||||||
#
|
|
||||||
# Ansible is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU General Public License for more details.
|
|
||||||
#
|
|
||||||
# You should have received a copy of the GNU General Public License
|
|
||||||
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
from __future__ import annotations
|
|
||||||
|
|
||||||
__all__ = [
|
|
||||||
'YAML_SYNTAX_ERROR',
|
|
||||||
'YAML_POSITION_DETAILS',
|
|
||||||
'YAML_COMMON_DICT_ERROR',
|
|
||||||
'YAML_COMMON_UNQUOTED_VARIABLE_ERROR',
|
|
||||||
'YAML_COMMON_UNQUOTED_COLON_ERROR',
|
|
||||||
'YAML_COMMON_PARTIALLY_QUOTED_LINE_ERROR',
|
|
||||||
'YAML_COMMON_UNBALANCED_QUOTES_ERROR',
|
|
||||||
]
|
|
||||||
|
|
||||||
YAML_SYNTAX_ERROR = """\
|
|
||||||
Syntax Error while loading YAML.
|
|
||||||
%s"""
|
|
||||||
|
|
||||||
YAML_POSITION_DETAILS = """\
|
|
||||||
The error appears to be in '%s': line %s, column %s, but may
|
|
||||||
be elsewhere in the file depending on the exact syntax problem.
|
|
||||||
"""
|
|
||||||
|
|
||||||
YAML_COMMON_DICT_ERROR = """\
|
|
||||||
This one looks easy to fix. YAML thought it was looking for the start of a
|
|
||||||
hash/dictionary and was confused to see a second "{". Most likely this was
|
|
||||||
meant to be an ansible template evaluation instead, so we have to give the
|
|
||||||
parser a small hint that we wanted a string instead. The solution here is to
|
|
||||||
just quote the entire value.
|
|
||||||
|
|
||||||
For instance, if the original line was:
|
|
||||||
|
|
||||||
app_path: {{ base_path }}/foo
|
|
||||||
|
|
||||||
It should be written as:
|
|
||||||
|
|
||||||
app_path: "{{ base_path }}/foo"
|
|
||||||
"""
|
|
||||||
|
|
||||||
YAML_COMMON_UNQUOTED_VARIABLE_ERROR = """\
|
|
||||||
We could be wrong, but this one looks like it might be an issue with
|
|
||||||
missing quotes. Always quote template expression brackets when they
|
|
||||||
start a value. For instance:
|
|
||||||
|
|
||||||
with_items:
|
|
||||||
- {{ foo }}
|
|
||||||
|
|
||||||
Should be written as:
|
|
||||||
|
|
||||||
with_items:
|
|
||||||
- "{{ foo }}"
|
|
||||||
"""
|
|
||||||
|
|
||||||
YAML_COMMON_UNQUOTED_COLON_ERROR = """\
|
|
||||||
This one looks easy to fix. There seems to be an extra unquoted colon in the line
|
|
||||||
and this is confusing the parser. It was only expecting to find one free
|
|
||||||
colon. The solution is just add some quotes around the colon, or quote the
|
|
||||||
entire line after the first colon.
|
|
||||||
|
|
||||||
For instance, if the original line was:
|
|
||||||
|
|
||||||
copy: src=file.txt dest=/path/filename:with_colon.txt
|
|
||||||
|
|
||||||
It can be written as:
|
|
||||||
|
|
||||||
copy: src=file.txt dest='/path/filename:with_colon.txt'
|
|
||||||
|
|
||||||
Or:
|
|
||||||
|
|
||||||
copy: 'src=file.txt dest=/path/filename:with_colon.txt'
|
|
||||||
"""
|
|
||||||
|
|
||||||
YAML_COMMON_PARTIALLY_QUOTED_LINE_ERROR = """\
|
|
||||||
This one looks easy to fix. It seems that there is a value started
|
|
||||||
with a quote, and the YAML parser is expecting to see the line ended
|
|
||||||
with the same kind of quote. For instance:
|
|
||||||
|
|
||||||
when: "ok" in result.stdout
|
|
||||||
|
|
||||||
Could be written as:
|
|
||||||
|
|
||||||
when: '"ok" in result.stdout'
|
|
||||||
|
|
||||||
Or equivalently:
|
|
||||||
|
|
||||||
when: "'ok' in result.stdout"
|
|
||||||
"""
|
|
||||||
|
|
||||||
YAML_COMMON_UNBALANCED_QUOTES_ERROR = """\
|
|
||||||
We could be wrong, but this one looks like it might be an issue with
|
|
||||||
unbalanced quotes. If starting a value with a quote, make sure the
|
|
||||||
line ends with the same set of quotes. For instance this arbitrary
|
|
||||||
example:
|
|
||||||
|
|
||||||
foo: "bad" "wolf"
|
|
||||||
|
|
||||||
Could be written as:
|
|
||||||
|
|
||||||
foo: '"bad" "wolf"'
|
|
||||||
"""
|
|
||||||
|
|
||||||
YAML_COMMON_LEADING_TAB_ERROR = """\
|
|
||||||
There appears to be a tab character at the start of the line.
|
|
||||||
|
|
||||||
YAML does not use tabs for formatting. Tabs should be replaced with spaces.
|
|
||||||
|
|
||||||
For example:
|
|
||||||
- name: update tooling
|
|
||||||
vars:
|
|
||||||
version: 1.2.3
|
|
||||||
# ^--- there is a tab there.
|
|
||||||
|
|
||||||
Should be written as:
|
|
||||||
- name: update tooling
|
|
||||||
vars:
|
|
||||||
version: 1.2.3
|
|
||||||
# ^--- all spaces here.
|
|
||||||
"""
|
|
||||||
|
|
||||||
YAML_AND_SHORTHAND_ERROR = """\
|
|
||||||
There appears to be both 'k=v' shorthand syntax and YAML in this task. \
|
|
||||||
Only one syntax may be used.
|
|
||||||
"""
|
|
||||||
@ -1,44 +0,0 @@
|
|||||||
# (c) 2016 - Red Hat, Inc. <info@ansible.com>
|
|
||||||
#
|
|
||||||
# This file is part of Ansible
|
|
||||||
#
|
|
||||||
# Ansible is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU General Public License as published by
|
|
||||||
# the Free Software Foundation, either version 3 of the License, or
|
|
||||||
# (at your option) any later version.
|
|
||||||
#
|
|
||||||
# Ansible is distributed in the hope that it will be useful,
|
|
||||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
# GNU General Public License for more details.
|
|
||||||
#
|
|
||||||
# You should have received a copy of the GNU General Public License
|
|
||||||
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
from __future__ import annotations
|
|
||||||
|
|
||||||
import multiprocessing.synchronize
|
|
||||||
|
|
||||||
from ansible.utils.multiprocessing import context as multiprocessing_context
|
|
||||||
|
|
||||||
from ansible.module_utils.facts.system.pkg_mgr import PKG_MGRS
|
|
||||||
|
|
||||||
if 'action_write_locks' not in globals():
|
|
||||||
# Do not initialize this more than once because it seems to bash
|
|
||||||
# the existing one. multiprocessing must be reloading the module
|
|
||||||
# when it forks?
|
|
||||||
action_write_locks: dict[str | None, multiprocessing.synchronize.Lock] = dict()
|
|
||||||
|
|
||||||
# Below is a Lock for use when we weren't expecting a named module. It gets used when an action
|
|
||||||
# plugin invokes a module whose name does not match with the action's name. Slightly less
|
|
||||||
# efficient as all processes with unexpected module names will wait on this lock
|
|
||||||
action_write_locks[None] = multiprocessing_context.Lock()
|
|
||||||
|
|
||||||
# These plugins are known to be called directly by action plugins with names differing from the
|
|
||||||
# action plugin name. We precreate them here as an optimization.
|
|
||||||
# If a list of service managers is created in the future we can do the same for them.
|
|
||||||
mods = set(p['name'] for p in PKG_MGRS)
|
|
||||||
|
|
||||||
mods.update(('copy', 'file', 'setup', 'slurp', 'stat'))
|
|
||||||
for mod_name in mods:
|
|
||||||
action_write_locks[mod_name] = multiprocessing_context.Lock()
|
|
||||||
File diff suppressed because it is too large
Load Diff
@ -0,0 +1,55 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import collections.abc as c
|
||||||
|
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
|
||||||
|
# DTFIX-RELEASE: bikeshed "intermediate"
|
||||||
|
INTERMEDIATE_MAPPING_TYPES = (c.Mapping,)
|
||||||
|
"""
|
||||||
|
Mapping types which are supported for recursion and runtime usage, such as in serialization and templating.
|
||||||
|
These will be converted to a simple Python `dict` before serialization or storage as a variable.
|
||||||
|
"""
|
||||||
|
|
||||||
|
INTERMEDIATE_ITERABLE_TYPES = (tuple, set, frozenset, c.Sequence)
|
||||||
|
"""
|
||||||
|
Iterable types which are supported for recursion and runtime usage, such as in serialization and templating.
|
||||||
|
These will be converted to a simple Python `list` before serialization or storage as a variable.
|
||||||
|
CAUTION: Scalar types which are sequences should be excluded when using this.
|
||||||
|
"""
|
||||||
|
|
||||||
|
ITERABLE_SCALARS_NOT_TO_ITERATE_FIXME = (str, bytes)
|
||||||
|
"""Scalars which are also iterable, and should thus be excluded from iterable checks."""
|
||||||
|
|
||||||
|
|
||||||
|
def is_intermediate_mapping(value: object) -> bool:
|
||||||
|
"""Returns `True` if `value` is a type supported for projection to a Python `dict`, otherwise returns `False`."""
|
||||||
|
# DTFIX-RELEASE: bikeshed name
|
||||||
|
return isinstance(value, INTERMEDIATE_MAPPING_TYPES)
|
||||||
|
|
||||||
|
|
||||||
|
def is_intermediate_iterable(value: object) -> bool:
|
||||||
|
"""Returns `True` if `value` is a type supported for projection to a Python `list`, otherwise returns `False`."""
|
||||||
|
# DTFIX-RELEASE: bikeshed name
|
||||||
|
return isinstance(value, INTERMEDIATE_ITERABLE_TYPES) and not isinstance(value, ITERABLE_SCALARS_NOT_TO_ITERATE_FIXME)
|
||||||
|
|
||||||
|
|
||||||
|
is_controller: bool = False
|
||||||
|
"""Set to True automatically when this module is imported into an Ansible controller context."""
|
||||||
|
|
||||||
|
|
||||||
|
def get_controller_serialize_map() -> dict[type, t.Callable]:
|
||||||
|
"""
|
||||||
|
Called to augment serialization maps.
|
||||||
|
This implementation is replaced with the one from ansible._internal in controller contexts.
|
||||||
|
"""
|
||||||
|
return {}
|
||||||
|
|
||||||
|
|
||||||
|
def import_controller_module(_module_name: str, /) -> t.Any:
|
||||||
|
"""
|
||||||
|
Called to conditionally import the named module in a controller context, otherwise returns `None`.
|
||||||
|
This implementation is replaced with the one from ansible._internal in controller contexts.
|
||||||
|
"""
|
||||||
|
return None
|
||||||
@ -0,0 +1,58 @@
|
|||||||
|
# Copyright (c) 2024 Ansible Project
|
||||||
|
# Simplified BSD License (see licenses/simplified_bsd.txt or https://opensource.org/licenses/BSD-2-Clause)
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import contextlib
|
||||||
|
import contextvars
|
||||||
|
|
||||||
|
# deprecated: description='typing.Self exists in Python 3.11+' python_version='3.10'
|
||||||
|
from ..compat import typing as t
|
||||||
|
|
||||||
|
|
||||||
|
class AmbientContextBase:
|
||||||
|
"""
|
||||||
|
An abstract base context manager that, once entered, will be accessible via its `current` classmethod to any code in the same
|
||||||
|
`contextvars` context (e.g. same thread/coroutine), until it is exited.
|
||||||
|
"""
|
||||||
|
|
||||||
|
__slots__ = ('_contextvar_token',)
|
||||||
|
|
||||||
|
# DTFIX-FUTURE: subclasses need to be able to opt-in to blocking nested contexts of the same type (basically optional per-callstack singleton behavior)
|
||||||
|
# DTFIX-RELEASE: this class should enforce strict nesting of contexts; overlapping context lifetimes leads to incredibly difficult to
|
||||||
|
# debug situations with undefined behavior, so it should fail fast.
|
||||||
|
# DTFIX-RELEASE: make frozen=True dataclass subclasses work (fix the mutability of the contextvar instance)
|
||||||
|
|
||||||
|
_contextvar: t.ClassVar[contextvars.ContextVar] # pylint: disable=declare-non-slot # pylint bug, see https://github.com/pylint-dev/pylint/issues/9950
|
||||||
|
_contextvar_token: contextvars.Token
|
||||||
|
|
||||||
|
def __init_subclass__(cls, **kwargs) -> None:
|
||||||
|
cls._contextvar = contextvars.ContextVar(cls.__name__)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def when(cls, condition: bool, /, *args, **kwargs) -> t.Self | contextlib.nullcontext:
|
||||||
|
"""Return an instance of the context if `condition` is `True`, otherwise return a `nullcontext` instance."""
|
||||||
|
return cls(*args, **kwargs) if condition else contextlib.nullcontext()
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def current(cls, optional: bool = False) -> t.Self | None:
|
||||||
|
"""
|
||||||
|
Return the currently active context value for the current thread or coroutine.
|
||||||
|
Raises ReferenceError if a context is not active, unless `optional` is `True`.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
return cls._contextvar.get()
|
||||||
|
except LookupError:
|
||||||
|
if optional:
|
||||||
|
return None
|
||||||
|
|
||||||
|
raise ReferenceError(f"A required {cls.__name__} context is not active.") from None
|
||||||
|
|
||||||
|
def __enter__(self) -> t.Self:
|
||||||
|
# DTFIX-RELEASE: actively block multiple entry
|
||||||
|
self._contextvar_token = self.__class__._contextvar.set(self)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __exit__(self, exc_type, exc_val, exc_tb) -> None:
|
||||||
|
self.__class__._contextvar.reset(self._contextvar_token)
|
||||||
|
del self._contextvar_token
|
||||||
@ -0,0 +1,133 @@
|
|||||||
|
# Copyright (c) 2024 Ansible Project
|
||||||
|
# Simplified BSD License (see licenses/simplified_bsd.txt or https://opensource.org/licenses/BSD-2-Clause)
|
||||||
|
|
||||||
|
"""Support code for exclusive use by the AnsiballZ wrapper."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import atexit
|
||||||
|
import dataclasses
|
||||||
|
import importlib.util
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import runpy
|
||||||
|
import sys
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
from . import _errors
|
||||||
|
from ._plugin_exec_context import PluginExecContext, HasPluginInfo
|
||||||
|
from .. import basic
|
||||||
|
from ..common.json import get_module_encoder, Direction
|
||||||
|
from ..common.messages import PluginInfo
|
||||||
|
|
||||||
|
|
||||||
|
def run_module(
|
||||||
|
*,
|
||||||
|
json_params: bytes,
|
||||||
|
profile: str,
|
||||||
|
plugin_info_dict: dict[str, object],
|
||||||
|
module_fqn: str,
|
||||||
|
modlib_path: str,
|
||||||
|
init_globals: dict[str, t.Any] | None = None,
|
||||||
|
coverage_config: str | None = None,
|
||||||
|
coverage_output: str | None = None,
|
||||||
|
) -> None: # pragma: nocover
|
||||||
|
"""Used internally by the AnsiballZ wrapper to run an Ansible module."""
|
||||||
|
try:
|
||||||
|
_enable_coverage(coverage_config, coverage_output)
|
||||||
|
_run_module(
|
||||||
|
json_params=json_params,
|
||||||
|
profile=profile,
|
||||||
|
plugin_info_dict=plugin_info_dict,
|
||||||
|
module_fqn=module_fqn,
|
||||||
|
modlib_path=modlib_path,
|
||||||
|
init_globals=init_globals,
|
||||||
|
)
|
||||||
|
except Exception as ex: # not BaseException, since modules are expected to raise SystemExit
|
||||||
|
_handle_exception(ex, profile)
|
||||||
|
|
||||||
|
|
||||||
|
def _enable_coverage(coverage_config: str | None, coverage_output: str | None) -> None: # pragma: nocover
|
||||||
|
"""Bootstrap `coverage` for the current Ansible module invocation."""
|
||||||
|
if not coverage_config:
|
||||||
|
return
|
||||||
|
|
||||||
|
if coverage_output:
|
||||||
|
# Enable code coverage analysis of the module.
|
||||||
|
# This feature is for internal testing and may change without notice.
|
||||||
|
python_version_string = '.'.join(str(v) for v in sys.version_info[:2])
|
||||||
|
os.environ['COVERAGE_FILE'] = f'{coverage_output}=python-{python_version_string}=coverage'
|
||||||
|
|
||||||
|
import coverage
|
||||||
|
|
||||||
|
cov = coverage.Coverage(config_file=coverage_config)
|
||||||
|
|
||||||
|
def atexit_coverage():
|
||||||
|
cov.stop()
|
||||||
|
cov.save()
|
||||||
|
|
||||||
|
atexit.register(atexit_coverage)
|
||||||
|
|
||||||
|
cov.start()
|
||||||
|
else:
|
||||||
|
# Verify coverage is available without importing it.
|
||||||
|
# This will detect when a module would fail with coverage enabled with minimal overhead.
|
||||||
|
if importlib.util.find_spec('coverage') is None:
|
||||||
|
raise RuntimeError('Could not find the `coverage` Python module.')
|
||||||
|
|
||||||
|
|
||||||
|
def _run_module(
|
||||||
|
*,
|
||||||
|
json_params: bytes,
|
||||||
|
profile: str,
|
||||||
|
plugin_info_dict: dict[str, object],
|
||||||
|
module_fqn: str,
|
||||||
|
modlib_path: str,
|
||||||
|
init_globals: dict[str, t.Any] | None = None,
|
||||||
|
) -> None:
|
||||||
|
"""Used internally by `_run_module` to run an Ansible module after coverage has been enabled (if applicable)."""
|
||||||
|
basic._ANSIBLE_ARGS = json_params
|
||||||
|
basic._ANSIBLE_PROFILE = profile
|
||||||
|
|
||||||
|
init_globals = init_globals or {}
|
||||||
|
init_globals.update(_module_fqn=module_fqn, _modlib_path=modlib_path)
|
||||||
|
|
||||||
|
with PluginExecContext(_ModulePluginWrapper(PluginInfo._from_dict(plugin_info_dict))):
|
||||||
|
# Run the module. By importing it as '__main__', it executes as a script.
|
||||||
|
runpy.run_module(mod_name=module_fqn, init_globals=init_globals, run_name='__main__', alter_sys=True)
|
||||||
|
|
||||||
|
# An Ansible module must print its own results and exit. If execution reaches this point, that did not happen.
|
||||||
|
raise RuntimeError('New-style module did not handle its own exit.')
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_exception(exception: BaseException, profile: str) -> t.NoReturn:
|
||||||
|
"""Handle the given exception."""
|
||||||
|
result = dict(
|
||||||
|
failed=True,
|
||||||
|
exception=_errors.create_error_summary(exception),
|
||||||
|
)
|
||||||
|
|
||||||
|
encoder = get_module_encoder(profile, Direction.MODULE_TO_CONTROLLER)
|
||||||
|
|
||||||
|
print(json.dumps(result, cls=encoder)) # pylint: disable=ansible-bad-function
|
||||||
|
|
||||||
|
sys.exit(1) # pylint: disable=ansible-bad-function
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass(frozen=True)
|
||||||
|
class _ModulePluginWrapper(HasPluginInfo):
|
||||||
|
"""Modules aren't plugin instances; this adapter implements the `HasPluginInfo` protocol to allow `PluginExecContext` infra to work with modules."""
|
||||||
|
|
||||||
|
plugin: PluginInfo
|
||||||
|
|
||||||
|
@property
|
||||||
|
def _load_name(self) -> str:
|
||||||
|
return self.plugin.requested_name
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ansible_name(self) -> str:
|
||||||
|
return self.plugin.resolved_name
|
||||||
|
|
||||||
|
@property
|
||||||
|
def plugin_type(self) -> str:
|
||||||
|
return self.plugin.type
|
||||||
@ -0,0 +1,64 @@
|
|||||||
|
"""Patch broken ClassVar support in dataclasses when ClassVar is accessed via a module other than `typing`."""
|
||||||
|
|
||||||
|
# deprecated: description='verify ClassVar support in dataclasses has been fixed in Python before removing this patching code', python_version='3.12'
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import dataclasses
|
||||||
|
import sys
|
||||||
|
import typing as t
|
||||||
|
|
||||||
|
# trigger the bug by exposing typing.ClassVar via a module reference that is not `typing`
|
||||||
|
_ts = sys.modules[__name__]
|
||||||
|
ClassVar = t.ClassVar
|
||||||
|
|
||||||
|
|
||||||
|
def patch_dataclasses_is_type() -> None:
|
||||||
|
if not _is_patch_needed():
|
||||||
|
return # pragma: nocover
|
||||||
|
|
||||||
|
try:
|
||||||
|
real_is_type = dataclasses._is_type # type: ignore[attr-defined]
|
||||||
|
except AttributeError: # pragma: nocover
|
||||||
|
raise RuntimeError("unable to patch broken dataclasses ClassVar support") from None
|
||||||
|
|
||||||
|
# patch dataclasses._is_type - impl from https://github.com/python/cpython/blob/4c6d4f5cb33e48519922d635894eef356faddba2/Lib/dataclasses.py#L709-L765
|
||||||
|
def _is_type(annotation, cls, a_module, a_type, is_type_predicate):
|
||||||
|
match = dataclasses._MODULE_IDENTIFIER_RE.match(annotation) # type: ignore[attr-defined]
|
||||||
|
if match:
|
||||||
|
ns = None
|
||||||
|
module_name = match.group(1)
|
||||||
|
if not module_name:
|
||||||
|
# No module name, assume the class's module did
|
||||||
|
# "from dataclasses import InitVar".
|
||||||
|
ns = sys.modules.get(cls.__module__).__dict__
|
||||||
|
else:
|
||||||
|
# Look up module_name in the class's module.
|
||||||
|
module = sys.modules.get(cls.__module__)
|
||||||
|
if module and module.__dict__.get(module_name): # this is the patched line; removed `is a_module`
|
||||||
|
ns = sys.modules.get(a_type.__module__).__dict__
|
||||||
|
if ns and is_type_predicate(ns.get(match.group(2)), a_module):
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
_is_type._orig_impl = real_is_type # type: ignore[attr-defined] # stash this away to allow unit tests to undo the patch
|
||||||
|
|
||||||
|
dataclasses._is_type = _is_type # type: ignore[attr-defined]
|
||||||
|
|
||||||
|
try:
|
||||||
|
if _is_patch_needed():
|
||||||
|
raise RuntimeError("patching had no effect") # pragma: nocover
|
||||||
|
except Exception as ex: # pragma: nocover
|
||||||
|
dataclasses._is_type = real_is_type # type: ignore[attr-defined]
|
||||||
|
raise RuntimeError("dataclasses ClassVar support is still broken after patching") from ex
|
||||||
|
|
||||||
|
|
||||||
|
def _is_patch_needed() -> bool:
|
||||||
|
@dataclasses.dataclass
|
||||||
|
class CheckClassVar:
|
||||||
|
# this is the broken case requiring patching: ClassVar dot-referenced from a module that is not `typing` is treated as an instance field
|
||||||
|
# DTFIX-RELEASE: add link to CPython bug report to-be-filed (or update associated deprecation comments if we don't)
|
||||||
|
a_classvar: _ts.ClassVar[int] # type: ignore[name-defined]
|
||||||
|
a_field: int
|
||||||
|
|
||||||
|
return len(dataclasses.fields(CheckClassVar)) != 1
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue