Compare commits

...

24 Commits

Author SHA1 Message Date
ShIRann 9335ef980b update: latest devel branch 1 month ago
ShIRann fa6df9950b change: extract algo relative options to docfragment 1 month ago
Abhijeet Kasurde 802e95f580 distro: remove pep8 ignore
* Remove unnecessary pep8 from ignore.txt

Fixes: #80840

Signed-off-by: Abhijeet Kasurde <akasurde@redhat.com>
1 month ago
Abhijeet Kasurde 628ce5a62e assemble: update argument_spec with 'decrypt' option
* decrypt option is used by assemble action plugin.
  Add this parameter to remove failure raised by
  validate-modules:nonexistent-parameter-documented

Fixes: #80840

Signed-off-by: Abhijeet Kasurde <akasurde@redhat.com>
1 month ago
Alex Willmer d5edb77db4 Add description of ansible.utils.path.unfrackpath() basedir argument 1 month ago
Brian Coca c18e755b82 removed extra ansible_managed 1 month ago
Sloane Hertel e84240db84
Fix installing roles containing symlinks (#82911)
* Fix installing roles containing symlinks

Fix sanitizing tarfile symlinks relative to the link directory instead of the archive

For example:

role
├── handlers
│   └── utils.yml -> ../tasks/utils/suite.yml

The link ../tasks/utils/suite.yml will resolve to a path outside of the link's directory, but within the role

role/handlers/../tasks/utils/suite.yml

the resolved path relative to the role is tasks/utils/suite.yml, but if the symlink is set to that value, tarfile would extract it from role/handlers/tasks/utils/suite.yml

* Replace overly forgiving test case with tests for a symlink in a subdirectory of the archive and a symlink in the archive dir when these are not equivalent.

* Build test case from role files to make it easier to add test cases

Fixes #82702
Fixes #81965
Fixes #82051
1 month ago
Brian Coca 124d03145c
inspect components, ansible_managed templatable (#83053)
* inspect components, ansible_managed templatable

fixes #82322
1 month ago
Hrishikesh Mishra 718edde465
Update distro version in __init__.py (#83114)
Update the bundled package distro from 1.6.0 to 1.9.0 in __init__.py

Co-authored-by: Abhijeet Kasurde <akasurde@redhat.com>
1 month ago
Sven 33d1224e83
password_hash: update docs about bcrypt (#81675)
Signed-off-by: Sven Kieske <kieske@osism.tech>
1 month ago
Martin Krizek 87bead3dcf
setup_rpm_repo/create_repo: "Arch dependent binaries in noarch package" (#83108)
This fixes "Arch dependent binaries in noarch package" error cause by
including files created by make_elf function in noarch packages. While the
error only manifests itself on EL 7 and 8 it is better to use files
suitable for noarch packages to prevent the error potentially
re-occuring in the future.
2 months ago
Thomas Sjögren 7f93f6171d
add systemd version and features fact (#83083)
Signed-off-by: Thomas Sjögren <konstruktoid@users.noreply.github.com>
2 months ago
Abhijeet Kasurde 5dac5d365a
systemd_service: handle failure when mask operation fails (#83079)
* Handle the mask operation failure instead of just
  marking state changed.

Fixes: #81649

Signed-off-by: Abhijeet Kasurde <akasurde@redhat.com>
2 months ago
Thomas Sjögren ae2234f185
add countme option to yum_repository (#82831)
* add countme option to yum_repository

Signed-off-by: Thomas Sjögren <konstruktoid@users.noreply.github.com>

* Update lib/ansible/modules/yum_repository.py

Co-authored-by: Abhijeet Kasurde <akasurde@redhat.com>

* add changelog fragment

Signed-off-by: Thomas Sjögren <konstruktoid@users.noreply.github.com>

* add version_added

Signed-off-by: Thomas Sjögren <konstruktoid@users.noreply.github.com>

* Update lib/ansible/modules/yum_repository.py

Co-authored-by: Sloane Hertel <19572925+s-hertel@users.noreply.github.com>

* Update lib/ansible/modules/yum_repository.py

* Update lib/ansible/modules/yum_repository.py

Co-authored-by: Martin Krizek <martin.krizek@gmail.com>

* Update lib/ansible/modules/yum_repository.py

---------

Signed-off-by: Thomas Sjögren <konstruktoid@users.noreply.github.com>
Co-authored-by: Abhijeet Kasurde <akasurde@redhat.com>
Co-authored-by: Sloane Hertel <19572925+s-hertel@users.noreply.github.com>
Co-authored-by: Martin Krizek <martin.krizek@gmail.com>
2 months ago
Abhijeet Kasurde 6b3bab6476
plugin: fixed examples of csv lookup plugin (#83068)
Fixes: #83031

Signed-off-by: Abhijeet Kasurde <akasurde@redhat.com>
2 months ago
Abhijeet Kasurde 8c3c694472
test: remove ansible-examples.git repo (#81600)
* test: remove ansible-examples.git repo

* To speed up git tests remove reference to ansible-examples.git

Fixes: #81327

* Make CI green

Signed-off-by: Ansible Test Runner <noreply@example.com>
Co-authored-by: Ansible Test Runner <noreply@example.com>
Co-authored-by: Sviatoslav Sydorenko <wk.cvs.github@sydorenko.org.ua>
2 months ago
Brian Coca 0c51a30d93
ansible-config: add 'validate' option (#83007)
We can now validate both ansible.cfg and 'ANSIBLE_' env vars
match either core (-t base), installed plugin(s) (-t <plugin_type>) or both (-t all)
2 months ago
Matt Clay 8bc0d809a6 Update sdist path in release tool
The latest setuptools package uses a normalized package name for the sdist.
2 months ago
Martin Krizek a3cdd831b3
handlers: fix executing in lockstep using linear (#83030)
Fixes #82307
2 months ago
Ernie Hershey 82d91f0952
Make inventory --list help text wording consistent (#83035) 2 months ago
Matt Martz 57750e2cf7
Fallbacks for brand new APIs that don't exist in released dnf5 (#83022) 2 months ago
Martin Krizek 3a6f825a8e
dnf - honor installroot and substitutes in paths (#83011)
In #80094 support for var substitution for cachedir was added but there
are more options that should be supported. Using an API for
prepend_installroot which should be done anyway provide that feature
so use that. In addition, perform the operation once all substitutes
are in place (releasever as well).
2 months ago
Martin Krizek 4e57249d59
dnf5: replace removed API calls (#83020)
* dnf5: replace removed API calls

bfb6f32e15
96c9188f9c

* call set_group_with_name instead of setting group_with_name

c7b88428f3

---------

Co-authored-by: Matt Martz <matt@sivel.net>
2 months ago
Abhijeet Kasurde cb81801124
facts: Add a generic detection for VMware product name (#83012)
* Use startswith instead of hardcoded values in VMWare product
  detction

Signed-off-by: Abhijeet Kasurde <akasurde@redhat.com>
2 months ago

@ -0,0 +1,2 @@
bugfixes:
- Fix handlers not being executed in lockstep using the linear strategy in some cases (https://github.com/ansible/ansible/issues/82307)

@ -0,0 +1,2 @@
bugfixes:
- support the countme option when using yum_repository

@ -0,0 +1,3 @@
---
bugfixes:
- lookup - Fixed examples of csv lookup plugin (https://github.com/ansible/ansible/issues/83031).

@ -0,0 +1,2 @@
minor_changes:
- facts - add systemd version and features

@ -0,0 +1,2 @@
minor_changes:
- ansible-config has new 'validate' option to find mispelled/forgein configurations in ini file or environment variables.

@ -0,0 +1,2 @@
bugfixes:
- ansible-galaxy role install - fix symlinks (https://github.com/ansible/ansible/issues/82702, https://github.com/ansible/ansible/issues/81965).

@ -0,0 +1,2 @@
bugfixes:
- ansible_managed restored it's 'templatability' by ensuring the possible injection routes are cut off earlier in the process.

@ -0,0 +1,3 @@
---
bugfixes:
- assemble - update argument_spec with 'decrypt' option which is required by action plugin (https://github.com/ansible/ansible/issues/80840).

@ -0,0 +1,3 @@
bugfixes:
- dnf - honor installroot for ``cachedir``, ``logdir`` and ``persistdir``
- dnf - perform variable substitutions in ``logdir`` and ``persistdir``

@ -0,0 +1,2 @@
bugfixes:
- dnf5 - replace removed API calls

@ -0,0 +1,3 @@
---
bugfixes:
- systemd_service - handle mask operation failure (https://github.com/ansible/ansible/issues/81649).

@ -0,0 +1,3 @@
---
bugfixes:
- facts - add a generic detection for VMware in product name.

@ -9,9 +9,10 @@ from __future__ import annotations
from ansible.cli import CLI from ansible.cli import CLI
import os import os
import yaml
import shlex import shlex
import subprocess import subprocess
import sys
import yaml
from collections.abc import Mapping from collections.abc import Mapping
@ -49,6 +50,37 @@ def get_constants():
return get_constants.cvars return get_constants.cvars
def _ansible_env_vars(varname):
''' return true or false depending if variable name is possibly a 'configurable' ansible env variable '''
return all(
[
varname.startswith("ANSIBLE_"),
not varname.startswith(("ANSIBLE_TEST_", "ANSIBLE_LINT_")),
varname not in ("ANSIBLE_CONFIG", "ANSIBLE_DEV_HOME"),
]
)
def _get_evar_list(settings):
data = []
for setting in settings:
if 'env' in settings[setting] and settings[setting]['env']:
for varname in settings[setting]['env']:
data.append(varname.get('name'))
return data
def _get_ini_entries(settings):
data = {}
for setting in settings:
if 'ini' in settings[setting] and settings[setting]['ini']:
for kv in settings[setting]['ini']:
if not kv['section'] in data:
data[kv['section']] = set()
data[kv['section']].add(kv['key'])
return data
class ConfigCLI(CLI): class ConfigCLI(CLI):
""" Config command line class """ """ Config command line class """
@ -99,9 +131,13 @@ class ConfigCLI(CLI):
init_parser.add_argument('--disabled', dest='commented', action='store_true', default=False, init_parser.add_argument('--disabled', dest='commented', action='store_true', default=False,
help='Prefixes all entries with a comment character to disable them') help='Prefixes all entries with a comment character to disable them')
# search_parser = subparsers.add_parser('find', help='Search configuration') validate_parser = subparsers.add_parser('validate',
# search_parser.set_defaults(func=self.execute_search) help='Validate the configuration file and environment variables. '
# search_parser.add_argument('args', help='Search term', metavar='<search term>') 'By default it only checks the base settings without accounting for plugins (see -t).',
parents=[common])
validate_parser.set_defaults(func=self.execute_validate)
validate_parser.add_argument('--format', '-f', dest='format', action='store', choices=['ini', 'env'] , default='ini',
help='Output format for init')
def post_process_args(self, options): def post_process_args(self, options):
options = super(ConfigCLI, self).post_process_args(options) options = super(ConfigCLI, self).post_process_args(options)
@ -239,6 +275,7 @@ class ConfigCLI(CLI):
for ptype in C.CONFIGURABLE_PLUGINS: for ptype in C.CONFIGURABLE_PLUGINS:
config_entries['PLUGINS'][ptype.upper()] = self._list_plugin_settings(ptype) config_entries['PLUGINS'][ptype.upper()] = self._list_plugin_settings(ptype)
elif context.CLIARGS['type'] != 'base': elif context.CLIARGS['type'] != 'base':
# only for requested types
config_entries['PLUGINS'][context.CLIARGS['type']] = self._list_plugin_settings(context.CLIARGS['type'], context.CLIARGS['args']) config_entries['PLUGINS'][context.CLIARGS['type']] = self._list_plugin_settings(context.CLIARGS['type'], context.CLIARGS['args'])
return config_entries return config_entries
@ -358,7 +395,7 @@ class ConfigCLI(CLI):
elif default is None: elif default is None:
default = '' default = ''
if context.CLIARGS['commented']: if context.CLIARGS.get('commented', False):
entry['key'] = ';%s' % entry['key'] entry['key'] = ';%s' % entry['key']
key = desc + '\n%s=%s' % (entry['key'], default) key = desc + '\n%s=%s' % (entry['key'], default)
@ -552,6 +589,64 @@ class ConfigCLI(CLI):
self.pager(to_text(text, errors='surrogate_or_strict')) self.pager(to_text(text, errors='surrogate_or_strict'))
def execute_validate(self):
found = False
config_entries = self._list_entries_from_args()
plugin_types = config_entries.pop('PLUGINS', None)
if context.CLIARGS['format'] == 'ini':
if C.CONFIG_FILE is not None:
# validate ini config since it is found
sections = _get_ini_entries(config_entries)
# Also from plugins
if plugin_types:
for ptype in plugin_types:
for plugin in plugin_types[ptype].keys():
plugin_sections = _get_ini_entries(plugin_types[ptype][plugin])
for s in plugin_sections:
if s in sections:
sections[s].update(plugin_sections[s])
else:
sections[s] = plugin_sections[s]
if sections:
p = C.config._parsers[C.CONFIG_FILE]
for s in p.sections():
# check for valid sections
if s not in sections:
display.error(f"Found unknown section '{s}' in '{C.CONFIG_FILE}.")
found = True
continue
# check keys in valid sections
for k in p.options(s):
if k not in sections[s]:
display.error(f"Found unknown key '{k}' in section '{s}' in '{C.CONFIG_FILE}.")
found = True
elif context.CLIARGS['format'] == 'env':
# validate any 'ANSIBLE_' env vars found
evars = [varname for varname in os.environ.keys() if _ansible_env_vars(varname)]
if evars:
data = _get_evar_list(config_entries)
if plugin_types:
for ptype in plugin_types:
for plugin in plugin_types[ptype].keys():
data.extend(_get_evar_list(plugin_types[ptype][plugin]))
for evar in evars:
if evar not in data:
display.error(f"Found unknown environment variable '{evar}'.")
found = True
# we found discrepancies!
if found:
sys.exit(1)
# allsgood
display.display("All configurations seem valid!")
def main(args=None): def main(args=None):
ConfigCLI.cli_executor(args) ConfigCLI.cli_executor(args)

@ -73,12 +73,12 @@ class InventoryCLI(CLI):
# list # list
self.parser.add_argument("--export", action="store_true", default=C.INVENTORY_EXPORT, dest='export', self.parser.add_argument("--export", action="store_true", default=C.INVENTORY_EXPORT, dest='export',
help="When doing an --list, represent in a way that is optimized for export," help="When doing --list, represent in a way that is optimized for export,"
"not as an accurate representation of how Ansible has processed it") "not as an accurate representation of how Ansible has processed it")
self.parser.add_argument('--output', default=None, dest='output_file', self.parser.add_argument('--output', default=None, dest='output_file',
help="When doing --list, send the inventory to a file instead of to the screen") help="When doing --list, send the inventory to a file instead of to the screen")
# self.parser.add_argument("--ignore-vars-plugins", action="store_true", default=False, dest='ignore_vars_plugins', # self.parser.add_argument("--ignore-vars-plugins", action="store_true", default=False, dest='ignore_vars_plugins',
# help="When doing an --list, skip vars data from vars plugins, by default, this would include group_vars/ and host_vars/") # help="When doing --list, skip vars data from vars plugins, by default, this would include group_vars/ and host_vars/")
def post_process_args(self, options): def post_process_args(self, options):
options = super(InventoryCLI, self).post_process_args(options) options = super(InventoryCLI, self).post_process_args(options)

@ -427,13 +427,13 @@ class PlayIterator:
# might be there from previous flush # might be there from previous flush
state.handlers = self.handlers[:] state.handlers = self.handlers[:]
state.update_handlers = False state.update_handlers = False
state.cur_handlers_task = 0
while True: while True:
try: try:
task = state.handlers[state.cur_handlers_task] task = state.handlers[state.cur_handlers_task]
except IndexError: except IndexError:
task = None task = None
state.cur_handlers_task = 0
state.run_state = state.pre_flushing_run_state state.run_state = state.pre_flushing_run_state
state.update_handlers = True state.update_handlers = True
break break

@ -386,6 +386,8 @@ class GalaxyRole(object):
else: else:
os.makedirs(self.path) os.makedirs(self.path)
resolved_archive = unfrackpath(archive_parent_dir, follow=False)
# We strip off any higher-level directories for all of the files # We strip off any higher-level directories for all of the files
# contained within the tar file here. The default is 'github_repo-target'. # contained within the tar file here. The default is 'github_repo-target'.
# Gerrit instances, on the other hand, does not have a parent directory at all. # Gerrit instances, on the other hand, does not have a parent directory at all.
@ -400,33 +402,29 @@ class GalaxyRole(object):
if not (attr_value := getattr(member, attr, None)): if not (attr_value := getattr(member, attr, None)):
continue continue
if attr_value.startswith(os.sep) and not is_subpath(attr_value, archive_parent_dir):
err = f"Invalid {attr} for tarfile member: path {attr_value} is not a subpath of the role {archive_parent_dir}"
raise AnsibleError(err)
if attr == 'linkname': if attr == 'linkname':
# Symlinks are relative to the link # Symlinks are relative to the link
relative_to_archive_dir = os.path.dirname(getattr(member, 'name', '')) relative_to = os.path.dirname(getattr(member, 'name', ''))
archive_dir_path = os.path.join(archive_parent_dir, relative_to_archive_dir, attr_value)
else: else:
# Normalize paths that start with the archive dir # Normalize paths that start with the archive dir
attr_value = attr_value.replace(archive_parent_dir, "", 1) attr_value = attr_value.replace(archive_parent_dir, "", 1)
attr_value = os.path.join(*attr_value.split(os.sep)) # remove leading os.sep attr_value = os.path.join(*attr_value.split(os.sep)) # remove leading os.sep
archive_dir_path = os.path.join(archive_parent_dir, attr_value) relative_to = ''
resolved_archive = unfrackpath(archive_parent_dir) full_path = os.path.join(resolved_archive, relative_to, attr_value)
resolved_path = unfrackpath(archive_dir_path) if not is_subpath(full_path, resolved_archive, real=True):
if not is_subpath(resolved_path, resolved_archive): err = f"Invalid {attr} for tarfile member: path {full_path} is not a subpath of the role {resolved_archive}"
err = f"Invalid {attr} for tarfile member: path {resolved_path} is not a subpath of the role {resolved_archive}"
raise AnsibleError(err) raise AnsibleError(err)
relative_path = os.path.join(*resolved_path.replace(resolved_archive, "", 1).split(os.sep)) or '.' relative_path_dir = os.path.join(resolved_archive, relative_to)
relative_path = os.path.join(*full_path.replace(relative_path_dir, "", 1).split(os.sep))
setattr(member, attr, relative_path) setattr(member, attr, relative_path)
if _check_working_data_filter(): if _check_working_data_filter():
# deprecated: description='extract fallback without filter' python_version='3.11' # deprecated: description='extract fallback without filter' python_version='3.11'
role_tar_file.extract(member, to_native(self.path), filter='data') # type: ignore[call-arg] role_tar_file.extract(member, to_native(self.path), filter='data') # type: ignore[call-arg]
else: else:
# Remove along with manual path filter once Python 3.12 is minimum supported version
role_tar_file.extract(member, to_native(self.path)) role_tar_file.extract(member, to_native(self.path))
# write out the install info file for later use # write out the install info file for later use

@ -22,7 +22,7 @@ Compat distro library.
from __future__ import annotations from __future__ import annotations
# The following makes it easier for us to script updates of the bundled code # The following makes it easier for us to script updates of the bundled code
_BUNDLED_METADATA = {"pypi_name": "distro", "version": "1.6.0"} _BUNDLED_METADATA = {"pypi_name": "distro", "version": "1.8.0"}
# The following additional changes have been made: # The following additional changes have been made:
# * Remove optparse since it is not needed for our use. # * Remove optparse since it is not needed for our use.

@ -53,6 +53,7 @@ from ansible.module_utils.facts.system.python import PythonFactCollector
from ansible.module_utils.facts.system.selinux import SelinuxFactCollector from ansible.module_utils.facts.system.selinux import SelinuxFactCollector
from ansible.module_utils.facts.system.service_mgr import ServiceMgrFactCollector from ansible.module_utils.facts.system.service_mgr import ServiceMgrFactCollector
from ansible.module_utils.facts.system.ssh_pub_keys import SshPubKeyFactCollector from ansible.module_utils.facts.system.ssh_pub_keys import SshPubKeyFactCollector
from ansible.module_utils.facts.system.systemd import SystemdFactCollector
from ansible.module_utils.facts.system.user import UserFactCollector from ansible.module_utils.facts.system.user import UserFactCollector
from ansible.module_utils.facts.hardware.base import HardwareCollector from ansible.module_utils.facts.hardware.base import HardwareCollector
@ -118,7 +119,8 @@ _general = [
EnvFactCollector, EnvFactCollector,
LoadAvgFactCollector, LoadAvgFactCollector,
SshPubKeyFactCollector, SshPubKeyFactCollector,
UserFactCollector UserFactCollector,
SystemdFactCollector
] # type: t.List[t.Type[BaseFactCollector]] ] # type: t.List[t.Type[BaseFactCollector]]
# virtual, this might also limit hardware/networking # virtual, this might also limit hardware/networking

@ -0,0 +1,47 @@
# Get systemd version and features
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
from __future__ import annotations
import ansible.module_utils.compat.typing as t
from ansible.module_utils.facts.collector import BaseFactCollector
from ansible.module_utils.facts.system.service_mgr import ServiceMgrFactCollector
class SystemdFactCollector(BaseFactCollector):
name = "systemd"
_fact_ids = set() # type: t.Set[str]
def collect(self, module=None, collected_facts=None):
systemctl_bin = module.get_bin_path("systemctl")
if systemctl_bin and ServiceMgrFactCollector.is_systemd_managed(module):
rc, stdout, stderr = module.run_command(
[systemctl_bin, "--version"],
check_rc=False,
)
systemd_facts = {}
if rc != 0:
return systemd_facts
systemd_facts["systemd"] = {}
systemd_facts["systemd"]["features"] = str(stdout.split("\n")[1])
systemd_facts["systemd"]["version"] = int(stdout.split(" ")[1])
return systemd_facts

@ -175,7 +175,7 @@ class LinuxVirtual(Virtual):
virtual_facts['virtualization_type'] = 'RHEV' virtual_facts['virtualization_type'] = 'RHEV'
found_virt = True found_virt = True
if product_name in ('VMware Virtual Platform', 'VMware7,1', 'VMware20,1'): if product_name and product_name.startswith(("VMware",)):
guest_tech.add('VMware') guest_tech.add('VMware')
if not found_virt: if not found_virt:
virtual_facts['virtualization_type'] = 'VMware' virtual_facts['virtualization_type'] = 'VMware'

@ -205,6 +205,11 @@ def main():
regexp=dict(type='str'), regexp=dict(type='str'),
ignore_hidden=dict(type='bool', default=False), ignore_hidden=dict(type='bool', default=False),
validate=dict(type='str'), validate=dict(type='str'),
# Options that are for the action plugin, but ignored by the module itself.
# We have them here so that the tests pass without ignores, which
# reduces the likelihood of further bugs added.
decrypt=dict(type='bool', default=True),
), ),
add_file_common_args=True, add_file_common_args=True,
) )

@ -403,7 +403,6 @@ from ansible.module_utils.yumdnf import YumDnf, yumdnf_argument_spec
# to set proper locale before importing dnf to be able to scrape # to set proper locale before importing dnf to be able to scrape
# the output in some cases (FIXME?). # the output in some cases (FIXME?).
dnf = None dnf = None
libdnf = None
class DnfModule(YumDnf): class DnfModule(YumDnf):
@ -484,7 +483,6 @@ class DnfModule(YumDnf):
os.environ['LANGUAGE'] = os.environ['LANG'] = locale os.environ['LANGUAGE'] = os.environ['LANG'] = locale
global dnf global dnf
global libdnf
try: try:
import dnf import dnf
import dnf.const import dnf.const
@ -492,7 +490,6 @@ class DnfModule(YumDnf):
import dnf.package import dnf.package
import dnf.subject import dnf.subject
import dnf.util import dnf.util
import libdnf
HAS_DNF = True HAS_DNF = True
except ImportError: except ImportError:
HAS_DNF = False HAS_DNF = False
@ -560,9 +557,6 @@ class DnfModule(YumDnf):
# Load substitutions from the filesystem # Load substitutions from the filesystem
conf.substitutions.update_from_etc(installroot) conf.substitutions.update_from_etc(installroot)
# Substitute variables in cachedir path
conf.cachedir = libdnf.conf.ConfigParser.substitute(conf.cachedir, conf.substitutions)
# Handle different DNF versions immutable mutable datatypes and # Handle different DNF versions immutable mutable datatypes and
# dnf v1/v2/v3 # dnf v1/v2/v3
# #
@ -596,6 +590,11 @@ class DnfModule(YumDnf):
# setting this to an empty string instead of None appears to mimic the DNF CLI behavior # setting this to an empty string instead of None appears to mimic the DNF CLI behavior
conf.substitutions['releasever'] = '' conf.substitutions['releasever'] = ''
# Honor installroot for dnf directories
# This will also perform variable substitutions in the paths
for opt in ('cachedir', 'logdir', 'persistdir'):
conf.prepend_installroot(opt)
# Set skip_broken (in dnf this is strict=0) # Set skip_broken (in dnf this is strict=0)
if self.skip_broken: if self.skip_broken:
conf.strict = 0 conf.strict = 0

@ -496,7 +496,7 @@ class Dnf5Module(YumDnf):
conf.config_file_path = self.conf_file conf.config_file_path = self.conf_file
try: try:
base.load_config_from_file() base.load_config()
except RuntimeError as e: except RuntimeError as e:
self.module.fail_json( self.module.fail_json(
msg=str(e), msg=str(e),
@ -536,7 +536,8 @@ class Dnf5Module(YumDnf):
log_router = base.get_logger() log_router = base.get_logger()
global_logger = libdnf5.logger.GlobalLogger() global_logger = libdnf5.logger.GlobalLogger()
global_logger.set(log_router.get(), libdnf5.logger.Logger.Level_DEBUG) global_logger.set(log_router.get(), libdnf5.logger.Logger.Level_DEBUG)
logger = libdnf5.logger.create_file_logger(base) # FIXME hardcoding the filename does not seem right, should libdnf5 expose the default file name?
logger = libdnf5.logger.create_file_logger(base, "dnf5.log")
log_router.add_logger(logger) log_router.add_logger(logger)
if self.update_cache: if self.update_cache:
@ -561,7 +562,11 @@ class Dnf5Module(YumDnf):
for repo in repo_query: for repo in repo_query:
repo.enable() repo.enable()
sack.update_and_load_enabled_repos(True) try:
sack.load_repos()
except AttributeError:
# dnf5 < 5.2.0.0
sack.update_and_load_enabled_repos(True)
if self.update_cache and not self.names and not self.list: if self.update_cache and not self.names and not self.list:
self.module.exit_json( self.module.exit_json(
@ -593,7 +598,11 @@ class Dnf5Module(YumDnf):
self.module.exit_json(msg="", results=results, rc=0) self.module.exit_json(msg="", results=results, rc=0)
settings = libdnf5.base.GoalJobSettings() settings = libdnf5.base.GoalJobSettings()
settings.group_with_name = True try:
settings.set_group_with_name(True)
except AttributeError:
# dnf5 < 5.2.0.0
settings.group_with_name = True
if self.bugfix or self.security: if self.bugfix or self.security:
advisory_query = libdnf5.advisory.AdvisoryQuery(base) advisory_query = libdnf5.advisory.AdvisoryQuery(base)
types = [] types = []

@ -29,6 +29,9 @@ options:
- You can choose seconds, minutes, hours, days, or weeks by specifying the - You can choose seconds, minutes, hours, days, or weeks by specifying the
first letter of any of those words (e.g., "1w"). first letter of any of those words (e.g., "1w").
type: str type: str
get_checksum:
default: no
version_added: 2.18
patterns: patterns:
default: [] default: []
description: description:
@ -131,11 +134,6 @@ options:
- Set this to V(true) to follow symlinks in path for systems with python 2.6+. - Set this to V(true) to follow symlinks in path for systems with python 2.6+.
type: bool type: bool
default: no default: no
get_checksum:
description:
- Set this to V(true) to retrieve a file's SHA1 checksum.
type: bool
default: no
use_regex: use_regex:
description: description:
- If V(false), the patterns are file globs (shell). - If V(false), the patterns are file globs (shell).
@ -154,19 +152,7 @@ options:
- When doing a C(contains) search, determine the encoding of the files to be searched. - When doing a C(contains) search, determine the encoding of the files to be searched.
type: str type: str
version_added: "2.17" version_added: "2.17"
checksum_algorithm: extends_documentation_fragment: [action_common_attributes, checksum_common]
description:
- Algorithm to determine checksum of file.
- Will throw an error if the host is unable to use specified algorithm.
- The remote host has to support the hashing method specified, V(md5)
can be unavailable if the host is FIPS-140 compliant.
- Availability might be restricted by the target system, for example FIPS systems won't allow md5 use
type: str
choices: [ md5, sha1, sha224, sha256, sha384, sha512 ]
default: sha1
aliases: [ checksum, checksum_algo ]
version_added: "2.18"
extends_documentation_fragment: action_common_attributes
attributes: attributes:
check_mode: check_mode:
details: since this action does not modify the target it just executes normally during check mode details: since this action does not modify the target it just executes normally during check mode

@ -25,7 +25,7 @@ options:
V(processor_count), V(python), V(python_version), V(real_user_id), V(selinux), V(service_mgr), V(processor_count), V(python), V(python_version), V(real_user_id), V(selinux), V(service_mgr),
V(ssh_host_key_dsa_public), V(ssh_host_key_ecdsa_public), V(ssh_host_key_ed25519_public), V(ssh_host_key_dsa_public), V(ssh_host_key_ecdsa_public), V(ssh_host_key_ed25519_public),
V(ssh_host_key_rsa_public), V(ssh_host_pub_keys), V(ssh_pub_keys), V(system), V(system_capabilities), V(ssh_host_key_rsa_public), V(ssh_host_pub_keys), V(ssh_pub_keys), V(system), V(system_capabilities),
V(system_capabilities_enforced), V(user), V(user_dir), V(user_gecos), V(user_gid), V(user_id), V(system_capabilities_enforced), V(systemd), V(user), V(user_dir), V(user_gecos), V(user_gid), V(user_id),
V(user_shell), V(user_uid), V(virtual), V(virtualization_role), V(virtualization_type). V(user_shell), V(user_uid), V(virtual), V(virtualization_role), V(virtualization_type).
Can specify a list of values to specify a larger subset. Can specify a list of values to specify a larger subset.
Values can also be used with an initial C(!) to specify that Values can also be used with an initial C(!) to specify that

@ -26,22 +26,8 @@ options:
type: bool type: bool
default: no default: no
get_checksum: get_checksum:
description:
- Whether to return a checksum of the file.
type: bool
default: yes default: yes
version_added: "1.8" version_added: "1.8"
checksum_algorithm:
description:
- Algorithm to determine checksum of file.
- Will throw an error if the host is unable to use specified algorithm.
- The remote host has to support the hashing method specified, V(md5)
can be unavailable if the host is FIPS-140 compliant.
type: str
choices: [ md5, sha1, sha224, sha256, sha384, sha512 ]
default: sha1
aliases: [ checksum, checksum_algo ]
version_added: "2.0"
get_mime: get_mime:
description: description:
- Use file magic and return data about the nature of the file. this uses - Use file magic and return data about the nature of the file. this uses
@ -61,6 +47,7 @@ options:
version_added: "2.3" version_added: "2.3"
extends_documentation_fragment: extends_documentation_fragment:
- action_common_attributes - action_common_attributes
- checksum_common
attributes: attributes:
check_mode: check_mode:
support: full support: full

@ -495,6 +495,8 @@ def main():
if rc != 0: if rc != 0:
# some versions of system CAN mask/unmask non existing services, we only fail on missing if they don't # some versions of system CAN mask/unmask non existing services, we only fail on missing if they don't
fail_if_missing(module, found, unit, msg='host') fail_if_missing(module, found, unit, msg='host')
# here if service was not missing, but failed for other reasons
module.fail_json(msg=f"Failed to {action} the service ({unit}): {err.strip()}")
# Enable/disable service startup at boot if requested # Enable/disable service startup at boot if requested
if module.params['enabled'] is not None: if module.params['enabled'] is not None:

@ -50,6 +50,13 @@ options:
- Relative cost of accessing this repository. Useful for weighing one - Relative cost of accessing this repository. Useful for weighing one
repo's packages as greater/less than any other. repo's packages as greater/less than any other.
type: str type: str
countme:
description:
- Whether a special flag should be added to a randomly chosen metalink/mirrorlist query each week.
This allows the repository owner to estimate the number of systems consuming it.
default: ~
type: bool
version_added: '2.18'
deltarpm_metadata_percentage: deltarpm_metadata_percentage:
description: description:
- When the relative size of deltarpm metadata vs pkgs is larger than - When the relative size of deltarpm metadata vs pkgs is larger than
@ -432,6 +439,7 @@ class YumRepo(object):
'bandwidth', 'bandwidth',
'baseurl', 'baseurl',
'cost', 'cost',
'countme',
'deltarpm_metadata_percentage', 'deltarpm_metadata_percentage',
'deltarpm_percentage', 'deltarpm_percentage',
'enabled', 'enabled',
@ -581,6 +589,7 @@ def main():
bandwidth=dict(), bandwidth=dict(),
baseurl=dict(type='list', elements='str'), baseurl=dict(type='list', elements='str'),
cost=dict(), cost=dict(),
countme=dict(type='bool'),
deltarpm_metadata_percentage=dict(), deltarpm_metadata_percentage=dict(),
deltarpm_percentage=dict(), deltarpm_percentage=dict(),
description=dict(), description=dict(),

@ -0,0 +1,25 @@
# Copyright (c) 2024 ShIRann Chen <shirannx@gmail.com>
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import annotations
class ModuleDocFragment(object):
DOCUMENTATION = r"""
options:
checksum_algorithm:
description:
- Algorithm to determine checksum of file.
- Will throw an error if the host is unable to use specified algorithm.
- The remote host has to support the hashing method specified, V(md5)
can be unavailable if the host is FIPS-140 compliant.
- Availability might be restricted by the target system, for example FIPS systems won't allow md5 use
type: str
choices: [ md5, sha1, sha224, sha256, sha384, sha512 ]
default: sha1
aliases: [ checksum, checksum_algo ]
get_checksum:
description:
- Whether to return a checksum of the file.
type: bool
"""

@ -17,7 +17,7 @@ DOCUMENTATION:
description: Hashing algorithm to use. description: Hashing algorithm to use.
type: string type: string
default: sha512 default: sha512
choices: [ md5, blowfish, sha256, sha512 ] choices: [ md5, blowfish, sha256, sha512, bcrypt ]
salt: salt:
description: Secret string used for the hashing. If none is provided a random one can be generated. Use only numbers and letters (characters matching V([./0-9A-Za-z]+)). description: Secret string used for the hashing. If none is provided a random one can be generated. Use only numbers and letters (characters matching V([./0-9A-Za-z]+)).
type: string type: string

@ -51,18 +51,30 @@ EXAMPLES = """
- name: msg="Match 'Li' on the first column, but return the 3rd column (columns start counting after the match)" - name: msg="Match 'Li' on the first column, but return the 3rd column (columns start counting after the match)"
ansible.builtin.debug: msg="The atomic mass of Lithium is {{ lookup('ansible.builtin.csvfile', 'Li file=elements.csv delimiter=, col=2') }}" ansible.builtin.debug: msg="The atomic mass of Lithium is {{ lookup('ansible.builtin.csvfile', 'Li file=elements.csv delimiter=, col=2') }}"
- name: Define Values From CSV File, this reads file in one go, but you could also use col= to read each in it's own lookup. # Contents of bgp_neighbors.csv
# 127.0.0.1,10.0.0.1,24,nones,lola,pepe,127.0.0.2
# 128.0.0.1,10.1.0.1,20,notes,lolita,pepito,128.0.0.2
# 129.0.0.1,10.2.0.1,23,nines,aayush,pepete,129.0.0.2
- name: Define values from CSV file, this reads file in one go, but you could also use col= to read each in it's own lookup.
ansible.builtin.set_fact: ansible.builtin.set_fact:
loop_ip: "{{ csvline[0] }}" '{{ columns[item|int] }}': "{{ csvline }}"
int_ip: "{{ csvline[1] }}"
int_mask: "{{ csvline[2] }}"
int_name: "{{ csvline[3] }}"
local_as: "{{ csvline[4] }}"
neighbor_as: "{{ csvline[5] }}"
neigh_int_ip: "{{ csvline[6] }}"
vars: vars:
csvline: "{{ lookup('ansible.builtin.csvfile', bgp_neighbor_ip, file='bgp_neighbors.csv', delimiter=',') }}" csvline: "{{ lookup('csvfile', bgp_neighbor_ip, file='bgp_neighbors.csv', delimiter=',', col=item) }}"
columns: ['loop_ip', 'int_ip', 'int_mask', 'int_name', 'local_as', 'neighbour_as', 'neight_int_ip']
bgp_neighbor_ip: '127.0.0.1'
loop: '{{ range(columns|length|int) }}'
delegate_to: localhost delegate_to: localhost
delegate_facts: true
# Contents of people.csv
# # Last,First,Email,Extension
# Smith,Jane,jsmith@example.com,1234
- name: Specify the column (by keycol) in which the string should be searched
assert:
that:
- lookup('ansible.builtin.csvfile', 'Jane', file='people.csv', delimiter=',', col=0, keycol=1) == "Smith"
""" """
RETURN = """ RETURN = """

@ -85,26 +85,26 @@ def generate_ansible_template_vars(path, fullpath=None, dest_path=None):
template_uid = os.stat(b_path).st_uid template_uid = os.stat(b_path).st_uid
temp_vars = { temp_vars = {
'template_host': to_text(os.uname()[1]), 'template_host': to_unsafe_text(os.uname()[1]),
'template_path': path, 'template_path': to_unsafe_text(path),
'template_mtime': datetime.datetime.fromtimestamp(os.path.getmtime(b_path)), 'template_mtime': datetime.datetime.fromtimestamp(os.path.getmtime(b_path)),
'template_uid': to_text(template_uid), 'template_uid': to_unsafe_text(template_uid),
'template_run_date': datetime.datetime.now(), 'template_run_date': datetime.datetime.now(),
'template_destpath': to_native(dest_path) if dest_path else None, 'template_destpath': wrap_var(to_native(dest_path)) if dest_path else None,
} }
if fullpath is None: if fullpath is None:
temp_vars['template_fullpath'] = os.path.abspath(path) temp_vars['template_fullpath'] = wrap_var(os.path.abspath(path))
else: else:
temp_vars['template_fullpath'] = fullpath temp_vars['template_fullpath'] = wrap_var(fullpath)
managed_default = C.DEFAULT_MANAGED_STR managed_default = C.DEFAULT_MANAGED_STR
managed_str = managed_default.format( managed_str = managed_default.format(
host=temp_vars['template_host'], host="{{ template_host }}",
uid=temp_vars['template_uid'], uid="{{ template_uid }}",
file=temp_vars['template_path'].replace('%', '%%'), file="{{ template_path }}"
) )
temp_vars['ansible_managed'] = to_unsafe_text(time.strftime(to_native(managed_str), time.localtime(os.path.getmtime(b_path)))) temp_vars['ansible_managed'] = time.strftime(to_native(managed_str), time.localtime(os.path.getmtime(b_path)))
return temp_vars return temp_vars

@ -33,6 +33,9 @@ def unfrackpath(path, follow=True, basedir=None):
:arg path: A byte or text string representing a path to be canonicalized :arg path: A byte or text string representing a path to be canonicalized
:arg follow: A boolean to indicate of symlinks should be resolved or not :arg follow: A boolean to indicate of symlinks should be resolved or not
:arg basedir: A byte string, text string, PathLike object, or `None`
representing where a relative path should be resolved from.
`None` will be substituted for the current working directory.
:raises UnicodeDecodeError: If the canonicalized version of the path :raises UnicodeDecodeError: If the canonicalized version of the path
contains non-utf8 byte sequences. contains non-utf8 byte sequences.
:rtype: A text string (unicode on pyyhon2, str on python3). :rtype: A text string (unicode on pyyhon2, str on python3).

@ -856,7 +856,7 @@ def test_built_artifact(path: pathlib.Path) -> None:
def get_sdist_path(version: Version, dist_dir: pathlib.Path = DIST_DIR) -> pathlib.Path: def get_sdist_path(version: Version, dist_dir: pathlib.Path = DIST_DIR) -> pathlib.Path:
"""Return the path to the sdist file.""" """Return the path to the sdist file."""
return dist_dir / f"ansible-core-{version}.tar.gz" return dist_dir / f"ansible_core-{version}.tar.gz"
def get_wheel_path(version: Version, dist_dir: pathlib.Path = DIST_DIR) -> pathlib.Path: def get_wheel_path(version: Version, dist_dir: pathlib.Path = DIST_DIR) -> pathlib.Path:

@ -0,0 +1,5 @@
[defaults]
cow_selection=random
[ssh_connection]
control_path=/var/tmp

@ -1,4 +1,4 @@
- name: test ansible-config for valid output and no dupes - name: test ansible-config init for valid output and no dupes
block: block:
- name: Create temporary file - name: Create temporary file
tempfile: tempfile:
@ -12,3 +12,47 @@
- name: run ini tester, for correctness and dupes - name: run ini tester, for correctness and dupes
shell: "{{ansible_playbook_python}} '{{role_path}}/files/ini_dupes.py' '{{ini_tempfile.path}}'" shell: "{{ansible_playbook_python}} '{{role_path}}/files/ini_dupes.py' '{{ini_tempfile.path}}'"
- name: test ansible-config validate
block:
# not testing w/o -t all as ansible-test uses it's own plugins and would give false positives
- name: validate config files
shell: ansible-config validate -t all -v
register: valid_cfg
loop:
- empty.cfg
- base_valid.cfg
- base_all_valid.cfg
- invalid_base.cfg
- invalid_plugins_config.ini
ignore_errors: true
environment:
ANSIBLE_CONFIG: "{{role_path ~ '/files/' ~ item}}"
- name: ensure expected cfg check results
assert:
that:
- valid_cfg['results'][0] is success
- valid_cfg['results'][1] is success
- valid_cfg['results'][2] is success
- valid_cfg['results'][3] is failed
- valid_cfg['results'][4] is failed
- name: validate env vars
shell: ansible-config validate -t all -v -f env
register: valid_env
environment:
ANSIBLE_COW_SELECTION: 1
- name: validate env vars
shell: ansible-config validate -t all -v -f env
register: invalid_env
ignore_errors: true
environment:
ANSIBLE_COW_DESTRUCTION: 1
- name: ensure env check is what we expected
assert:
that:
- valid_env is success
- invalid_env is failed

@ -1,78 +1,38 @@
- name: create test directories - delegate_to: localhost
file: block:
path: '{{ remote_tmp_dir }}/dir-traversal/{{ item }}' - name: Create archive
state: directory command: "tar -cf safe-symlinks.tar {{ role_path }}/files/safe-symlinks"
loop: args:
- source chdir: "{{ remote_tmp_dir }}"
- target
- roles - name: Install role successfully
command: ansible-galaxy role install --roles-path '{{ remote_tmp_dir }}/roles' safe-symlinks.tar
- name: create subdir in the role content to test relative symlinks args:
file: chdir: "{{ remote_tmp_dir }}"
dest: '{{ remote_tmp_dir }}/dir-traversal/source/role_subdir'
state: directory - name: Validate each of the symlinks exists
stat:
- copy: path: "{{ remote_tmp_dir }}/roles/safe-symlinks.tar/{{ item }}"
dest: '{{ remote_tmp_dir }}/dir-traversal/source/role_subdir/.keep' loop:
content: '' - defaults/main.yml
- handlers/utils.yml
- set_fact: register: symlink_stat
installed_roles: "{{ remote_tmp_dir | realpath }}/dir-traversal/roles"
- assert:
- name: build role with symlink to a directory in the role that:
script: - symlink_stat.results[0].stat.exists
chdir: '{{ remote_tmp_dir }}/dir-traversal/source' - symlink_stat.results[0].stat.lnk_source == ((dest, 'roles/safe-symlinks.tar/defaults/common_vars/subdir/group0/main.yml') | path_join)
cmd: create-role-archive.py safe-link-dir.tar ./ role_subdir/.. - symlink_stat.results[1].stat.exists
executable: '{{ ansible_playbook_python }}' - symlink_stat.results[1].stat.lnk_source == ((dest, 'roles/safe-symlinks.tar/tasks/utils/suite.yml') | path_join)
vars:
- name: install role successfully dest: "{{ remote_tmp_dir | realpath }}"
command:
cmd: 'ansible-galaxy role install --roles-path {{ remote_tmp_dir }}/dir-traversal/roles safe-link-dir.tar' always:
chdir: '{{ remote_tmp_dir }}/dir-traversal/source' - name: Clean up
register: galaxy_install_ok file:
path: "{{ item }}"
- name: check for the directory symlink in the role state: absent
stat: delegate_to: localhost
path: "{{ installed_roles }}/safe-link-dir.tar/symlink" loop:
register: symlink_in_role - "{{ remote_tmp_dir }}/roles/"
- "{{ remote_tmp_dir }}/safe-symlinks.tar"
- assert:
that:
- symlink_in_role.stat.exists
- symlink_in_role.stat.lnk_source == installed_roles + '/safe-link-dir.tar'
- name: remove tarfile for next test
file:
path: '{{ remote_tmp_dir }}/dir-traversal/source/safe-link-dir.tar'
state: absent
- name: build role with safe relative symlink
script:
chdir: '{{ remote_tmp_dir }}/dir-traversal/source'
cmd: create-role-archive.py safe.tar ./ role_subdir/../context.txt
executable: '{{ ansible_playbook_python }}'
- name: install role successfully
command:
cmd: 'ansible-galaxy role install --roles-path {{ remote_tmp_dir }}/dir-traversal/roles safe.tar'
chdir: '{{ remote_tmp_dir }}/dir-traversal/source'
register: galaxy_install_ok
- name: check for symlink in role
stat:
path: "{{ installed_roles }}/safe.tar/symlink"
register: symlink_in_role
- assert:
that:
- symlink_in_role.stat.exists
- symlink_in_role.stat.lnk_source == installed_roles + '/safe.tar/context.txt'
- name: remove test directories
file:
path: '{{ remote_tmp_dir }}/dir-traversal/{{ item }}'
state: absent
loop:
- source
- target
- roles

@ -448,7 +448,7 @@
- present - present
- dnf: - dnf:
name: /foo.so name: /foo.gif
state: present state: present
register: dnf_result register: dnf_result

@ -56,6 +56,16 @@
args: args:
chdir: "{{ repo_dir }}/shallow_branches" chdir: "{{ repo_dir }}/shallow_branches"
- name: SETUP-LOCAL-REPOS | get ref head for test_branch
shell: git checkout test_branch && git rev-parse HEAD
args:
chdir: "{{ repo_dir }}/shallow_branches"
register: ref_head_id
- name: SETUP-LOCAL-REPOS | store ref head for test_branch
set_fact:
test_branch_ref_head_id: "{{ ref_head_id.stdout }}"
# Make this a bare one, we need to be able to push to it from clones # Make this a bare one, we need to be able to push to it from clones
# We make the repo here for consistency with the other repos, # We make the repo here for consistency with the other repos,
# but we finish setting it up in forcefully-fetch-tag.yml. # but we finish setting it up in forcefully-fetch-tag.yml.

@ -70,11 +70,11 @@
# Same as the previous test, but this time we specify which ref # Same as the previous test, but this time we specify which ref
# contains the SHA1 # contains the SHA1
- name: SPECIFIC-REVISION | update to revision by specifying the refspec - name: SPECIFIC-REVISION | update to revision by specifying the refspec
git: git: &git_ref_spec
repo: https://github.com/ansible/ansible-examples.git repo: "{{ repo_dir }}/shallow_branches/.git"
dest: '{{ checkout_dir }}' dest: '{{ checkout_dir }}'
version: 5473e343e33255f2da0b160f53135c56921d875c version: "{{ test_branch_ref_head_id }}"
refspec: refs/pull/7/merge refspec: refs/heads/test_branch
- name: SPECIFIC-REVISION | check HEAD after update with refspec - name: SPECIFIC-REVISION | check HEAD after update with refspec
command: git rev-parse HEAD command: git rev-parse HEAD
@ -84,7 +84,7 @@
- assert: - assert:
that: that:
- 'git_result.stdout == "5473e343e33255f2da0b160f53135c56921d875c"' - 'git_result.stdout == test_branch_ref_head_id'
# try out combination of refspec and depth # try out combination of refspec and depth
- name: SPECIFIC-REVISION | clear checkout_dir - name: SPECIFIC-REVISION | clear checkout_dir
@ -94,11 +94,8 @@
- name: SPECIFIC-REVISION | update to revision by specifying the refspec with depth=1 - name: SPECIFIC-REVISION | update to revision by specifying the refspec with depth=1
git: git:
repo: https://github.com/ansible/ansible-examples.git
dest: '{{ checkout_dir }}'
version: 5473e343e33255f2da0b160f53135c56921d875c
refspec: refs/pull/7/merge
depth: 1 depth: 1
<<: *git_ref_spec
- name: SPECIFIC-REVISION | check HEAD after update with refspec - name: SPECIFIC-REVISION | check HEAD after update with refspec
command: git rev-parse HEAD command: git rev-parse HEAD
@ -108,7 +105,7 @@
- assert: - assert:
that: that:
- 'git_result.stdout == "5473e343e33255f2da0b160f53135c56921d875c"' - 'git_result.stdout == test_branch_ref_head_id'
- name: SPECIFIC-REVISION | try to access other commit - name: SPECIFIC-REVISION | try to access other commit
shell: git checkout 0ce1096 shell: git checkout 0ce1096
@ -130,11 +127,7 @@
path: "{{ checkout_dir }}" path: "{{ checkout_dir }}"
- name: SPECIFIC-REVISION | clone to revision by specifying the refspec - name: SPECIFIC-REVISION | clone to revision by specifying the refspec
git: git: *git_ref_spec
repo: https://github.com/ansible/ansible-examples.git
dest: "{{ checkout_dir }}"
version: 5473e343e33255f2da0b160f53135c56921d875c
refspec: refs/pull/7/merge
- name: SPECIFIC-REVISION | check HEAD after update with refspec - name: SPECIFIC-REVISION | check HEAD after update with refspec
command: git rev-parse HEAD command: git rev-parse HEAD
@ -144,7 +137,7 @@
- assert: - assert:
that: that:
- 'git_result.stdout == "5473e343e33255f2da0b160f53135c56921d875c"' - 'git_result.stdout == test_branch_ref_head_id'
# Test that a forced shallow checkout referincing branch only always fetches latest head # Test that a forced shallow checkout referincing branch only always fetches latest head

@ -0,0 +1,25 @@
- hosts: A,B
gather_facts: false
tasks:
- block:
- command: echo
notify:
- handler1
- handler2
- fail:
when: inventory_hostname == "B"
- meta: flush_handlers
always:
- name: always
debug:
msg: always
handlers:
- name: handler1
debug:
msg: handler1
- name: handler2
debug:
msg: handler2

@ -216,3 +216,6 @@ ansible-playbook nested_flush_handlers_failure_force.yml -i inventory.handlers "
ansible-playbook 82241.yml -i inventory.handlers "$@" 2>&1 | tee out.txt ansible-playbook 82241.yml -i inventory.handlers "$@" 2>&1 | tee out.txt
[ "$(grep out.txt -ce 'included_task_from_tasks_dir')" = "1" ] [ "$(grep out.txt -ce 'included_task_from_tasks_dir')" = "1" ]
ansible-playbook handlers_lockstep_82307.yml -i inventory.handlers "$@" 2>&1 | tee out.txt
[ "$(grep out.txt -ce 'TASK \[handler2\]')" = "0" ]

@ -12,11 +12,11 @@ from ansible.module_utils.common.respawn import has_respawned, probe_interpreter
HAS_RPMFLUFF = True HAS_RPMFLUFF = True
can_use_rpm_weak_deps = None can_use_rpm_weak_deps = None
try: try:
from rpmfluff import SimpleRpmBuild, GeneratedSourceFile, make_elf from rpmfluff import SimpleRpmBuild, GeneratedSourceFile, make_gif
from rpmfluff import YumRepoBuild from rpmfluff import YumRepoBuild
except ImportError: except ImportError:
try: try:
from rpmfluff.make import make_elf from rpmfluff.make import make_gif
from rpmfluff.sourcefile import GeneratedSourceFile from rpmfluff.sourcefile import GeneratedSourceFile
from rpmfluff.rpmbuild import SimpleRpmBuild from rpmfluff.rpmbuild import SimpleRpmBuild
from rpmfluff.yumrepobuild import YumRepoBuild from rpmfluff.yumrepobuild import YumRepoBuild
@ -47,8 +47,8 @@ SPECS = [
RPM('dinginessentail-with-weak-dep', '1.0', '1', None, ['dinginessentail-weak-dep'], None, None), RPM('dinginessentail-with-weak-dep', '1.0', '1', None, ['dinginessentail-weak-dep'], None, None),
RPM('dinginessentail-weak-dep', '1.0', '1', None, None, None, None), RPM('dinginessentail-weak-dep', '1.0', '1', None, None, None, None),
RPM('noarchfake', '1.0', '1', None, None, None, 'noarch'), RPM('noarchfake', '1.0', '1', None, None, None, 'noarch'),
RPM('provides_foo_a', '1.0', '1', None, None, 'foo.so', 'noarch'), RPM('provides_foo_a', '1.0', '1', None, None, 'foo.gif', 'noarch'),
RPM('provides_foo_b', '1.0', '1', None, None, 'foo.so', 'noarch'), RPM('provides_foo_b', '1.0', '1', None, None, 'foo.gif', 'noarch'),
RPM('number-11-name', '11.0', '1', None, None, None, None), RPM('number-11-name', '11.0', '1', None, None, None, None),
RPM('number-11-name', '11.1', '1', None, None, None, None), RPM('number-11-name', '11.1', '1', None, None, None, None),
RPM('epochone', '1.0', '1', '1', None, None, "noarch"), RPM('epochone', '1.0', '1', '1', None, None, "noarch"),
@ -74,7 +74,7 @@ def create_repo(arch='x86_64'):
pkg.add_installed_file( pkg.add_installed_file(
"/" + spec.file, "/" + spec.file,
GeneratedSourceFile( GeneratedSourceFile(
spec.file, make_elf() spec.file, make_gif()
) )
) )

@ -15,3 +15,8 @@
file: file:
path: /etc/systemd/system/baz.service path: /etc/systemd/system/baz.service
state: absent state: absent
- name: remove mask unit file
file:
path: /etc/systemd/system/mask_me.service
state: absent

@ -121,3 +121,5 @@
- import_tasks: test_unit_template.yml - import_tasks: test_unit_template.yml
- import_tasks: test_indirect_service.yml - import_tasks: test_indirect_service.yml
- import_tasks: test_enabled_runtime.yml - import_tasks: test_enabled_runtime.yml
- import_tasks: test_systemd_version.yml
- import_tasks: test_mask.yml

@ -0,0 +1,25 @@
- name: Copy service file for mask operation
template:
src: mask_me.service
dest: /etc/systemd/system/mask_me.service
owner: root
group: root
mode: '0644'
notify: remove unit file
- name: Reload systemd
systemd:
daemon_reload: true
- name: Try to mask already masked service
systemd:
name: mask_me.service
masked: true
register: mask_test_1
ignore_errors: true
- name: Test mask service test
assert:
that:
- mask_test_1 is not changed
- "'Failed to mask' in mask_test_1.msg"

@ -0,0 +1,11 @@
---
- name: Show Gathered Facts
ansible.builtin.debug:
msg: "{{ ansible_systemd }}"
- name: Assert the systemd version fact
ansible.builtin.assert:
that:
- ansible_systemd.version | int
- ansible_systemd.version is match('^[1-9][0-9][0-9]$')
- ansible_systemd.features | regex_search('(\\+|-)(PAM|AUDIT)')

@ -0,0 +1,9 @@
[Unit]
Description=Mask Me Server
Documentation=Mask
[Service]
ExecStart=/bin/yes
[Install]
WantedBy=default.target

@ -2,13 +2,51 @@
- hosts: testhost - hosts: testhost
gather_facts: False gather_facts: False
tasks: tasks:
- set_fact: - name: set output_dir
set_fact:
output_dir: "{{ lookup('env', 'OUTPUT_DIR') }}" output_dir: "{{ lookup('env', 'OUTPUT_DIR') }}"
- file: tags: ['always']
path: '{{ output_dir }}/café.txt'
state: 'absent' - name: Smoketest that ansible_managed with non-ascii chars works, https://github.com/ansible/ansible/issues/27262
# Smoketest that ansible_managed with non-ascii chars works: tags: ['27262']
# https://github.com/ansible/ansible/issues/27262 block:
- template: - name: ensure output file does not exist
src: 'templates/café.j2' file:
dest: '{{ output_dir }}/café.txt' path: '{{ output_dir }}/café.txt'
state: 'absent'
- name: test templating with unicode in template name
template:
src: 'templates/café.j2'
dest: '{{ output_dir }}/café.txt'
always:
- name: clean up!
file:
path: '{{ output_dir }}/café.txt'
state: 'absent'
- name: check strftime resolution in ansible_managed, https://github.com/ansible/ansible/pull/79129
tags: ['79129']
block:
- template:
src: "templates/%necho Onii-chan help Im stuck;exit 1%n.j2"
dest: "{{ output_dir }}/strftime.sh"
mode: '0755'
- shell: "exec {{ output_dir | quote }}/strftime.sh"
- name: Avoid templating 'injections' via file names
template:
src: !unsafe "templates/completely{{ 1 % 0 }} safe template.j2"
dest: "{{ output_dir }}/jinja.sh"
mode: '0755'
- shell: "exec {{ output_dir | quote }}/jinja.sh"
register: result
- assert:
that:
- "'Hello' in result.stdout"
- "'uname' not in lookup('file', output_dir ~ '/strftime.sh')"
- "'uname' not in lookup('file', output_dir ~ '/jinja.sh')"

@ -1,29 +0,0 @@
---
- hosts: testhost
gather_facts: false
tasks:
- set_fact:
output_dir: "{{ lookup('env', 'OUTPUT_DIR') }}"
- name: check strftime
block:
- template:
src: "templates/%necho Onii-chan help Im stuck;exit 1%n.j2"
dest: "{{ output_dir }}/79129-strftime.sh"
mode: '0755'
- shell: "exec {{ output_dir | quote }}/79129-strftime.sh"
- name: check jinja template
block:
- template:
src: !unsafe "templates/completely{{ 1 % 0 }} safe template.j2"
dest: "{{ output_dir }}/79129-jinja.sh"
mode: '0755'
- shell: "exec {{ output_dir | quote }}/79129-jinja.sh"
register: result
- assert:
that:
- "'Hello' in result.stdout"

@ -0,0 +1,2 @@
[defaults]
ansible_managed = Ansible managed: {file} modified on %Y-%m-%d %H:%M:%S by {uid} on {host}({{{{q('pipe', 'uname -a')}}}})

@ -7,11 +7,11 @@ ANSIBLE_ROLES_PATH=../ ansible-playbook template.yml -i ../../inventory -v "$@"
# Test for https://github.com/ansible/ansible/pull/35571 # Test for https://github.com/ansible/ansible/pull/35571
ansible testhost -i testhost, -m debug -a 'msg={{ hostvars["localhost"] }}' -e "vars1={{ undef() }}" -e "vars2={{ vars1 }}" ansible testhost -i testhost, -m debug -a 'msg={{ hostvars["localhost"] }}' -e "vars1={{ undef() }}" -e "vars2={{ vars1 }}"
# Test for https://github.com/ansible/ansible/issues/27262 # ansible_managed tests
ANSIBLE_CONFIG=ansible_managed.cfg ansible-playbook ansible_managed.yml -i ../../inventory -v "$@" ANSIBLE_CONFIG=ansible_managed.cfg ansible-playbook ansible_managed.yml -i ../../inventory -v "$@"
# Test for https://github.com/ansible/ansible/pull/79129 # same as above but with ansible_managed j2 template
ANSIBLE_CONFIG=ansible_managed.cfg ansible-playbook ansible_managed_79129.yml -i ../../inventory -v "$@" ANSIBLE_CONFIG=ansible_managed_templated.cfg ansible-playbook ansible_managed.yml -i ../../inventory -v "$@"
# Test for #42585 # Test for #42585
ANSIBLE_ROLES_PATH=../ ansible-playbook custom_template.yml -i ../../inventory -v "$@" ANSIBLE_ROLES_PATH=../ ansible-playbook custom_template.yml -i ../../inventory -v "$@"

@ -97,6 +97,7 @@
baseurl: "{{ yum_repository_test_repo.baseurl }}" baseurl: "{{ yum_repository_test_repo.baseurl }}"
description: New description description: New description
async: no async: no
countme: yes
enablegroups: no enablegroups: no
file: "{{ yum_repository_test_repo.name ~ 2 }}" file: "{{ yum_repository_test_repo.name ~ 2 }}"
ip_resolve: 4 ip_resolve: 4
@ -114,6 +115,7 @@
that: that:
- "'async = 0' in repo_file_contents" - "'async = 0' in repo_file_contents"
- "'name = New description' in repo_file_contents" - "'name = New description' in repo_file_contents"
- "'countme = 1' in repo_file_contents"
- "'enablegroups = 0' in repo_file_contents" - "'enablegroups = 0' in repo_file_contents"
- "'ip_resolve = 4' in repo_file_contents" - "'ip_resolve = 4' in repo_file_contents"
- "'keepalive = 0' in repo_file_contents" - "'keepalive = 0' in repo_file_contents"
@ -127,6 +129,7 @@
baseurl: "{{ yum_repository_test_repo.baseurl }}" baseurl: "{{ yum_repository_test_repo.baseurl }}"
description: New description description: New description
async: no async: no
countme: yes
enablegroups: no enablegroups: no
file: "{{ yum_repository_test_repo.name ~ 2 }}" file: "{{ yum_repository_test_repo.name ~ 2 }}"
ip_resolve: 4 ip_resolve: 4

@ -58,9 +58,6 @@ ignore_missing_imports = True
[mypy-dnf.*] [mypy-dnf.*]
ignore_missing_imports = True ignore_missing_imports = True
[mypy-libdnf.*]
ignore_missing_imports = True
[mypy-apt.*] [mypy-apt.*]
ignore_missing_imports = True ignore_missing_imports = True

@ -25,9 +25,6 @@ ignore_missing_imports = True
[mypy-dnf.*] [mypy-dnf.*]
ignore_missing_imports = True ignore_missing_imports = True
[mypy-libdnf.*]
ignore_missing_imports = True
[mypy-apt.*] [mypy-apt.*]
ignore_missing_imports = True ignore_missing_imports = True

@ -15,7 +15,6 @@ lib/ansible/parsing/yaml/constructor.py mypy-3.12:type-var # too many occurrenc
lib/ansible/keyword_desc.yml no-unwanted-files lib/ansible/keyword_desc.yml no-unwanted-files
lib/ansible/modules/apt.py validate-modules:parameter-invalid lib/ansible/modules/apt.py validate-modules:parameter-invalid
lib/ansible/modules/apt_repository.py validate-modules:parameter-invalid lib/ansible/modules/apt_repository.py validate-modules:parameter-invalid
lib/ansible/modules/assemble.py validate-modules:nonexistent-parameter-documented
lib/ansible/modules/async_status.py validate-modules!skip lib/ansible/modules/async_status.py validate-modules!skip
lib/ansible/modules/async_wrapper.py ansible-doc!skip # not an actual module lib/ansible/modules/async_wrapper.py ansible-doc!skip # not an actual module
lib/ansible/modules/async_wrapper.py pylint:ansible-bad-function # ignore, required lib/ansible/modules/async_wrapper.py pylint:ansible-bad-function # ignore, required
@ -56,7 +55,6 @@ lib/ansible/module_utils/compat/selinux.py import-3.11!skip # pass/fail depends
lib/ansible/module_utils/compat/selinux.py import-3.12!skip # pass/fail depends on presence of libselinux.so lib/ansible/module_utils/compat/selinux.py import-3.12!skip # pass/fail depends on presence of libselinux.so
lib/ansible/module_utils/compat/selinux.py pylint:unidiomatic-typecheck lib/ansible/module_utils/compat/selinux.py pylint:unidiomatic-typecheck
lib/ansible/module_utils/distro/_distro.py no-assert lib/ansible/module_utils/distro/_distro.py no-assert
lib/ansible/module_utils/distro/_distro.py pep8!skip # bundled code we don't want to modify
lib/ansible/module_utils/distro/__init__.py empty-init # breaks namespacing, bundled, do not override lib/ansible/module_utils/distro/__init__.py empty-init # breaks namespacing, bundled, do not override
lib/ansible/module_utils/facts/__init__.py empty-init # breaks namespacing, deprecate and eventually remove lib/ansible/module_utils/facts/__init__.py empty-init # breaks namespacing, deprecate and eventually remove
lib/ansible/module_utils/powershell/Ansible.ModuleUtils.ArgvParser.psm1 pslint:PSUseApprovedVerbs lib/ansible/module_utils/powershell/Ansible.ModuleUtils.ArgvParser.psm1 pslint:PSUseApprovedVerbs

Loading…
Cancel
Save