[stable-2.13] ansible-test - Multiple backports (#77951)

* ansible-test - Backport `InternalError`

NOTE: This is a partial backport, including only one new class.

(cherry picked from commit b960641759)

* ansible-test - Fix subprocess management. (#77641)

* Run code-smell sanity tests in UTF-8 Mode.
* Update subprocess use in sanity test programs.
* Use raw_command instead of run_command with always=True set.
* Add more capture=True usage.
* Don't expose stdin to subprocesses.
* Capture more output. Warn on retry.
* Add more captures.
* Capture coverage cli output.
* Capture windows and network host checks.
* Be explicit about interactive usage.
* Use a shell for non-captured, non-interactive subprocesses.
* Add integration test to assert no TTY.
* Add unit test to assert no TTY.
* Require blocking stdin/stdout/stderr.
* Use subprocess.run in ansible-core sanity tests.
* Remove unused arg.
* Be explicit with subprocess.run check=False.
* Add changelog.
* Use a Python subprocess instead of a shell.
* Use InternalError instead of Exception.
* Require capture argument.
* Check for invalid raw_command arguments.
* Removed pointless communicate=True usage.
* Relocate stdout w/o capture check.
* Use threads instead of a subprocess for IO.

(cherry picked from commit 5c2d830dea)

* ansible-test - Add support for remote Ubuntu VMs.

(cherry picked from commit 6513453310)

* ansible-test - Fix remote completion validation.

(cherry picked from commit e2200e8dfc)

* ansible-test - Add multi-arch remote support.

(cherry picked from commit 2cc74b04c4)

* ansible-test - Enhance the shell command. (#77734)

* ansible-test - Add shell --export option.

* ansible-test - Support cmd args for shell command.

Also allow shell to be used without a valid layout if no delegation is required.

* ansible-test - Improve stderr/stdout consistency.

By default all output goes to stdout only, with the exception of a fatal error.

When using any of the following, all output defaults to stderr instead:

* sanity with the `--lint` option -- sanity messages to stdout
* coverage analyze -- output to stdout if the output file is `/dev/stdout`
* shell -- shell output to stdout

This fixes issues two main issues:

* Unpredictable output order when using both info and error/warning messages.
* Mixing of lint/command/shell output with bootstrapping messages on stdout.

* ansible-test - Add changelog fragment.

(cherry picked from commit fe349a1ccd)

* ansible-test - Fix remote args restriction.

The platform-specific and global fallbacks were not working with the `--remote` option.

This regression was introduced by https://github.com/ansible/ansible/pull/77711

(cherry picked from commit 76ead1e768)
pull/77986/head
Matt Clay 4 years ago committed by GitHub
parent f3b56ec661
commit ae380e3bef
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -0,0 +1,2 @@
minor_changes:
- ansible-test - Add support for multi-arch remotes.

@ -0,0 +1,2 @@
bugfixes:
- ansible-test - Fix internal validation of remote completion configuration.

@ -0,0 +1,7 @@
minor_changes:
- ansible-test - Add support for running non-interactive commands with ``ansible-test shell``.
- ansible-test - Add support for exporting inventory with ``ansible-test shell --export {path}``.
- ansible-test - The ``shell`` command can be used outside a collection if no controller delegation is required.
- ansible-test - Improve consistency of output messages by using stdout or stderr for most output, but not both.
bugfixes:
- ansible-test - Sanity test output with the ``--lint`` option is no longer mixed in with bootstrapping output.

@ -0,0 +1,10 @@
bugfixes:
- ansible-test - Subprocesses are now isolated from the stdin, stdout and stderr of ansible-test.
This avoids issues with subprocesses tampering with the file descriptors, such as SSH making them non-blocking.
As a result of this change, subprocess output from unit and integration tests on stderr now go to stdout.
- ansible-test - Subprocesses no longer have access to the TTY ansible-test is connected to, if any.
This maintains consistent behavior between local testing and CI systems, which typically do not provide a TTY.
Tests which require a TTY should use pexpect or another mechanism to create a PTY.
minor_changes:
- ansible-test - Blocking mode is now enforced for stdin, stdout and stderr.
If any of these are non-blocking then ansible-test will exit during startup with an error.

@ -0,0 +1,2 @@
minor_changes:
- ansible-test - Add support for Ubuntu VMs using the ``--remote`` option.

@ -0,0 +1,2 @@
context/controller
shippable/posix/group1

@ -0,0 +1,7 @@
#!/usr/bin/env python
import sys
assert not sys.stdin.isatty()
assert not sys.stdout.isatty()
assert not sys.stderr.isatty()

@ -0,0 +1,5 @@
#!/usr/bin/env bash
set -eux
./runme.py

@ -1,2 +1,2 @@
ios/csr1000v collection=cisco.ios connection=ansible.netcommon.network_cli provider=aws ios/csr1000v collection=cisco.ios connection=ansible.netcommon.network_cli provider=aws arch=x86_64
vyos/1.1.8 collection=vyos.vyos connection=ansible.netcommon.network_cli provider=aws vyos/1.1.8 collection=vyos.vyos connection=ansible.netcommon.network_cli provider=aws arch=x86_64

@ -1,9 +1,11 @@
freebsd/12.3 python=3.8 python_dir=/usr/local/bin provider=aws freebsd/12.3 python=3.8 python_dir=/usr/local/bin provider=aws arch=x86_64
freebsd/13.0 python=3.7,2.7,3.8,3.9 python_dir=/usr/local/bin provider=aws freebsd/13.0 python=3.7,2.7,3.8,3.9 python_dir=/usr/local/bin provider=aws arch=x86_64
freebsd python_dir=/usr/local/bin provider=aws freebsd python_dir=/usr/local/bin provider=aws arch=x86_64
macos/12.0 python=3.10 python_dir=/usr/local/bin provider=parallels macos/12.0 python=3.10 python_dir=/usr/local/bin provider=parallels arch=x86_64
macos python_dir=/usr/local/bin provider=parallels macos python_dir=/usr/local/bin provider=parallels arch=x86_64
rhel/7.9 python=2.7 provider=aws rhel/7.9 python=2.7 provider=aws arch=x86_64
rhel/8.5 python=3.6,3.8,3.9 provider=aws rhel/8.5 python=3.6,3.8,3.9 provider=aws arch=x86_64
rhel/9.0 python=3.9 provider=aws rhel/9.0 python=3.9 provider=aws arch=x86_64
rhel provider=aws rhel provider=aws arch=x86_64
ubuntu/22.04 python=3.10 provider=aws arch=x86_64
ubuntu provider=aws arch=x86_64

@ -1,6 +1,6 @@
windows/2012 provider=aws windows/2012 provider=aws arch=x86_64
windows/2012-R2 provider=aws windows/2012-R2 provider=aws arch=x86_64
windows/2016 provider=aws windows/2016 provider=aws arch=x86_64
windows/2019 provider=aws windows/2019 provider=aws arch=x86_64
windows/2022 provider=aws windows/2022 provider=aws arch=x86_64
windows provider=aws windows provider=aws arch=x86_64

@ -57,7 +57,7 @@ def main(cli_args=None): # type: (t.Optional[t.List[str]]) -> None
display.truncate = config.truncate display.truncate = config.truncate
display.redact = config.redact display.redact = config.redact
display.color = config.color display.color = config.color
display.info_stderr = config.info_stderr display.fd = sys.stderr if config.display_stderr else sys.stdout
configure_timeout(config) configure_timeout(config)
display.info('RLIMIT_NOFILE: %s' % (CURRENT_RLIMIT_NOFILE,), verbosity=2) display.info('RLIMIT_NOFILE: %s' % (CURRENT_RLIMIT_NOFILE,), verbosity=2)
@ -66,7 +66,9 @@ def main(cli_args=None): # type: (t.Optional[t.List[str]]) -> None
target_names = None target_names = None
try: try:
data_context().check_layout() if config.check_layout:
data_context().check_layout()
args.func(config) args.func(config)
except PrimeContainers: except PrimeContainers:
pass pass
@ -82,7 +84,7 @@ def main(cli_args=None): # type: (t.Optional[t.List[str]]) -> None
if target_names: if target_names:
for target_name in target_names: for target_name in target_names:
print(target_name) # info goes to stderr, this should be on stdout print(target_name) # display goes to stderr, this should be on stdout
display.review_warnings() display.review_warnings()
config.success = True config.success = True
@ -90,7 +92,7 @@ def main(cli_args=None): # type: (t.Optional[t.List[str]]) -> None
display.warning(u'%s' % ex) display.warning(u'%s' % ex)
sys.exit(0) sys.exit(0)
except ApplicationError as ex: except ApplicationError as ex:
display.error(u'%s' % ex) display.fatal(u'%s' % ex)
sys.exit(1) sys.exit(1)
except KeyboardInterrupt: except KeyboardInterrupt:
sys.exit(2) sys.exit(2)

@ -22,11 +22,11 @@ from .util import (
ANSIBLE_SOURCE_ROOT, ANSIBLE_SOURCE_ROOT,
ANSIBLE_TEST_TOOLS_ROOT, ANSIBLE_TEST_TOOLS_ROOT,
get_ansible_version, get_ansible_version,
raw_command,
) )
from .util_common import ( from .util_common import (
create_temp_dir, create_temp_dir,
run_command,
ResultType, ResultType,
intercept_python, intercept_python,
get_injector_path, get_injector_path,
@ -258,12 +258,12 @@ class CollectionDetailError(ApplicationError):
self.reason = reason self.reason = reason
def get_collection_detail(args, python): # type: (EnvironmentConfig, PythonConfig) -> CollectionDetail def get_collection_detail(python): # type: (PythonConfig) -> CollectionDetail
"""Return collection detail.""" """Return collection detail."""
collection = data_context().content.collection collection = data_context().content.collection
directory = os.path.join(collection.root, collection.directory) directory = os.path.join(collection.root, collection.directory)
stdout = run_command(args, [python.path, os.path.join(ANSIBLE_TEST_TOOLS_ROOT, 'collection_detail.py'), directory], capture=True, always=True)[0] stdout = raw_command([python.path, os.path.join(ANSIBLE_TEST_TOOLS_ROOT, 'collection_detail.py'), directory], capture=True)[0]
result = json.loads(stdout) result = json.loads(stdout)
error = result.get('error') error = result.get('error')
@ -282,15 +282,15 @@ def run_playbook(
args, # type: EnvironmentConfig args, # type: EnvironmentConfig
inventory_path, # type: str inventory_path, # type: str
playbook, # type: str playbook, # type: str
run_playbook_vars=None, # type: t.Optional[t.Dict[str, t.Any]] capture, # type: bool
capture=False, # type: bool variables=None, # type: t.Optional[t.Dict[str, t.Any]]
): # type: (...) -> None ): # type: (...) -> None
"""Run the specified playbook using the given inventory file and playbook variables.""" """Run the specified playbook using the given inventory file and playbook variables."""
playbook_path = os.path.join(ANSIBLE_TEST_DATA_ROOT, 'playbooks', playbook) playbook_path = os.path.join(ANSIBLE_TEST_DATA_ROOT, 'playbooks', playbook)
cmd = ['ansible-playbook', '-i', inventory_path, playbook_path] cmd = ['ansible-playbook', '-i', inventory_path, playbook_path]
if run_playbook_vars: if variables:
cmd.extend(['-e', json.dumps(run_playbook_vars)]) cmd.extend(['-e', json.dumps(variables)])
if args.verbosity: if args.verbosity:
cmd.append('-%s' % ('v' * args.verbosity)) cmd.append('-%s' % ('v' * args.verbosity))

@ -38,10 +38,22 @@ def do_shell(
shell = parser.add_argument_group(title='shell arguments') shell = parser.add_argument_group(title='shell arguments')
shell.add_argument(
'cmd',
nargs='*',
help='run the specified command',
)
shell.add_argument( shell.add_argument(
'--raw', '--raw',
action='store_true', action='store_true',
help='direct to shell with no setup', help='direct to shell with no setup',
) )
shell.add_argument(
'--export',
metavar='PATH',
help='export inventory instead of opening a shell',
)
add_environments(parser, completer, ControllerMode.DELEGATED, TargetMode.SHELL) # shell add_environments(parser, completer, ControllerMode.DELEGATED, TargetMode.SHELL) # shell

@ -115,6 +115,7 @@ class LegacyHostOptions:
venv_system_site_packages: t.Optional[bool] = None venv_system_site_packages: t.Optional[bool] = None
remote: t.Optional[str] = None remote: t.Optional[str] = None
remote_provider: t.Optional[str] = None remote_provider: t.Optional[str] = None
remote_arch: t.Optional[str] = None
docker: t.Optional[str] = None docker: t.Optional[str] = None
docker_privileged: t.Optional[bool] = None docker_privileged: t.Optional[bool] = None
docker_seccomp: t.Optional[str] = None docker_seccomp: t.Optional[str] = None
@ -374,33 +375,34 @@ def get_legacy_host_config(
if remote_config.controller_supported: if remote_config.controller_supported:
if controller_python(options.python) or not options.python: if controller_python(options.python) or not options.python:
controller = PosixRemoteConfig(name=options.remote, python=native_python(options), provider=options.remote_provider) controller = PosixRemoteConfig(name=options.remote, python=native_python(options), provider=options.remote_provider,
arch=options.remote_arch)
targets = controller_targets(mode, options, controller) targets = controller_targets(mode, options, controller)
else: else:
controller_fallback = f'remote:{options.remote}', f'--remote {options.remote} --python {options.python}', FallbackReason.PYTHON controller_fallback = f'remote:{options.remote}', f'--remote {options.remote} --python {options.python}', FallbackReason.PYTHON
controller = PosixRemoteConfig(name=options.remote, provider=options.remote_provider) controller = PosixRemoteConfig(name=options.remote, provider=options.remote_provider, arch=options.remote_arch)
targets = controller_targets(mode, options, controller) targets = controller_targets(mode, options, controller)
else: else:
context, reason = f'--remote {options.remote}', FallbackReason.ENVIRONMENT context, reason = f'--remote {options.remote}', FallbackReason.ENVIRONMENT
controller = None controller = None
targets = [PosixRemoteConfig(name=options.remote, python=native_python(options), provider=options.remote_provider)] targets = [PosixRemoteConfig(name=options.remote, python=native_python(options), provider=options.remote_provider, arch=options.remote_arch)]
elif mode == TargetMode.SHELL and options.remote.startswith('windows/'): elif mode == TargetMode.SHELL and options.remote.startswith('windows/'):
if options.python and options.python not in CONTROLLER_PYTHON_VERSIONS: if options.python and options.python not in CONTROLLER_PYTHON_VERSIONS:
raise ControllerNotSupportedError(f'--python {options.python}') raise ControllerNotSupportedError(f'--python {options.python}')
controller = OriginConfig(python=native_python(options)) controller = OriginConfig(python=native_python(options))
targets = [WindowsRemoteConfig(name=options.remote, provider=options.remote_provider)] targets = [WindowsRemoteConfig(name=options.remote, provider=options.remote_provider, arch=options.remote_arch)]
else: else:
if not options.python: if not options.python:
raise PythonVersionUnspecifiedError(f'--remote {options.remote}') raise PythonVersionUnspecifiedError(f'--remote {options.remote}')
if controller_python(options.python): if controller_python(options.python):
controller = PosixRemoteConfig(name=options.remote, python=native_python(options), provider=options.remote_provider) controller = PosixRemoteConfig(name=options.remote, python=native_python(options), provider=options.remote_provider, arch=options.remote_arch)
targets = controller_targets(mode, options, controller) targets = controller_targets(mode, options, controller)
else: else:
context, reason = f'--remote {options.remote} --python {options.python}', FallbackReason.PYTHON context, reason = f'--remote {options.remote} --python {options.python}', FallbackReason.PYTHON
controller = None controller = None
targets = [PosixRemoteConfig(name=options.remote, python=native_python(options), provider=options.remote_provider)] targets = [PosixRemoteConfig(name=options.remote, python=native_python(options), provider=options.remote_provider, arch=options.remote_arch)]
if not controller: if not controller:
if docker_available(): if docker_available():
@ -458,12 +460,13 @@ def handle_non_posix_targets(
"""Return a list of non-POSIX targets if the target mode is non-POSIX.""" """Return a list of non-POSIX targets if the target mode is non-POSIX."""
if mode == TargetMode.WINDOWS_INTEGRATION: if mode == TargetMode.WINDOWS_INTEGRATION:
if options.windows: if options.windows:
targets = [WindowsRemoteConfig(name=f'windows/{version}', provider=options.remote_provider) for version in options.windows] targets = [WindowsRemoteConfig(name=f'windows/{version}', provider=options.remote_provider, arch=options.remote_arch)
for version in options.windows]
else: else:
targets = [WindowsInventoryConfig(path=options.inventory)] targets = [WindowsInventoryConfig(path=options.inventory)]
elif mode == TargetMode.NETWORK_INTEGRATION: elif mode == TargetMode.NETWORK_INTEGRATION:
if options.platform: if options.platform:
network_targets = [NetworkRemoteConfig(name=platform, provider=options.remote_provider) for platform in options.platform] network_targets = [NetworkRemoteConfig(name=platform, provider=options.remote_provider, arch=options.remote_arch) for platform in options.platform]
for platform, collection in options.platform_collection or []: for platform, collection in options.platform_collection or []:
for entry in network_targets: for entry in network_targets:

@ -13,6 +13,10 @@ from ..constants import (
SUPPORTED_PYTHON_VERSIONS, SUPPORTED_PYTHON_VERSIONS,
) )
from ..util import (
REMOTE_ARCHITECTURES,
)
from ..completion import ( from ..completion import (
docker_completion, docker_completion,
network_completion, network_completion,
@ -532,6 +536,13 @@ def add_environment_remote(
help=suppress or 'remote provider to use: %(choices)s', help=suppress or 'remote provider to use: %(choices)s',
) )
environments_parser.add_argument(
'--remote-arch',
metavar='ARCH',
choices=REMOTE_ARCHITECTURES,
help=suppress or 'remote arch to use: %(choices)s',
)
def complete_remote_stage(prefix: str, **_) -> t.List[str]: def complete_remote_stage(prefix: str, **_) -> t.List[str]:
"""Return a list of supported stages matching the given prefix.""" """Return a list of supported stages matching the given prefix."""

@ -10,6 +10,10 @@ from ...constants import (
SUPPORTED_PYTHON_VERSIONS, SUPPORTED_PYTHON_VERSIONS,
) )
from ...util import (
REMOTE_ARCHITECTURES,
)
from ...host_configs import ( from ...host_configs import (
OriginConfig, OriginConfig,
) )
@ -126,6 +130,7 @@ class PosixRemoteKeyValueParser(KeyValueParser):
"""Return a dictionary of key names and value parsers.""" """Return a dictionary of key names and value parsers."""
return dict( return dict(
provider=ChoicesParser(REMOTE_PROVIDERS), provider=ChoicesParser(REMOTE_PROVIDERS),
arch=ChoicesParser(REMOTE_ARCHITECTURES),
python=PythonParser(versions=self.versions, allow_venv=False, allow_default=self.allow_default), python=PythonParser(versions=self.versions, allow_venv=False, allow_default=self.allow_default),
) )
@ -137,6 +142,7 @@ class PosixRemoteKeyValueParser(KeyValueParser):
state.sections[f'{"controller" if self.controller else "target"} {section_name} (comma separated):'] = '\n'.join([ state.sections[f'{"controller" if self.controller else "target"} {section_name} (comma separated):'] = '\n'.join([
f' provider={ChoicesParser(REMOTE_PROVIDERS).document(state)}', f' provider={ChoicesParser(REMOTE_PROVIDERS).document(state)}',
f' arch={ChoicesParser(REMOTE_ARCHITECTURES).document(state)}',
f' python={python_parser.document(state)}', f' python={python_parser.document(state)}',
]) ])
@ -149,6 +155,7 @@ class WindowsRemoteKeyValueParser(KeyValueParser):
"""Return a dictionary of key names and value parsers.""" """Return a dictionary of key names and value parsers."""
return dict( return dict(
provider=ChoicesParser(REMOTE_PROVIDERS), provider=ChoicesParser(REMOTE_PROVIDERS),
arch=ChoicesParser(REMOTE_ARCHITECTURES),
) )
def document(self, state): # type: (DocumentationState) -> t.Optional[str] def document(self, state): # type: (DocumentationState) -> t.Optional[str]
@ -157,6 +164,7 @@ class WindowsRemoteKeyValueParser(KeyValueParser):
state.sections[f'target {section_name} (comma separated):'] = '\n'.join([ state.sections[f'target {section_name} (comma separated):'] = '\n'.join([
f' provider={ChoicesParser(REMOTE_PROVIDERS).document(state)}', f' provider={ChoicesParser(REMOTE_PROVIDERS).document(state)}',
f' arch={ChoicesParser(REMOTE_ARCHITECTURES).document(state)}',
]) ])
return f'{{{section_name}}}' return f'{{{section_name}}}'
@ -168,6 +176,7 @@ class NetworkRemoteKeyValueParser(KeyValueParser):
"""Return a dictionary of key names and value parsers.""" """Return a dictionary of key names and value parsers."""
return dict( return dict(
provider=ChoicesParser(REMOTE_PROVIDERS), provider=ChoicesParser(REMOTE_PROVIDERS),
arch=ChoicesParser(REMOTE_ARCHITECTURES),
collection=AnyParser(), collection=AnyParser(),
connection=AnyParser(), connection=AnyParser(),
) )
@ -178,7 +187,8 @@ class NetworkRemoteKeyValueParser(KeyValueParser):
state.sections[f'target {section_name} (comma separated):'] = '\n'.join([ state.sections[f'target {section_name} (comma separated):'] = '\n'.join([
f' provider={ChoicesParser(REMOTE_PROVIDERS).document(state)}', f' provider={ChoicesParser(REMOTE_PROVIDERS).document(state)}',
' collection={collecton}', f' arch={ChoicesParser(REMOTE_ARCHITECTURES).document(state)}',
' collection={collection}',
' connection={connection}', ' connection={connection}',
]) ])

@ -95,7 +95,16 @@ def run_coverage(args, host_state, output_file, command, cmd): # type: (Coverag
cmd = ['python', '-m', 'coverage.__main__', command, '--rcfile', COVERAGE_CONFIG_PATH] + cmd cmd = ['python', '-m', 'coverage.__main__', command, '--rcfile', COVERAGE_CONFIG_PATH] + cmd
intercept_python(args, host_state.controller_profile.python, cmd, env) stdout, stderr = intercept_python(args, host_state.controller_profile.python, cmd, env, capture=True)
stdout = (stdout or '').strip()
stderr = (stderr or '').strip()
if stdout:
display.info(stdout)
if stderr:
display.warning(stderr)
def get_all_coverage_files(): # type: () -> t.List[str] def get_all_coverage_files(): # type: () -> t.List[str]

@ -14,4 +14,4 @@ class CoverageAnalyzeConfig(CoverageConfig):
# avoid mixing log messages with file output when using `/dev/stdout` for the output file on commands # avoid mixing log messages with file output when using `/dev/stdout` for the output file on commands
# this may be worth considering as the default behavior in the future, instead of being dependent on the command or options used # this may be worth considering as the default behavior in the future, instead of being dependent on the command or options used
self.info_stderr = True self.display_stderr = True

@ -32,7 +32,7 @@ class CoverageAnalyzeTargetsConfig(CoverageAnalyzeConfig):
def __init__(self, args): # type: (t.Any) -> None def __init__(self, args): # type: (t.Any) -> None
super().__init__(args) super().__init__(args)
self.info_stderr = True self.display_stderr = True
def make_report(target_indexes, arcs, lines): # type: (TargetIndexes, Arcs, Lines) -> t.Dict[str, t.Any] def make_report(target_indexes, arcs, lines): # type: (TargetIndexes, Arcs, Lines) -> t.Dict[str, t.Any]

@ -18,11 +18,11 @@ from ...util import (
ANSIBLE_TEST_TOOLS_ROOT, ANSIBLE_TEST_TOOLS_ROOT,
display, display,
ApplicationError, ApplicationError,
raw_command,
) )
from ...util_common import ( from ...util_common import (
ResultType, ResultType,
run_command,
write_json_file, write_json_file,
write_json_test_results, write_json_test_results,
) )
@ -194,7 +194,7 @@ def _command_coverage_combine_powershell(args): # type: (CoverageCombineConfig)
cmd = ['pwsh', os.path.join(ANSIBLE_TEST_TOOLS_ROOT, 'coverage_stub.ps1')] cmd = ['pwsh', os.path.join(ANSIBLE_TEST_TOOLS_ROOT, 'coverage_stub.ps1')]
cmd.extend(source_paths) cmd.extend(source_paths)
stubs = json.loads(run_command(args, cmd, capture=True, always=True)[0]) stubs = json.loads(raw_command(cmd, capture=True)[0])
return dict((d['Path'], dict((line, 0) for line in d['Lines'])) for d in stubs) return dict((d['Path'], dict((line, 0) for line in d['Lines'])) for d in stubs)

@ -619,7 +619,7 @@ def command_integration_script(
cmd += ['-e', '@%s' % config_path] cmd += ['-e', '@%s' % config_path]
env.update(coverage_manager.get_environment(target.name, target.aliases)) env.update(coverage_manager.get_environment(target.name, target.aliases))
cover_python(args, host_state.controller_profile.python, cmd, target.name, env, cwd=cwd) cover_python(args, host_state.controller_profile.python, cmd, target.name, env, cwd=cwd, capture=False)
def command_integration_role( def command_integration_role(
@ -738,7 +738,7 @@ def command_integration_role(
env['ANSIBLE_ROLES_PATH'] = test_env.targets_dir env['ANSIBLE_ROLES_PATH'] = test_env.targets_dir
env.update(coverage_manager.get_environment(target.name, target.aliases)) env.update(coverage_manager.get_environment(target.name, target.aliases))
cover_python(args, host_state.controller_profile.python, cmd, target.name, env, cwd=cwd) cover_python(args, host_state.controller_profile.python, cmd, target.name, env, cwd=cwd, capture=False)
def run_setup_targets( def run_setup_targets(

@ -21,6 +21,7 @@ from ....target import (
from ....core_ci import ( from ....core_ci import (
AnsibleCoreCI, AnsibleCoreCI,
CloudResource,
) )
from ....host_configs import ( from ....host_configs import (
@ -91,7 +92,7 @@ class AwsCloudProvider(CloudProvider):
def _create_ansible_core_ci(self): # type: () -> AnsibleCoreCI def _create_ansible_core_ci(self): # type: () -> AnsibleCoreCI
"""Return an AWS instance of AnsibleCoreCI.""" """Return an AWS instance of AnsibleCoreCI."""
return AnsibleCoreCI(self.args, 'aws', 'aws', 'aws', persist=False) return AnsibleCoreCI(self.args, CloudResource(platform='aws'))
class AwsCloudEnvironment(CloudEnvironment): class AwsCloudEnvironment(CloudEnvironment):

@ -19,6 +19,7 @@ from ....target import (
from ....core_ci import ( from ....core_ci import (
AnsibleCoreCI, AnsibleCoreCI,
CloudResource,
) )
from . import ( from . import (
@ -97,7 +98,7 @@ class AzureCloudProvider(CloudProvider):
def _create_ansible_core_ci(self): # type: () -> AnsibleCoreCI def _create_ansible_core_ci(self): # type: () -> AnsibleCoreCI
"""Return an Azure instance of AnsibleCoreCI.""" """Return an Azure instance of AnsibleCoreCI."""
return AnsibleCoreCI(self.args, 'azure', 'azure', 'azure', persist=False) return AnsibleCoreCI(self.args, CloudResource(platform='azure'))
class AzureCloudEnvironment(CloudEnvironment): class AzureCloudEnvironment(CloudEnvironment):

@ -106,7 +106,7 @@ class CsCloudProvider(CloudProvider):
# apply work-around for OverlayFS issue # apply work-around for OverlayFS issue
# https://github.com/docker/for-linux/issues/72#issuecomment-319904698 # https://github.com/docker/for-linux/issues/72#issuecomment-319904698
docker_exec(self.args, self.DOCKER_SIMULATOR_NAME, ['find', '/var/lib/mysql', '-type', 'f', '-exec', 'touch', '{}', ';']) docker_exec(self.args, self.DOCKER_SIMULATOR_NAME, ['find', '/var/lib/mysql', '-type', 'f', '-exec', 'touch', '{}', ';'], capture=True)
if self.args.explain: if self.args.explain:
values = dict( values = dict(

@ -18,6 +18,7 @@ from ....target import (
from ....core_ci import ( from ....core_ci import (
AnsibleCoreCI, AnsibleCoreCI,
CloudResource,
) )
from . import ( from . import (
@ -78,7 +79,7 @@ class HcloudCloudProvider(CloudProvider):
def _create_ansible_core_ci(self): # type: () -> AnsibleCoreCI def _create_ansible_core_ci(self): # type: () -> AnsibleCoreCI
"""Return a Heztner instance of AnsibleCoreCI.""" """Return a Heztner instance of AnsibleCoreCI."""
return AnsibleCoreCI(self.args, 'hetzner', 'hetzner', 'hetzner', persist=False) return AnsibleCoreCI(self.args, CloudResource(platform='hetzner'))
class HcloudCloudEnvironment(CloudEnvironment): class HcloudCloudEnvironment(CloudEnvironment):

@ -118,7 +118,7 @@ class CoverageHandler(t.Generic[THostConfig], metaclass=abc.ABCMeta):
def run_playbook(self, playbook, variables): # type: (str, t.Dict[str, str]) -> None def run_playbook(self, playbook, variables): # type: (str, t.Dict[str, str]) -> None
"""Run the specified playbook using the current inventory.""" """Run the specified playbook using the current inventory."""
self.create_inventory() self.create_inventory()
run_playbook(self.args, self.inventory_path, playbook, variables) run_playbook(self.args, self.inventory_path, playbook, capture=False, variables=variables)
class PosixCoverageHandler(CoverageHandler[PosixConfig]): class PosixCoverageHandler(CoverageHandler[PosixConfig]):

@ -10,6 +10,7 @@ from ...config import (
from ...util import ( from ...util import (
cache, cache,
detect_architecture,
display, display,
get_type_map, get_type_map,
) )
@ -223,6 +224,14 @@ class NetworkInventoryTargetFilter(TargetFilter[NetworkInventoryConfig]):
class OriginTargetFilter(PosixTargetFilter[OriginConfig]): class OriginTargetFilter(PosixTargetFilter[OriginConfig]):
"""Target filter for localhost.""" """Target filter for localhost."""
def filter_targets(self, targets, exclude): # type: (t.List[IntegrationTarget], t.Set[str]) -> None
"""Filter the list of targets, adding any which this host profile cannot support to the provided exclude list."""
super().filter_targets(targets, exclude)
arch = detect_architecture(self.config.python.path)
if arch:
self.skip(f'skip/{arch}', f'which are not supported by {arch}', targets, exclude)
@cache @cache
@ -247,10 +256,7 @@ def get_target_filter(args, configs, controller): # type: (IntegrationConfig, t
def get_remote_skip_aliases(config): # type: (RemoteConfig) -> t.Dict[str, str] def get_remote_skip_aliases(config): # type: (RemoteConfig) -> t.Dict[str, str]
"""Return a dictionary of skip aliases and the reason why they apply.""" """Return a dictionary of skip aliases and the reason why they apply."""
if isinstance(config, PosixRemoteConfig): return get_platform_skip_aliases(config.platform, config.version, config.arch)
return get_platform_skip_aliases(config.platform, config.version, config.arch)
return get_platform_skip_aliases(config.platform, config.version, None)
def get_platform_skip_aliases(platform, version, arch): # type: (str, str, t.Optional[str]) -> t.Dict[str, str] def get_platform_skip_aliases(platform, version, arch): # type: (str, str, t.Optional[str]) -> t.Dict[str, str]

@ -179,7 +179,7 @@ def command_sanity(args): # type: (SanityConfig) -> None
for test in tests: for test in tests:
if args.list_tests: if args.list_tests:
display.info(test.name) print(test.name) # display goes to stderr, this should be on stdout
continue continue
for version in SUPPORTED_PYTHON_VERSIONS: for version in SUPPORTED_PYTHON_VERSIONS:
@ -952,6 +952,7 @@ class SanityCodeSmellTest(SanitySingleVersion):
cmd = [python.path, self.path] cmd = [python.path, self.path]
env = ansible_environment(args, color=False) env = ansible_environment(args, color=False)
env.update(PYTHONUTF8='1') # force all code-smell sanity tests to run with Python UTF-8 Mode enabled
pattern = None pattern = None
data = None data = None

@ -141,7 +141,7 @@ class PylintTest(SanitySingleVersion):
if data_context().content.collection: if data_context().content.collection:
try: try:
collection_detail = get_collection_detail(args, python) collection_detail = get_collection_detail(python)
if not collection_detail.version: if not collection_detail.version:
display.warning('Skipping pylint collection version checks since no collection version was found.') display.warning('Skipping pylint collection version checks since no collection version was found.')

@ -121,7 +121,7 @@ class ValidateModulesTest(SanitySingleVersion):
cmd.extend(['--collection', data_context().content.collection.directory]) cmd.extend(['--collection', data_context().content.collection.directory])
try: try:
collection_detail = get_collection_detail(args, python) collection_detail = get_collection_detail(python)
if collection_detail.version: if collection_detail.version:
cmd.extend(['--collection-version', collection_detail.version]) cmd.extend(['--collection-version', collection_detail.version])

@ -2,6 +2,7 @@
from __future__ import annotations from __future__ import annotations
import os import os
import sys
import typing as t import typing as t
from ...util import ( from ...util import (
@ -38,12 +39,20 @@ from ...host_configs import (
OriginConfig, OriginConfig,
) )
from ...inventory import (
create_controller_inventory,
create_posix_inventory,
)
def command_shell(args): # type: (ShellConfig) -> None def command_shell(args): # type: (ShellConfig) -> None
"""Entry point for the `shell` command.""" """Entry point for the `shell` command."""
if args.raw and isinstance(args.targets[0], ControllerConfig): if args.raw and isinstance(args.targets[0], ControllerConfig):
raise ApplicationError('The --raw option has no effect on the controller.') raise ApplicationError('The --raw option has no effect on the controller.')
if not args.export and not args.cmd and not sys.stdin.isatty():
raise ApplicationError('Standard input must be a TTY to launch a shell.')
host_state = prepare_profiles(args, skip_setup=args.raw) # shell host_state = prepare_profiles(args, skip_setup=args.raw) # shell
if args.delegate: if args.delegate:
@ -57,10 +66,25 @@ def command_shell(args): # type: (ShellConfig) -> None
if isinstance(target_profile, ControllerProfile): if isinstance(target_profile, ControllerProfile):
# run the shell locally unless a target was requested # run the shell locally unless a target was requested
con = LocalConnection(args) # type: Connection con = LocalConnection(args) # type: Connection
if args.export:
display.info('Configuring controller inventory.', verbosity=1)
create_controller_inventory(args, args.export, host_state.controller_profile)
else: else:
# a target was requested, connect to it over SSH # a target was requested, connect to it over SSH
con = target_profile.get_controller_target_connections()[0] con = target_profile.get_controller_target_connections()[0]
if args.export:
display.info('Configuring target inventory.', verbosity=1)
create_posix_inventory(args, args.export, host_state.target_profiles, True)
if args.export:
return
if args.cmd:
con.run(args.cmd, capture=False, interactive=False, force_stdout=True)
return
if isinstance(con, SshConnection) and args.raw: if isinstance(con, SshConnection) and args.raw:
cmd = [] # type: t.List[str] cmd = [] # type: t.List[str]
elif isinstance(target_profile, PosixProfile): elif isinstance(target_profile, PosixProfile):
@ -87,4 +111,4 @@ def command_shell(args): # type: (ShellConfig) -> None
else: else:
cmd = [] cmd = []
con.run(cmd) con.run(cmd, capture=False, interactive=True)

@ -280,7 +280,7 @@ def command_units(args): # type: (UnitsConfig) -> None
display.info('Unit test %s with Python %s' % (test_context, python.version)) display.info('Unit test %s with Python %s' % (test_context, python.version))
try: try:
cover_python(args, python, cmd, test_context, env) cover_python(args, python, cmd, test_context, env, capture=False)
except SubprocessError as ex: except SubprocessError as ex:
# pytest exits with status code 5 when all tests are skipped, which isn't an error for our use case # pytest exits with status code 5 when all tests are skipped, which isn't an error for our use case
if ex.status != 5: if ex.status != 5:

@ -79,6 +79,7 @@ class PythonCompletionConfig(PosixCompletionConfig, metaclass=abc.ABCMeta):
class RemoteCompletionConfig(CompletionConfig): class RemoteCompletionConfig(CompletionConfig):
"""Base class for completion configuration of remote environments provisioned through Ansible Core CI.""" """Base class for completion configuration of remote environments provisioned through Ansible Core CI."""
provider: t.Optional[str] = None provider: t.Optional[str] = None
arch: t.Optional[str] = None
@property @property
def platform(self): def platform(self):
@ -99,6 +100,9 @@ class RemoteCompletionConfig(CompletionConfig):
if not self.provider: if not self.provider:
raise Exception(f'Remote completion entry "{self.name}" must provide a "provider" setting.') raise Exception(f'Remote completion entry "{self.name}" must provide a "provider" setting.')
if not self.arch:
raise Exception(f'Remote completion entry "{self.name}" must provide a "arch" setting.')
@dataclasses.dataclass(frozen=True) @dataclasses.dataclass(frozen=True)
class InventoryCompletionConfig(CompletionConfig): class InventoryCompletionConfig(CompletionConfig):
@ -152,6 +156,11 @@ class NetworkRemoteCompletionConfig(RemoteCompletionConfig):
"""Configuration for remote network platforms.""" """Configuration for remote network platforms."""
collection: str = '' collection: str = ''
connection: str = '' connection: str = ''
placeholder: bool = False
def __post_init__(self):
if not self.placeholder:
super().__post_init__()
@dataclasses.dataclass(frozen=True) @dataclasses.dataclass(frozen=True)
@ -160,6 +169,9 @@ class PosixRemoteCompletionConfig(RemoteCompletionConfig, PythonCompletionConfig
placeholder: bool = False placeholder: bool = False
def __post_init__(self): def __post_init__(self):
if not self.placeholder:
super().__post_init__()
if not self.supported_pythons: if not self.supported_pythons:
if self.version and not self.placeholder: if self.version and not self.placeholder:
raise Exception(f'POSIX remote completion entry "{self.name}" must provide a "python" setting.') raise Exception(f'POSIX remote completion entry "{self.name}" must provide a "python" setting.')

@ -48,29 +48,6 @@ class TerminateMode(enum.Enum):
return self.name.lower() return self.name.lower()
class ParsedRemote:
"""A parsed version of a "remote" string."""
def __init__(self, arch, platform, version): # type: (t.Optional[str], str, str) -> None
self.arch = arch
self.platform = platform
self.version = version
@staticmethod
def parse(value): # type: (str) -> t.Optional['ParsedRemote']
"""Return a ParsedRemote from the given value or None if the syntax is invalid."""
parts = value.split('/')
if len(parts) == 2:
arch = None
platform, version = parts
elif len(parts) == 3:
arch, platform, version = parts
else:
return None
return ParsedRemote(arch, platform, version)
class EnvironmentConfig(CommonConfig): class EnvironmentConfig(CommonConfig):
"""Configuration common to all commands which execute in an environment.""" """Configuration common to all commands which execute in an environment."""
def __init__(self, args, command): # type: (t.Any, str) -> None def __init__(self, args, command): # type: (t.Any, str) -> None
@ -237,7 +214,12 @@ class ShellConfig(EnvironmentConfig):
def __init__(self, args): # type: (t.Any) -> None def __init__(self, args): # type: (t.Any) -> None
super().__init__(args, 'shell') super().__init__(args, 'shell')
self.cmd = args.cmd # type: t.List[str]
self.raw = args.raw # type: bool self.raw = args.raw # type: bool
self.check_layout = self.delegate # allow shell to be used without a valid layout as long as no delegation is required
self.interactive = True
self.export = args.export # type: t.Optional[str]
self.display_stderr = True
class SanityConfig(TestConfig): class SanityConfig(TestConfig):
@ -253,7 +235,7 @@ class SanityConfig(TestConfig):
self.keep_git = args.keep_git # type: bool self.keep_git = args.keep_git # type: bool
self.prime_venvs = args.prime_venvs # type: bool self.prime_venvs = args.prime_venvs # type: bool
self.info_stderr = self.lint self.display_stderr = self.lint or self.list_tests
if self.keep_git: if self.keep_git:
def git_callback(files): # type: (t.List[t.Tuple[str, str]]) -> None def git_callback(files): # type: (t.List[t.Tuple[str, str]]) -> None
@ -292,7 +274,7 @@ class IntegrationConfig(TestConfig):
if self.list_targets: if self.list_targets:
self.explain = True self.explain = True
self.info_stderr = True self.display_stderr = True
def get_ansible_config(self): # type: () -> str def get_ansible_config(self): # type: () -> str
"""Return the path to the Ansible config for the given config.""" """Return the path to the Ansible config for the given config."""

@ -3,7 +3,6 @@ from __future__ import annotations
import abc import abc
import shlex import shlex
import sys
import tempfile import tempfile
import typing as t import typing as t
@ -46,10 +45,12 @@ class Connection(metaclass=abc.ABCMeta):
@abc.abstractmethod @abc.abstractmethod
def run(self, def run(self,
command, # type: t.List[str] command, # type: t.List[str]
capture=False, # type: bool capture, # type: bool
interactive=False, # type: bool
data=None, # type: t.Optional[str] data=None, # type: t.Optional[str]
stdin=None, # type: t.Optional[t.IO[bytes]] stdin=None, # type: t.Optional[t.IO[bytes]]
stdout=None, # type: t.Optional[t.IO[bytes]] stdout=None, # type: t.Optional[t.IO[bytes]]
force_stdout=False, # type: bool
): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]] ): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]]
"""Run the specified command and return the result.""" """Run the specified command and return the result."""
@ -60,7 +61,7 @@ class Connection(metaclass=abc.ABCMeta):
"""Extract the given archive file stream in the specified directory.""" """Extract the given archive file stream in the specified directory."""
tar_cmd = ['tar', 'oxzf', '-', '-C', chdir] tar_cmd = ['tar', 'oxzf', '-', '-C', chdir]
retry(lambda: self.run(tar_cmd, stdin=src)) retry(lambda: self.run(tar_cmd, stdin=src, capture=True))
def create_archive(self, def create_archive(self,
chdir, # type: str chdir, # type: str
@ -82,7 +83,7 @@ class Connection(metaclass=abc.ABCMeta):
sh_cmd = ['sh', '-c', ' | '.join(' '.join(shlex.quote(cmd) for cmd in command) for command in commands)] sh_cmd = ['sh', '-c', ' | '.join(' '.join(shlex.quote(cmd) for cmd in command) for command in commands)]
retry(lambda: self.run(sh_cmd, stdout=dst)) retry(lambda: self.run(sh_cmd, stdout=dst, capture=True))
class LocalConnection(Connection): class LocalConnection(Connection):
@ -92,10 +93,12 @@ class LocalConnection(Connection):
def run(self, def run(self,
command, # type: t.List[str] command, # type: t.List[str]
capture=False, # type: bool capture, # type: bool
interactive=False, # type: bool
data=None, # type: t.Optional[str] data=None, # type: t.Optional[str]
stdin=None, # type: t.Optional[t.IO[bytes]] stdin=None, # type: t.Optional[t.IO[bytes]]
stdout=None, # type: t.Optional[t.IO[bytes]] stdout=None, # type: t.Optional[t.IO[bytes]]
force_stdout=False, # type: bool
): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]] ): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]]
"""Run the specified command and return the result.""" """Run the specified command and return the result."""
return run_command( return run_command(
@ -105,6 +108,8 @@ class LocalConnection(Connection):
data=data, data=data,
stdin=stdin, stdin=stdin,
stdout=stdout, stdout=stdout,
interactive=interactive,
force_stdout=force_stdout,
) )
@ -130,10 +135,12 @@ class SshConnection(Connection):
def run(self, def run(self,
command, # type: t.List[str] command, # type: t.List[str]
capture=False, # type: bool capture, # type: bool
interactive=False, # type: bool
data=None, # type: t.Optional[str] data=None, # type: t.Optional[str]
stdin=None, # type: t.Optional[t.IO[bytes]] stdin=None, # type: t.Optional[t.IO[bytes]]
stdout=None, # type: t.Optional[t.IO[bytes]] stdout=None, # type: t.Optional[t.IO[bytes]]
force_stdout=False, # type: bool
): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]] ): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]]
"""Run the specified command and return the result.""" """Run the specified command and return the result."""
options = list(self.options) options = list(self.options)
@ -143,7 +150,7 @@ class SshConnection(Connection):
options.append('-q') options.append('-q')
if not data and not stdin and not stdout and sys.stdin.isatty(): if interactive:
options.append('-tt') options.append('-tt')
with tempfile.NamedTemporaryFile(prefix='ansible-test-ssh-debug-', suffix='.log') as ssh_logfile: with tempfile.NamedTemporaryFile(prefix='ansible-test-ssh-debug-', suffix='.log') as ssh_logfile:
@ -166,6 +173,8 @@ class SshConnection(Connection):
data=data, data=data,
stdin=stdin, stdin=stdin,
stdout=stdout, stdout=stdout,
interactive=interactive,
force_stdout=force_stdout,
error_callback=error_callback, error_callback=error_callback,
) )
@ -208,10 +217,12 @@ class DockerConnection(Connection):
def run(self, def run(self,
command, # type: t.List[str] command, # type: t.List[str]
capture=False, # type: bool capture, # type: bool
interactive=False, # type: bool
data=None, # type: t.Optional[str] data=None, # type: t.Optional[str]
stdin=None, # type: t.Optional[t.IO[bytes]] stdin=None, # type: t.Optional[t.IO[bytes]]
stdout=None, # type: t.Optional[t.IO[bytes]] stdout=None, # type: t.Optional[t.IO[bytes]]
force_stdout=False, # type: bool
): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]] ): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]]
"""Run the specified command and return the result.""" """Run the specified command and return the result."""
options = [] options = []
@ -219,7 +230,7 @@ class DockerConnection(Connection):
if self.user: if self.user:
options.extend(['--user', self.user]) options.extend(['--user', self.user])
if not data and not stdin and not stdout and sys.stdin.isatty(): if interactive:
options.append('-it') options.append('-it')
return docker_exec( return docker_exec(
@ -231,6 +242,8 @@ class DockerConnection(Connection):
data=data, data=data,
stdin=stdin, stdin=stdin,
stdout=stdout, stdout=stdout,
interactive=interactive,
force_stdout=force_stdout,
) )
def inspect(self): # type: () -> DockerInspect def inspect(self): # type: () -> DockerInspect

@ -794,7 +794,7 @@ def forward_ssh_ports(
inventory = generate_ssh_inventory(ssh_connections) inventory = generate_ssh_inventory(ssh_connections)
with named_temporary_file(args, 'ssh-inventory-', '.json', None, inventory) as inventory_path: # type: str with named_temporary_file(args, 'ssh-inventory-', '.json', None, inventory) as inventory_path: # type: str
run_playbook(args, inventory_path, playbook, dict(hosts_entries=hosts_entries)) run_playbook(args, inventory_path, playbook, capture=False, variables=dict(hosts_entries=hosts_entries))
ssh_processes = [] # type: t.List[SshProcess] ssh_processes = [] # type: t.List[SshProcess]
@ -827,7 +827,7 @@ def cleanup_ssh_ports(
inventory = generate_ssh_inventory(ssh_connections) inventory = generate_ssh_inventory(ssh_connections)
with named_temporary_file(args, 'ssh-inventory-', '.json', None, inventory) as inventory_path: # type: str with named_temporary_file(args, 'ssh-inventory-', '.json', None, inventory) as inventory_path: # type: str
run_playbook(args, inventory_path, playbook, dict(hosts_entries=hosts_entries)) run_playbook(args, inventory_path, playbook, capture=False, variables=dict(hosts_entries=hosts_entries))
if ssh_processes: if ssh_processes:
for process in ssh_processes: for process in ssh_processes:

@ -1,6 +1,8 @@
"""Access Ansible Core CI remote services.""" """Access Ansible Core CI remote services."""
from __future__ import annotations from __future__ import annotations
import abc
import dataclasses
import json import json
import os import os
import re import re
@ -48,6 +50,65 @@ from .data import (
) )
@dataclasses.dataclass(frozen=True)
class Resource(metaclass=abc.ABCMeta):
"""Base class for Ansible Core CI resources."""
@abc.abstractmethod
def as_tuple(self) -> t.Tuple[str, str, str, str]:
"""Return the resource as a tuple of platform, version, architecture and provider."""
@abc.abstractmethod
def get_label(self) -> str:
"""Return a user-friendly label for this resource."""
@property
@abc.abstractmethod
def persist(self) -> bool:
"""True if the resource is persistent, otherwise false."""
@dataclasses.dataclass(frozen=True)
class VmResource(Resource):
"""Details needed to request a VM from Ansible Core CI."""
platform: str
version: str
architecture: str
provider: str
tag: str
def as_tuple(self) -> t.Tuple[str, str, str, str]:
"""Return the resource as a tuple of platform, version, architecture and provider."""
return self.platform, self.version, self.architecture, self.provider
def get_label(self) -> str:
"""Return a user-friendly label for this resource."""
return f'{self.platform} {self.version} ({self.architecture}) [{self.tag}] @{self.provider}'
@property
def persist(self) -> bool:
"""True if the resource is persistent, otherwise false."""
return True
@dataclasses.dataclass(frozen=True)
class CloudResource(Resource):
"""Details needed to request cloud credentials from Ansible Core CI."""
platform: str
def as_tuple(self) -> t.Tuple[str, str, str, str]:
"""Return the resource as a tuple of platform, version, architecture and provider."""
return self.platform, '', '', self.platform
def get_label(self) -> str:
"""Return a user-friendly label for this resource."""
return self.platform
@property
def persist(self) -> bool:
"""True if the resource is persistent, otherwise false."""
return False
class AnsibleCoreCI: class AnsibleCoreCI:
"""Client for Ansible Core CI services.""" """Client for Ansible Core CI services."""
DEFAULT_ENDPOINT = 'https://ansible-core-ci.testing.ansible.com' DEFAULT_ENDPOINT = 'https://ansible-core-ci.testing.ansible.com'
@ -55,16 +116,12 @@ class AnsibleCoreCI:
def __init__( def __init__(
self, self,
args, # type: EnvironmentConfig args, # type: EnvironmentConfig
platform, # type: str resource, # type: Resource
version, # type: str
provider, # type: str
persist=True, # type: bool
load=True, # type: bool load=True, # type: bool
suffix=None, # type: t.Optional[str]
): # type: (...) -> None ): # type: (...) -> None
self.args = args self.args = args
self.platform = platform self.resource = resource
self.version = version self.platform, self.version, self.arch, self.provider = self.resource.as_tuple()
self.stage = args.remote_stage self.stage = args.remote_stage
self.client = HttpClient(args) self.client = HttpClient(args)
self.connection = None self.connection = None
@ -73,35 +130,33 @@ class AnsibleCoreCI:
self.default_endpoint = args.remote_endpoint or self.DEFAULT_ENDPOINT self.default_endpoint = args.remote_endpoint or self.DEFAULT_ENDPOINT
self.retries = 3 self.retries = 3
self.ci_provider = get_ci_provider() self.ci_provider = get_ci_provider()
self.provider = provider self.label = self.resource.get_label()
self.name = '%s-%s' % (self.platform, self.version)
if suffix: stripped_label = re.sub('[^A-Za-z0-9_.]+', '-', self.label).strip('-')
self.name += '-' + suffix
self.path = os.path.expanduser('~/.ansible/test/instances/%s-%s-%s' % (self.name, self.provider, self.stage)) self.name = f"{stripped_label}-{self.stage}" # turn the label into something suitable for use as a filename
self.path = os.path.expanduser(f'~/.ansible/test/instances/{self.name}')
self.ssh_key = SshKey(args) self.ssh_key = SshKey(args)
if persist and load and self._load(): if self.resource.persist and load and self._load():
try: try:
display.info('Checking existing %s/%s instance %s.' % (self.platform, self.version, self.instance_id), display.info(f'Checking existing {self.label} instance using: {self._uri}', verbosity=1)
verbosity=1)
self.connection = self.get(always_raise_on=[404]) self.connection = self.get(always_raise_on=[404])
display.info('Loaded existing %s/%s from: %s' % (self.platform, self.version, self._uri), verbosity=1) display.info(f'Loaded existing {self.label} instance.', verbosity=1)
except HttpError as ex: except HttpError as ex:
if ex.status != 404: if ex.status != 404:
raise raise
self._clear() self._clear()
display.info('Cleared stale %s/%s instance %s.' % (self.platform, self.version, self.instance_id), display.info(f'Cleared stale {self.label} instance.', verbosity=1)
verbosity=1)
self.instance_id = None self.instance_id = None
self.endpoint = None self.endpoint = None
elif not persist: elif not self.resource.persist:
self.instance_id = None self.instance_id = None
self.endpoint = None self.endpoint = None
self._clear() self._clear()
@ -126,8 +181,7 @@ class AnsibleCoreCI:
def start(self): def start(self):
"""Start instance.""" """Start instance."""
if self.started: if self.started:
display.info('Skipping started %s/%s instance %s.' % (self.platform, self.version, self.instance_id), display.info(f'Skipping started {self.label} instance.', verbosity=1)
verbosity=1)
return None return None
return self._start(self.ci_provider.prepare_core_ci_auth()) return self._start(self.ci_provider.prepare_core_ci_auth())
@ -135,22 +189,19 @@ class AnsibleCoreCI:
def stop(self): def stop(self):
"""Stop instance.""" """Stop instance."""
if not self.started: if not self.started:
display.info('Skipping invalid %s/%s instance %s.' % (self.platform, self.version, self.instance_id), display.info(f'Skipping invalid {self.label} instance.', verbosity=1)
verbosity=1)
return return
response = self.client.delete(self._uri) response = self.client.delete(self._uri)
if response.status_code == 404: if response.status_code == 404:
self._clear() self._clear()
display.info('Cleared invalid %s/%s instance %s.' % (self.platform, self.version, self.instance_id), display.info(f'Cleared invalid {self.label} instance.', verbosity=1)
verbosity=1)
return return
if response.status_code == 200: if response.status_code == 200:
self._clear() self._clear()
display.info('Stopped running %s/%s instance %s.' % (self.platform, self.version, self.instance_id), display.info(f'Stopped running {self.label} instance.', verbosity=1)
verbosity=1)
return return
raise self._create_http_error(response) raise self._create_http_error(response)
@ -158,8 +209,7 @@ class AnsibleCoreCI:
def get(self, tries=3, sleep=15, always_raise_on=None): # type: (int, int, t.Optional[t.List[int]]) -> t.Optional[InstanceConnection] def get(self, tries=3, sleep=15, always_raise_on=None): # type: (int, int, t.Optional[t.List[int]]) -> t.Optional[InstanceConnection]
"""Get instance connection information.""" """Get instance connection information."""
if not self.started: if not self.started:
display.info('Skipping invalid %s/%s instance %s.' % (self.platform, self.version, self.instance_id), display.info(f'Skipping invalid {self.label} instance.', verbosity=1)
verbosity=1)
return None return None
if not always_raise_on: if not always_raise_on:
@ -180,7 +230,7 @@ class AnsibleCoreCI:
if not tries or response.status_code in always_raise_on: if not tries or response.status_code in always_raise_on:
raise error raise error
display.warning('%s. Trying again after %d seconds.' % (error, sleep)) display.warning(f'{error}. Trying again after {sleep} seconds.')
time.sleep(sleep) time.sleep(sleep)
if self.args.explain: if self.args.explain:
@ -216,9 +266,7 @@ class AnsibleCoreCI:
status = 'running' if self.connection.running else 'starting' status = 'running' if self.connection.running else 'starting'
display.info('Status update: %s/%s on instance %s is %s.' % display.info(f'The {self.label} instance is {status}.', verbosity=1)
(self.platform, self.version, self.instance_id, status),
verbosity=1)
return self.connection return self.connection
@ -229,16 +277,15 @@ class AnsibleCoreCI:
return return
time.sleep(10) time.sleep(10)
raise ApplicationError('Timeout waiting for %s/%s instance %s.' % raise ApplicationError(f'Timeout waiting for {self.label} instance.')
(self.platform, self.version, self.instance_id))
@property @property
def _uri(self): def _uri(self):
return '%s/%s/%s/%s' % (self.endpoint, self.stage, self.provider, self.instance_id) return f'{self.endpoint}/{self.stage}/{self.provider}/{self.instance_id}'
def _start(self, auth): def _start(self, auth):
"""Start instance.""" """Start instance."""
display.info('Initializing new %s/%s instance %s.' % (self.platform, self.version, self.instance_id), verbosity=1) display.info(f'Initializing new {self.label} instance using: {self._uri}', verbosity=1)
if self.platform == 'windows': if self.platform == 'windows':
winrm_config = read_text_file(os.path.join(ANSIBLE_TEST_TARGET_ROOT, 'setup', 'ConfigureRemotingForAnsible.ps1')) winrm_config = read_text_file(os.path.join(ANSIBLE_TEST_TARGET_ROOT, 'setup', 'ConfigureRemotingForAnsible.ps1'))
@ -249,6 +296,7 @@ class AnsibleCoreCI:
config=dict( config=dict(
platform=self.platform, platform=self.platform,
version=self.version, version=self.version,
architecture=self.arch,
public_key=self.ssh_key.pub_contents, public_key=self.ssh_key.pub_contents,
query=False, query=False,
winrm_config=winrm_config, winrm_config=winrm_config,
@ -266,7 +314,7 @@ class AnsibleCoreCI:
self.started = True self.started = True
self._save() self._save()
display.info('Started %s/%s from: %s' % (self.platform, self.version, self._uri), verbosity=1) display.info(f'Started {self.label} instance.', verbosity=1)
if self.args.explain: if self.args.explain:
return {} return {}
@ -277,8 +325,6 @@ class AnsibleCoreCI:
tries = self.retries tries = self.retries
sleep = 15 sleep = 15
display.info('Trying endpoint: %s' % self.endpoint, verbosity=1)
while True: while True:
tries -= 1 tries -= 1
response = self.client.put(self._uri, data=json.dumps(data), headers=headers) response = self.client.put(self._uri, data=json.dumps(data), headers=headers)
@ -294,7 +340,7 @@ class AnsibleCoreCI:
if not tries: if not tries:
raise error raise error
display.warning('%s. Trying again after %d seconds.' % (error, sleep)) display.warning(f'{error}. Trying again after {sleep} seconds.')
time.sleep(sleep) time.sleep(sleep)
def _clear(self): def _clear(self):
@ -345,14 +391,14 @@ class AnsibleCoreCI:
def save(self): # type: () -> t.Dict[str, str] def save(self): # type: () -> t.Dict[str, str]
"""Save instance details and return as a dictionary.""" """Save instance details and return as a dictionary."""
return dict( return dict(
platform_version='%s/%s' % (self.platform, self.version), label=self.resource.get_label(),
instance_id=self.instance_id, instance_id=self.instance_id,
endpoint=self.endpoint, endpoint=self.endpoint,
) )
@staticmethod @staticmethod
def _create_http_error(response): # type: (HttpResponse) -> ApplicationError def _create_http_error(response): # type: (HttpResponse) -> ApplicationError
"""Return an exception created from the given HTTP resposne.""" """Return an exception created from the given HTTP response."""
response_json = response.json() response_json = response.json()
stack_trace = '' stack_trace = ''
@ -369,7 +415,7 @@ class AnsibleCoreCI:
traceback_lines = traceback.format_list(traceback_lines) traceback_lines = traceback.format_list(traceback_lines)
trace = '\n'.join([x.rstrip() for x in traceback_lines]) trace = '\n'.join([x.rstrip() for x in traceback_lines])
stack_trace = ('\nTraceback (from remote server):\n%s' % trace) stack_trace = f'\nTraceback (from remote server):\n{trace}'
else: else:
message = str(response_json) message = str(response_json)
@ -379,7 +425,7 @@ class AnsibleCoreCI:
class CoreHttpError(HttpError): class CoreHttpError(HttpError):
"""HTTP response as an error.""" """HTTP response as an error."""
def __init__(self, status, remote_message, remote_stack_trace): # type: (int, str, str) -> None def __init__(self, status, remote_message, remote_stack_trace): # type: (int, str, str) -> None
super().__init__(status, '%s%s' % (remote_message, remote_stack_trace)) super().__init__(status, f'{remote_message}{remote_stack_trace}')
self.remote_message = remote_message self.remote_message = remote_message
self.remote_stack_trace = remote_stack_trace self.remote_stack_trace = remote_stack_trace
@ -388,8 +434,8 @@ class CoreHttpError(HttpError):
class SshKey: class SshKey:
"""Container for SSH key used to connect to remote instances.""" """Container for SSH key used to connect to remote instances."""
KEY_TYPE = 'rsa' # RSA is used to maintain compatibility with paramiko and EC2 KEY_TYPE = 'rsa' # RSA is used to maintain compatibility with paramiko and EC2
KEY_NAME = 'id_%s' % KEY_TYPE KEY_NAME = f'id_{KEY_TYPE}'
PUB_NAME = '%s.pub' % KEY_NAME PUB_NAME = f'{KEY_NAME}.pub'
@mutex @mutex
def __init__(self, args): # type: (EnvironmentConfig) -> None def __init__(self, args): # type: (EnvironmentConfig) -> None
@ -469,7 +515,7 @@ class SshKey:
make_dirs(os.path.dirname(key)) make_dirs(os.path.dirname(key))
if not os.path.isfile(key) or not os.path.isfile(pub): if not os.path.isfile(key) or not os.path.isfile(pub):
run_command(args, ['ssh-keygen', '-m', 'PEM', '-q', '-t', self.KEY_TYPE, '-N', '', '-f', key]) run_command(args, ['ssh-keygen', '-m', 'PEM', '-q', '-t', self.KEY_TYPE, '-N', '', '-f', key], capture=True)
if args.explain: if args.explain:
return key, pub return key, pub
@ -502,6 +548,6 @@ class InstanceConnection:
def __str__(self): def __str__(self):
if self.password: if self.password:
return '%s:%s [%s:%s]' % (self.hostname, self.port, self.username, self.password) return f'{self.hostname}:{self.port} [{self.username}:{self.password}]'
return '%s:%s [%s]' % (self.hostname, self.port, self.username) return f'{self.hostname}:{self.port} [{self.username}]'

@ -48,7 +48,7 @@ def cover_python(
cmd, # type: t.List[str] cmd, # type: t.List[str]
target_name, # type: str target_name, # type: str
env, # type: t.Dict[str, str] env, # type: t.Dict[str, str]
capture=False, # type: bool capture, # type: bool
data=None, # type: t.Optional[str] data=None, # type: t.Optional[str]
cwd=None, # type: t.Optional[str] cwd=None, # type: t.Optional[str]
): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]] ): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]]

@ -160,12 +160,13 @@ def delegate_command(args, host_state, exclude, require): # type: (EnvironmentC
os.path.join(content_root, ResultType.COVERAGE.relative_path), os.path.join(content_root, ResultType.COVERAGE.relative_path),
] ]
con.run(['mkdir', '-p'] + writable_dirs) con.run(['mkdir', '-p'] + writable_dirs, capture=True)
con.run(['chmod', '777'] + writable_dirs) con.run(['chmod', '777'] + writable_dirs, capture=True)
con.run(['chmod', '755', working_directory]) con.run(['chmod', '755', working_directory], capture=True)
con.run(['chmod', '644', os.path.join(content_root, args.metadata_path)]) con.run(['chmod', '644', os.path.join(content_root, args.metadata_path)], capture=True)
con.run(['useradd', pytest_user, '--create-home']) con.run(['useradd', pytest_user, '--create-home'], capture=True)
con.run(insert_options(command, options + ['--requirements-mode', 'only']))
con.run(insert_options(command, options + ['--requirements-mode', 'only']), capture=False)
container = con.inspect() container = con.inspect()
networks = container.get_network_names() networks = container.get_network_names()
@ -191,7 +192,7 @@ def delegate_command(args, host_state, exclude, require): # type: (EnvironmentC
success = False success = False
try: try:
con.run(insert_options(command, options)) con.run(insert_options(command, options), capture=False, interactive=args.interactive)
success = True success = True
finally: finally:
if host_delegation: if host_delegation:

@ -268,7 +268,7 @@ def docker_pull(args, image): # type: (EnvironmentConfig, str) -> None
for _iteration in range(1, 10): for _iteration in range(1, 10):
try: try:
docker_command(args, ['pull', image]) docker_command(args, ['pull', image], capture=False)
return return
except SubprocessError: except SubprocessError:
display.warning('Failed to pull docker image "%s". Waiting a few seconds before trying again.' % image) display.warning('Failed to pull docker image "%s". Waiting a few seconds before trying again.' % image)
@ -279,7 +279,7 @@ def docker_pull(args, image): # type: (EnvironmentConfig, str) -> None
def docker_cp_to(args, container_id, src, dst): # type: (EnvironmentConfig, str, str, str) -> None def docker_cp_to(args, container_id, src, dst): # type: (EnvironmentConfig, str, str, str) -> None
"""Copy a file to the specified container.""" """Copy a file to the specified container."""
docker_command(args, ['cp', src, '%s:%s' % (container_id, dst)]) docker_command(args, ['cp', src, '%s:%s' % (container_id, dst)], capture=True)
def docker_run( def docker_run(
@ -510,10 +510,12 @@ def docker_exec(
args, # type: EnvironmentConfig args, # type: EnvironmentConfig
container_id, # type: str container_id, # type: str
cmd, # type: t.List[str] cmd, # type: t.List[str]
capture, # type: bool
options=None, # type: t.Optional[t.List[str]] options=None, # type: t.Optional[t.List[str]]
capture=False, # type: bool
stdin=None, # type: t.Optional[t.IO[bytes]] stdin=None, # type: t.Optional[t.IO[bytes]]
stdout=None, # type: t.Optional[t.IO[bytes]] stdout=None, # type: t.Optional[t.IO[bytes]]
interactive=False, # type: bool
force_stdout=False, # type: bool
data=None, # type: t.Optional[str] data=None, # type: t.Optional[str]
): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]] ): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]]
"""Execute the given command in the specified container.""" """Execute the given command in the specified container."""
@ -523,7 +525,8 @@ def docker_exec(
if data or stdin or stdout: if data or stdin or stdout:
options.append('-i') options.append('-i')
return docker_command(args, ['exec'] + options + [container_id] + cmd, capture=capture, stdin=stdin, stdout=stdout, data=data) return docker_command(args, ['exec'] + options + [container_id] + cmd, capture=capture, stdin=stdin, stdout=stdout, interactive=interactive,
force_stdout=force_stdout, data=data)
def docker_info(args): # type: (CommonConfig) -> t.Dict[str, t.Any] def docker_info(args): # type: (CommonConfig) -> t.Dict[str, t.Any]
@ -541,18 +544,23 @@ def docker_version(args): # type: (CommonConfig) -> t.Dict[str, t.Any]
def docker_command( def docker_command(
args, # type: CommonConfig args, # type: CommonConfig
cmd, # type: t.List[str] cmd, # type: t.List[str]
capture=False, # type: bool capture, # type: bool
stdin=None, # type: t.Optional[t.IO[bytes]] stdin=None, # type: t.Optional[t.IO[bytes]]
stdout=None, # type: t.Optional[t.IO[bytes]] stdout=None, # type: t.Optional[t.IO[bytes]]
interactive=False, # type: bool
force_stdout=False, # type: bool
always=False, # type: bool always=False, # type: bool
data=None, # type: t.Optional[str] data=None, # type: t.Optional[str]
): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]] ): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]]
"""Run the specified docker command.""" """Run the specified docker command."""
env = docker_environment() env = docker_environment()
command = [require_docker().command] command = [require_docker().command]
if command[0] == 'podman' and _get_podman_remote(): if command[0] == 'podman' and _get_podman_remote():
command.append('--remote') command.append('--remote')
return run_command(args, command + cmd, env=env, capture=capture, stdin=stdin, stdout=stdout, always=always, data=data)
return run_command(args, command + cmd, env=env, capture=capture, stdin=stdin, stdout=stdout, interactive=interactive, always=always,
force_stdout=force_stdout, data=data)
def docker_environment(): # type: () -> t.Dict[str, str] def docker_environment(): # type: () -> t.Dict[str, str]

@ -39,6 +39,7 @@ from .util import (
get_available_python_versions, get_available_python_versions,
str_to_version, str_to_version,
version_to_str, version_to_str,
Architecture,
) )
@ -206,6 +207,7 @@ class RemoteConfig(HostConfig, metaclass=abc.ABCMeta):
"""Base class for remote host configuration.""" """Base class for remote host configuration."""
name: t.Optional[str] = None name: t.Optional[str] = None
provider: t.Optional[str] = None provider: t.Optional[str] = None
arch: t.Optional[str] = None
@property @property
def platform(self): # type: () -> str def platform(self): # type: () -> str
@ -227,6 +229,7 @@ class RemoteConfig(HostConfig, metaclass=abc.ABCMeta):
self.provider = None self.provider = None
self.provider = self.provider or defaults.provider or 'aws' self.provider = self.provider or defaults.provider or 'aws'
self.arch = self.arch or defaults.arch or Architecture.X86_64
@property @property
def is_managed(self): # type: () -> bool def is_managed(self): # type: () -> bool
@ -330,8 +333,6 @@ class DockerConfig(ControllerHostConfig, PosixConfig):
@dataclasses.dataclass @dataclasses.dataclass
class PosixRemoteConfig(RemoteConfig, ControllerHostConfig, PosixConfig): class PosixRemoteConfig(RemoteConfig, ControllerHostConfig, PosixConfig):
"""Configuration for a POSIX remote host.""" """Configuration for a POSIX remote host."""
arch: t.Optional[str] = None
def get_defaults(self, context): # type: (HostContext) -> PosixRemoteCompletionConfig def get_defaults(self, context): # type: (HostContext) -> PosixRemoteCompletionConfig
"""Return the default settings.""" """Return the default settings."""
return filter_completion(remote_completion()).get(self.name) or remote_completion().get(self.platform) or PosixRemoteCompletionConfig( return filter_completion(remote_completion()).get(self.name) or remote_completion().get(self.platform) or PosixRemoteCompletionConfig(
@ -388,6 +389,7 @@ class NetworkRemoteConfig(RemoteConfig, NetworkConfig):
"""Return the default settings.""" """Return the default settings."""
return filter_completion(network_completion()).get(self.name) or NetworkRemoteCompletionConfig( return filter_completion(network_completion()).get(self.name) or NetworkRemoteCompletionConfig(
name=self.name, name=self.name,
placeholder=True,
) )
def apply_defaults(self, context, defaults): # type: (HostContext, CompletionConfig) -> None def apply_defaults(self, context, defaults): # type: (HostContext, CompletionConfig) -> None

@ -40,6 +40,7 @@ from .host_configs import (
from .core_ci import ( from .core_ci import (
AnsibleCoreCI, AnsibleCoreCI,
SshKey, SshKey,
VmResource,
) )
from .util import ( from .util import (
@ -50,6 +51,7 @@ from .util import (
get_type_map, get_type_map,
sanitize_host_name, sanitize_host_name,
sorted_versions, sorted_versions,
InternalError,
) )
from .util_common import ( from .util_common import (
@ -148,7 +150,7 @@ class Inventory:
inventory_text = inventory_text.strip() inventory_text = inventory_text.strip()
if not args.explain: if not args.explain:
write_text_file(path, inventory_text) write_text_file(path, inventory_text + '\n')
display.info(f'>>> Inventory\n{inventory_text}', verbosity=3) display.info(f'>>> Inventory\n{inventory_text}', verbosity=3)
@ -295,12 +297,18 @@ class RemoteProfile(SshTargetHostProfile[TRemoteConfig], metaclass=abc.ABCMeta):
def create_core_ci(self, load): # type: (bool) -> AnsibleCoreCI def create_core_ci(self, load): # type: (bool) -> AnsibleCoreCI
"""Create and return an AnsibleCoreCI instance.""" """Create and return an AnsibleCoreCI instance."""
if not self.config.arch:
raise InternalError(f'No arch specified for config: {self.config}')
return AnsibleCoreCI( return AnsibleCoreCI(
args=self.args, args=self.args,
platform=self.config.platform, resource=VmResource(
version=self.config.version, platform=self.config.platform,
provider=self.config.provider, version=self.config.version,
suffix='controller' if self.controller else 'target', architecture=self.config.arch,
provider=self.config.provider,
tag='controller' if self.controller else 'target',
),
load=load, load=load,
) )
@ -362,7 +370,7 @@ class DockerProfile(ControllerHostProfile[DockerConfig], SshTargetHostProfile[Do
setup_sh = bootstrapper.get_script() setup_sh = bootstrapper.get_script()
shell = setup_sh.splitlines()[0][2:] shell = setup_sh.splitlines()[0][2:]
docker_exec(self.args, self.container_name, [shell], data=setup_sh) docker_exec(self.args, self.container_name, [shell], data=setup_sh, capture=False)
def deprovision(self): # type: () -> None def deprovision(self): # type: () -> None
"""Deprovision the host after delegation has completed.""" """Deprovision the host after delegation has completed."""
@ -484,8 +492,9 @@ class NetworkRemoteProfile(RemoteProfile[NetworkRemoteConfig]):
for dummy in range(1, 90): for dummy in range(1, 90):
try: try:
intercept_python(self.args, self.args.controller_python, cmd, env) intercept_python(self.args, self.args.controller_python, cmd, env, capture=True)
except SubprocessError: except SubprocessError as ex:
display.warning(str(ex))
time.sleep(10) time.sleep(10)
else: else:
return return
@ -547,7 +556,7 @@ class PosixRemoteProfile(ControllerHostProfile[PosixRemoteConfig], RemoteProfile
shell = setup_sh.splitlines()[0][2:] shell = setup_sh.splitlines()[0][2:]
ssh = self.get_origin_controller_connection() ssh = self.get_origin_controller_connection()
ssh.run([shell], data=setup_sh) ssh.run([shell], data=setup_sh, capture=False)
def get_ssh_connection(self): # type: () -> SshConnection def get_ssh_connection(self): # type: () -> SshConnection
"""Return an SSH connection for accessing the host.""" """Return an SSH connection for accessing the host."""
@ -570,6 +579,8 @@ class PosixRemoteProfile(ControllerHostProfile[PosixRemoteConfig], RemoteProfile
become = Sudo() become = Sudo()
elif self.config.platform == 'rhel': elif self.config.platform == 'rhel':
become = Sudo() become = Sudo()
elif self.config.platform == 'ubuntu':
become = Sudo()
else: else:
raise NotImplementedError(f'Become support has not been implemented for platform "{self.config.platform}" and user "{settings.user}" is not root.') raise NotImplementedError(f'Become support has not been implemented for platform "{self.config.platform}" and user "{settings.user}" is not root.')
@ -717,8 +728,9 @@ class WindowsRemoteProfile(RemoteProfile[WindowsRemoteConfig]):
for dummy in range(1, 120): for dummy in range(1, 120):
try: try:
intercept_python(self.args, self.args.controller_python, cmd, env) intercept_python(self.args, self.args.controller_python, cmd, env, capture=True)
except SubprocessError: except SubprocessError as ex:
display.warning(str(ex))
time.sleep(10) time.sleep(10)
else: else:
return return

@ -126,7 +126,8 @@ def configure_target_pypi_proxy(args, profile, pypi_endpoint, pypi_hostname): #
force = 'yes' if profile.config.is_managed else 'no' force = 'yes' if profile.config.is_managed else 'no'
run_playbook(args, inventory_path, 'pypi_proxy_prepare.yml', dict(pypi_endpoint=pypi_endpoint, pypi_hostname=pypi_hostname, force=force), capture=True) run_playbook(args, inventory_path, 'pypi_proxy_prepare.yml', capture=True, variables=dict(
pypi_endpoint=pypi_endpoint, pypi_hostname=pypi_hostname, force=force))
atexit.register(cleanup_pypi_proxy) atexit.register(cleanup_pypi_proxy)

@ -261,7 +261,7 @@ def run_pip(
if not args.explain: if not args.explain:
try: try:
connection.run([python.path], data=script) connection.run([python.path], data=script, capture=False)
except SubprocessError: except SubprocessError:
script = prepare_pip_script([PipVersion()]) script = prepare_pip_script([PipVersion()])

@ -265,10 +265,10 @@ class TestFailure(TestResult):
message = 'The test `%s` failed. See stderr output for details.' % command message = 'The test `%s` failed. See stderr output for details.' % command
path = '' path = ''
message = TestMessage(message, path) message = TestMessage(message, path)
print(message) print(message) # display goes to stderr, this should be on stdout
else: else:
for message in self.messages: for message in self.messages:
print(message) print(message) # display goes to stderr, this should be on stdout
def write_junit(self, args): # type: (TestConfig) -> None def write_junit(self, args): # type: (TestConfig) -> None
"""Write results to a junit XML file.""" """Write results to a junit XML file."""

@ -1,12 +1,15 @@
"""Miscellaneous utility functions and classes.""" """Miscellaneous utility functions and classes."""
from __future__ import annotations from __future__ import annotations
import abc
import errno import errno
import fcntl import fcntl
import importlib.util import importlib.util
import inspect import inspect
import json
import keyword import keyword
import os import os
import platform
import pkgutil import pkgutil
import random import random
import re import re
@ -41,6 +44,7 @@ from .io import (
from .thread import ( from .thread import (
mutex, mutex,
WrappedThread,
) )
from .constants import ( from .constants import (
@ -96,6 +100,18 @@ MODE_DIRECTORY = MODE_READ | stat.S_IWUSR | stat.S_IXUSR | stat.S_IXGRP | stat.S
MODE_DIRECTORY_WRITE = MODE_DIRECTORY | stat.S_IWGRP | stat.S_IWOTH MODE_DIRECTORY_WRITE = MODE_DIRECTORY | stat.S_IWGRP | stat.S_IWOTH
class Architecture:
"""
Normalized architecture names.
These are the architectures supported by ansible-test, such as when provisioning remote instances.
"""
X86_64 = 'x86_64'
AARCH64 = 'aarch64'
REMOTE_ARCHITECTURES = list(value for key, value in Architecture.__dict__.items() if not key.startswith('__'))
def is_valid_identifier(value: str) -> bool: def is_valid_identifier(value: str) -> bool:
"""Return True if the given value is a valid non-keyword Python identifier, otherwise return False.""" """Return True if the given value is a valid non-keyword Python identifier, otherwise return False."""
return value.isidentifier() and not keyword.iskeyword(value) return value.isidentifier() and not keyword.iskeyword(value)
@ -119,6 +135,58 @@ def cache(func): # type: (t.Callable[[], TValue]) -> t.Callable[[], TValue]
return wrapper return wrapper
@mutex
def detect_architecture(python: str) -> t.Optional[str]:
"""Detect the architecture of the specified Python and return a normalized version, or None if it cannot be determined."""
results: t.Dict[str, t.Optional[str]]
try:
results = detect_architecture.results # type: ignore[attr-defined]
except AttributeError:
results = detect_architecture.results = {} # type: ignore[attr-defined]
if python in results:
return results[python]
if python == sys.executable or os.path.realpath(python) == os.path.realpath(sys.executable):
uname = platform.uname()
else:
data = raw_command([python, '-c', 'import json, platform; print(json.dumps(platform.uname()));'], capture=True)[0]
uname = json.loads(data)
translation = {
'x86_64': Architecture.X86_64, # Linux, macOS
'amd64': Architecture.X86_64, # FreeBSD
'aarch64': Architecture.AARCH64, # Linux, FreeBSD
'arm64': Architecture.AARCH64, # FreeBSD
}
candidates = []
if len(uname) >= 5:
candidates.append(uname[4])
if len(uname) >= 6:
candidates.append(uname[5])
candidates = sorted(set(candidates))
architectures = sorted(set(arch for arch in [translation.get(candidate) for candidate in candidates] if arch))
architecture: t.Optional[str] = None
if not architectures:
display.warning(f'Unable to determine architecture for Python interpreter "{python}" from: {candidates}')
elif len(architectures) == 1:
architecture = architectures[0]
display.info(f'Detected architecture {architecture} for Python interpreter: {python}', verbosity=1)
else:
display.warning(f'Conflicting architectures detected ({architectures}) for Python interpreter "{python}" from: {candidates}')
results[python] = architecture
return architecture
def filter_args(args, filters): # type: (t.List[str], t.Dict[str, int]) -> t.List[str] def filter_args(args, filters): # type: (t.List[str], t.Dict[str, int]) -> t.List[str]
"""Return a filtered version of the given command line arguments.""" """Return a filtered version of the given command line arguments."""
remaining = 0 remaining = 0
@ -254,18 +322,44 @@ def get_available_python_versions(): # type: () -> t.Dict[str, str]
def raw_command( def raw_command(
cmd, # type: t.Iterable[str] cmd, # type: t.Iterable[str]
capture=False, # type: bool capture, # type: bool
env=None, # type: t.Optional[t.Dict[str, str]] env=None, # type: t.Optional[t.Dict[str, str]]
data=None, # type: t.Optional[str] data=None, # type: t.Optional[str]
cwd=None, # type: t.Optional[str] cwd=None, # type: t.Optional[str]
explain=False, # type: bool explain=False, # type: bool
stdin=None, # type: t.Optional[t.Union[t.IO[bytes], int]] stdin=None, # type: t.Optional[t.Union[t.IO[bytes], int]]
stdout=None, # type: t.Optional[t.Union[t.IO[bytes], int]] stdout=None, # type: t.Optional[t.Union[t.IO[bytes], int]]
interactive=False, # type: bool
force_stdout=False, # type: bool
cmd_verbosity=1, # type: int cmd_verbosity=1, # type: int
str_errors='strict', # type: str str_errors='strict', # type: str
error_callback=None, # type: t.Optional[t.Callable[[SubprocessError], None]] error_callback=None, # type: t.Optional[t.Callable[[SubprocessError], None]]
): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]] ): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]]
"""Run the specified command and return stdout and stderr as a tuple.""" """Run the specified command and return stdout and stderr as a tuple."""
if capture and interactive:
raise InternalError('Cannot combine capture=True with interactive=True.')
if data and interactive:
raise InternalError('Cannot combine data with interactive=True.')
if stdin and interactive:
raise InternalError('Cannot combine stdin with interactive=True.')
if stdout and interactive:
raise InternalError('Cannot combine stdout with interactive=True.')
if stdin and data:
raise InternalError('Cannot combine stdin with data.')
if stdout and not capture:
raise InternalError('Redirection of stdout requires capture=True to avoid redirection of stderr to stdout.')
if force_stdout and capture:
raise InternalError('Cannot combine force_stdout=True with capture=True.')
if force_stdout and interactive:
raise InternalError('Cannot combine force_stdout=True with interactive=True.')
if not cwd: if not cwd:
cwd = os.getcwd() cwd = os.getcwd()
@ -276,7 +370,30 @@ def raw_command(
escaped_cmd = ' '.join(shlex.quote(c) for c in cmd) escaped_cmd = ' '.join(shlex.quote(c) for c in cmd)
display.info('Run command: %s' % escaped_cmd, verbosity=cmd_verbosity, truncate=True) if capture:
description = 'Run'
elif interactive:
description = 'Interactive'
else:
description = 'Stream'
description += ' command'
with_types = []
if data:
with_types.append('data')
if stdin:
with_types.append('stdin')
if stdout:
with_types.append('stdout')
if with_types:
description += f' with {"/".join(with_types)}'
display.info(f'{description}: {escaped_cmd}', verbosity=cmd_verbosity, truncate=True)
display.info('Working directory: %s' % cwd, verbosity=2) display.info('Working directory: %s' % cwd, verbosity=2)
program = find_executable(cmd[0], cwd=cwd, path=env['PATH'], required='warning') program = find_executable(cmd[0], cwd=cwd, path=env['PATH'], required='warning')
@ -294,17 +411,23 @@ def raw_command(
if stdin is not None: if stdin is not None:
data = None data = None
communicate = True
elif data is not None: elif data is not None:
stdin = subprocess.PIPE stdin = subprocess.PIPE
communicate = True communicate = True
elif interactive:
if stdout: pass # allow the subprocess access to our stdin
communicate = True else:
stdin = subprocess.DEVNULL
if capture:
if not interactive:
# When not running interactively, send subprocess stdout/stderr through a pipe.
# This isolates the stdout/stderr of the subprocess from the current process, and also hides the current TTY from it, if any.
# This prevents subprocesses from sharing stdout/stderr with the current process or each other.
# Doing so allows subprocesses to safely make changes to their file handles, such as making them non-blocking (ssh does this).
# This also maintains consistency between local testing and CI systems, which typically do not provide a TTY.
# To maintain output ordering, a single pipe is used for both stdout/stderr when not capturing output.
stdout = stdout or subprocess.PIPE stdout = stdout or subprocess.PIPE
stderr = subprocess.PIPE stderr = subprocess.PIPE if capture else subprocess.STDOUT
communicate = True communicate = True
else: else:
stderr = None stderr = None
@ -324,7 +447,8 @@ def raw_command(
if communicate: if communicate:
data_bytes = to_optional_bytes(data) data_bytes = to_optional_bytes(data)
stdout_bytes, stderr_bytes = process.communicate(data_bytes) stdout_bytes, stderr_bytes = communicate_with_process(process, data_bytes, stdout == subprocess.PIPE, stderr == subprocess.PIPE, capture=capture,
force_stdout=force_stdout)
stdout_text = to_optional_text(stdout_bytes, str_errors) or u'' stdout_text = to_optional_text(stdout_bytes, str_errors) or u''
stderr_text = to_optional_text(stderr_bytes, str_errors) or u'' stderr_text = to_optional_text(stderr_bytes, str_errors) or u''
else: else:
@ -347,6 +471,122 @@ def raw_command(
raise SubprocessError(cmd, status, stdout_text, stderr_text, runtime, error_callback) raise SubprocessError(cmd, status, stdout_text, stderr_text, runtime, error_callback)
def communicate_with_process(
process: subprocess.Popen,
stdin: t.Optional[bytes],
stdout: bool,
stderr: bool,
capture: bool,
force_stdout: bool
) -> t.Tuple[bytes, bytes]:
"""Communicate with the specified process, handling stdin/stdout/stderr as requested."""
threads: t.List[WrappedThread] = []
reader: t.Type[ReaderThread]
if capture:
reader = CaptureThread
else:
reader = OutputThread
if stdin is not None:
threads.append(WriterThread(process.stdin, stdin))
if stdout:
stdout_reader = reader(process.stdout, force_stdout)
threads.append(stdout_reader)
else:
stdout_reader = None
if stderr:
stderr_reader = reader(process.stderr, force_stdout)
threads.append(stderr_reader)
else:
stderr_reader = None
for thread in threads:
thread.start()
for thread in threads:
try:
thread.wait_for_result()
except Exception as ex: # pylint: disable=broad-except
display.error(str(ex))
if isinstance(stdout_reader, ReaderThread):
stdout_bytes = b''.join(stdout_reader.lines)
else:
stdout_bytes = b''
if isinstance(stderr_reader, ReaderThread):
stderr_bytes = b''.join(stderr_reader.lines)
else:
stderr_bytes = b''
process.wait()
return stdout_bytes, stderr_bytes
class WriterThread(WrappedThread):
"""Thread to write data to stdin of a subprocess."""
def __init__(self, handle: t.IO[bytes], data: bytes) -> None:
super().__init__(self._run)
self.handle = handle
self.data = data
def _run(self) -> None:
"""Workload to run on a thread."""
try:
self.handle.write(self.data)
self.handle.flush()
finally:
self.handle.close()
class ReaderThread(WrappedThread, metaclass=abc.ABCMeta):
"""Thread to read stdout from a subprocess."""
def __init__(self, handle: t.IO[bytes], force_stdout: bool) -> None:
super().__init__(self._run)
self.handle = handle
self.force_stdout = force_stdout
self.lines = [] # type: t.List[bytes]
@abc.abstractmethod
def _run(self) -> None:
"""Workload to run on a thread."""
class CaptureThread(ReaderThread):
"""Thread to capture stdout from a subprocess into a buffer."""
def _run(self) -> None:
"""Workload to run on a thread."""
src = self.handle
dst = self.lines
try:
for line in src:
dst.append(line)
finally:
src.close()
class OutputThread(ReaderThread):
"""Thread to pass stdout from a subprocess to stdout."""
def _run(self) -> None:
"""Workload to run on a thread."""
src = self.handle
dst = sys.stdout.buffer if self.force_stdout else display.fd.buffer
try:
for line in src:
dst.write(line)
dst.flush()
finally:
src.close()
def common_environment(): def common_environment():
"""Common environment used for executing all programs.""" """Common environment used for executing all programs."""
env = dict( env = dict(
@ -516,7 +756,7 @@ class Display:
self.color = sys.stdout.isatty() self.color = sys.stdout.isatty()
self.warnings = [] self.warnings = []
self.warnings_unique = set() self.warnings_unique = set()
self.info_stderr = False self.fd = sys.stderr # default to stderr until config is initialized to avoid early messages going to stdout
self.rows = 0 self.rows = 0
self.columns = 0 self.columns = 0
self.truncate = 0 self.truncate = 0
@ -528,7 +768,7 @@ class Display:
def __warning(self, message): # type: (str) -> None def __warning(self, message): # type: (str) -> None
"""Internal implementation for displaying a warning message.""" """Internal implementation for displaying a warning message."""
self.print_message('WARNING: %s' % message, color=self.purple, fd=sys.stderr) self.print_message('WARNING: %s' % message, color=self.purple)
def review_warnings(self): # type: () -> None def review_warnings(self): # type: () -> None
"""Review all warnings which previously occurred.""" """Review all warnings which previously occurred."""
@ -556,23 +796,27 @@ class Display:
def notice(self, message): # type: (str) -> None def notice(self, message): # type: (str) -> None
"""Display a notice level message.""" """Display a notice level message."""
self.print_message('NOTICE: %s' % message, color=self.purple, fd=sys.stderr) self.print_message('NOTICE: %s' % message, color=self.purple)
def error(self, message): # type: (str) -> None def error(self, message): # type: (str) -> None
"""Display an error level message.""" """Display an error level message."""
self.print_message('ERROR: %s' % message, color=self.red, fd=sys.stderr) self.print_message('ERROR: %s' % message, color=self.red)
def fatal(self, message): # type: (str) -> None
"""Display a fatal level message."""
self.print_message('FATAL: %s' % message, color=self.red, stderr=True)
def info(self, message, verbosity=0, truncate=False): # type: (str, int, bool) -> None def info(self, message, verbosity=0, truncate=False): # type: (str, int, bool) -> None
"""Display an info level message.""" """Display an info level message."""
if self.verbosity >= verbosity: if self.verbosity >= verbosity:
color = self.verbosity_colors.get(verbosity, self.yellow) color = self.verbosity_colors.get(verbosity, self.yellow)
self.print_message(message, color=color, fd=sys.stderr if self.info_stderr else sys.stdout, truncate=truncate) self.print_message(message, color=color, truncate=truncate)
def print_message( # pylint: disable=locally-disabled, invalid-name def print_message( # pylint: disable=locally-disabled, invalid-name
self, self,
message, # type: str message, # type: str
color=None, # type: t.Optional[str] color=None, # type: t.Optional[str]
fd=sys.stdout, # type: t.IO[str] stderr=False, # type: bool
truncate=False, # type: bool truncate=False, # type: bool
): # type: (...) -> None ): # type: (...) -> None
"""Display a message.""" """Display a message."""
@ -592,10 +836,18 @@ class Display:
message = message.replace(self.clear, color) message = message.replace(self.clear, color)
message = '%s%s%s' % (color, message, self.clear) message = '%s%s%s' % (color, message, self.clear)
fd = sys.stderr if stderr else self.fd
print(message, file=fd) print(message, file=fd)
fd.flush() fd.flush()
class InternalError(Exception):
"""An unhandled internal error indicating a bug in the code."""
def __init__(self, message: str) -> None:
super().__init__(f'An internal error has occurred in ansible-test: {message}')
class ApplicationError(Exception): class ApplicationError(Exception):
"""General application error.""" """General application error."""
@ -648,12 +900,15 @@ class MissingEnvironmentVariable(ApplicationError):
self.name = name self.name = name
def retry(func, ex_type=SubprocessError, sleep=10, attempts=10): def retry(func, ex_type=SubprocessError, sleep=10, attempts=10, warn=True):
"""Retry the specified function on failure.""" """Retry the specified function on failure."""
for dummy in range(1, attempts): for dummy in range(1, attempts):
try: try:
return func() return func()
except ex_type: except ex_type as ex:
if warn:
display.warning(str(ex))
time.sleep(sleep) time.sleep(sleep)
return func() return func()

@ -126,6 +126,8 @@ class CommonConfig:
"""Configuration common to all commands.""" """Configuration common to all commands."""
def __init__(self, args, command): # type: (t.Any, str) -> None def __init__(self, args, command): # type: (t.Any, str) -> None
self.command = command self.command = command
self.interactive = False
self.check_layout = True
self.success = None # type: t.Optional[bool] self.success = None # type: t.Optional[bool]
self.color = args.color # type: bool self.color = args.color # type: bool
@ -135,7 +137,7 @@ class CommonConfig:
self.truncate = args.truncate # type: int self.truncate = args.truncate # type: int
self.redact = args.redact # type: bool self.redact = args.redact # type: bool
self.info_stderr = False # type: bool self.display_stderr = False # type: bool
self.session_name = generate_name() self.session_name = generate_name()
@ -369,7 +371,7 @@ def intercept_python(
python, # type: PythonConfig python, # type: PythonConfig
cmd, # type: t.List[str] cmd, # type: t.List[str]
env, # type: t.Dict[str, str] env, # type: t.Dict[str, str]
capture=False, # type: bool capture, # type: bool
data=None, # type: t.Optional[str] data=None, # type: t.Optional[str]
cwd=None, # type: t.Optional[str] cwd=None, # type: t.Optional[str]
always=False, # type: bool always=False, # type: bool
@ -399,21 +401,23 @@ def intercept_python(
def run_command( def run_command(
args, # type: CommonConfig args, # type: CommonConfig
cmd, # type: t.Iterable[str] cmd, # type: t.Iterable[str]
capture=False, # type: bool capture, # type: bool
env=None, # type: t.Optional[t.Dict[str, str]] env=None, # type: t.Optional[t.Dict[str, str]]
data=None, # type: t.Optional[str] data=None, # type: t.Optional[str]
cwd=None, # type: t.Optional[str] cwd=None, # type: t.Optional[str]
always=False, # type: bool always=False, # type: bool
stdin=None, # type: t.Optional[t.IO[bytes]] stdin=None, # type: t.Optional[t.IO[bytes]]
stdout=None, # type: t.Optional[t.IO[bytes]] stdout=None, # type: t.Optional[t.IO[bytes]]
interactive=False, # type: bool
force_stdout=False, # type: bool
cmd_verbosity=1, # type: int cmd_verbosity=1, # type: int
str_errors='strict', # type: str str_errors='strict', # type: str
error_callback=None, # type: t.Optional[t.Callable[[SubprocessError], None]] error_callback=None, # type: t.Optional[t.Callable[[SubprocessError], None]]
): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]] ): # type: (...) -> t.Tuple[t.Optional[str], t.Optional[str]]
"""Run the specified command and return stdout and stderr as a tuple.""" """Run the specified command and return stdout and stderr as a tuple."""
explain = args.explain and not always explain = args.explain and not always
return raw_command(cmd, capture=capture, env=env, data=data, cwd=cwd, explain=explain, stdin=stdin, stdout=stdout, return raw_command(cmd, capture=capture, env=env, data=data, cwd=cwd, explain=explain, stdin=stdin, stdout=stdout, interactive=interactive,
cmd_verbosity=cmd_verbosity, str_errors=str_errors, error_callback=error_callback) force_stdout=force_stdout, cmd_verbosity=cmd_verbosity, str_errors=str_errors, error_callback=error_callback)
def yamlcheck(python): def yamlcheck(python):

@ -20,6 +20,7 @@ from .util import (
remove_tree, remove_tree,
ApplicationError, ApplicationError,
str_to_version, str_to_version,
raw_command,
) )
from .util_common import ( from .util_common import (
@ -92,7 +93,7 @@ def create_virtual_environment(args, # type: EnvironmentConfig
# creating a virtual environment using 'venv' when running in a virtual environment created by 'virtualenv' results # creating a virtual environment using 'venv' when running in a virtual environment created by 'virtualenv' results
# in a copy of the original virtual environment instead of creation of a new one # in a copy of the original virtual environment instead of creation of a new one
# avoid this issue by only using "real" python interpreters to invoke 'venv' # avoid this issue by only using "real" python interpreters to invoke 'venv'
for real_python in iterate_real_pythons(args, python.version): for real_python in iterate_real_pythons(python.version):
if run_venv(args, real_python, system_site_packages, pip, path): if run_venv(args, real_python, system_site_packages, pip, path):
display.info('Created Python %s virtual environment using "venv": %s' % (python.version, path), verbosity=1) display.info('Created Python %s virtual environment using "venv": %s' % (python.version, path), verbosity=1)
return True return True
@ -128,7 +129,7 @@ def create_virtual_environment(args, # type: EnvironmentConfig
return False return False
def iterate_real_pythons(args, version): # type: (EnvironmentConfig, str) -> t.Iterable[str] def iterate_real_pythons(version): # type: (str) -> t.Iterable[str]
""" """
Iterate through available real python interpreters of the requested version. Iterate through available real python interpreters of the requested version.
The current interpreter will be checked and then the path will be searched. The current interpreter will be checked and then the path will be searched.
@ -138,7 +139,7 @@ def iterate_real_pythons(args, version): # type: (EnvironmentConfig, str) -> t.
if version_info == sys.version_info[:len(version_info)]: if version_info == sys.version_info[:len(version_info)]:
current_python = sys.executable current_python = sys.executable
real_prefix = get_python_real_prefix(args, current_python) real_prefix = get_python_real_prefix(current_python)
if real_prefix: if real_prefix:
current_python = find_python(version, os.path.join(real_prefix, 'bin')) current_python = find_python(version, os.path.join(real_prefix, 'bin'))
@ -159,7 +160,7 @@ def iterate_real_pythons(args, version): # type: (EnvironmentConfig, str) -> t.
if found_python == current_python: if found_python == current_python:
return return
real_prefix = get_python_real_prefix(args, found_python) real_prefix = get_python_real_prefix(found_python)
if real_prefix: if real_prefix:
found_python = find_python(version, os.path.join(real_prefix, 'bin')) found_python = find_python(version, os.path.join(real_prefix, 'bin'))
@ -168,12 +169,12 @@ def iterate_real_pythons(args, version): # type: (EnvironmentConfig, str) -> t.
yield found_python yield found_python
def get_python_real_prefix(args, python_path): # type: (EnvironmentConfig, str) -> t.Optional[str] def get_python_real_prefix(python_path): # type: (str) -> t.Optional[str]
""" """
Return the real prefix of the specified interpreter or None if the interpreter is not a virtual environment created by 'virtualenv'. Return the real prefix of the specified interpreter or None if the interpreter is not a virtual environment created by 'virtualenv'.
""" """
cmd = [python_path, os.path.join(os.path.join(ANSIBLE_TEST_TARGET_TOOLS_ROOT, 'virtualenvcheck.py'))] cmd = [python_path, os.path.join(os.path.join(ANSIBLE_TEST_TARGET_TOOLS_ROOT, 'virtualenvcheck.py'))]
check_result = json.loads(run_command(args, cmd, capture=True, always=True)[0]) check_result = json.loads(raw_command(cmd, capture=True)[0])
real_prefix = check_result['real_prefix'] real_prefix = check_result['real_prefix']
return real_prefix return real_prefix

@ -47,7 +47,11 @@ def main():
env = os.environ.copy() env = os.environ.copy()
env.update(PYTHONPATH='%s:%s' % (os.path.join(os.path.dirname(__file__), 'changelog'), env['PYTHONPATH'])) env.update(PYTHONPATH='%s:%s' % (os.path.join(os.path.dirname(__file__), 'changelog'), env['PYTHONPATH']))
subprocess.call(cmd, env=env) # ignore the return code, rely on the output instead # ignore the return code, rely on the output instead
process = subprocess.run(cmd, stdin=subprocess.DEVNULL, capture_output=True, text=True, env=env, check=False)
sys.stdout.write(process.stdout)
sys.stderr.write(process.stderr)
if __name__ == '__main__': if __name__ == '__main__':

@ -436,14 +436,13 @@ class ModuleValidator(Validator):
base_path = self._get_base_branch_module_path() base_path = self._get_base_branch_module_path()
command = ['git', 'show', '%s:%s' % (self.base_branch, base_path or self.path)] command = ['git', 'show', '%s:%s' % (self.base_branch, base_path or self.path)]
p = subprocess.Popen(command, stdout=subprocess.PIPE, p = subprocess.run(command, stdin=subprocess.DEVNULL, capture_output=True, check=False)
stderr=subprocess.PIPE)
stdout, stderr = p.communicate()
if int(p.returncode) != 0: if int(p.returncode) != 0:
return None return None
t = tempfile.NamedTemporaryFile(delete=False) t = tempfile.NamedTemporaryFile(delete=False)
t.write(stdout) t.write(p.stdout)
t.close() t.close()
return t.name return t.name
@ -2456,11 +2455,12 @@ class GitCache:
@staticmethod @staticmethod
def _git(args): def _git(args):
cmd = ['git'] + args cmd = ['git'] + args
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) p = subprocess.run(cmd, stdin=subprocess.DEVNULL, capture_output=True, text=True, check=False)
stdout, stderr = p.communicate()
if p.returncode != 0: if p.returncode != 0:
raise GitError(stderr, p.returncode) raise GitError(p.stderr, p.returncode)
return stdout.decode('utf-8').splitlines()
return p.stdout.splitlines()
class GitError(Exception): class GitError(Exception):

@ -122,14 +122,12 @@ def get_ps_argument_spec(filename, collection):
}) })
script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'ps_argspec.ps1') script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'ps_argspec.ps1')
proc = subprocess.Popen(['pwsh', script_path, util_manifest], stdout=subprocess.PIPE, stderr=subprocess.PIPE, proc = subprocess.run(['pwsh', script_path, util_manifest], stdin=subprocess.DEVNULL, capture_output=True, text=True, check=False)
shell=False)
stdout, stderr = proc.communicate()
if proc.returncode != 0: if proc.returncode != 0:
raise AnsibleModuleImportError("STDOUT:\n%s\nSTDERR:\n%s" % (stdout.decode('utf-8'), stderr.decode('utf-8'))) raise AnsibleModuleImportError("STDOUT:\n%s\nSTDERR:\n%s" % (proc.stdout, proc.stderr))
kwargs = json.loads(stdout) kwargs = json.loads(proc.stdout)
# the validate-modules code expects the options spec to be under the argument_spec key not options as set in PS # the validate-modules code expects the options spec to be under the argument_spec key not options as set in PS
kwargs['argument_spec'] = kwargs.pop('options', {}) kwargs['argument_spec'] = kwargs.pop('options', {})

@ -27,6 +27,9 @@ def main(args=None):
raise SystemExit('This version of ansible-test cannot be executed with Python version %s. Supported Python versions are: %s' % ( raise SystemExit('This version of ansible-test cannot be executed with Python version %s. Supported Python versions are: %s' % (
version_to_str(sys.version_info[:3]), ', '.join(CONTROLLER_PYTHON_VERSIONS))) version_to_str(sys.version_info[:3]), ', '.join(CONTROLLER_PYTHON_VERSIONS)))
if any(not os.get_blocking(handle.fileno()) for handle in (sys.stdin, sys.stdout, sys.stderr)):
raise SystemExit('Standard input, output and error file handles must be blocking to run ansible-test.')
# noinspection PyProtectedMember # noinspection PyProtectedMember
from ansible_test._internal import main as cli_main from ansible_test._internal import main as cli_main

@ -281,6 +281,39 @@ bootstrap_remote_rhel_pinned_pip_packages()
pip_install "${pip_packages}" pip_install "${pip_packages}"
} }
bootstrap_remote_ubuntu()
{
py_pkg_prefix="python3"
packages="
gcc
${py_pkg_prefix}-dev
${py_pkg_prefix}-pip
${py_pkg_prefix}-venv
"
if [ "${controller}" ]; then
# The resolvelib package is not listed here because the available version (0.8.1) is incompatible with ansible.
# Instead, ansible-test will install it using pip.
packages="
${packages}
${py_pkg_prefix}-cryptography
${py_pkg_prefix}-jinja2
${py_pkg_prefix}-packaging
${py_pkg_prefix}-yaml
"
fi
while true; do
# shellcheck disable=SC2086
apt-get update -qq -y && \
DEBIAN_FRONTEND=noninteractive apt-get install -qq -y --no-install-recommends ${packages} \
&& break
echo "Failed to install packages. Sleeping before trying again..."
sleep 10
done
}
bootstrap_docker() bootstrap_docker()
{ {
# Required for newer mysql-server packages to install/upgrade on Ubuntu 16.04. # Required for newer mysql-server packages to install/upgrade on Ubuntu 16.04.
@ -299,6 +332,7 @@ bootstrap_remote()
"freebsd") bootstrap_remote_freebsd ;; "freebsd") bootstrap_remote_freebsd ;;
"macos") bootstrap_remote_macos ;; "macos") bootstrap_remote_macos ;;
"rhel") bootstrap_remote_rhel ;; "rhel") bootstrap_remote_rhel ;;
"ubuntu") bootstrap_remote_ubuntu ;;
esac esac
done done
} }

@ -29,13 +29,12 @@ def main():
try: try:
cmd = ['make', 'core_singlehtmldocs'] cmd = ['make', 'core_singlehtmldocs']
sphinx = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=docs_dir) sphinx = subprocess.run(cmd, stdin=subprocess.DEVNULL, capture_output=True, cwd=docs_dir, check=False, text=True)
stdout, stderr = sphinx.communicate()
finally: finally:
shutil.move(tmp, requirements_txt) shutil.move(tmp, requirements_txt)
stdout = stdout.decode('utf-8') stdout = sphinx.stdout
stderr = stderr.decode('utf-8') stderr = sphinx.stderr
if sphinx.returncode != 0: if sphinx.returncode != 0:
sys.stderr.write("Command '%s' failed with status code: %d\n" % (' '.join(cmd), sphinx.returncode)) sys.stderr.write("Command '%s' failed with status code: %d\n" % (' '.join(cmd), sphinx.returncode))

@ -172,14 +172,15 @@ def clean_repository(file_list):
def create_sdist(tmp_dir): def create_sdist(tmp_dir):
"""Create an sdist in the repository""" """Create an sdist in the repository"""
create = subprocess.Popen( create = subprocess.run(
['make', 'snapshot', 'SDIST_DIR=%s' % tmp_dir], ['make', 'snapshot', 'SDIST_DIR=%s' % tmp_dir],
stdout=subprocess.PIPE, stdin=subprocess.DEVNULL,
stderr=subprocess.PIPE, capture_output=True,
universal_newlines=True, text=True,
check=False,
) )
stderr = create.communicate()[1] stderr = create.stderr
if create.returncode != 0: if create.returncode != 0:
raise Exception('make snapshot failed:\n%s' % stderr) raise Exception('make snapshot failed:\n%s' % stderr)
@ -220,15 +221,16 @@ def extract_sdist(sdist_path, tmp_dir):
def install_sdist(tmp_dir, sdist_dir): def install_sdist(tmp_dir, sdist_dir):
"""Install the extracted sdist into the temporary directory""" """Install the extracted sdist into the temporary directory"""
install = subprocess.Popen( install = subprocess.run(
['python', 'setup.py', 'install', '--root=%s' % tmp_dir], ['python', 'setup.py', 'install', '--root=%s' % tmp_dir],
stdout=subprocess.PIPE, stdin=subprocess.DEVNULL,
stderr=subprocess.PIPE, capture_output=True,
universal_newlines=True, text=True,
cwd=os.path.join(tmp_dir, sdist_dir), cwd=os.path.join(tmp_dir, sdist_dir),
check=False,
) )
stdout, stderr = install.communicate() stdout, stderr = install.stdout, install.stderr
if install.returncode != 0: if install.returncode != 0:
raise Exception('sdist install failed:\n%s' % stderr) raise Exception('sdist install failed:\n%s' % stderr)

@ -0,0 +1,7 @@
import sys
def test_no_tty():
assert not sys.stdin.isatty()
assert not sys.stdout.isatty()
assert not sys.stderr.isatty()
Loading…
Cancel
Save