Merge branch 'python24' into dmw

* python24:
  WIP first run of py24 CI
  issue #477: initial Python 2.4.6 build for CI.
  issue #477: enable git-lfs for tests/data/*.tar.bz2.
  issue #477: import build script for Python 2.4.6.
  issue #477: add mitogen_py24 CI test type.
  issue #477: disable Django parts of module_finder_test on 2.4.
  issue #477: clean up globals after unix_test.
  issue #477: remove unused pytest bits from importer_test.
  issue #477: remove fork use from unix_test.
  parent: don't kill child when profiling=True
  issue #485: import new throuhgput bench
  issue #477: more fork removal
  issue #477: Py2.4 startswith() did not support tuples.
  issue #477: util/fakessh/two_three_compat fixes.
  issue #477: call_function_test fixes for 2.4.
  issue #477: promote setup_gil() to mitogen.utils
  issue #477: fix lxc_test any polyfill import.
  issue #477: stop using fork in responder_test.
  issue #477: stop using fork in service_test.
  issue #477: Python<2.5 ioctl() request parameter was signed.
  issue #477: stop using fork() in parent_test, compatible enumerate().
  issue #477: Popen.terminate() polyfill for Py2.4.
  issue #477: stop using .fork() in router_test, one small 2.4 fix.
  issue #477: document master.Router.max_message_size.
  issue #477: old Py zlib did not include extended exception text.
  issue #477: stop using router.fork() in receiver_test
  issue #477: any() polyfill for lxc_test.
  issue #477: replace type(e) -> __class__ for an exception
  issue #477: old Mock does not throw side_effect exceptions from a list
  issue #477: 2.4 stat() returned int timestamps not float.
  issue #477: set().union(a, b, ..) unsupported on Py2.4.
  issue #477: Logger.log(extra=) unsupported on Py2.4.
  issue #477: fix another Threading.getName() call.
  issue #477: %f date format requires Py2.6 or newer.
  issue #477: make mitogen.fork unsupported on Py<2.6.
  issue #477: Py2.4 dep scanner bytecode difference
  Drop 'alpha' trove classifier
  issue #477: fix another str/bytes mixup.
  issue #477: blacklist 'thread' module to avoid roundtrip on 2.x->3.x
  issue #477: fix 3.x failure in new target.set_file_mode() function.
  issue #477: fix 3.x failure in new target.set_file_mode() function.
  issue #477: fix 2 runner tests on Ansible 2.7.
  issue #477: fix 3.x test regressions.
  issue #477: fix new KwargsTest on Python 3.x.
  issue #477: ModuleFinder now returns Unicode module names.
  issue #477: Python3 does not have Pickler.dispatch.
  issue #477: ModuleFinder test fixes.
  issue #477: Ansible 2.3 compatible regression/all.yml.
  issue #477: Ansible 2.3 requires placeholder module for assert_equals
  issue #477: build a CentOS 5/Py2.4 container + playbook compat fixes.
  issue #477: use PY24 constant rather than explicit test.
  issue #477: backport mitogen.master to Python 2.4.
  issue #477: parent: make iter_read() log disconnect reason.
  issue #477: backport ansible_mitogen.runner to 2.4.
  issue #477: backport various test modules to Python 2.4.
  issue #477: backport ansible_mitogen/target.py to Python2.4
  issue #477: add all() polyfill to custom_python_detect_environmnet
  issue #477: polyfill partition() use in mitogen.parent.
  issue #477: polyfill partition() use in mitogen.service.
  issue #477: polyfill partition() use in mitogen.ssh.
  issue #477: vendorize the last 2.4-compatible simplejson
  issue #477: _update_linecache() must append newlines.
  issue #415, #477: Poller must handle POLLHUP too.
  issue #477: Python 2.5 needs next() polyfill too.
  issue #477: explicitly populate Py2.4 linecache from Importer.
  issue #477: rename and add tests for polyfill functions.
  issue #477: various core.py docstring cleanups.
  issue #477: Ansible 2.3 module output format difference.
  issue #477: Ansible 2.3 cannot use when: on an include.
  issue #477: tests: use Ansible 2.3-compatible include rather than import
  issue #477: serve up junk ansible/__init__.py just like Ansible.
  issue #477: testlib: Py2.4 did not have BaseException.
  issue #477: master: ability to override ModuleResponder output.
  issue #477: yet another bug in core._partition().
  issue #477: 2.4.x compat fixes for mitogen.service.
  issue #477: Py2.4 lacks all().
  issue #477: Ansible 2.3 had stricter arg spec format.
  issue #477: make CallError serializable on 2.4.
  issue #477: log full module name when SyntaxError occurs.
  issue #477: more Py2.4 (str|unicode).partition().
  issue #477: Py2.4 cannot tolerate unicode kwargs.
  issue #477: Py2.4 lacks BaseException.
  issue #477: Py2.4: enumerate() may return stopped threads.
  issue #477: Py2.4: more unicode.rpartition() usage.
  issue #477: Python 2.4 type(exc) returns old-style instance.
  issue #477: Python 2.4 lacked str.partition.
  issue #477: Python 2.4 lacked Thread.name.
  issue #477: Python 2.4 lacked context managers.
  issue #477: Python <2.5 did not have combined try/finally/except.
  issue #477: older Ansibles had no vars plugin base class.
  issue #477: Python <2.5 lacked any().
  issue #477: Python <2.6 lacked rpartition().
  issue #477: make CallError inherit from object for 2.4/2.5.
  issue #477: 2.4/2.5 had no better poller than poll().
issue510
David Wilson 6 years ago
commit 104e7a963f

@ -0,0 +1,14 @@
#!/usr/bin/env python
import ci_lib
batches = [
[
'docker pull %s' % (ci_lib.image_for_distro(ci_lib.DISTRO),),
],
[
'sudo tar -C / -jxvf tests/data/ubuntu-python-2.4.6.tar.bz2',
]
]
ci_lib.run_batches(batches)

@ -0,0 +1,17 @@
#!/usr/bin/env python
# Mitogen tests for Python 2.4.
import os
import ci_lib
os.environ.update({
'NOCOVERAGE': '1',
'UNIT2': '/usr/local/python2.4.6/bin/unit2',
'MITOGEN_TEST_DISTRO': ci_lib.DISTRO,
'MITOGEN_LOG_LEVEL': 'debug',
'SKIP_ANSIBLE': '1',
})
ci_lib.run('./run_tests -v')

@ -24,6 +24,9 @@ script:
matrix: matrix:
include: include:
# Mitogen tests. # Mitogen tests.
# 2.4 -> 2.4
- language: c
env: MODE=mitogen_py24 DISTRO=centos5
# 2.7 -> 2.7 # 2.7 -> 2.7
- python: "2.7" - python: "2.7"
env: MODE=mitogen DISTRO=debian env: MODE=mitogen DISTRO=debian

@ -0,0 +1,318 @@
r"""JSON (JavaScript Object Notation) <http://json.org> is a subset of
JavaScript syntax (ECMA-262 3rd edition) used as a lightweight data
interchange format.
:mod:`simplejson` exposes an API familiar to users of the standard library
:mod:`marshal` and :mod:`pickle` modules. It is the externally maintained
version of the :mod:`json` library contained in Python 2.6, but maintains
compatibility with Python 2.4 and Python 2.5 and (currently) has
significant performance advantages, even without using the optional C
extension for speedups.
Encoding basic Python object hierarchies::
>>> import simplejson as json
>>> json.dumps(['foo', {'bar': ('baz', None, 1.0, 2)}])
'["foo", {"bar": ["baz", null, 1.0, 2]}]'
>>> print json.dumps("\"foo\bar")
"\"foo\bar"
>>> print json.dumps(u'\u1234')
"\u1234"
>>> print json.dumps('\\')
"\\"
>>> print json.dumps({"c": 0, "b": 0, "a": 0}, sort_keys=True)
{"a": 0, "b": 0, "c": 0}
>>> from StringIO import StringIO
>>> io = StringIO()
>>> json.dump(['streaming API'], io)
>>> io.getvalue()
'["streaming API"]'
Compact encoding::
>>> import simplejson as json
>>> json.dumps([1,2,3,{'4': 5, '6': 7}], separators=(',',':'))
'[1,2,3,{"4":5,"6":7}]'
Pretty printing::
>>> import simplejson as json
>>> s = json.dumps({'4': 5, '6': 7}, sort_keys=True, indent=4)
>>> print '\n'.join([l.rstrip() for l in s.splitlines()])
{
"4": 5,
"6": 7
}
Decoding JSON::
>>> import simplejson as json
>>> obj = [u'foo', {u'bar': [u'baz', None, 1.0, 2]}]
>>> json.loads('["foo", {"bar":["baz", null, 1.0, 2]}]') == obj
True
>>> json.loads('"\\"foo\\bar"') == u'"foo\x08ar'
True
>>> from StringIO import StringIO
>>> io = StringIO('["streaming API"]')
>>> json.load(io)[0] == 'streaming API'
True
Specializing JSON object decoding::
>>> import simplejson as json
>>> def as_complex(dct):
... if '__complex__' in dct:
... return complex(dct['real'], dct['imag'])
... return dct
...
>>> json.loads('{"__complex__": true, "real": 1, "imag": 2}',
... object_hook=as_complex)
(1+2j)
>>> import decimal
>>> json.loads('1.1', parse_float=decimal.Decimal) == decimal.Decimal('1.1')
True
Specializing JSON object encoding::
>>> import simplejson as json
>>> def encode_complex(obj):
... if isinstance(obj, complex):
... return [obj.real, obj.imag]
... raise TypeError(repr(o) + " is not JSON serializable")
...
>>> json.dumps(2 + 1j, default=encode_complex)
'[2.0, 1.0]'
>>> json.JSONEncoder(default=encode_complex).encode(2 + 1j)
'[2.0, 1.0]'
>>> ''.join(json.JSONEncoder(default=encode_complex).iterencode(2 + 1j))
'[2.0, 1.0]'
Using simplejson.tool from the shell to validate and pretty-print::
$ echo '{"json":"obj"}' | python -m simplejson.tool
{
"json": "obj"
}
$ echo '{ 1.2:3.4}' | python -m simplejson.tool
Expecting property name: line 1 column 2 (char 2)
"""
__version__ = '2.0.9'
__all__ = [
'dump', 'dumps', 'load', 'loads',
'JSONDecoder', 'JSONEncoder',
]
__author__ = 'Bob Ippolito <bob@redivi.com>'
from decoder import JSONDecoder
from encoder import JSONEncoder
_default_encoder = JSONEncoder(
skipkeys=False,
ensure_ascii=True,
check_circular=True,
allow_nan=True,
indent=None,
separators=None,
encoding='utf-8',
default=None,
)
def dump(obj, fp, skipkeys=False, ensure_ascii=True, check_circular=True,
allow_nan=True, cls=None, indent=None, separators=None,
encoding='utf-8', default=None, **kw):
"""Serialize ``obj`` as a JSON formatted stream to ``fp`` (a
``.write()``-supporting file-like object).
If ``skipkeys`` is true then ``dict`` keys that are not basic types
(``str``, ``unicode``, ``int``, ``long``, ``float``, ``bool``, ``None``)
will be skipped instead of raising a ``TypeError``.
If ``ensure_ascii`` is false, then the some chunks written to ``fp``
may be ``unicode`` instances, subject to normal Python ``str`` to
``unicode`` coercion rules. Unless ``fp.write()`` explicitly
understands ``unicode`` (as in ``codecs.getwriter()``) this is likely
to cause an error.
If ``check_circular`` is false, then the circular reference check
for container types will be skipped and a circular reference will
result in an ``OverflowError`` (or worse).
If ``allow_nan`` is false, then it will be a ``ValueError`` to
serialize out of range ``float`` values (``nan``, ``inf``, ``-inf``)
in strict compliance of the JSON specification, instead of using the
JavaScript equivalents (``NaN``, ``Infinity``, ``-Infinity``).
If ``indent`` is a non-negative integer, then JSON array elements and object
members will be pretty-printed with that indent level. An indent level
of 0 will only insert newlines. ``None`` is the most compact representation.
If ``separators`` is an ``(item_separator, dict_separator)`` tuple
then it will be used instead of the default ``(', ', ': ')`` separators.
``(',', ':')`` is the most compact JSON representation.
``encoding`` is the character encoding for str instances, default is UTF-8.
``default(obj)`` is a function that should return a serializable version
of obj or raise TypeError. The default simply raises TypeError.
To use a custom ``JSONEncoder`` subclass (e.g. one that overrides the
``.default()`` method to serialize additional types), specify it with
the ``cls`` kwarg.
"""
# cached encoder
if (not skipkeys and ensure_ascii and
check_circular and allow_nan and
cls is None and indent is None and separators is None and
encoding == 'utf-8' and default is None and not kw):
iterable = _default_encoder.iterencode(obj)
else:
if cls is None:
cls = JSONEncoder
iterable = cls(skipkeys=skipkeys, ensure_ascii=ensure_ascii,
check_circular=check_circular, allow_nan=allow_nan, indent=indent,
separators=separators, encoding=encoding,
default=default, **kw).iterencode(obj)
# could accelerate with writelines in some versions of Python, at
# a debuggability cost
for chunk in iterable:
fp.write(chunk)
def dumps(obj, skipkeys=False, ensure_ascii=True, check_circular=True,
allow_nan=True, cls=None, indent=None, separators=None,
encoding='utf-8', default=None, **kw):
"""Serialize ``obj`` to a JSON formatted ``str``.
If ``skipkeys`` is false then ``dict`` keys that are not basic types
(``str``, ``unicode``, ``int``, ``long``, ``float``, ``bool``, ``None``)
will be skipped instead of raising a ``TypeError``.
If ``ensure_ascii`` is false, then the return value will be a
``unicode`` instance subject to normal Python ``str`` to ``unicode``
coercion rules instead of being escaped to an ASCII ``str``.
If ``check_circular`` is false, then the circular reference check
for container types will be skipped and a circular reference will
result in an ``OverflowError`` (or worse).
If ``allow_nan`` is false, then it will be a ``ValueError`` to
serialize out of range ``float`` values (``nan``, ``inf``, ``-inf``) in
strict compliance of the JSON specification, instead of using the
JavaScript equivalents (``NaN``, ``Infinity``, ``-Infinity``).
If ``indent`` is a non-negative integer, then JSON array elements and
object members will be pretty-printed with that indent level. An indent
level of 0 will only insert newlines. ``None`` is the most compact
representation.
If ``separators`` is an ``(item_separator, dict_separator)`` tuple
then it will be used instead of the default ``(', ', ': ')`` separators.
``(',', ':')`` is the most compact JSON representation.
``encoding`` is the character encoding for str instances, default is UTF-8.
``default(obj)`` is a function that should return a serializable version
of obj or raise TypeError. The default simply raises TypeError.
To use a custom ``JSONEncoder`` subclass (e.g. one that overrides the
``.default()`` method to serialize additional types), specify it with
the ``cls`` kwarg.
"""
# cached encoder
if (not skipkeys and ensure_ascii and
check_circular and allow_nan and
cls is None and indent is None and separators is None and
encoding == 'utf-8' and default is None and not kw):
return _default_encoder.encode(obj)
if cls is None:
cls = JSONEncoder
return cls(
skipkeys=skipkeys, ensure_ascii=ensure_ascii,
check_circular=check_circular, allow_nan=allow_nan, indent=indent,
separators=separators, encoding=encoding, default=default,
**kw).encode(obj)
_default_decoder = JSONDecoder(encoding=None, object_hook=None)
def load(fp, encoding=None, cls=None, object_hook=None, parse_float=None,
parse_int=None, parse_constant=None, **kw):
"""Deserialize ``fp`` (a ``.read()``-supporting file-like object containing
a JSON document) to a Python object.
If the contents of ``fp`` is encoded with an ASCII based encoding other
than utf-8 (e.g. latin-1), then an appropriate ``encoding`` name must
be specified. Encodings that are not ASCII based (such as UCS-2) are
not allowed, and should be wrapped with
``codecs.getreader(fp)(encoding)``, or simply decoded to a ``unicode``
object and passed to ``loads()``
``object_hook`` is an optional function that will be called with the
result of any object literal decode (a ``dict``). The return value of
``object_hook`` will be used instead of the ``dict``. This feature
can be used to implement custom decoders (e.g. JSON-RPC class hinting).
To use a custom ``JSONDecoder`` subclass, specify it with the ``cls``
kwarg.
"""
return loads(fp.read(),
encoding=encoding, cls=cls, object_hook=object_hook,
parse_float=parse_float, parse_int=parse_int,
parse_constant=parse_constant, **kw)
def loads(s, encoding=None, cls=None, object_hook=None, parse_float=None,
parse_int=None, parse_constant=None, **kw):
"""Deserialize ``s`` (a ``str`` or ``unicode`` instance containing a JSON
document) to a Python object.
If ``s`` is a ``str`` instance and is encoded with an ASCII based encoding
other than utf-8 (e.g. latin-1) then an appropriate ``encoding`` name
must be specified. Encodings that are not ASCII based (such as UCS-2)
are not allowed and should be decoded to ``unicode`` first.
``object_hook`` is an optional function that will be called with the
result of any object literal decode (a ``dict``). The return value of
``object_hook`` will be used instead of the ``dict``. This feature
can be used to implement custom decoders (e.g. JSON-RPC class hinting).
``parse_float``, if specified, will be called with the string
of every JSON float to be decoded. By default this is equivalent to
float(num_str). This can be used to use another datatype or parser
for JSON floats (e.g. decimal.Decimal).
``parse_int``, if specified, will be called with the string
of every JSON int to be decoded. By default this is equivalent to
int(num_str). This can be used to use another datatype or parser
for JSON integers (e.g. float).
``parse_constant``, if specified, will be called with one of the
following strings: -Infinity, Infinity, NaN, null, true, false.
This can be used to raise an exception if invalid JSON numbers
are encountered.
To use a custom ``JSONDecoder`` subclass, specify it with the ``cls``
kwarg.
"""
if (cls is None and encoding is None and object_hook is None and
parse_int is None and parse_float is None and
parse_constant is None and not kw):
return _default_decoder.decode(s)
if cls is None:
cls = JSONDecoder
if object_hook is not None:
kw['object_hook'] = object_hook
if parse_float is not None:
kw['parse_float'] = parse_float
if parse_int is not None:
kw['parse_int'] = parse_int
if parse_constant is not None:
kw['parse_constant'] = parse_constant
return cls(encoding=encoding, **kw).decode(s)

@ -0,0 +1,354 @@
"""Implementation of JSONDecoder
"""
import re
import sys
import struct
from simplejson.scanner import make_scanner
try:
from simplejson._speedups import scanstring as c_scanstring
except ImportError:
c_scanstring = None
__all__ = ['JSONDecoder']
FLAGS = re.VERBOSE | re.MULTILINE | re.DOTALL
def _floatconstants():
_BYTES = '7FF80000000000007FF0000000000000'.decode('hex')
if sys.byteorder != 'big':
_BYTES = _BYTES[:8][::-1] + _BYTES[8:][::-1]
nan, inf = struct.unpack('dd', _BYTES)
return nan, inf, -inf
NaN, PosInf, NegInf = _floatconstants()
def linecol(doc, pos):
lineno = doc.count('\n', 0, pos) + 1
if lineno == 1:
colno = pos
else:
colno = pos - doc.rindex('\n', 0, pos)
return lineno, colno
def errmsg(msg, doc, pos, end=None):
# Note that this function is called from _speedups
lineno, colno = linecol(doc, pos)
if end is None:
#fmt = '{0}: line {1} column {2} (char {3})'
#return fmt.format(msg, lineno, colno, pos)
fmt = '%s: line %d column %d (char %d)'
return fmt % (msg, lineno, colno, pos)
endlineno, endcolno = linecol(doc, end)
#fmt = '{0}: line {1} column {2} - line {3} column {4} (char {5} - {6})'
#return fmt.format(msg, lineno, colno, endlineno, endcolno, pos, end)
fmt = '%s: line %d column %d - line %d column %d (char %d - %d)'
return fmt % (msg, lineno, colno, endlineno, endcolno, pos, end)
_CONSTANTS = {
'-Infinity': NegInf,
'Infinity': PosInf,
'NaN': NaN,
}
STRINGCHUNK = re.compile(r'(.*?)(["\\\x00-\x1f])', FLAGS)
BACKSLASH = {
'"': u'"', '\\': u'\\', '/': u'/',
'b': u'\b', 'f': u'\f', 'n': u'\n', 'r': u'\r', 't': u'\t',
}
DEFAULT_ENCODING = "utf-8"
def py_scanstring(s, end, encoding=None, strict=True, _b=BACKSLASH, _m=STRINGCHUNK.match):
"""Scan the string s for a JSON string. End is the index of the
character in s after the quote that started the JSON string.
Unescapes all valid JSON string escape sequences and raises ValueError
on attempt to decode an invalid string. If strict is False then literal
control characters are allowed in the string.
Returns a tuple of the decoded string and the index of the character in s
after the end quote."""
if encoding is None:
encoding = DEFAULT_ENCODING
chunks = []
_append = chunks.append
begin = end - 1
while 1:
chunk = _m(s, end)
if chunk is None:
raise ValueError(
errmsg("Unterminated string starting at", s, begin))
end = chunk.end()
content, terminator = chunk.groups()
# Content is contains zero or more unescaped string characters
if content:
if not isinstance(content, unicode):
content = unicode(content, encoding)
_append(content)
# Terminator is the end of string, a literal control character,
# or a backslash denoting that an escape sequence follows
if terminator == '"':
break
elif terminator != '\\':
if strict:
msg = "Invalid control character %r at" % (terminator,)
#msg = "Invalid control character {0!r} at".format(terminator)
raise ValueError(errmsg(msg, s, end))
else:
_append(terminator)
continue
try:
esc = s[end]
except IndexError:
raise ValueError(
errmsg("Unterminated string starting at", s, begin))
# If not a unicode escape sequence, must be in the lookup table
if esc != 'u':
try:
char = _b[esc]
except KeyError:
msg = "Invalid \\escape: " + repr(esc)
raise ValueError(errmsg(msg, s, end))
end += 1
else:
# Unicode escape sequence
esc = s[end + 1:end + 5]
next_end = end + 5
if len(esc) != 4:
msg = "Invalid \\uXXXX escape"
raise ValueError(errmsg(msg, s, end))
uni = int(esc, 16)
# Check for surrogate pair on UCS-4 systems
if 0xd800 <= uni <= 0xdbff and sys.maxunicode > 65535:
msg = "Invalid \\uXXXX\\uXXXX surrogate pair"
if not s[end + 5:end + 7] == '\\u':
raise ValueError(errmsg(msg, s, end))
esc2 = s[end + 7:end + 11]
if len(esc2) != 4:
raise ValueError(errmsg(msg, s, end))
uni2 = int(esc2, 16)
uni = 0x10000 + (((uni - 0xd800) << 10) | (uni2 - 0xdc00))
next_end += 6
char = unichr(uni)
end = next_end
# Append the unescaped character
_append(char)
return u''.join(chunks), end
# Use speedup if available
scanstring = c_scanstring or py_scanstring
WHITESPACE = re.compile(r'[ \t\n\r]*', FLAGS)
WHITESPACE_STR = ' \t\n\r'
def JSONObject((s, end), encoding, strict, scan_once, object_hook, _w=WHITESPACE.match, _ws=WHITESPACE_STR):
pairs = {}
# Use a slice to prevent IndexError from being raised, the following
# check will raise a more specific ValueError if the string is empty
nextchar = s[end:end + 1]
# Normally we expect nextchar == '"'
if nextchar != '"':
if nextchar in _ws:
end = _w(s, end).end()
nextchar = s[end:end + 1]
# Trivial empty object
if nextchar == '}':
return pairs, end + 1
elif nextchar != '"':
raise ValueError(errmsg("Expecting property name", s, end))
end += 1
while True:
key, end = scanstring(s, end, encoding, strict)
# To skip some function call overhead we optimize the fast paths where
# the JSON key separator is ": " or just ":".
if s[end:end + 1] != ':':
end = _w(s, end).end()
if s[end:end + 1] != ':':
raise ValueError(errmsg("Expecting : delimiter", s, end))
end += 1
try:
if s[end] in _ws:
end += 1
if s[end] in _ws:
end = _w(s, end + 1).end()
except IndexError:
pass
try:
value, end = scan_once(s, end)
except StopIteration:
raise ValueError(errmsg("Expecting object", s, end))
pairs[key] = value
try:
nextchar = s[end]
if nextchar in _ws:
end = _w(s, end + 1).end()
nextchar = s[end]
except IndexError:
nextchar = ''
end += 1
if nextchar == '}':
break
elif nextchar != ',':
raise ValueError(errmsg("Expecting , delimiter", s, end - 1))
try:
nextchar = s[end]
if nextchar in _ws:
end += 1
nextchar = s[end]
if nextchar in _ws:
end = _w(s, end + 1).end()
nextchar = s[end]
except IndexError:
nextchar = ''
end += 1
if nextchar != '"':
raise ValueError(errmsg("Expecting property name", s, end - 1))
if object_hook is not None:
pairs = object_hook(pairs)
return pairs, end
def JSONArray((s, end), scan_once, _w=WHITESPACE.match, _ws=WHITESPACE_STR):
values = []
nextchar = s[end:end + 1]
if nextchar in _ws:
end = _w(s, end + 1).end()
nextchar = s[end:end + 1]
# Look-ahead for trivial empty array
if nextchar == ']':
return values, end + 1
_append = values.append
while True:
try:
value, end = scan_once(s, end)
except StopIteration:
raise ValueError(errmsg("Expecting object", s, end))
_append(value)
nextchar = s[end:end + 1]
if nextchar in _ws:
end = _w(s, end + 1).end()
nextchar = s[end:end + 1]
end += 1
if nextchar == ']':
break
elif nextchar != ',':
raise ValueError(errmsg("Expecting , delimiter", s, end))
try:
if s[end] in _ws:
end += 1
if s[end] in _ws:
end = _w(s, end + 1).end()
except IndexError:
pass
return values, end
class JSONDecoder(object):
"""Simple JSON <http://json.org> decoder
Performs the following translations in decoding by default:
+---------------+-------------------+
| JSON | Python |
+===============+===================+
| object | dict |
+---------------+-------------------+
| array | list |
+---------------+-------------------+
| string | unicode |
+---------------+-------------------+
| number (int) | int, long |
+---------------+-------------------+
| number (real) | float |
+---------------+-------------------+
| true | True |
+---------------+-------------------+
| false | False |
+---------------+-------------------+
| null | None |
+---------------+-------------------+
It also understands ``NaN``, ``Infinity``, and ``-Infinity`` as
their corresponding ``float`` values, which is outside the JSON spec.
"""
def __init__(self, encoding=None, object_hook=None, parse_float=None,
parse_int=None, parse_constant=None, strict=True):
"""``encoding`` determines the encoding used to interpret any ``str``
objects decoded by this instance (utf-8 by default). It has no
effect when decoding ``unicode`` objects.
Note that currently only encodings that are a superset of ASCII work,
strings of other encodings should be passed in as ``unicode``.
``object_hook``, if specified, will be called with the result
of every JSON object decoded and its return value will be used in
place of the given ``dict``. This can be used to provide custom
deserializations (e.g. to support JSON-RPC class hinting).
``parse_float``, if specified, will be called with the string
of every JSON float to be decoded. By default this is equivalent to
float(num_str). This can be used to use another datatype or parser
for JSON floats (e.g. decimal.Decimal).
``parse_int``, if specified, will be called with the string
of every JSON int to be decoded. By default this is equivalent to
int(num_str). This can be used to use another datatype or parser
for JSON integers (e.g. float).
``parse_constant``, if specified, will be called with one of the
following strings: -Infinity, Infinity, NaN.
This can be used to raise an exception if invalid JSON numbers
are encountered.
"""
self.encoding = encoding
self.object_hook = object_hook
self.parse_float = parse_float or float
self.parse_int = parse_int or int
self.parse_constant = parse_constant or _CONSTANTS.__getitem__
self.strict = strict
self.parse_object = JSONObject
self.parse_array = JSONArray
self.parse_string = scanstring
self.scan_once = make_scanner(self)
def decode(self, s, _w=WHITESPACE.match):
"""Return the Python representation of ``s`` (a ``str`` or ``unicode``
instance containing a JSON document)
"""
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
end = _w(s, end).end()
if end != len(s):
raise ValueError(errmsg("Extra data", s, end, len(s)))
return obj
def raw_decode(self, s, idx=0):
"""Decode a JSON document from ``s`` (a ``str`` or ``unicode`` beginning
with a JSON document) and return a 2-tuple of the Python
representation and the index in ``s`` where the document ended.
This can be used to decode a JSON document from a string that may
have extraneous data at the end.
"""
try:
obj, end = self.scan_once(s, idx)
except StopIteration:
raise ValueError("No JSON object could be decoded")
return obj, end

@ -0,0 +1,440 @@
"""Implementation of JSONEncoder
"""
import re
try:
from simplejson._speedups import encode_basestring_ascii as c_encode_basestring_ascii
except ImportError:
c_encode_basestring_ascii = None
try:
from simplejson._speedups import make_encoder as c_make_encoder
except ImportError:
c_make_encoder = None
ESCAPE = re.compile(r'[\x00-\x1f\\"\b\f\n\r\t]')
ESCAPE_ASCII = re.compile(r'([\\"]|[^\ -~])')
HAS_UTF8 = re.compile(r'[\x80-\xff]')
ESCAPE_DCT = {
'\\': '\\\\',
'"': '\\"',
'\b': '\\b',
'\f': '\\f',
'\n': '\\n',
'\r': '\\r',
'\t': '\\t',
}
for i in range(0x20):
#ESCAPE_DCT.setdefault(chr(i), '\\u{0:04x}'.format(i))
ESCAPE_DCT.setdefault(chr(i), '\\u%04x' % (i,))
# Assume this produces an infinity on all machines (probably not guaranteed)
INFINITY = float('1e66666')
FLOAT_REPR = repr
def encode_basestring(s):
"""Return a JSON representation of a Python string
"""
def replace(match):
return ESCAPE_DCT[match.group(0)]
return '"' + ESCAPE.sub(replace, s) + '"'
def py_encode_basestring_ascii(s):
"""Return an ASCII-only JSON representation of a Python string
"""
if isinstance(s, str) and HAS_UTF8.search(s) is not None:
s = s.decode('utf-8')
def replace(match):
s = match.group(0)
try:
return ESCAPE_DCT[s]
except KeyError:
n = ord(s)
if n < 0x10000:
#return '\\u{0:04x}'.format(n)
return '\\u%04x' % (n,)
else:
# surrogate pair
n -= 0x10000
s1 = 0xd800 | ((n >> 10) & 0x3ff)
s2 = 0xdc00 | (n & 0x3ff)
#return '\\u{0:04x}\\u{1:04x}'.format(s1, s2)
return '\\u%04x\\u%04x' % (s1, s2)
return '"' + str(ESCAPE_ASCII.sub(replace, s)) + '"'
encode_basestring_ascii = c_encode_basestring_ascii or py_encode_basestring_ascii
class JSONEncoder(object):
"""Extensible JSON <http://json.org> encoder for Python data structures.
Supports the following objects and types by default:
+-------------------+---------------+
| Python | JSON |
+===================+===============+
| dict | object |
+-------------------+---------------+
| list, tuple | array |
+-------------------+---------------+
| str, unicode | string |
+-------------------+---------------+
| int, long, float | number |
+-------------------+---------------+
| True | true |
+-------------------+---------------+
| False | false |
+-------------------+---------------+
| None | null |
+-------------------+---------------+
To extend this to recognize other objects, subclass and implement a
``.default()`` method with another method that returns a serializable
object for ``o`` if possible, otherwise it should call the superclass
implementation (to raise ``TypeError``).
"""
item_separator = ', '
key_separator = ': '
def __init__(self, skipkeys=False, ensure_ascii=True,
check_circular=True, allow_nan=True, sort_keys=False,
indent=None, separators=None, encoding='utf-8', default=None):
"""Constructor for JSONEncoder, with sensible defaults.
If skipkeys is false, then it is a TypeError to attempt
encoding of keys that are not str, int, long, float or None. If
skipkeys is True, such items are simply skipped.
If ensure_ascii is true, the output is guaranteed to be str
objects with all incoming unicode characters escaped. If
ensure_ascii is false, the output will be unicode object.
If check_circular is true, then lists, dicts, and custom encoded
objects will be checked for circular references during encoding to
prevent an infinite recursion (which would cause an OverflowError).
Otherwise, no such check takes place.
If allow_nan is true, then NaN, Infinity, and -Infinity will be
encoded as such. This behavior is not JSON specification compliant,
but is consistent with most JavaScript based encoders and decoders.
Otherwise, it will be a ValueError to encode such floats.
If sort_keys is true, then the output of dictionaries will be
sorted by key; this is useful for regression tests to ensure
that JSON serializations can be compared on a day-to-day basis.
If indent is a non-negative integer, then JSON array
elements and object members will be pretty-printed with that
indent level. An indent level of 0 will only insert newlines.
None is the most compact representation.
If specified, separators should be a (item_separator, key_separator)
tuple. The default is (', ', ': '). To get the most compact JSON
representation you should specify (',', ':') to eliminate whitespace.
If specified, default is a function that gets called for objects
that can't otherwise be serialized. It should return a JSON encodable
version of the object or raise a ``TypeError``.
If encoding is not None, then all input strings will be
transformed into unicode using that encoding prior to JSON-encoding.
The default is UTF-8.
"""
self.skipkeys = skipkeys
self.ensure_ascii = ensure_ascii
self.check_circular = check_circular
self.allow_nan = allow_nan
self.sort_keys = sort_keys
self.indent = indent
if separators is not None:
self.item_separator, self.key_separator = separators
if default is not None:
self.default = default
self.encoding = encoding
def default(self, o):
"""Implement this method in a subclass such that it returns
a serializable object for ``o``, or calls the base implementation
(to raise a ``TypeError``).
For example, to support arbitrary iterators, you could
implement default like this::
def default(self, o):
try:
iterable = iter(o)
except TypeError:
pass
else:
return list(iterable)
return JSONEncoder.default(self, o)
"""
raise TypeError(repr(o) + " is not JSON serializable")
def encode(self, o):
"""Return a JSON string representation of a Python data structure.
>>> JSONEncoder().encode({"foo": ["bar", "baz"]})
'{"foo": ["bar", "baz"]}'
"""
# This is for extremely simple cases and benchmarks.
if isinstance(o, basestring):
if isinstance(o, str):
_encoding = self.encoding
if (_encoding is not None
and not (_encoding == 'utf-8')):
o = o.decode(_encoding)
if self.ensure_ascii:
return encode_basestring_ascii(o)
else:
return encode_basestring(o)
# This doesn't pass the iterator directly to ''.join() because the
# exceptions aren't as detailed. The list call should be roughly
# equivalent to the PySequence_Fast that ''.join() would do.
chunks = self.iterencode(o, _one_shot=True)
if not isinstance(chunks, (list, tuple)):
chunks = list(chunks)
return ''.join(chunks)
def iterencode(self, o, _one_shot=False):
"""Encode the given object and yield each string
representation as available.
For example::
for chunk in JSONEncoder().iterencode(bigobject):
mysocket.write(chunk)
"""
if self.check_circular:
markers = {}
else:
markers = None
if self.ensure_ascii:
_encoder = encode_basestring_ascii
else:
_encoder = encode_basestring
if self.encoding != 'utf-8':
def _encoder(o, _orig_encoder=_encoder, _encoding=self.encoding):
if isinstance(o, str):
o = o.decode(_encoding)
return _orig_encoder(o)
def floatstr(o, allow_nan=self.allow_nan, _repr=FLOAT_REPR, _inf=INFINITY, _neginf=-INFINITY):
# Check for specials. Note that this type of test is processor- and/or
# platform-specific, so do tests which don't depend on the internals.
if o != o:
text = 'NaN'
elif o == _inf:
text = 'Infinity'
elif o == _neginf:
text = '-Infinity'
else:
return _repr(o)
if not allow_nan:
raise ValueError(
"Out of range float values are not JSON compliant: " +
repr(o))
return text
if _one_shot and c_make_encoder is not None and not self.indent and not self.sort_keys:
_iterencode = c_make_encoder(
markers, self.default, _encoder, self.indent,
self.key_separator, self.item_separator, self.sort_keys,
self.skipkeys, self.allow_nan)
else:
_iterencode = _make_iterencode(
markers, self.default, _encoder, self.indent, floatstr,
self.key_separator, self.item_separator, self.sort_keys,
self.skipkeys, _one_shot)
return _iterencode(o, 0)
def _make_iterencode(markers, _default, _encoder, _indent, _floatstr, _key_separator, _item_separator, _sort_keys, _skipkeys, _one_shot,
## HACK: hand-optimized bytecode; turn globals into locals
False=False,
True=True,
ValueError=ValueError,
basestring=basestring,
dict=dict,
float=float,
id=id,
int=int,
isinstance=isinstance,
list=list,
long=long,
str=str,
tuple=tuple,
):
def _iterencode_list(lst, _current_indent_level):
if not lst:
yield '[]'
return
if markers is not None:
markerid = id(lst)
if markerid in markers:
raise ValueError("Circular reference detected")
markers[markerid] = lst
buf = '['
if _indent is not None:
_current_indent_level += 1
newline_indent = '\n' + (' ' * (_indent * _current_indent_level))
separator = _item_separator + newline_indent
buf += newline_indent
else:
newline_indent = None
separator = _item_separator
first = True
for value in lst:
if first:
first = False
else:
buf = separator
if isinstance(value, basestring):
yield buf + _encoder(value)
elif value is None:
yield buf + 'null'
elif value is True:
yield buf + 'true'
elif value is False:
yield buf + 'false'
elif isinstance(value, (int, long)):
yield buf + str(value)
elif isinstance(value, float):
yield buf + _floatstr(value)
else:
yield buf
if isinstance(value, (list, tuple)):
chunks = _iterencode_list(value, _current_indent_level)
elif isinstance(value, dict):
chunks = _iterencode_dict(value, _current_indent_level)
else:
chunks = _iterencode(value, _current_indent_level)
for chunk in chunks:
yield chunk
if newline_indent is not None:
_current_indent_level -= 1
yield '\n' + (' ' * (_indent * _current_indent_level))
yield ']'
if markers is not None:
del markers[markerid]
def _iterencode_dict(dct, _current_indent_level):
if not dct:
yield '{}'
return
if markers is not None:
markerid = id(dct)
if markerid in markers:
raise ValueError("Circular reference detected")
markers[markerid] = dct
yield '{'
if _indent is not None:
_current_indent_level += 1
newline_indent = '\n' + (' ' * (_indent * _current_indent_level))
item_separator = _item_separator + newline_indent
yield newline_indent
else:
newline_indent = None
item_separator = _item_separator
first = True
if _sort_keys:
items = dct.items()
items.sort(key=lambda kv: kv[0])
else:
items = dct.iteritems()
for key, value in items:
if isinstance(key, basestring):
pass
# JavaScript is weakly typed for these, so it makes sense to
# also allow them. Many encoders seem to do something like this.
elif isinstance(key, float):
key = _floatstr(key)
elif key is True:
key = 'true'
elif key is False:
key = 'false'
elif key is None:
key = 'null'
elif isinstance(key, (int, long)):
key = str(key)
elif _skipkeys:
continue
else:
raise TypeError("key " + repr(key) + " is not a string")
if first:
first = False
else:
yield item_separator
yield _encoder(key)
yield _key_separator
if isinstance(value, basestring):
yield _encoder(value)
elif value is None:
yield 'null'
elif value is True:
yield 'true'
elif value is False:
yield 'false'
elif isinstance(value, (int, long)):
yield str(value)
elif isinstance(value, float):
yield _floatstr(value)
else:
if isinstance(value, (list, tuple)):
chunks = _iterencode_list(value, _current_indent_level)
elif isinstance(value, dict):
chunks = _iterencode_dict(value, _current_indent_level)
else:
chunks = _iterencode(value, _current_indent_level)
for chunk in chunks:
yield chunk
if newline_indent is not None:
_current_indent_level -= 1
yield '\n' + (' ' * (_indent * _current_indent_level))
yield '}'
if markers is not None:
del markers[markerid]
def _iterencode(o, _current_indent_level):
if isinstance(o, basestring):
yield _encoder(o)
elif o is None:
yield 'null'
elif o is True:
yield 'true'
elif o is False:
yield 'false'
elif isinstance(o, (int, long)):
yield str(o)
elif isinstance(o, float):
yield _floatstr(o)
elif isinstance(o, (list, tuple)):
for chunk in _iterencode_list(o, _current_indent_level):
yield chunk
elif isinstance(o, dict):
for chunk in _iterencode_dict(o, _current_indent_level):
yield chunk
else:
if markers is not None:
markerid = id(o)
if markerid in markers:
raise ValueError("Circular reference detected")
markers[markerid] = o
o = _default(o)
for chunk in _iterencode(o, _current_indent_level):
yield chunk
if markers is not None:
del markers[markerid]
return _iterencode

@ -0,0 +1,65 @@
"""JSON token scanner
"""
import re
try:
from simplejson._speedups import make_scanner as c_make_scanner
except ImportError:
c_make_scanner = None
__all__ = ['make_scanner']
NUMBER_RE = re.compile(
r'(-?(?:0|[1-9]\d*))(\.\d+)?([eE][-+]?\d+)?',
(re.VERBOSE | re.MULTILINE | re.DOTALL))
def py_make_scanner(context):
parse_object = context.parse_object
parse_array = context.parse_array
parse_string = context.parse_string
match_number = NUMBER_RE.match
encoding = context.encoding
strict = context.strict
parse_float = context.parse_float
parse_int = context.parse_int
parse_constant = context.parse_constant
object_hook = context.object_hook
def _scan_once(string, idx):
try:
nextchar = string[idx]
except IndexError:
raise StopIteration
if nextchar == '"':
return parse_string(string, idx + 1, encoding, strict)
elif nextchar == '{':
return parse_object((string, idx + 1), encoding, strict, _scan_once, object_hook)
elif nextchar == '[':
return parse_array((string, idx + 1), _scan_once)
elif nextchar == 'n' and string[idx:idx + 4] == 'null':
return None, idx + 4
elif nextchar == 't' and string[idx:idx + 4] == 'true':
return True, idx + 4
elif nextchar == 'f' and string[idx:idx + 5] == 'false':
return False, idx + 5
m = match_number(string, idx)
if m is not None:
integer, frac, exp = m.groups()
if frac or exp:
res = parse_float(integer + (frac or '') + (exp or ''))
else:
res = parse_int(integer)
return res, m.end()
elif nextchar == 'N' and string[idx:idx + 3] == 'NaN':
return parse_constant('NaN'), idx + 3
elif nextchar == 'I' and string[idx:idx + 8] == 'Infinity':
return parse_constant('Infinity'), idx + 8
elif nextchar == '-' and string[idx:idx + 9] == '-Infinity':
return parse_constant('-Infinity'), idx + 9
else:
raise StopIteration
return _scan_once
make_scanner = c_make_scanner or py_make_scanner

@ -50,6 +50,7 @@ import mitogen.service
import mitogen.unix import mitogen.unix
import mitogen.utils import mitogen.utils
import ansible
import ansible.constants as C import ansible.constants as C
import ansible_mitogen.logging import ansible_mitogen.logging
import ansible_mitogen.services import ansible_mitogen.services
@ -59,6 +60,11 @@ from mitogen.core import b
LOG = logging.getLogger(__name__) LOG = logging.getLogger(__name__)
ANSIBLE_PKG_OVERRIDE = (
u"__version__ = %r\n"
u"__author__ = %r\n"
)
def clean_shutdown(sock): def clean_shutdown(sock):
""" """
@ -87,27 +93,6 @@ def getenv_int(key, default=0):
return default return default
def setup_gil():
"""
Set extremely long GIL release interval to let threads naturally progress
through CPU-heavy sequences without forcing the wake of another thread that
may contend trying to run the same CPU-heavy code. For the new-style work,
this drops runtime ~33% and involuntary context switches by >80%,
essentially making threads cooperatively scheduled.
"""
try:
# Python 2.
sys.setcheckinterval(100000)
except AttributeError:
pass
try:
# Python 3.
sys.setswitchinterval(10)
except AttributeError:
pass
class MuxProcess(object): class MuxProcess(object):
""" """
Implement a subprocess forked from the Ansible top-level, as a safe place Implement a subprocess forked from the Ansible top-level, as a safe place
@ -171,7 +156,7 @@ class MuxProcess(object):
if faulthandler is not None: if faulthandler is not None:
faulthandler.enable() faulthandler.enable()
setup_gil() mitogen.utils.setup_gil()
cls.unix_listener_path = mitogen.unix.make_socket_path() cls.unix_listener_path = mitogen.unix.make_socket_path()
cls.worker_sock, cls.child_sock = socket.socketpair() cls.worker_sock, cls.child_sock = socket.socketpair()
atexit.register(lambda: clean_shutdown(cls.worker_sock)) atexit.register(lambda: clean_shutdown(cls.worker_sock))
@ -222,13 +207,37 @@ class MuxProcess(object):
if secs: if secs:
mitogen.debug.dump_to_logger(secs=secs) mitogen.debug.dump_to_logger(secs=secs)
def _setup_responder(self, responder):
"""
Configure :class:`mitogen.master.ModuleResponder` to only permit
certain packages, and to generate custom responses for certain modules.
"""
responder.whitelist_prefix('ansible')
responder.whitelist_prefix('ansible_mitogen')
responder.whitelist_prefix('simplejson')
simplejson_path = os.path.join(os.path.dirname(__file__), 'compat')
sys.path.insert(0, simplejson_path)
# Ansible 2.3 is compatible with Python 2.4 targets, however
# ansible/__init__.py is not. Instead, executor/module_common.py writes
# out a 2.4-compatible namespace package for unknown reasons. So we
# copy it here.
responder.add_source_override(
fullname='ansible',
path=ansible.__file__,
source=(ANSIBLE_PKG_OVERRIDE % (
ansible.__version__,
ansible.__author__,
)).encode(),
is_pkg=True,
)
def _setup_master(self): def _setup_master(self):
""" """
Construct a Router, Broker, and mitogen.unix listener Construct a Router, Broker, and mitogen.unix listener
""" """
self.router = mitogen.master.Router(max_message_size=4096 * 1048576) self.router = mitogen.master.Router(max_message_size=4096 * 1048576)
self.router.responder.whitelist_prefix('ansible') self._setup_responder(self.router.responder)
self.router.responder.whitelist_prefix('ansible_mitogen')
mitogen.core.listen(self.router.broker, 'shutdown', self.on_broker_shutdown) mitogen.core.listen(self.router.broker, 'shutdown', self.on_broker_shutdown)
mitogen.core.listen(self.router.broker, 'exit', self.on_broker_exit) mitogen.core.listen(self.router.broker, 'exit', self.on_broker_exit)
self.listener = mitogen.unix.Listener( self.listener = mitogen.unix.Listener(

@ -26,7 +26,6 @@
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE # ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE. # POSSIBILITY OF SUCH DAMAGE.
""" """
These classes implement execution for each style of Ansible module. They are These classes implement execution for each style of Ansible module. They are
instantiated in the target context by way of target.py::run_module(). instantiated in the target context by way of target.py::run_module().
@ -35,14 +34,9 @@ Each class in here has a corresponding Planner class in planners.py that knows
how to build arguments for it, preseed related data, etc. how to build arguments for it, preseed related data, etc.
""" """
from __future__ import absolute_import
from __future__ import unicode_literals
import atexit import atexit
import ctypes import codecs
import imp import imp
import json
import logging
import os import os
import shlex import shlex
import shutil import shutil
@ -52,6 +46,23 @@ import types
import mitogen.core import mitogen.core
import ansible_mitogen.target # TODO: circular import import ansible_mitogen.target # TODO: circular import
from mitogen.core import b
from mitogen.core import bytes_partition
from mitogen.core import str_partition
from mitogen.core import str_rpartition
from mitogen.core import to_text
try:
import ctypes
except ImportError:
# Python 2.4
ctypes = None
try:
import json
except ImportError:
# Python 2.4
import simplejson as json
try: try:
# Cannot use cStringIO as it does not support Unicode. # Cannot use cStringIO as it does not support Unicode.
@ -64,6 +75,10 @@ try:
except ImportError: except ImportError:
from pipes import quote as shlex_quote from pipes import quote as shlex_quote
# Absolute imports for <2.5.
logging = __import__('logging')
# Prevent accidental import of an Ansible module from hanging on stdin read. # Prevent accidental import of an Ansible module from hanging on stdin read.
import ansible.module_utils.basic import ansible.module_utils.basic
ansible.module_utils.basic._ANSIBLE_ARGS = '{}' ansible.module_utils.basic._ANSIBLE_ARGS = '{}'
@ -72,9 +87,10 @@ ansible.module_utils.basic._ANSIBLE_ARGS = '{}'
# resolv.conf at startup and never implicitly reload it. Cope with that via an # resolv.conf at startup and never implicitly reload it. Cope with that via an
# explicit call to res_init() on each task invocation. BSD-alikes export it # explicit call to res_init() on each task invocation. BSD-alikes export it
# directly, Linux #defines it as "__res_init". # directly, Linux #defines it as "__res_init".
libc = ctypes.CDLL(None)
libc__res_init = None libc__res_init = None
for symbol in 'res_init', '__res_init': if ctypes:
libc = ctypes.CDLL(None)
for symbol in 'res_init', '__res_init':
try: try:
libc__res_init = getattr(libc, symbol) libc__res_init = getattr(libc, symbol)
except AttributeError: except AttributeError:
@ -118,8 +134,11 @@ class EnvironmentFileWatcher(object):
def _load(self): def _load(self):
try: try:
with open(self.path, 'r') as fp: fp = codecs.open(self.path, 'r', encoding='utf-8')
try:
return list(self._parse(fp)) return list(self._parse(fp))
finally:
fp.close()
except IOError: except IOError:
return [] return []
@ -133,10 +152,10 @@ class EnvironmentFileWatcher(object):
if (not bits) or bits[0].startswith('#'): if (not bits) or bits[0].startswith('#'):
continue continue
if bits[0] == 'export': if bits[0] == u'export':
bits.pop(0) bits.pop(0)
key, sep, value = (' '.join(bits)).partition('=') key, sep, value = str_partition(u' '.join(bits), u'=')
if key and sep: if key and sep:
yield key, value yield key, value
@ -437,7 +456,7 @@ class ModuleUtilsImporter(object):
mod.__path__ = [] mod.__path__ = []
mod.__package__ = str(fullname) mod.__package__ = str(fullname)
else: else:
mod.__package__ = str(fullname.rpartition('.')[0]) mod.__package__ = str(str_rpartition(to_text(fullname), '.')[0])
exec(code, mod.__dict__) exec(code, mod.__dict__)
self._loaded.add(fullname) self._loaded.add(fullname)
return mod return mod
@ -581,7 +600,7 @@ class ProgramRunner(Runner):
Return the final argument vector used to execute the program. Return the final argument vector used to execute the program.
""" """
return [ return [
self.args['_ansible_shell_executable'], self.args.get('_ansible_shell_executable', '/bin/sh'),
'-c', '-c',
self._get_shell_fragment(), self._get_shell_fragment(),
] ]
@ -598,18 +617,19 @@ class ProgramRunner(Runner):
args=self._get_argv(), args=self._get_argv(),
emulate_tty=self.emulate_tty, emulate_tty=self.emulate_tty,
) )
except Exception as e: except Exception:
LOG.exception('While running %s', self._get_argv()) LOG.exception('While running %s', self._get_argv())
e = sys.exc_info()[1]
return { return {
'rc': 1, u'rc': 1,
'stdout': '', u'stdout': u'',
'stderr': '%s: %s' % (type(e), e), u'stderr': u'%s: %s' % (type(e), e),
} }
return { return {
'rc': rc, u'rc': rc,
'stdout': mitogen.core.to_text(stdout), u'stdout': mitogen.core.to_text(stdout),
'stderr': mitogen.core.to_text(stderr), u'stderr': mitogen.core.to_text(stderr),
} }
@ -659,7 +679,7 @@ class ScriptRunner(ProgramRunner):
self.interpreter_fragment = interpreter_fragment self.interpreter_fragment = interpreter_fragment
self.is_python = is_python self.is_python = is_python
b_ENCODING_STRING = b'# -*- coding: utf-8 -*-' b_ENCODING_STRING = b('# -*- coding: utf-8 -*-')
def _get_program(self): def _get_program(self):
return self._rewrite_source( return self._rewrite_source(
@ -668,7 +688,7 @@ class ScriptRunner(ProgramRunner):
def _get_argv(self): def _get_argv(self):
return [ return [
self.args['_ansible_shell_executable'], self.args.get('_ansible_shell_executable', '/bin/sh'),
'-c', '-c',
self._get_shell_fragment(), self._get_shell_fragment(),
] ]
@ -692,13 +712,13 @@ class ScriptRunner(ProgramRunner):
# While Ansible rewrites the #! using ansible_*_interpreter, it is # While Ansible rewrites the #! using ansible_*_interpreter, it is
# never actually used to execute the script, instead it is a shell # never actually used to execute the script, instead it is a shell
# fragment consumed by shell/__init__.py::build_module_command(). # fragment consumed by shell/__init__.py::build_module_command().
new = [b'#!' + utf8(self.interpreter_fragment)] new = [b('#!') + utf8(self.interpreter_fragment)]
if self.is_python: if self.is_python:
new.append(self.b_ENCODING_STRING) new.append(self.b_ENCODING_STRING)
_, _, rest = s.partition(b'\n') _, _, rest = bytes_partition(s, b('\n'))
new.append(rest) new.append(rest)
return b'\n'.join(new) return b('\n').join(new)
class NewStyleRunner(ScriptRunner): class NewStyleRunner(ScriptRunner):
@ -786,16 +806,18 @@ class NewStyleRunner(ScriptRunner):
return self._code_by_path[self.path] return self._code_by_path[self.path]
except KeyError: except KeyError:
return self._code_by_path.setdefault(self.path, compile( return self._code_by_path.setdefault(self.path, compile(
source=self.source, # Py2.4 doesn't support kwargs.
filename="master:" + self.path, self.source, # source
mode='exec', "master:" + self.path, # filename
dont_inherit=True, 'exec', # mode
0, # flags
True, # dont_inherit
)) ))
if mitogen.core.PY3: if mitogen.core.PY3:
main_module_name = '__main__' main_module_name = '__main__'
else: else:
main_module_name = b'__main__' main_module_name = b('__main__')
def _handle_magic_exception(self, mod, exc): def _handle_magic_exception(self, mod, exc):
""" """
@ -817,8 +839,8 @@ class NewStyleRunner(ScriptRunner):
exec(code, vars(mod)) exec(code, vars(mod))
else: else:
exec('exec code in vars(mod)') exec('exec code in vars(mod)')
except Exception as e: except Exception:
self._handle_magic_exception(mod, e) self._handle_magic_exception(mod, sys.exc_info()[1])
raise raise
def _run(self): def _run(self):
@ -834,24 +856,25 @@ class NewStyleRunner(ScriptRunner):
) )
code = self._get_code() code = self._get_code()
exc = None rc = 2
try: try:
try: try:
self._run_code(code, mod) self._run_code(code, mod)
except SystemExit as e: except SystemExit:
exc = e exc = sys.exc_info()[1]
rc = exc.args[0]
finally: finally:
self.atexit_wrapper.run_callbacks() self.atexit_wrapper.run_callbacks()
return { return {
'rc': exc.args[0] if exc else 2, u'rc': rc,
'stdout': mitogen.core.to_text(sys.stdout.getvalue()), u'stdout': mitogen.core.to_text(sys.stdout.getvalue()),
'stderr': mitogen.core.to_text(sys.stderr.getvalue()), u'stderr': mitogen.core.to_text(sys.stderr.getvalue()),
} }
class JsonArgsRunner(ScriptRunner): class JsonArgsRunner(ScriptRunner):
JSON_ARGS = b'<<INCLUDE_ANSIBLE_MODULE_JSON_ARGS>>' JSON_ARGS = b('<<INCLUDE_ANSIBLE_MODULE_JSON_ARGS>>')
def _get_args_contents(self): def _get_args_contents(self):
return json.dumps(self.args).encode() return json.dumps(self.args).encode()

@ -31,14 +31,8 @@ Helper functions intended to be executed on the target. These are entrypoints
for file transfer, module execution and sundry bits like changing file modes. for file transfer, module execution and sundry bits like changing file modes.
""" """
from __future__ import absolute_import
from __future__ import unicode_literals
import errno import errno
import functools
import grp import grp
import json
import logging
import operator import operator
import os import os
import pwd import pwd
@ -51,10 +45,32 @@ import tempfile
import traceback import traceback
import types import types
# Absolute imports for <2.5.
logging = __import__('logging')
import mitogen.core import mitogen.core
import mitogen.fork import mitogen.fork
import mitogen.parent import mitogen.parent
import mitogen.service import mitogen.service
from mitogen.core import b
try:
import json
except ImportError:
import simplejson as json
try:
reduce
except NameError:
# Python 3.x.
from functools import reduce
try:
BaseException
except NameError:
# Python 2.4
BaseException = Exception
# Ansible since PR #41749 inserts "import __main__" into # Ansible since PR #41749 inserts "import __main__" into
# ansible.module_utils.basic. Mitogen's importer will refuse such an import, so # ansible.module_utils.basic. Mitogen's importer will refuse such an import, so
@ -70,14 +86,14 @@ import ansible_mitogen.runner
LOG = logging.getLogger(__name__) LOG = logging.getLogger(__name__)
MAKE_TEMP_FAILED_MSG = ( MAKE_TEMP_FAILED_MSG = (
"Unable to find a useable temporary directory. This likely means no\n" u"Unable to find a useable temporary directory. This likely means no\n"
"system-supplied TMP directory can be written to, or all directories\n" u"system-supplied TMP directory can be written to, or all directories\n"
"were mounted on 'noexec' filesystems.\n" u"were mounted on 'noexec' filesystems.\n"
"\n" u"\n"
"The following paths were tried:\n" u"The following paths were tried:\n"
" %(namelist)s\n" u" %(namelist)s\n"
"\n" u"\n"
"Please check '-vvv' output for a log of individual path errors." u"Please check '-vvv' output for a log of individual path errors."
) )
@ -99,7 +115,7 @@ def subprocess__Popen__close_fds(self, but):
a version that is O(fds) rather than O(_SC_OPEN_MAX). a version that is O(fds) rather than O(_SC_OPEN_MAX).
""" """
try: try:
names = os.listdir('/proc/self/fd') names = os.listdir(u'/proc/self/fd')
except OSError: except OSError:
# May fail if acting on a container that does not have /proc mounted. # May fail if acting on a container that does not have /proc mounted.
self._original_close_fds(but) self._original_close_fds(but)
@ -118,9 +134,9 @@ def subprocess__Popen__close_fds(self, but):
if ( if (
sys.platform.startswith('linux') and sys.platform.startswith(u'linux') and
sys.version < '3.0' and sys.version < u'3.0' and
hasattr(subprocess.Popen, '_close_fds') and hasattr(subprocess.Popen, u'_close_fds') and
not mitogen.is_master not mitogen.is_master
): ):
subprocess.Popen._original_close_fds = subprocess.Popen._close_fds subprocess.Popen._original_close_fds = subprocess.Popen._close_fds
@ -142,7 +158,7 @@ def get_small_file(context, path):
Bytestring file data. Bytestring file data.
""" """
pool = mitogen.service.get_or_create_pool(router=context.router) pool = mitogen.service.get_or_create_pool(router=context.router)
service = pool.get_service('mitogen.service.PushFileService') service = pool.get_service(u'mitogen.service.PushFileService')
return service.get(path) return service.get(path)
@ -184,9 +200,10 @@ def transfer_file(context, in_path, out_path, sync=False, set_owner=False):
if not ok: if not ok:
raise IOError('transfer of %r was interrupted.' % (in_path,)) raise IOError('transfer of %r was interrupted.' % (in_path,))
os.fchmod(fp.fileno(), metadata['mode']) set_file_mode(tmp_path, metadata['mode'], fd=fp.fileno())
if set_owner: if set_owner:
set_fd_owner(fp.fileno(), metadata['owner'], metadata['group']) set_file_owner(tmp_path, metadata['owner'], metadata['group'],
fd=fp.fileno())
finally: finally:
fp.close() fp.close()
@ -209,7 +226,8 @@ def prune_tree(path):
try: try:
os.unlink(path) os.unlink(path)
return return
except OSError as e: except OSError:
e = sys.exc_info()[1]
if not (os.path.isdir(path) and if not (os.path.isdir(path) and
e.args[0] in (errno.EPERM, errno.EISDIR)): e.args[0] in (errno.EPERM, errno.EISDIR)):
LOG.error('prune_tree(%r): %s', path, e) LOG.error('prune_tree(%r): %s', path, e)
@ -219,7 +237,8 @@ def prune_tree(path):
# Ensure write access for readonly directories. Ignore error in case # Ensure write access for readonly directories. Ignore error in case
# path is on a weird filesystem (e.g. vfat). # path is on a weird filesystem (e.g. vfat).
os.chmod(path, int('0700', 8)) os.chmod(path, int('0700', 8))
except OSError as e: except OSError:
e = sys.exc_info()[1]
LOG.warning('prune_tree(%r): %s', path, e) LOG.warning('prune_tree(%r): %s', path, e)
try: try:
@ -227,7 +246,8 @@ def prune_tree(path):
if name not in ('.', '..'): if name not in ('.', '..'):
prune_tree(os.path.join(path, name)) prune_tree(os.path.join(path, name))
os.rmdir(path) os.rmdir(path)
except OSError as e: except OSError:
e = sys.exc_info()[1]
LOG.error('prune_tree(%r): %s', path, e) LOG.error('prune_tree(%r): %s', path, e)
@ -248,7 +268,8 @@ def is_good_temp_dir(path):
if not os.path.exists(path): if not os.path.exists(path):
try: try:
os.makedirs(path, mode=int('0700', 8)) os.makedirs(path, mode=int('0700', 8))
except OSError as e: except OSError:
e = sys.exc_info()[1]
LOG.debug('temp dir %r unusable: did not exist and attempting ' LOG.debug('temp dir %r unusable: did not exist and attempting '
'to create it failed: %s', path, e) 'to create it failed: %s', path, e)
return False return False
@ -258,14 +279,16 @@ def is_good_temp_dir(path):
prefix='ansible_mitogen_is_good_temp_dir', prefix='ansible_mitogen_is_good_temp_dir',
dir=path, dir=path,
) )
except (OSError, IOError) as e: except (OSError, IOError):
e = sys.exc_info()[1]
LOG.debug('temp dir %r unusable: %s', path, e) LOG.debug('temp dir %r unusable: %s', path, e)
return False return False
try: try:
try: try:
os.chmod(tmp.name, int('0700', 8)) os.chmod(tmp.name, int('0700', 8))
except OSError as e: except OSError:
e = sys.exc_info()[1]
LOG.debug('temp dir %r unusable: chmod failed: %s', path, e) LOG.debug('temp dir %r unusable: chmod failed: %s', path, e)
return False return False
@ -273,7 +296,8 @@ def is_good_temp_dir(path):
# access(.., X_OK) is sufficient to detect noexec. # access(.., X_OK) is sufficient to detect noexec.
if not os.access(tmp.name, os.X_OK): if not os.access(tmp.name, os.X_OK):
raise OSError('filesystem appears to be mounted noexec') raise OSError('filesystem appears to be mounted noexec')
except OSError as e: except OSError:
e = sys.exc_info()[1]
LOG.debug('temp dir %r unusable: %s', path, e) LOG.debug('temp dir %r unusable: %s', path, e)
return False return False
finally: finally:
@ -351,9 +375,9 @@ def init_child(econtext, log_level, candidate_temp_dirs):
good_temp_dir = find_good_temp_dir(candidate_temp_dirs) good_temp_dir = find_good_temp_dir(candidate_temp_dirs)
return { return {
'fork_context': _fork_parent, u'fork_context': _fork_parent,
'home_dir': mitogen.core.to_text(os.path.expanduser('~')), u'home_dir': mitogen.core.to_text(os.path.expanduser('~')),
'good_temp_dir': good_temp_dir, u'good_temp_dir': good_temp_dir,
} }
@ -379,7 +403,7 @@ def run_module(kwargs):
""" """
runner_name = kwargs.pop('runner_name') runner_name = kwargs.pop('runner_name')
klass = getattr(ansible_mitogen.runner, runner_name) klass = getattr(ansible_mitogen.runner, runner_name)
impl = klass(**kwargs) impl = klass(**mitogen.core.Kwargs(kwargs))
return impl.run() return impl.run()
@ -412,8 +436,11 @@ class AsyncRunner(object):
dct.setdefault('ansible_job_id', self.job_id) dct.setdefault('ansible_job_id', self.job_id)
dct.setdefault('data', '') dct.setdefault('data', '')
with open(self.path + '.tmp', 'w') as fp: fp = open(self.path + '.tmp', 'w')
try:
fp.write(json.dumps(dct)) fp.write(json.dumps(dct))
finally:
fp.close()
os.rename(self.path + '.tmp', self.path) os.rename(self.path + '.tmp', self.path)
def _on_sigalrm(self, signum, frame): def _on_sigalrm(self, signum, frame):
@ -565,8 +592,8 @@ def exec_args(args, in_data='', chdir=None, shell=None, emulate_tty=False):
stdout, stderr = proc.communicate(in_data) stdout, stderr = proc.communicate(in_data)
if emulate_tty: if emulate_tty:
stdout = stdout.replace(b'\n', b'\r\n') stdout = stdout.replace(b('\n'), b('\r\n'))
return proc.returncode, stdout, stderr or b'' return proc.returncode, stdout, stderr or b('')
def exec_command(cmd, in_data='', chdir=None, shell=None, emulate_tty=False): def exec_command(cmd, in_data='', chdir=None, shell=None, emulate_tty=False):
@ -598,7 +625,7 @@ def read_path(path):
return open(path, 'rb').read() return open(path, 'rb').read()
def set_fd_owner(fd, owner, group=None): def set_file_owner(path, owner, group=None, fd=None):
if owner: if owner:
uid = pwd.getpwnam(owner).pw_uid uid = pwd.getpwnam(owner).pw_uid
else: else:
@ -609,7 +636,11 @@ def set_fd_owner(fd, owner, group=None):
else: else:
gid = os.getegid() gid = os.getegid()
if fd is not None and hasattr(os, 'fchown'):
os.fchown(fd, (uid, gid)) os.fchown(fd, (uid, gid))
else:
# Python<2.6
os.chown(path, (uid, gid))
def write_path(path, s, owner=None, group=None, mode=None, def write_path(path, s, owner=None, group=None, mode=None,
@ -627,9 +658,9 @@ def write_path(path, s, owner=None, group=None, mode=None,
try: try:
try: try:
if mode: if mode:
os.fchmod(fp.fileno(), mode) set_file_mode(tmp_path, mode, fd=fp.fileno())
if owner or group: if owner or group:
set_fd_owner(fp.fileno(), owner, group) set_file_owner(tmp_path, owner, group, fd=fp.fileno())
fp.write(s) fp.write(s)
finally: finally:
fp.close() fp.close()
@ -676,7 +707,7 @@ def apply_mode_spec(spec, mode):
mask = CHMOD_MASKS[ch] mask = CHMOD_MASKS[ch]
bits = CHMOD_BITS[ch] bits = CHMOD_BITS[ch]
cur_perm_bits = mode & mask cur_perm_bits = mode & mask
new_perm_bits = functools.reduce(operator.or_, (bits[p] for p in perms), 0) new_perm_bits = reduce(operator.or_, (bits[p] for p in perms), 0)
mode &= ~mask mode &= ~mask
if op == '=': if op == '=':
mode |= new_perm_bits mode |= new_perm_bits
@ -687,15 +718,21 @@ def apply_mode_spec(spec, mode):
return mode return mode
def set_file_mode(path, spec): def set_file_mode(path, spec, fd=None):
""" """
Update the permissions of a file using the same syntax as chmod(1). Update the permissions of a file using the same syntax as chmod(1).
""" """
mode = os.stat(path).st_mode if isinstance(spec, int):
new_mode = spec
if spec.isdigit(): elif not mitogen.core.PY3 and isinstance(spec, long):
new_mode = spec
elif spec.isdigit():
new_mode = int(spec, 8) new_mode = int(spec, 8)
else: else:
mode = os.stat(path).st_mode
new_mode = apply_mode_spec(spec, mode) new_mode = apply_mode_spec(spec, mode)
if fd is not None and hasattr(os, 'fchmod'):
os.fchmod(fd, new_mode)
else:
os.chmod(path, new_mode) os.chmod(path, new_mode)

@ -115,6 +115,9 @@ Connection Methods
and router, and responds to function calls identically to children and router, and responds to function calls identically to children
created using other methods. created using other methods.
The use of this method is strongly discouraged. It requires Python 2.6 or
newer, as older Pythons made no effort to reset threading state upon fork.
For long-lived processes, :meth:`local` is always better as it For long-lived processes, :meth:`local` is always better as it
guarantees a pristine interpreter state that inherited little from the guarantees a pristine interpreter state that inherited little from the
parent. Forking should only be used in performance-sensitive scenarios parent. Forking should only be used in performance-sensitive scenarios
@ -158,7 +161,9 @@ Connection Methods
* Locks held in the parent causing random deadlocks in the child, such * Locks held in the parent causing random deadlocks in the child, such
as when another thread emits a log entry via the :mod:`logging` as when another thread emits a log entry via the :mod:`logging`
package concurrent to another thread calling :meth:`fork`. package concurrent to another thread calling :meth:`fork`, or when a C
extension module calls the C library allocator, or when a thread is using
the C library DNS resolver, for example via :func:`socket.gethostbyname`.
* Objects existing in Thread-Local Storage of every non-:meth:`fork` * Objects existing in Thread-Local Storage of every non-:meth:`fork`
thread becoming permanently inaccessible, and never having their thread becoming permanently inaccessible, and never having their
@ -612,6 +617,7 @@ A random assortment of utility functions useful on masters and children.
.. currentmodule:: mitogen.utils .. currentmodule:: mitogen.utils
.. autofunction:: setup_gil
.. autofunction:: disable_site_packages .. autofunction:: disable_site_packages
.. autofunction:: log_to_file .. autofunction:: log_to_file
.. autofunction:: run_with_router(func, \*args, \**kwargs) .. autofunction:: run_with_router(func, \*args, \**kwargs)

@ -37,7 +37,9 @@ import encodings.latin_1
import errno import errno
import fcntl import fcntl
import itertools import itertools
import linecache
import logging import logging
import pickle as py_pickle
import os import os
import signal import signal
import socket import socket
@ -72,6 +74,11 @@ try:
except ImportError: except ImportError:
from io import BytesIO from io import BytesIO
try:
BaseException
except NameError:
BaseException = Exception
try: try:
ModuleNotFoundError ModuleNotFoundError
except NameError: except NameError:
@ -122,6 +129,7 @@ except NameError:
BaseException = Exception BaseException = Exception
IS_WSL = 'Microsoft' in os.uname()[2] IS_WSL = 'Microsoft' in os.uname()[2]
PY24 = sys.version_info < (2, 5)
PY3 = sys.version_info > (3,) PY3 = sys.version_info > (3,)
if PY3: if PY3:
b = str.encode b = str.encode
@ -139,9 +147,12 @@ else:
AnyTextType = (BytesType, UnicodeType) AnyTextType = (BytesType, UnicodeType)
if sys.version_info < (2, 5): try:
next
except NameError:
next = lambda it: it.next() next = lambda it: it.next()
#: Default size for calls to :meth:`Side.read` or :meth:`Side.write`, and the #: Default size for calls to :meth:`Side.read` or :meth:`Side.write`, and the
#: size of buffers configured by :func:`mitogen.parent.create_socketpair`. This #: size of buffers configured by :func:`mitogen.parent.create_socketpair`. This
#: value has many performance implications, 128KiB seems to be a sweet spot. #: value has many performance implications, 128KiB seems to be a sweet spot.
@ -231,10 +242,15 @@ class Secret(UnicodeType):
class Kwargs(dict): class Kwargs(dict):
"""A serializable dict subclass that indicates the contained keys should be """
be coerced to Unicode on Python 3 as required. Python 2 produces keyword A serializable dict subclass that indicates its keys should be coerced to
argument dicts whose keys are bytestrings, requiring a helper to ensure Unicode on Python 3 and bytes on Python<2.6.
compatibility with Python 3."""
Python 2 produces keyword argument dicts whose keys are bytes, requiring a
helper to ensure compatibility with Python 3 where Unicode is required,
whereas Python 3 produces keyword argument dicts whose keys are Unicode,
requiring a helper for Python 2.4/2.5, where bytes are required.
"""
if PY3: if PY3:
def __init__(self, dct): def __init__(self, dct):
for k, v in dct.items(): for k, v in dct.items():
@ -242,6 +258,13 @@ class Kwargs(dict):
self[k.decode()] = v self[k.decode()] = v
else: else:
self[k] = v self[k] = v
elif sys.version_info < (2, 6):
def __init__(self, dct):
for k, v in dct.iteritems():
if type(k) is unicode:
self[k.encode()] = v
else:
self[k] = v
def __repr__(self): def __repr__(self):
return 'Kwargs(%s)' % (dict.__repr__(self),) return 'Kwargs(%s)' % (dict.__repr__(self),)
@ -251,16 +274,18 @@ class Kwargs(dict):
class CallError(Error): class CallError(Error):
"""Serializable :class:`Error` subclass raised when """
:meth:`Context.call() <mitogen.parent.Context.call>` fails. A copy of Serializable :class:`Error` subclass raised when :meth:`Context.call()
the traceback from the external context is appended to the exception <mitogen.parent.Context.call>` fails. A copy of the traceback from the
message.""" external context is appended to the exception message.
"""
def __init__(self, fmt=None, *args): def __init__(self, fmt=None, *args):
if not isinstance(fmt, BaseException): if not isinstance(fmt, BaseException):
Error.__init__(self, fmt, *args) Error.__init__(self, fmt, *args)
else: else:
e = fmt e = fmt
fmt = '%s.%s: %s' % (type(e).__module__, type(e).__name__, e) cls = e.__class__
fmt = '%s.%s: %s' % (cls.__module__, cls.__name__, e)
tb = sys.exc_info()[2] tb = sys.exc_info()[2]
if tb: if tb:
fmt += '\n' fmt += '\n'
@ -274,9 +299,7 @@ class CallError(Error):
def _unpickle_call_error(s): def _unpickle_call_error(s):
if not (type(s) is UnicodeType and len(s) < 10000): if not (type(s) is UnicodeType and len(s) < 10000):
raise TypeError('cannot unpickle CallError: bad input') raise TypeError('cannot unpickle CallError: bad input')
inst = CallError.__new__(CallError) return CallError(s)
Exception.__init__(inst, s)
return inst
class ChannelError(Error): class ChannelError(Error):
@ -304,6 +327,39 @@ def to_text(o):
return UnicodeType(o) return UnicodeType(o)
# Python 2.4
try:
any
except NameError:
def any(it):
for elem in it:
if elem:
return True
def _partition(s, sep, find):
"""
(str|unicode).(partition|rpartition) for Python 2.4/2.5.
"""
idx = find(sep)
if idx != -1:
left = s[0:idx]
return left, sep, s[len(left)+len(sep):]
if hasattr(UnicodeType, 'rpartition'):
str_partition = UnicodeType.partition
str_rpartition = UnicodeType.rpartition
bytes_partition = BytesType.partition
else:
def str_partition(s, sep):
return _partition(s, sep, s.find) or (s, u'', u'')
def str_rpartition(s, sep):
return _partition(s, sep, s.rfind) or (u'', u'', s)
def bytes_partition(s, sep):
return _partition(s, sep, s.find) or (s, '', '')
def has_parent_authority(msg, _stream=None): def has_parent_authority(msg, _stream=None):
"""Policy function for use with :class:`Receiver` and """Policy function for use with :class:`Receiver` and
:meth:`Router.add_handler` that requires incoming messages to originate :meth:`Router.add_handler` that requires incoming messages to originate
@ -399,20 +455,20 @@ def io_op(func, *args):
signalled by :data:`errno.EPIPE`. signalled by :data:`errno.EPIPE`.
:returns: :returns:
Tuple of `(return_value, disconnected)`, where `return_value` is the Tuple of `(return_value, disconnect_reason)`, where `return_value` is
return value of `func(*args)`, and `disconnected` is :data:`True` if the return value of `func(*args)`, and `disconnected` is an exception
disconnection was detected, otherwise :data:`False`. instance when disconnection was detected, otherwise :data:`None`.
""" """
while True: while True:
try: try:
return func(*args), False return func(*args), None
except (select.error, OSError, IOError): except (select.error, OSError, IOError):
e = sys.exc_info()[1] e = sys.exc_info()[1]
_vv and IOLOG.debug('io_op(%r) -> OSError: %s', func, e) _vv and IOLOG.debug('io_op(%r) -> OSError: %s', func, e)
if e.args[0] == errno.EINTR: if e.args[0] == errno.EINTR:
continue continue
if e.args[0] in (errno.EIO, errno.ECONNRESET, errno.EPIPE): if e.args[0] in (errno.EIO, errno.ECONNRESET, errno.EPIPE):
return None, True return None, e
raise raise
@ -501,13 +557,49 @@ def import_module(modname):
return __import__(modname, None, None, ['']) return __import__(modname, None, None, [''])
class Py24Pickler(py_pickle.Pickler):
"""
Exceptions were classic classes until Python 2.5. Sadly for 2.4, cPickle
offers little control over how a classic instance is pickled. Therefore 2.4
uses a pure-Python pickler, so CallError can be made to look as it does on
newer Pythons.
This mess will go away once proper serialization exists.
"""
@classmethod
def dumps(cls, obj, protocol):
bio = BytesIO()
self = cls(bio, protocol=protocol)
self.dump(obj)
return bio.getvalue()
def save_exc_inst(self, obj):
if isinstance(obj, CallError):
func, args = obj.__reduce__()
self.save(func)
self.save(args)
self.write(py_pickle.REDUCE)
else:
py_pickle.Pickler.save_inst(self, obj)
if PY24:
dispatch = py_pickle.Pickler.dispatch.copy()
dispatch[py_pickle.InstanceType] = save_exc_inst
if PY3: if PY3:
# In 3.x Unpickler is a class exposing find_class as an overridable, but it # In 3.x Unpickler is a class exposing find_class as an overridable, but it
# cannot be overridden without subclassing. # cannot be overridden without subclassing.
class _Unpickler(pickle.Unpickler): class _Unpickler(pickle.Unpickler):
def find_class(self, module, func): def find_class(self, module, func):
return self.find_global(module, func) return self.find_global(module, func)
pickle__dumps = pickle.dumps
elif PY24:
# On Python 2.4, we must use a pure-Python pickler.
pickle__dumps = Py24Pickler.dumps
_Unpickler = pickle.Unpickler
else: else:
pickle__dumps = pickle.dumps
# In 2.x Unpickler is a function exposing a writeable find_global # In 2.x Unpickler is a function exposing a writeable find_global
# attribute. # attribute.
_Unpickler = pickle.Unpickler _Unpickler = pickle.Unpickler
@ -580,7 +672,7 @@ class Message(object):
"""Return the class implementing `module_name.class_name` or raise """Return the class implementing `module_name.class_name` or raise
`StreamError` if the module is not whitelisted.""" `StreamError` if the module is not whitelisted."""
if module == __name__: if module == __name__:
if func == '_unpickle_call_error': if func == '_unpickle_call_error' or func == 'CallError':
return _unpickle_call_error return _unpickle_call_error
elif func == '_unpickle_sender': elif func == '_unpickle_sender':
return self._unpickle_sender return self._unpickle_sender
@ -627,10 +719,10 @@ class Message(object):
""" """
self = cls(**kwargs) self = cls(**kwargs)
try: try:
self.data = pickle.dumps(obj, protocol=2) self.data = pickle__dumps(obj, protocol=2)
except pickle.PicklingError: except pickle.PicklingError:
e = sys.exc_info()[1] e = sys.exc_info()[1]
self.data = pickle.dumps(CallError(e), protocol=2) self.data = pickle__dumps(CallError(e), protocol=2)
return self return self
def reply(self, msg, router=None, **kwargs): def reply(self, msg, router=None, **kwargs):
@ -986,6 +1078,8 @@ class Importer(object):
# a negative round-trip. # a negative round-trip.
'builtins', 'builtins',
'__builtin__', '__builtin__',
'thread',
# org.python.core imported by copy, pickle, xml.sax; breaks Jython, but # org.python.core imported by copy, pickle, xml.sax; breaks Jython, but
# very unlikely to trigger a bug report. # very unlikely to trigger a bug report.
'org', 'org',
@ -1005,15 +1099,31 @@ class Importer(object):
self._callbacks = {} self._callbacks = {}
self._cache = {} self._cache = {}
if core_src: if core_src:
self._update_linecache('x/mitogen/core.py', core_src)
self._cache['mitogen.core'] = ( self._cache['mitogen.core'] = (
'mitogen.core', 'mitogen.core',
None, None,
'mitogen/core.py', 'x/mitogen/core.py',
zlib.compress(core_src, 9), zlib.compress(core_src, 9),
[], [],
) )
self._install_handler(router) self._install_handler(router)
def _update_linecache(self, path, data):
"""
The Python 2.4 linecache module, used to fetch source code for
tracebacks and :func:`inspect.getsource`, does not support PEP-302,
meaning it needs extra help to for Mitogen-loaded modules. Directly
populate its cache if a loaded module belongs to the Mitogen package.
"""
if PY24 and 'mitogen' in path:
linecache.cache[path] = (
len(data),
0.0,
[line+'\n' for line in data.splitlines()],
path,
)
def _install_handler(self, router): def _install_handler(self, router):
router.add_handler( router.add_handler(
fn=self._on_load_module, fn=self._on_load_module,
@ -1031,7 +1141,7 @@ class Importer(object):
if fullname == '__main__': if fullname == '__main__':
raise ModuleNotFoundError() raise ModuleNotFoundError()
parent, _, modname = fullname.rpartition('.') parent, _, modname = str_rpartition(fullname, '.')
if parent: if parent:
path = sys.modules[parent].__path__ path = sys.modules[parent].__path__
else: else:
@ -1048,7 +1158,8 @@ class Importer(object):
_tls.running = True _tls.running = True
try: try:
_v and LOG.debug('%r.find_module(%r)', self, fullname) _v and LOG.debug('%r.find_module(%r)', self, fullname)
pkgname, dot, _ = fullname.rpartition('.') fullname = to_text(fullname)
pkgname, dot, _ = str_rpartition(fullname, '.')
pkg = sys.modules.get(pkgname) pkg = sys.modules.get(pkgname)
if pkgname and getattr(pkg, '__loader__', None) is not self: if pkgname and getattr(pkg, '__loader__', None) is not self:
LOG.debug('%r: %r is submodule of a package we did not load', LOG.debug('%r: %r is submodule of a package we did not load',
@ -1127,6 +1238,11 @@ class Importer(object):
self._lock.acquire() self._lock.acquire()
try: try:
self._cache[fullname] = tup self._cache[fullname] = tup
if tup[2] is not None and PY24:
self._update_linecache(
path='master:' + tup[2],
data=zlib.decompress(tup[3])
)
callbacks = self._callbacks.pop(fullname, []) callbacks = self._callbacks.pop(fullname, [])
finally: finally:
self._lock.release() self._lock.release()
@ -1177,14 +1293,19 @@ class Importer(object):
mod.__package__ = fullname mod.__package__ = fullname
self._present[fullname] = pkg_present self._present[fullname] = pkg_present
else: else:
mod.__package__ = fullname.rpartition('.')[0] or None mod.__package__ = str_rpartition(fullname, '.')[0] or None
if mod.__package__ and not PY3: if mod.__package__ and not PY3:
# 2.x requires __package__ to be exactly a string. # 2.x requires __package__ to be exactly a string.
mod.__package__ = mod.__package__.encode() mod.__package__ = mod.__package__.encode()
source = self.get_source(fullname) source = self.get_source(fullname)
try:
code = compile(source, mod.__file__, 'exec', 0, 1) code = compile(source, mod.__file__, 'exec', 0, 1)
except SyntaxError:
LOG.exception('while importing %r', fullname)
raise
if PY3: if PY3:
exec(code, vars(mod)) exec(code, vars(mod))
else: else:
@ -1222,6 +1343,11 @@ class LogHandler(logging.Handler):
self._buffer = [] self._buffer = []
def uncork(self): def uncork(self):
"""
#305: during startup :class:`LogHandler` may be installed before it is
possible to route messages, therefore messages are buffered until
:meth:`uncork` is called by :class:`ExternalContext`.
"""
self._send = self.context.send self._send = self.context.send
for msg in self._buffer: for msg in self._buffer:
self._send(msg) self._send(msg)
@ -1777,8 +1903,10 @@ class Poller(object):
""" """
pass pass
_readmask = select.POLLIN | select.POLLHUP
def _update(self, fd): def _update(self, fd):
mask = (((fd in self._rfds) and select.POLLIN) | mask = (((fd in self._rfds) and self._readmask) |
((fd in self._wfds) and select.POLLOUT)) ((fd in self._wfds) and select.POLLOUT))
if mask: if mask:
self._pollobj.register(fd, mask) self._pollobj.register(fd, mask)
@ -1828,8 +1956,8 @@ class Poller(object):
events, _ = io_op(self._pollobj.poll, timeout) events, _ = io_op(self._pollobj.poll, timeout)
for fd, event in events: for fd, event in events:
if event & select.POLLIN: if event & self._readmask:
_vv and IOLOG.debug('%r: POLLIN for %r', self, fd) _vv and IOLOG.debug('%r: POLLIN|POLLHUP for %r', self, fd)
data, gen = self._rfds.get(fd, (None, None)) data, gen = self._rfds.get(fd, (None, None))
if gen and gen < self._generation: if gen and gen < self._generation:
yield data yield data
@ -2094,7 +2222,7 @@ class Latch(object):
return 'Latch(%#x, size=%d, t=%r)' % ( return 'Latch(%#x, size=%d, t=%r)' % (
id(self), id(self),
len(self._queue), len(self._queue),
threading.currentThread().name, threading.currentThread().getName(),
) )
@ -2230,7 +2358,7 @@ class IoLogger(BasicStream):
def _log_lines(self): def _log_lines(self):
while self._buf.find('\n') != -1: while self._buf.find('\n') != -1:
line, _, self._buf = self._buf.partition('\n') line, _, self._buf = str_partition(self._buf, '\n')
self._log.info('%s', line.rstrip('\n')) self._log.info('%s', line.rstrip('\n'))
def on_shutdown(self, broker): def on_shutdown(self, broker):
@ -2312,7 +2440,7 @@ class Router(object):
""" """
LOG.error('%r._on_del_route() %r', self, msg) LOG.error('%r._on_del_route() %r', self, msg)
if not msg.is_dead: if not msg.is_dead:
target_id_s, _, name = msg.data.partition(b(':')) target_id_s, _, name = bytes_partition(msg.data, b(':'))
target_id = int(target_id_s, 10) target_id = int(target_id_s, 10)
if target_id not in self._context_by_id: if target_id not in self._context_by_id:
LOG.debug('DEL_ROUTE for unknown ID %r: %r', target_id, msg) LOG.debug('DEL_ROUTE for unknown ID %r: %r', target_id, msg)

@ -39,6 +39,18 @@ import mitogen.parent
LOG = logging.getLogger('mitogen') LOG = logging.getLogger('mitogen')
# Python 2.4/2.5 cannot support fork+threads whatsoever, it doesn't even fix up
# interpreter state. So 2.4/2.5 interpreters start .local() contexts for
# isolation instead. Since we don't have any crazy memory sharing problems to
# avoid, there is no virginal fork parent either. The child is started directly
# from the login/become process. In future this will be default everywhere,
# fork is brainwrong from the stone age.
FORK_SUPPORTED = sys.version_info >= (2, 6)
class Error(mitogen.core.StreamError):
pass
def fixup_prngs(): def fixup_prngs():
""" """
@ -113,9 +125,19 @@ class Stream(mitogen.parent.Stream):
#: User-supplied function for cleaning up child process state. #: User-supplied function for cleaning up child process state.
on_fork = None on_fork = None
python_version_msg = (
"The mitogen.fork method is not supported on Python versions "
"prior to 2.6, since those versions made no attempt to repair "
"critical interpreter state following a fork. Please use the "
"local() method instead."
)
def construct(self, old_router, max_message_size, on_fork=None, def construct(self, old_router, max_message_size, on_fork=None,
debug=False, profiling=False, unidirectional=False, debug=False, profiling=False, unidirectional=False,
on_start=None): on_start=None):
if not FORK_SUPPORTED:
raise Error(self.python_version_msg)
# fork method only supports a tiny subset of options. # fork method only supports a tiny subset of options.
super(Stream, self).construct(max_message_size=max_message_size, super(Stream, self).construct(max_message_size=max_message_size,
debug=debug, profiling=profiling, debug=debug, profiling=profiling,
@ -183,6 +205,7 @@ class Stream(mitogen.parent.Stream):
if self.on_start: if self.on_start:
config['on_start'] = self.on_start config['on_start'] = self.on_start
try:
try: try:
mitogen.core.ExternalContext(config).main() mitogen.core.ExternalContext(config).main()
except Exception: except Exception:

@ -59,13 +59,25 @@ import mitogen.minify
import mitogen.parent import mitogen.parent
from mitogen.core import b from mitogen.core import b
from mitogen.core import to_text
from mitogen.core import LOG
from mitogen.core import IOLOG from mitogen.core import IOLOG
from mitogen.core import LOG
from mitogen.core import str_partition
from mitogen.core import str_rpartition
from mitogen.core import to_text
imap = getattr(itertools, 'imap', map) imap = getattr(itertools, 'imap', map)
izip = getattr(itertools, 'izip', zip) izip = getattr(itertools, 'izip', zip)
try:
any
except NameError:
from mitogen.core import any
try:
next
except NameError:
from mitogen.core import next
RLOG = logging.getLogger('mitogen.ctx') RLOG = logging.getLogger('mitogen.ctx')
@ -146,7 +158,7 @@ IMPORT_NAME = dis.opname.index('IMPORT_NAME')
def _getarg(nextb, c): def _getarg(nextb, c):
if c > dis.HAVE_ARGUMENT: if c >= dis.HAVE_ARGUMENT:
return nextb() | (nextb() << 8) return nextb() | (nextb() << 8)
@ -172,9 +184,10 @@ else:
def scan_code_imports(co): def scan_code_imports(co):
"""Given a code object `co`, scan its bytecode yielding any """
``IMPORT_NAME`` and associated prior ``LOAD_CONST`` instructions Given a code object `co`, scan its bytecode yielding any ``IMPORT_NAME``
representing an `Import` statement or `ImportFrom` statement. and associated prior ``LOAD_CONST`` instructions representing an `Import`
statement or `ImportFrom` statement.
:return: :return:
Generator producing `(level, modname, namelist)` tuples, where: Generator producing `(level, modname, namelist)` tuples, where:
@ -188,6 +201,7 @@ def scan_code_imports(co):
""" """
opit = iter_opcodes(co) opit = iter_opcodes(co)
opit, opit2, opit3 = itertools.tee(opit, 3) opit, opit2, opit3 = itertools.tee(opit, 3)
try: try:
next(opit2) next(opit2)
next(opit3) next(opit3)
@ -195,6 +209,7 @@ def scan_code_imports(co):
except StopIteration: except StopIteration:
return return
if sys.version_info >= (2, 5):
for oparg1, oparg2, (op3, arg3) in izip(opit, opit2, opit3): for oparg1, oparg2, (op3, arg3) in izip(opit, opit2, opit3):
if op3 == IMPORT_NAME: if op3 == IMPORT_NAME:
op2, arg2 = oparg2 op2, arg2 = oparg2
@ -203,6 +218,13 @@ def scan_code_imports(co):
yield (co.co_consts[arg1], yield (co.co_consts[arg1],
co.co_names[arg3], co.co_names[arg3],
co.co_consts[arg2] or ()) co.co_consts[arg2] or ())
else:
# Python 2.4 did not yet have 'level', so stack format differs.
for oparg1, (op2, arg2) in izip(opit, opit2):
if op2 == IMPORT_NAME:
op1, arg1 = oparg1
if op1 == LOAD_CONST:
yield (-1, co.co_names[arg2], co.co_consts[arg1] or ())
class ThreadWatcher(object): class ThreadWatcher(object):
@ -324,11 +346,21 @@ class LogForwarder(object):
self._cache[msg.src_id] = logger = logging.getLogger(name) self._cache[msg.src_id] = logger = logging.getLogger(name)
name, level_s, s = msg.data.decode('latin1').split('\x00', 2) name, level_s, s = msg.data.decode('latin1').split('\x00', 2)
logger.log(int(level_s), '%s: %s', name, s, extra={
'mitogen_message': s, # See logging.Handler.makeRecord()
'mitogen_context': self._router.context_by_id(msg.src_id), record = logging.LogRecord(
'mitogen_name': name, name=logger.name,
}) level=int(level_s),
pathname='(unknown file)',
lineno=0,
msg=('%s: %s' % (name, s)),
args=(),
exc_info=None,
)
record.mitogen_message = s
record.mitogen_context = self._router.context_by_id(msg.src_id)
record.mitogen_name = name
logger.handle(record)
def __repr__(self): def __repr__(self):
return 'LogForwarder(%r)' % (self._router,) return 'LogForwarder(%r)' % (self._router,)
@ -464,7 +496,7 @@ class ModuleFinder(object):
# else we could return junk. # else we could return junk.
return return
pkgname, _, modname = fullname.rpartition('.') pkgname, _, modname = str_rpartition(to_text(fullname), u'.')
pkg = sys.modules.get(pkgname) pkg = sys.modules.get(pkgname)
if pkg is None or not hasattr(pkg, '__file__'): if pkg is None or not hasattr(pkg, '__file__'):
return return
@ -480,6 +512,7 @@ class ModuleFinder(object):
source = fp.read() source = fp.read()
finally: finally:
if fp:
fp.close() fp.close()
if isinstance(source, mitogen.core.UnicodeType): if isinstance(source, mitogen.core.UnicodeType):
@ -491,6 +524,25 @@ class ModuleFinder(object):
e = sys.exc_info()[1] e = sys.exc_info()[1]
LOG.debug('imp.find_module(%r, %r) -> %s', modname, [pkg_path], e) LOG.debug('imp.find_module(%r, %r) -> %s', modname, [pkg_path], e)
def add_source_override(self, fullname, path, source, is_pkg):
"""
Explicitly install a source cache entry, preventing usual lookup
methods from being used.
Beware the value of `path` is critical when `is_pkg` is specified,
since it directs where submodules are searched for.
:param str fullname:
Name of the module to override.
:param str path:
Module's path as it will appear in the cache.
:param bytes source:
Module source code as a bytestring.
:param bool is_pkg:
:data:`True` if the module is a package.
"""
self._found_cache[fullname] = (path, source, is_pkg)
get_module_methods = [_get_module_via_pkgutil, get_module_methods = [_get_module_via_pkgutil,
_get_module_via_sys_modules, _get_module_via_sys_modules,
_get_module_via_parent_enumeration] _get_module_via_parent_enumeration]
@ -540,7 +592,7 @@ class ModuleFinder(object):
def generate_parent_names(self, fullname): def generate_parent_names(self, fullname):
while '.' in fullname: while '.' in fullname:
fullname, _, _ = fullname.rpartition('.') fullname, _, _ = str_rpartition(to_text(fullname), u'.')
yield fullname yield fullname
def find_related_imports(self, fullname): def find_related_imports(self, fullname):
@ -583,7 +635,7 @@ class ModuleFinder(object):
return self._related_cache.setdefault(fullname, sorted( return self._related_cache.setdefault(fullname, sorted(
set( set(
name mitogen.core.to_text(name)
for name in maybe_names for name in maybe_names
if sys.modules.get(name) is not None if sys.modules.get(name) is not None
and not is_stdlib_name(name) and not is_stdlib_name(name)
@ -609,7 +661,7 @@ class ModuleFinder(object):
while stack: while stack:
name = stack.pop(0) name = stack.pop(0)
names = self.find_related_imports(name) names = self.find_related_imports(name)
stack.extend(set(names).difference(found, stack)) stack.extend(set(names).difference(set(found).union(stack)))
found.update(names) found.update(names)
found.discard(fullname) found.discard(fullname)
@ -643,6 +695,12 @@ class ModuleResponder(object):
def __repr__(self): def __repr__(self):
return 'ModuleResponder(%r)' % (self._router,) return 'ModuleResponder(%r)' % (self._router,)
def add_source_override(self, fullname, path, source, is_pkg):
"""
See :meth:`ModuleFinder.add_source_override.
"""
self._finder.add_source_override(fullname, path, source, is_pkg)
MAIN_RE = re.compile(b(r'^if\s+__name__\s*==\s*.__main__.\s*:'), re.M) MAIN_RE = re.compile(b(r'^if\s+__name__\s*==\s*.__main__.\s*:'), re.M)
main_guard_msg = ( main_guard_msg = (
"A child context attempted to import __main__, however the main " "A child context attempted to import __main__, however the main "
@ -759,7 +817,7 @@ class ModuleResponder(object):
return return
for name in tup[4]: # related for name in tup[4]: # related
parent, _, _ = name.partition('.') parent, _, _ = str_partition(name, '.')
if parent != fullname and parent not in stream.sent_modules: if parent != fullname and parent not in stream.sent_modules:
# Parent hasn't been sent, so don't load submodule yet. # Parent hasn't been sent, so don't load submodule yet.
continue continue
@ -802,7 +860,7 @@ class ModuleResponder(object):
path = [] path = []
while fullname: while fullname:
path.append(fullname) path.append(fullname)
fullname, _, _ = fullname.rpartition('.') fullname, _, _ = str_rpartition(fullname, u'.')
for fullname in reversed(path): for fullname in reversed(path):
stream = self._router.stream_by_id(context.context_id) stream = self._router.stream_by_id(context.context_id)
@ -812,7 +870,7 @@ class ModuleResponder(object):
def _forward_modules(self, context, fullnames): def _forward_modules(self, context, fullnames):
IOLOG.debug('%r._forward_modules(%r, %r)', self, context, fullnames) IOLOG.debug('%r._forward_modules(%r, %r)', self, context, fullnames)
for fullname in fullnames: for fullname in fullnames:
self._forward_one_module(context, fullname) self._forward_one_module(context, mitogen.core.to_text(fullname))
def forward_modules(self, context, fullnames): def forward_modules(self, context, fullnames):
self._router.broker.defer(self._forward_modules, context, fullnames) self._router.broker.defer(self._forward_modules, context, fullnames)
@ -873,7 +931,18 @@ class Router(mitogen.parent.Router):
:param mitogen.master.Broker broker: :param mitogen.master.Broker broker:
Broker to use. If not specified, a private :class:`Broker` is created. Broker to use. If not specified, a private :class:`Broker` is created.
:param int max_message_size:
Override the maximum message size this router is willing to receive or
transmit. Any value set here is automatically inherited by any children
created by the router.
This has a liberal default of 128 MiB, but may be set much lower.
Beware that setting it below 64KiB may encourage unexpected failures as
parents and children can no longer route large Python modules that may
be required by your application.
""" """
broker_class = Broker broker_class = Broker
#: When :data:`True`, cause the broker thread and any subsequent broker and #: When :data:`True`, cause the broker thread and any subsequent broker and

@ -60,17 +60,26 @@ except ImportError:
import mitogen.core import mitogen.core
from mitogen.core import b from mitogen.core import b
from mitogen.core import bytes_partition
from mitogen.core import LOG from mitogen.core import LOG
from mitogen.core import IOLOG from mitogen.core import IOLOG
try:
next
except NameError:
# Python 2.4/2.5
from mitogen.core import next
itervalues = getattr(dict, 'itervalues', dict.values) itervalues = getattr(dict, 'itervalues', dict.values)
if mitogen.core.PY3: if mitogen.core.PY3:
xrange = range xrange = range
closure_attr = '__closure__' closure_attr = '__closure__'
IM_SELF_ATTR = '__self__'
else: else:
closure_attr = 'func_closure' closure_attr = 'func_closure'
IM_SELF_ATTR = 'im_self'
try: try:
@ -93,8 +102,25 @@ SYS_EXECUTABLE_MSG = (
) )
_sys_executable_warning_logged = False _sys_executable_warning_logged = False
LINUX_TIOCGPTN = 2147767344 # Get PTY number; asm-generic/ioctls.h
LINUX_TIOCSPTLCK = 1074025521 # Lock/unlock PTY; asm-generic/ioctls.h def _ioctl_cast(n):
"""
Linux ioctl() request parameter is unsigned, whereas on BSD/Darwin it is
signed. Until 2.5 Python exclusively implemented the BSD behaviour,
preventing use of large unsigned int requests like the TTY layer uses
below. So on 2.4, we cast our unsigned to look like signed for Python.
"""
if sys.version_info < (2, 5):
n, = struct.unpack('i', struct.pack('I', n))
return n
# Get PTY number; asm-generic/ioctls.h
LINUX_TIOCGPTN = _ioctl_cast(2147767344)
# Lock/unlock PTY; asm-generic/ioctls.h
LINUX_TIOCSPTLCK = _ioctl_cast(1074025521)
IS_LINUX = os.uname()[0] == 'Linux' IS_LINUX = os.uname()[0] == 'Linux'
SIGNAL_BY_NUM = dict( SIGNAL_BY_NUM = dict(
@ -537,7 +563,8 @@ class IteratingRead(object):
for fd in self.poller.poll(self.timeout): for fd in self.poller.poll(self.timeout):
s, disconnected = mitogen.core.io_op(os.read, fd, 4096) s, disconnected = mitogen.core.io_op(os.read, fd, 4096)
if disconnected or not s: if disconnected or not s:
IOLOG.debug('iter_read(%r) -> disconnected', fd) LOG.debug('iter_read(%r) -> disconnected: %s',
fd, disconnected)
self.poller.stop_receive(fd) self.poller.stop_receive(fd)
else: else:
IOLOG.debug('iter_read(%r) -> %r', fd, s) IOLOG.debug('iter_read(%r) -> %r', fd, s)
@ -761,8 +788,9 @@ class CallSpec(object):
def _get_name(self): def _get_name(self):
bits = [self.func.__module__] bits = [self.func.__module__]
if inspect.ismethod(self.func): if inspect.ismethod(self.func):
bits.append(getattr(self.func.__self__, '__name__', None) or im_self = getattr(self.func, IM_SELF_ATTR)
getattr(type(self.func.__self__), '__name__', None)) bits.append(getattr(im_self, '__name__', None) or
getattr(type(im_self), '__name__', None))
bits.append(self.func.__name__) bits.append(self.func.__name__)
return u'.'.join(bits) return u'.'.join(bits)
@ -936,11 +964,15 @@ class EpollPoller(mitogen.core.Poller):
yield data yield data
POLLER_BY_SYSNAME = { if sys.version_info < (2, 6):
# 2.4 and 2.5 only had select.select() and select.poll().
POLLER_BY_SYSNAME = {}
else:
POLLER_BY_SYSNAME = {
'Darwin': KqueuePoller, 'Darwin': KqueuePoller,
'FreeBSD': KqueuePoller, 'FreeBSD': KqueuePoller,
'Linux': EpollPoller, 'Linux': EpollPoller,
} }
PREFERRED_POLLER = POLLER_BY_SYSNAME.get( PREFERRED_POLLER = POLLER_BY_SYSNAME.get(
os.uname()[0], os.uname()[0],
@ -1091,6 +1123,10 @@ class Stream(mitogen.core.Stream):
LOG.debug('%r: immediate child is detached, won\'t reap it', self) LOG.debug('%r: immediate child is detached, won\'t reap it', self)
return return
if self.profiling:
LOG.info('%r: wont kill child because profiling=True', self)
return
if self._reaped: if self._reaped:
# on_disconnect() may be invoked more than once, for example, if # on_disconnect() may be invoked more than once, for example, if
# there is still a pending message to be sent after the first # there is still a pending message to be sent after the first
@ -1443,9 +1479,10 @@ class CallChain(object):
raise TypeError(self.lambda_msg) raise TypeError(self.lambda_msg)
if inspect.ismethod(fn): if inspect.ismethod(fn):
if not inspect.isclass(fn.__self__): im_self = getattr(fn, IM_SELF_ATTR)
if not inspect.isclass(im_self):
raise TypeError(self.method_msg) raise TypeError(self.method_msg)
klass = mitogen.core.to_text(fn.__self__.__name__) klass = mitogen.core.to_text(im_self.__name__)
else: else:
klass = None klass = None
@ -1775,7 +1812,7 @@ class RouteMonitor(object):
if msg.is_dead: if msg.is_dead:
return return
target_id_s, _, target_name = msg.data.partition(b(':')) target_id_s, _, target_name = bytes_partition(msg.data, b(':'))
target_name = target_name.decode() target_name = target_name.decode()
target_id = int(target_id_s) target_id = int(target_id_s)
self.router.context_by_id(target_id).name = target_name self.router.context_by_id(target_id).name = target_name
@ -1931,8 +1968,10 @@ class Router(mitogen.core.Router):
via = kwargs.pop(u'via', None) via = kwargs.pop(u'via', None)
if via is not None: if via is not None:
return self.proxy_connect(via, method_name, name=name, **kwargs) return self.proxy_connect(via, method_name, name=name,
return self._connect(klass, name=name, **kwargs) **mitogen.core.Kwargs(kwargs))
return self._connect(klass, name=name,
**mitogen.core.Kwargs(kwargs))
def proxy_connect(self, via_context, method_name, name=None, **kwargs): def proxy_connect(self, via_context, method_name, name=None, **kwargs):
resp = via_context.call(_proxy_connect, resp = via_context.call(_proxy_connect,
@ -2054,7 +2093,7 @@ class ModuleForwarder(object):
if msg.is_dead: if msg.is_dead:
return return
context_id_s, _, fullname = msg.data.partition(b('\x00')) context_id_s, _, fullname = bytes_partition(msg.data, b('\x00'))
fullname = mitogen.core.to_text(fullname) fullname = mitogen.core.to_text(fullname)
context_id = int(context_id_s) context_id = int(context_id_s)
stream = self.router.stream_by_id(context_id) stream = self.router.stream_by_id(context_id)

@ -40,6 +40,16 @@ import mitogen.core
import mitogen.select import mitogen.select
from mitogen.core import b from mitogen.core import b
from mitogen.core import LOG from mitogen.core import LOG
from mitogen.core import str_rpartition
try:
all
except NameError:
def all(it):
for elem in it:
if not elem:
return False
return True
DEFAULT_POOL_SIZE = 16 DEFAULT_POOL_SIZE = 16
@ -192,7 +202,7 @@ class Activator(object):
) )
def activate(self, pool, service_name, msg): def activate(self, pool, service_name, msg):
mod_name, _, class_name = service_name.rpartition('.') mod_name, _, class_name = str_rpartition(service_name, '.')
if msg and not self.is_permitted(mod_name, class_name, msg): if msg and not self.is_permitted(mod_name, class_name, msg):
raise mitogen.core.CallError(self.not_active_msg, service_name) raise mitogen.core.CallError(self.not_active_msg, service_name)
@ -556,7 +566,7 @@ class Pool(object):
self._worker_run() self._worker_run()
except Exception: except Exception:
th = threading.currentThread() th = threading.currentThread()
LOG.exception('%r: worker %r crashed', self, th.name) LOG.exception('%r: worker %r crashed', self, th.getName())
raise raise
def __repr__(self): def __repr__(self):
@ -564,7 +574,7 @@ class Pool(object):
return 'mitogen.service.Pool(%#x, size=%d, th=%r)' % ( return 'mitogen.service.Pool(%#x, size=%d, th=%r)' % (
id(self), id(self),
len(self._threads), len(self._threads),
th.name, th.getName(),
) )
@ -817,8 +827,8 @@ class FileService(Service):
u'mode': st.st_mode, u'mode': st.st_mode,
u'owner': self._name_or_none(pwd.getpwuid, 0, 'pw_name'), u'owner': self._name_or_none(pwd.getpwuid, 0, 'pw_name'),
u'group': self._name_or_none(grp.getgrgid, 0, 'gr_name'), u'group': self._name_or_none(grp.getgrgid, 0, 'gr_name'),
u'mtime': st.st_mtime, u'mtime': float(st.st_mtime), # Python 2.4 uses int.
u'atime': st.st_atime, u'atime': float(st.st_atime), # Python 2.4 uses int.
} }
def on_shutdown(self): def on_shutdown(self):

@ -40,6 +40,12 @@ except ImportError:
import mitogen.parent import mitogen.parent
from mitogen.core import b from mitogen.core import b
from mitogen.core import bytes_partition
try:
any
except NameError:
from mitogen.core import any
LOG = logging.getLogger('mitogen') LOG = logging.getLogger('mitogen')
@ -91,19 +97,19 @@ def filter_debug(stream, it):
# interesting token from above or the bootstrap # interesting token from above or the bootstrap
# ('password', 'MITO000\n'). # ('password', 'MITO000\n').
break break
elif buf.startswith(DEBUG_PREFIXES): elif any(buf.startswith(p) for p in DEBUG_PREFIXES):
state = 'in_debug' state = 'in_debug'
else: else:
state = 'in_plain' state = 'in_plain'
elif state == 'in_debug': elif state == 'in_debug':
if b('\n') not in buf: if b('\n') not in buf:
break break
line, _, buf = buf.partition(b('\n')) line, _, buf = bytes_partition(buf, b('\n'))
LOG.debug('%r: %s', stream, LOG.debug('%r: %s', stream,
mitogen.core.to_text(line.rstrip())) mitogen.core.to_text(line.rstrip()))
state = 'start_of_line' state = 'start_of_line'
elif state == 'in_plain': elif state == 'in_plain':
line, nl, buf = buf.partition(b('\n')) line, nl, buf = bytes_partition(buf, b('\n'))
yield line + nl, not (nl or buf) yield line + nl, not (nl or buf)
if nl: if nl:
state = 'start_of_line' state = 'start_of_line'

@ -45,6 +45,27 @@ else:
iteritems = dict.iteritems iteritems = dict.iteritems
def setup_gil():
"""
Set extremely long GIL release interval to let threads naturally progress
through CPU-heavy sequences without forcing the wake of another thread that
may contend trying to run the same CPU-heavy code. For the new-style
Ansible work, this drops runtime ~33% and involuntary context switches by
>80%, essentially making threads cooperatively scheduled.
"""
try:
# Python 2.
sys.setcheckinterval(100000)
except AttributeError:
pass
try:
# Python 3.
sys.setswitchinterval(10)
except AttributeError:
pass
def disable_site_packages(): def disable_site_packages():
""" """
Remove all entries mentioning ``site-packages`` or ``Extras`` from Remove all entries mentioning ``site-packages`` or ``Extras`` from
@ -62,7 +83,9 @@ def _formatTime(record, datefmt=None):
def log_get_formatter(): def log_get_formatter():
datefmt = '%H:%M:%S.%f' datefmt = '%H:%M:%S'
if sys.version_info > (2, 6):
datefmt += '.%f'
fmt = '%(asctime)s %(levelname).1s %(name)s: %(message)s' fmt = '%(asctime)s %(levelname).1s %(name)s: %(message)s'
formatter = logging.Formatter(fmt, datefmt) formatter = logging.Formatter(fmt, datefmt)
formatter.formatTime = _formatTime formatter.formatTime = _formatTime

@ -8,16 +8,25 @@ echo
set -o errexit set -o errexit
set -o pipefail set -o pipefail
UNIT2="$(which unit2)" if [ ! "$UNIT2" ]; then
UNIT2="$(which unit2)"
fi
coverage erase [ "$NOCOVERAGE" ] || coverage erase
# First run overwites coverage output. # First run overwites coverage output.
[ "$SKIP_MITOGEN" ] || { [ "$SKIP_MITOGEN" ] || {
if [ ! "$NOCOVERAGE" ]; then
coverage run "${UNIT2}" discover \ coverage run "${UNIT2}" discover \
--start-directory "tests" \ --start-directory "tests" \
--pattern '*_test.py' \ --pattern '*_test.py' \
"$@" "$@"
else
"${UNIT2}" discover \
--start-directory "tests" \
--pattern '*_test.py' \
"$@"
fi
} }
# Second run appends. This is since 'discover' treats subdirs as packages and # Second run appends. This is since 'discover' treats subdirs as packages and
@ -27,11 +36,18 @@ coverage erase
# mess of Git history. # mess of Git history.
[ "$SKIP_ANSIBLE" ] || { [ "$SKIP_ANSIBLE" ] || {
export PYTHONPATH=`pwd`/tests:$PYTHONPATH export PYTHONPATH=`pwd`/tests:$PYTHONPATH
if [ ! "$NOCOVERAGE" ]; then
coverage run -a "${UNIT2}" discover \
--start-directory "tests/ansible" \
--pattern '*_test.py' \
"$@"
else
coverage run -a "${UNIT2}" discover \ coverage run -a "${UNIT2}" discover \
--start-directory "tests/ansible" \ --start-directory "tests/ansible" \
--pattern '*_test.py' \ --pattern '*_test.py' \
"$@" "$@"
fi
} }
coverage html [ "$NOCOVERAGE" ] || coverage html
echo coverage report is at "file://$(pwd)/htmlcov/index.html" [ "$NOCOVERAGE" ] || echo coverage report is at "file://$(pwd)/htmlcov/index.html"

@ -50,7 +50,6 @@ setup(
packages = find_packages(exclude=['tests', 'examples']), packages = find_packages(exclude=['tests', 'examples']),
zip_safe = False, zip_safe = False,
classifiers = [ classifiers = [
'Development Status :: 3 - Alpha',
'Environment :: Console', 'Environment :: Console',
'Intended Audience :: System Administrators', 'Intended Audience :: System Administrators',
'License :: OSI Approved :: BSD License', 'License :: OSI Approved :: BSD License',

@ -1,3 +1,3 @@
- import_playbook: regression/all.yml - include: regression/all.yml
- import_playbook: integration/all.yml - include: integration/all.yml

@ -1,9 +1,9 @@
- import_playbook: copy.yml - include: copy.yml
- import_playbook: fixup_perms2__copy.yml - include: fixup_perms2__copy.yml
- import_playbook: low_level_execute_command.yml - include: low_level_execute_command.yml
- import_playbook: make_tmp_path.yml - include: make_tmp_path.yml
- import_playbook: remote_expand_user.yml - include: remote_expand_user.yml
- import_playbook: remote_file_exists.yml - include: remote_file_exists.yml
- import_playbook: remove_tmp_path.yml - include: remove_tmp_path.yml
- import_playbook: synchronize.yml - include: synchronize.yml
- import_playbook: transfer_data.yml - include: transfer_data.yml

@ -3,18 +3,18 @@
# This playbook imports all tests that are known to work at present. # This playbook imports all tests that are known to work at present.
# #
- import_playbook: action/all.yml - include: action/all.yml
- import_playbook: async/all.yml - include: async/all.yml
- import_playbook: become/all.yml - include: become/all.yml
- import_playbook: connection/all.yml - include: connection/all.yml
- import_playbook: connection_delegation/all.yml - include: connection_delegation/all.yml
- import_playbook: connection_loader/all.yml - include: connection_loader/all.yml
- import_playbook: context_service/all.yml - include: context_service/all.yml
- import_playbook: glibc_caches/all.yml - include: glibc_caches/all.yml
- import_playbook: local/all.yml - include: local/all.yml
- import_playbook: module_utils/all.yml - include: module_utils/all.yml
- import_playbook: playbook_semantics/all.yml - include: playbook_semantics/all.yml
- import_playbook: runner/all.yml - include: runner/all.yml
- import_playbook: ssh/all.yml - include: ssh/all.yml
- import_playbook: strategy/all.yml - include: strategy/all.yml
- import_playbook: stub_connections/all.yml - include: stub_connections/all.yml

@ -1,9 +1,9 @@
- import_playbook: multiple_items_loop.yml - include: multiple_items_loop.yml
- import_playbook: result_binary_producing_json.yml - include: result_binary_producing_json.yml
- import_playbook: result_binary_producing_junk.yml - include: result_binary_producing_junk.yml
- import_playbook: result_shell_echo_hi.yml - include: result_shell_echo_hi.yml
- import_playbook: runner_new_process.yml - include: runner_new_process.yml
- import_playbook: runner_one_job.yml - include: runner_one_job.yml
- import_playbook: runner_timeout_then_polling.yml - include: runner_timeout_then_polling.yml
- import_playbook: runner_two_simultaneous_jobs.yml - include: runner_two_simultaneous_jobs.yml
- import_playbook: runner_with_polling_and_timeout.yml - include: runner_with_polling_and_timeout.yml

@ -1,7 +1,7 @@
- import_playbook: su_password.yml - include: su_password.yml
- import_playbook: sudo_flags_failure.yml - include: sudo_flags_failure.yml
- import_playbook: sudo_nonexistent.yml - include: sudo_nonexistent.yml
- import_playbook: sudo_nopassword.yml - include: sudo_nopassword.yml
- import_playbook: sudo_password.yml - include: sudo_password.yml
- import_playbook: sudo_requiretty.yml - include: sudo_requiretty.yml

@ -1,8 +1,8 @@
--- ---
- import_playbook: disconnect_during_module.yml - include: disconnect_during_module.yml
- import_playbook: disconnect_resets_connection.yml - include: disconnect_resets_connection.yml
- import_playbook: exec_command.yml - include: exec_command.yml
- import_playbook: put_large_file.yml - include: put_large_file.yml
- import_playbook: put_small_file.yml - include: put_small_file.yml
- import_playbook: reset.yml - include: reset.yml

@ -9,4 +9,4 @@
file_name: large-file file_name: large-file
file_size: 512 file_size: 512
tasks: tasks:
- include_tasks: _put_file.yml - include: _put_file.yml

@ -9,4 +9,4 @@
file_name: small-file file_name: small-file
file_size: 123 file_size: 123
tasks: tasks:
- include_tasks: _put_file.yml - include: _put_file.yml

@ -1,5 +1,5 @@
- import_playbook: delegate_to_template.yml - include: delegate_to_template.yml
- import_playbook: local_action.yml - include: local_action.yml
- import_playbook: osa_container_standalone.yml - include: osa_container_standalone.yml
- import_playbook: osa_delegate_to_self.yml - include: osa_delegate_to_self.yml
- import_playbook: stack_construction.yml - include: stack_construction.yml

@ -1,3 +1,3 @@
- import_playbook: local_blemished.yml - include: local_blemished.yml
- import_playbook: paramiko_unblemished.yml - include: paramiko_unblemished.yml
- import_playbook: ssh_blemished.yml - include: ssh_blemished.yml

@ -1,3 +1,3 @@
- import_playbook: disconnect_cleanup.yml - include: disconnect_cleanup.yml
- import_playbook: lru_one_target.yml - include: lru_one_target.yml
- import_playbook: reconnection.yml - include: reconnection.yml

@ -1,2 +1,2 @@
- import_playbook: resolv_conf.yml - include: resolv_conf.yml

@ -1,4 +1,4 @@
- import_playbook: cwd_preserved.yml - include: cwd_preserved.yml
- import_playbook: env_preserved.yml - include: env_preserved.yml

@ -1,6 +1,6 @@
#- import_playbook: from_config_path.yml #- include: from_config_path.yml
#- import_playbook: from_config_path_pkg.yml #- include: from_config_path_pkg.yml
#- import_playbook: adjacent_to_playbook.yml #- include: adjacent_to_playbook.yml
- import_playbook: adjacent_to_role.yml - include: adjacent_to_role.yml
#- import_playbook: overrides_builtin.yml #- include: overrides_builtin.yml

@ -1,4 +1,4 @@
- import_playbook: become_flags.yml - include: become_flags.yml
- import_playbook: delegate_to.yml - include: delegate_to.yml
- import_playbook: environment.yml - include: environment.yml
- import_playbook: with_items.yml - include: with_items.yml

@ -17,7 +17,7 @@
MAGIC_ETC_ENV=555 MAGIC_ETC_ENV=555
become: true become: true
- include_tasks: _reset_conn.yml - mitogen_shutdown_all:
when: not is_mitogen when: not is_mitogen
- shell: echo $MAGIC_ETC_ENV - shell: echo $MAGIC_ETC_ENV
@ -31,7 +31,7 @@
state: absent state: absent
become: true become: true
- include_tasks: _reset_conn.yml - mitogen_shutdown_all:
when: not is_mitogen when: not is_mitogen
- shell: echo $MAGIC_ETC_ENV - shell: echo $MAGIC_ETC_ENV

@ -1,21 +1,21 @@
- import_playbook: atexit.yml - include: atexit.yml
- import_playbook: builtin_command_module.yml - include: builtin_command_module.yml
- import_playbook: custom_bash_hashbang_argument.yml - include: custom_bash_hashbang_argument.yml
- import_playbook: custom_bash_old_style_module.yml - include: custom_bash_old_style_module.yml
- import_playbook: custom_bash_want_json_module.yml - include: custom_bash_want_json_module.yml
- import_playbook: custom_binary_producing_json.yml - include: custom_binary_producing_json.yml
- import_playbook: custom_binary_producing_junk.yml - include: custom_binary_producing_junk.yml
- import_playbook: custom_binary_single_null.yml - include: custom_binary_single_null.yml
- import_playbook: custom_perl_json_args_module.yml - include: custom_perl_json_args_module.yml
- import_playbook: custom_perl_want_json_module.yml - include: custom_perl_want_json_module.yml
- import_playbook: custom_python_json_args_module.yml - include: custom_python_json_args_module.yml
- import_playbook: custom_python_new_style_missing_interpreter.yml - include: custom_python_new_style_missing_interpreter.yml
- import_playbook: custom_python_new_style_module.yml - include: custom_python_new_style_module.yml
- import_playbook: custom_python_want_json_module.yml - include: custom_python_want_json_module.yml
- import_playbook: custom_script_interpreter.yml - include: custom_script_interpreter.yml
- import_playbook: environment_isolation.yml - include: environment_isolation.yml
- import_playbook: etc_environment.yml - include: etc_environment.yml
- import_playbook: forking_active.yml - include: forking_active.yml
- import_playbook: forking_correct_parent.yml - include: forking_correct_parent.yml
- import_playbook: forking_inactive.yml - include: forking_inactive.yml
- import_playbook: missing_module.yml - include: missing_module.yml

@ -25,8 +25,8 @@
any_errors_fatal: true any_errors_fatal: true
tasks: tasks:
- assert: - assert:
that: | that:
out.failed and - out.failed
out.results[0].failed and - out.results[0].failed
out.results[0].msg == 'MODULE FAILURE' and - out.results[0].msg.startswith('MODULE FAILURE')
out.results[0].rc == 0 - out.results[0].rc == 0

@ -14,7 +14,7 @@
that: that:
- "out.failed" - "out.failed"
- "out.results[0].failed" - "out.results[0].failed"
- "out.results[0].msg == 'MODULE FAILURE'" - "out.results[0].msg.startswith('MODULE FAILURE')"
- "out.results[0].module_stdout.startswith('/bin/sh: ')" - "out.results[0].module_stdout.startswith('/bin/sh: ')"
- | - |
out.results[0].module_stdout.endswith('/custom_binary_single_null: cannot execute binary file\r\n') or out.results[0].module_stdout.endswith('/custom_binary_single_null: cannot execute binary file\r\n') or

@ -8,8 +8,12 @@
register: out register: out
- assert: - assert:
that: | that:
(not out.changed) and - out.results[0].input[0].foo
(not out.results[0].changed) and - out.results[0].message == 'I am a perl script! Here is my input.'
out.results[0].input[0].foo and
out.results[0].message == 'I am a perl script! Here is my input.' - when: ansible_version.full > '2.4'
assert:
that:
- (not out.changed)
- (not out.results[0].changed)

@ -8,8 +8,12 @@
register: out register: out
- assert: - assert:
that: | that:
(not out.changed) and - out.results[0].input[0].foo
(not out.results[0].changed) and - out.results[0].message == 'I am a want JSON perl script! Here is my input.'
out.results[0].input[0].foo and
out.results[0].message == 'I am a want JSON perl script! Here is my input.' - when: ansible_version.full > '2.4'
assert:
that:
- (not out.changed)
- (not out.results[0].changed)

@ -7,9 +7,9 @@
any_errors_fatal: true any_errors_fatal: true
gather_facts: true gather_facts: true
tasks: tasks:
- include_tasks: _etc_environment_user.yml - include: _etc_environment_user.yml
when: ansible_system == "Linux" and is_mitogen when: ansible_system == "Linux" and is_mitogen
- include_tasks: _etc_environment_global.yml - include: _etc_environment_global.yml
# Don't destroy laptops. # Don't destroy laptops.
when: ansible_virtualization_type == "docker" when: ansible_virtualization_type == "docker"

@ -1,3 +1,3 @@
- import_playbook: config.yml - include: config.yml
- import_playbook: timeouts.yml - include: timeouts.yml
- import_playbook: variables.yml - include: variables.yml

@ -1 +1 @@
- import_playbook: mixed_vanilla_mitogen.yml - include: mixed_vanilla_mitogen.yml

@ -1,7 +1,7 @@
- import_playbook: kubectl.yml - include: kubectl.yml
- import_playbook: lxc.yml - include: lxc.yml
- import_playbook: lxd.yml - include: lxd.yml
- import_playbook: mitogen_doas.yml - include: mitogen_doas.yml
- import_playbook: mitogen_sudo.yml - include: mitogen_sudo.yml
- import_playbook: setns_lxc.yml - include: setns_lxc.yml
- import_playbook: setns_lxd.yml - include: setns_lxd.yml

@ -11,7 +11,7 @@
- meta: end_play - meta: end_play
when: not is_mitogen when: not is_mitogen
- include_tasks: _end_play_if_not_sudo_linux.yml - include: _end_play_if_not_sudo_linux.yml
- command: | - command: |
sudo -nE "{{lookup('env', 'VIRTUAL_ENV')}}/bin/ansible" sudo -nE "{{lookup('env', 'VIRTUAL_ENV')}}/bin/ansible"

@ -11,7 +11,7 @@
- meta: end_play - meta: end_play
when: not is_mitogen when: not is_mitogen
- include_tasks: _end_play_if_not_sudo_linux.yml - include: _end_play_if_not_sudo_linux.yml
- command: | - command: |
sudo -nE "{{lookup('env', 'VIRTUAL_ENV')}}/bin/ansible" sudo -nE "{{lookup('env', 'VIRTUAL_ENV')}}/bin/ansible"

@ -1,2 +1,2 @@
- import_playbook: kubectl.yml - include: kubectl.yml

@ -12,6 +12,17 @@ import socket
import sys import sys
try:
all
except NameError:
# Python 2.4
def all(it):
for elem in it:
if not elem:
return False
return True
def main(): def main():
module = AnsibleModule(argument_spec={}) module = AnsibleModule(argument_spec={})
module.exit_json( module.exit_json(

@ -1,7 +1,6 @@
#!/usr/bin/python #!/usr/bin/python
# I am an Ansible Python JSONARGS module. I should receive an encoding string. # I am an Ansible Python JSONARGS module. I should receive an encoding string.
import json
import sys import sys
json_arguments = """<<INCLUDE_ANSIBLE_MODULE_JSON_ARGS>>""" json_arguments = """<<INCLUDE_ANSIBLE_MODULE_JSON_ARGS>>"""

@ -12,8 +12,8 @@ import sys
def main(): def main():
module = AnsibleModule(argument_spec={ module = AnsibleModule(argument_spec={
'key': {'type': str}, 'key': {'type': 'str'},
'val': {'type': str} 'val': {'type': 'str'}
}) })
os.environ[module.params['key']] = module.params['val'] os.environ[module.params['key']] = module.params['val']
module.exit_json(msg='Muahahaha!') module.exit_json(msg='Muahahaha!')

@ -1,6 +1,5 @@
# I am an Ansible new-style Python module, but I lack an interpreter. # I am an Ansible new-style Python module, but I lack an interpreter.
import json
import sys import sys
# This is the magic marker Ansible looks for: # This is the magic marker Ansible looks for:

@ -1,7 +1,6 @@
#!/usr/bin/python #!/usr/bin/python
# I am an Ansible new-style Python module. I should receive an encoding string. # I am an Ansible new-style Python module. I should receive an encoding string.
import json
import sys import sys
# This is the magic marker Ansible looks for: # This is the magic marker Ansible looks for:

@ -22,7 +22,7 @@ def execute(s, gbls, lcls):
def main(): def main():
module = AnsibleModule(argument_spec={ module = AnsibleModule(argument_spec={
'script': { 'script': {
'type': str 'type': 'str'
} }
}) })

@ -1,9 +1,14 @@
#!/usr/bin/python #!/usr/bin/python
# I am an Ansible Python WANT_JSON module. I should receive an encoding string. # I am an Ansible Python WANT_JSON module. I should receive a JSON-encoded file.
import json
import sys import sys
try:
import json
except ImportError:
import simplejson as json
WANT_JSON = 1 WANT_JSON = 1
@ -16,12 +21,18 @@ if len(sys.argv) < 2:
# Also must slurp in our own source code, to verify the encoding string was # Also must slurp in our own source code, to verify the encoding string was
# added. # added.
with open(sys.argv[0]) as fp: fp = open(sys.argv[0])
try:
me = fp.read() me = fp.read()
finally:
fp.close()
try: try:
with open(sys.argv[1]) as fp: fp = open(sys.argv[1])
try:
input_json = fp.read() input_json = fp.read()
finally:
fp.close()
except IOError: except IOError:
usage() usage()

@ -1,16 +1,12 @@
# https://github.com/dw/mitogen/issues/297 # https://github.com/dw/mitogen/issues/297
from __future__ import (absolute_import, division, print_function) from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
from ansible.plugins.vars import BaseVarsPlugin
import os import os
class VarsModule(BaseVarsPlugin): class VarsModule(object):
def __init__(self, *args): def __init__(self, *args):
super(VarsModule, self).__init__(*args)
os.environ['EVIL_VARS_PLUGIN'] = 'YIPEEE' os.environ['EVIL_VARS_PLUGIN'] = 'YIPEEE'
def get_vars(self, loader, path, entities, cache=True): def get_vars(self, loader, path, entities, cache=True):
super(VarsModule, self).get_vars(loader, path, entities)
return {} return {}

@ -1,10 +1,10 @@
- import_playbook: issue_109__target_has_old_ansible_installed.yml - include: issue_109__target_has_old_ansible_installed.yml
- import_playbook: issue_113__duplicate_module_imports.yml - include: issue_113__duplicate_module_imports.yml
- import_playbook: issue_118__script_not_marked_exec.yml - include: issue_118__script_not_marked_exec.yml
- import_playbook: issue_122__environment_difference.yml - include: issue_122__environment_difference.yml
- import_playbook: issue_140__thread_pileup.yml - include: issue_140__thread_pileup.yml
- import_playbook: issue_152__local_action_wrong_interpreter.yml - include: issue_152__local_action_wrong_interpreter.yml
- import_playbook: issue_152__virtualenv_python_fails.yml - include: issue_152__virtualenv_python_fails.yml
- import_playbook: issue_154__module_state_leaks.yml - include: issue_154__module_state_leaks.yml
- import_playbook: issue_177__copy_module_failing.yml - include: issue_177__copy_module_failing.yml
- import_playbook: issue_332_ansiblemoduleerror_first_occurrence.yml - include: issue_332_ansiblemoduleerror_first_occurrence.yml

@ -2,5 +2,5 @@
tasks: tasks:
- set_fact: - set_fact:
content: "{% for x in range(126977) %}x{% endfor %}" content: "{% for x in range(126977) %}x{% endfor %}"
- include_tasks: _file_service_loop.yml - include: _file_service_loop.yml
with_sequence: start=1 end=100 with_sequence: start=1 end=100

@ -6,9 +6,9 @@ import threading
import time import time
import mitogen import mitogen
import mitogen.utils
import ansible_mitogen.process mitogen.utils.setup_gil()
ansible_mitogen.process.setup_gil()
X = 20000 X = 20000

@ -2,11 +2,12 @@
Measure latency of local RPC. Measure latency of local RPC.
""" """
import mitogen
import time import time
import ansible_mitogen.process import mitogen
ansible_mitogen.process.setup_gil() import mitogen.utils
mitogen.utils.setup_gil()
try: try:
xrange xrange

@ -0,0 +1,74 @@
# Verify throughput over sudo and SSH at various compression levels.
import os
import random
import socket
import subprocess
import tempfile
import time
import mitogen
import mitogen.service
def prepare():
pass
def transfer(context, path):
fp = open('/dev/null', 'wb')
mitogen.service.FileService.get(context, path, fp)
fp.close()
def fill_with_random(fp, size):
n = 0
s = os.urandom(1048576*16)
while n < size:
fp.write(s)
n += len(s)
def run_test(router, fp, s, context):
fp.seek(0, 2)
size = fp.tell()
print('Testing %s...' % (s,))
context.call(prepare)
t0 = time.time()
context.call(transfer, router.myself(), fp.name)
t1 = time.time()
print('%s took %.2f ms to transfer %.2f MiB, %.2f MiB/s' % (
s, 1000 * (t1 - t0), size / 1048576.0,
(size / (t1 - t0) / 1048576.0),
))
@mitogen.main()
def main(router):
bigfile = tempfile.NamedTemporaryFile()
fill_with_random(bigfile, 1048576*512)
file_service = mitogen.service.FileService(router)
pool = mitogen.service.Pool(router, ())
file_service.register(bigfile.name)
pool.add(file_service)
try:
context = router.local()
run_test(router, bigfile, 'local()', context)
context.shutdown(wait=True)
context = router.sudo()
run_test(router, bigfile, 'sudo()', context)
context.shutdown(wait=True)
context = router.ssh(hostname='localhost', compression=False)
run_test(router, bigfile, 'ssh(compression=False)', context)
context.shutdown(wait=True)
context = router.ssh(hostname='localhost', compression=True)
run_test(router, bigfile, 'ssh(compression=True)', context)
context.shutdown(wait=True)
finally:
pool.stop()
bigfile.close()

@ -31,9 +31,10 @@ class ConstructorTest(testlib.TestCase):
def test_form_base_exc(self): def test_form_base_exc(self):
ve = SystemExit('eek') ve = SystemExit('eek')
e = self.klass(ve) e = self.klass(ve)
cls = ve.__class__
self.assertEquals(e.args[0], self.assertEquals(e.args[0],
# varies across 2/3. # varies across 2/3.
'%s.%s: eek' % (type(ve).__module__, type(ve).__name__)) '%s.%s: eek' % (cls.__module__, cls.__name__))
self.assertTrue(isinstance(e.args[0], mitogen.core.UnicodeType)) self.assertTrue(isinstance(e.args[0], mitogen.core.UnicodeType))
def test_from_exc_tb(self): def test_from_exc_tb(self):
@ -72,7 +73,7 @@ class UnpickleCallErrorTest(testlib.TestCase):
def test_reify(self): def test_reify(self):
e = self.func(u'some error') e = self.func(u'some error')
self.assertEquals(mitogen.core.CallError, type(e)) self.assertEquals(mitogen.core.CallError, e.__class__)
self.assertEquals(1, len(e.args)) self.assertEquals(1, len(e.args))
self.assertEquals(mitogen.core.UnicodeType, type(e.args[0])) self.assertEquals(mitogen.core.UnicodeType, type(e.args[0]))
self.assertEquals(u'some error', e.args[0]) self.assertEquals(u'some error', e.args[0])

@ -6,6 +6,7 @@ import unittest2
import mitogen.core import mitogen.core
import mitogen.parent import mitogen.parent
import mitogen.master import mitogen.master
from mitogen.core import str_partition
import testlib import testlib
import plain_old_module import plain_old_module
@ -50,7 +51,7 @@ class CallFunctionTest(testlib.RouterMixin, testlib.TestCase):
def setUp(self): def setUp(self):
super(CallFunctionTest, self).setUp() super(CallFunctionTest, self).setUp()
self.local = self.router.fork() self.local = self.router.local()
def test_succeeds(self): def test_succeeds(self):
self.assertEqual(3, self.local.call(function_that_adds_numbers, 1, 2)) self.assertEqual(3, self.local.call(function_that_adds_numbers, 1, 2))
@ -65,11 +66,11 @@ class CallFunctionTest(testlib.RouterMixin, testlib.TestCase):
exc = self.assertRaises(mitogen.core.CallError, exc = self.assertRaises(mitogen.core.CallError,
lambda: self.local.call(function_that_fails)) lambda: self.local.call(function_that_fails))
s = str(exc) s = mitogen.core.to_text(exc)
etype, _, s = s.partition(': ') etype, _, s = str_partition(s, u': ')
self.assertEqual(etype, 'plain_old_module.MyError') self.assertEqual(etype, u'plain_old_module.MyError')
msg, _, s = s.partition('\n') msg, _, s = str_partition(s, u'\n')
self.assertEqual(msg, 'exception text') self.assertEqual(msg, 'exception text')
# Traceback # Traceback
@ -127,7 +128,7 @@ class CallChainTest(testlib.RouterMixin, testlib.TestCase):
def setUp(self): def setUp(self):
super(CallChainTest, self).setUp() super(CallChainTest, self).setUp()
self.local = self.router.fork() self.local = self.router.local()
def test_subsequent_calls_produce_same_error(self): def test_subsequent_calls_produce_same_error(self):
chain = self.klass(self.local, pipelined=True) chain = self.klass(self.local, pipelined=True)
@ -162,7 +163,7 @@ class UnsupportedCallablesTest(testlib.RouterMixin, testlib.TestCase):
def setUp(self): def setUp(self):
super(UnsupportedCallablesTest, self).setUp() super(UnsupportedCallablesTest, self).setUp()
self.local = self.router.fork() self.local = self.router.local()
def test_closures_unsuppored(self): def test_closures_unsuppored(self):
a = 1 a = 1

@ -0,0 +1 @@
*.tar.bz2 filter=lfs diff=lfs merge=lfs -text

@ -1,9 +0,0 @@
# https://www.toofishes.net/blog/trouble-sudoers-or-last-entry-wins/
%mitogen__sudo_nopw ALL=(ALL:ALL) NOPASSWD:ALL
mitogen__has_sudo_nopw ALL = (mitogen__pw_required) ALL
mitogen__has_sudo_nopw ALL = (mitogen__require_tty_pw_required) ALL
Defaults>mitogen__pw_required targetpw
Defaults>mitogen__require_tty requiretty
Defaults>mitogen__require_tty_pw_required requiretty,targetpw

@ -0,0 +1,6 @@
def ping(*args):
return args

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:123ddbd9055745d37e8f14bf1c8352541ff4d500e6daa4aa3165e604fb7e8b6a
size 6176131

@ -2,7 +2,6 @@
import os import os
import shutil import shutil
import timeoutcontext
import unittest2 import unittest2
import mitogen.fakessh import mitogen.fakessh
@ -11,7 +10,6 @@ import testlib
class RsyncTest(testlib.DockerMixin, testlib.TestCase): class RsyncTest(testlib.DockerMixin, testlib.TestCase):
@timeoutcontext.timeout(5)
@unittest2.skip('broken') @unittest2.skip('broken')
def test_rsync_from_master(self): def test_rsync_from_master(self):
context = self.docker_ssh_any() context = self.docker_ssh_any()
@ -28,7 +26,6 @@ class RsyncTest(testlib.DockerMixin, testlib.TestCase):
self.assertTrue(context.call(os.path.exists, '/tmp/data')) self.assertTrue(context.call(os.path.exists, '/tmp/data'))
self.assertTrue(context.call(os.path.exists, '/tmp/data/simple_pkg/a.py')) self.assertTrue(context.call(os.path.exists, '/tmp/data/simple_pkg/a.py'))
@timeoutcontext.timeout(5)
@unittest2.skip('broken') @unittest2.skip('broken')
def test_rsync_between_direct_children(self): def test_rsync_between_direct_children(self):
# master -> SSH -> mitogen__has_sudo_pubkey -> rsync(.ssh) -> master -> # master -> SSH -> mitogen__has_sudo_pubkey -> rsync(.ssh) -> master ->

@ -30,6 +30,7 @@ class CommandLineTest(testlib.RouterMixin, testlib.TestCase):
# success. # success.
fp = open("/dev/null", "r") fp = open("/dev/null", "r")
try:
proc = subprocess.Popen(args, proc = subprocess.Popen(args,
stdin=fp, stdin=fp,
stdout=subprocess.PIPE, stdout=subprocess.PIPE,
@ -38,7 +39,9 @@ class CommandLineTest(testlib.RouterMixin, testlib.TestCase):
stdout, stderr = proc.communicate() stdout, stderr = proc.communicate()
self.assertEquals(0, proc.returncode) self.assertEquals(0, proc.returncode)
self.assertEquals(mitogen.parent.Stream.EC0_MARKER, stdout) self.assertEquals(mitogen.parent.Stream.EC0_MARKER, stdout)
self.assertIn(b("Error -5 while decompressing data: incomplete or truncated stream"), stderr) self.assertIn(b("Error -5 while decompressing data"), stderr)
finally:
fp.close()
if __name__ == '__main__': if __name__ == '__main__':

@ -1,12 +1,25 @@
import _ssl
import ctypes
import os import os
import random import random
import ssl
import struct import struct
import sys import sys
try:
import _ssl
except ImportError:
_ssl = None
try:
import ssl
except ImportError:
ssl = None
try:
import ctypes
except ImportError:
# Python 2.4
ctypes = None
import mitogen import mitogen
import unittest2 import unittest2
@ -29,16 +42,17 @@ def _find_ssl_darwin():
return bits[1] return bits[1]
if sys.platform.startswith('linux'): if ctypes and sys.platform.startswith('linux'):
LIBSSL_PATH = _find_ssl_linux() LIBSSL_PATH = _find_ssl_linux()
elif sys.platform == 'darwin': elif ctypes and sys.platform == 'darwin':
LIBSSL_PATH = _find_ssl_darwin() LIBSSL_PATH = _find_ssl_darwin()
else: else:
assert 0, "Don't know how to find libssl on this platform" LIBSSL_PATH = None
c_ssl = ctypes.CDLL(LIBSSL_PATH) if ctypes and LIBSSL_PATH:
c_ssl.RAND_pseudo_bytes.argtypes = [ctypes.c_char_p, ctypes.c_int] c_ssl = ctypes.CDLL(LIBSSL_PATH)
c_ssl.RAND_pseudo_bytes.restype = ctypes.c_int c_ssl.RAND_pseudo_bytes.argtypes = [ctypes.c_char_p, ctypes.c_int]
c_ssl.RAND_pseudo_bytes.restype = ctypes.c_int
def ping(): def ping():
@ -64,6 +78,12 @@ def exercise_importer(n):
return simple_pkg.a.subtract_one_add_two(n) return simple_pkg.a.subtract_one_add_two(n)
skipIfUnsupported = unittest2.skipIf(
condition=(not mitogen.fork.FORK_SUPPORTED),
reason="mitogen.fork unsupported on this platform"
)
class ForkTest(testlib.RouterMixin, testlib.TestCase): class ForkTest(testlib.RouterMixin, testlib.TestCase):
def test_okay(self): def test_okay(self):
context = self.router.fork() context = self.router.fork()
@ -74,6 +94,10 @@ class ForkTest(testlib.RouterMixin, testlib.TestCase):
context = self.router.fork() context = self.router.fork()
self.assertNotEqual(context.call(random_random), random_random()) self.assertNotEqual(context.call(random_random), random_random())
@unittest2.skipIf(
condition=LIBSSL_PATH is None or ctypes is None,
reason='cant test libssl on this platform',
)
def test_ssl_module_diverges(self): def test_ssl_module_diverges(self):
# Ensure generator state is initialized. # Ensure generator state is initialized.
RAND_pseudo_bytes() RAND_pseudo_bytes()
@ -93,6 +117,8 @@ class ForkTest(testlib.RouterMixin, testlib.TestCase):
context = self.router.fork(on_start=on_start) context = self.router.fork(on_start=on_start)
self.assertEquals(123, recv.get().unpickle()) self.assertEquals(123, recv.get().unpickle())
ForkTest = skipIfUnsupported(ForkTest)
class DoubleChildTest(testlib.RouterMixin, testlib.TestCase): class DoubleChildTest(testlib.RouterMixin, testlib.TestCase):
def test_okay(self): def test_okay(self):
@ -115,6 +141,8 @@ class DoubleChildTest(testlib.RouterMixin, testlib.TestCase):
c2 = self.router.fork(name='c2', via=c1) c2 = self.router.fork(name='c2', via=c1)
self.assertEqual(2, c2.call(exercise_importer, 1)) self.assertEqual(2, c2.call(exercise_importer, 1))
DoubleChildTest = skipIfUnsupported(DoubleChildTest)
if __name__ == '__main__': if __name__ == '__main__':
unittest2.main() unittest2.main()

@ -22,7 +22,6 @@
packages: packages:
common: common:
- git
- openssh-server - openssh-server
- rsync - rsync
- strace - strace
@ -32,6 +31,9 @@
- libjson-perl - libjson-perl
- python-virtualenv - python-virtualenv
CentOS: CentOS:
"5":
- sudo
#- perl-JSON -- skipped on CentOS 5, packages are a pain.
"6": "6":
- perl-JSON - perl-JSON
"7": "7":
@ -91,9 +93,23 @@
dest: /etc/ssh/banner.txt dest: /etc/ssh/banner.txt
src: ../data/docker/ssh_login_banner.txt src: ../data/docker/ssh_login_banner.txt
- copy: - name: Ensure /etc/sudoers.d exists
dest: /etc/sudoers.d/001-mitogen file:
src: ../data/docker/001-mitogen.sudo state: directory
path: /etc/sudoers.d
mode: 'u=rwx,go='
- blockinfile:
path: /etc/sudoers
block: |
# https://www.toofishes.net/blog/trouble-sudoers-or-last-entry-wins/
%mitogen__sudo_nopw ALL=(ALL:ALL) NOPASSWD:ALL
mitogen__has_sudo_nopw ALL = (mitogen__pw_required) ALL
mitogen__has_sudo_nopw ALL = (mitogen__require_tty_pw_required) ALL
Defaults>mitogen__pw_required targetpw
Defaults>mitogen__require_tty requiretty
Defaults>mitogen__require_tty_pw_required requiretty,targetpw
- lineinfile: - lineinfile:
path: /etc/sudoers path: /etc/sudoers

@ -67,18 +67,18 @@
shell: /bin/bash shell: /bin/bash
groups: "{{user_groups[item]|default(['mitogen__group'])}}" groups: "{{user_groups[item]|default(['mitogen__group'])}}"
password: "{{ (item + '_password') | password_hash('sha256') }}" password: "{{ (item + '_password') | password_hash('sha256') }}"
loop: "{{all_users}}" with_items: "{{all_users}}"
when: ansible_system != 'Darwin' when: ansible_system != 'Darwin'
- user: - user:
name: "mitogen__{{item}}" name: "mitogen__{{item}}"
shell: /bin/bash shell: /bin/bash
groups: "{{user_groups[item]|default(['mitogen__group'])}}" groups: "{{user_groups[item]|default(['mitogen__group'])}}"
password: "{{item}}_password" password: "{{item}}_password"
loop: "{{all_users}}" with_items: "{{all_users}}"
when: ansible_system == 'Darwin' when: ansible_system == 'Darwin'
- name: Hide users from login window. - name: Hide users from login window.
loop: "{{all_users}}" with_items: "{{all_users}}"
when: ansible_system == 'Darwin' when: ansible_system == 'Darwin'
osx_defaults: osx_defaults:
array_add: true array_add: true
@ -149,4 +149,4 @@
lineinfile: lineinfile:
path: /etc/sudoers path: /etc/sudoers
line: "{{lookup('pipe', 'whoami')}} ALL = (mitogen__{{item}}) NOPASSWD:ALL" line: "{{lookup('pipe', 'whoami')}} ALL = (mitogen__{{item}}) NOPASSWD:ALL"
loop: "{{normal_users}}" with_items: "{{normal_users}}"

@ -25,6 +25,7 @@ def sh(s, *args):
label_by_id = {} label_by_id = {}
for base_image, label in [ for base_image, label in [
('astj/centos5-vault', 'centos5'), # Python 2.4.3
('debian:stretch', 'debian'), # Python 2.7.13, 3.5.3 ('debian:stretch', 'debian'), # Python 2.7.13, 3.5.3
('centos:6', 'centos6'), # Python 2.6.6 ('centos:6', 'centos6'), # Python 2.6.6
('centos:7', 'centos7') # Python 2.7.5 ('centos:7', 'centos7') # Python 2.7.5

@ -10,5 +10,5 @@
Ubuntu: sudo Ubuntu: sudo
CentOS: wheel CentOS: wheel
- import_playbook: _container_setup.yml - include: _container_setup.yml
- import_playbook: _user_accounts.yml - include: _user_accounts.yml

@ -0,0 +1,32 @@
#!/bin/bash
# Build the tests/data/ubuntu-python-2.4.6.tar.bz2 tarball.
set -ex
[ "" ] && {
wget -cO setuptools-1.4.2.tar.gz https://files.pythonhosted.org/packages/source/s/setuptools/setuptools-1.4.2.tar.gz
wget -cO ez_setup.py https://raw.githubusercontent.com/pypa/setuptools/bootstrap-py24/ez_setup.py
wget -cO simplejson-2.0.9.tar.gz https://github.com/simplejson/simplejson/archive/v2.0.9.tar.gz
wget -cO psutil-2.1.3.tar.gz https://github.com/giampaolo/psutil/archive/release-2.1.3.tar.gz
wget -cO unittest2-0.5.1.zip http://voidspace.org.uk/downloads/unittest2-0.5.1-python2.3.zip
wget -cO cpython-2.4.6.tar.gz https://github.com/python/cpython/archive/v2.4.6.tar.gz
wget -cO mock-0.8.0.tar.gz https://github.com/testing-cabal/mock/archive/0.8.0.tar.gz
tar xzvf cpython-2.4.6.tar.gz
}
(
cd cpython-2.4.6
./configure --prefix=/usr/local/python2.4.6 --with-pydebug --enable-debug CFLAGS="-g -O0" # --enable-debug
echo 'zlib zlibmodule.c -I$(prefix)/include -L$(exec_prefix)/lib -lz' >> Modules/Setup.config
make -j 8
sudo make install
)
sudo /usr/local/python2.4.6/bin/python2.4 ez_setup.py
sudo /usr/local/python2.4.6/bin/easy_install psutil-2.1.3.tar.gz
sudo /usr/local/python2.4.6/bin/easy_install simplejson-2.0.9.tar.gz
sudo /usr/local/python2.4.6/bin/easy_install unittest2-0.5.1.zip
sudo /usr/local/python2.4.6/bin/easy_install mock-0.8.0.tar.gz
sudo find /usr/local/python2.4.6 -name '*.py[co]' -delete
tar jcvf ubuntu-python-2.4.6.tar.bz2 /usr/local/python2.4.6

@ -1,12 +1,10 @@
import email.utils
import sys import sys
import threading import threading
import types import types
import zlib import zlib
import mock import mock
import pytest
import unittest2 import unittest2
import mitogen.core import mitogen.core
@ -144,7 +142,6 @@ class LoadModulePackageTest(ImporterMixin, testlib.TestCase):
class EmailParseAddrSysTest(testlib.RouterMixin, testlib.TestCase): class EmailParseAddrSysTest(testlib.RouterMixin, testlib.TestCase):
@pytest.fixture(autouse=True)
def initdir(self, caplog): def initdir(self, caplog):
self.caplog = caplog self.caplog = caplog
@ -212,5 +209,10 @@ class ImporterBlacklistTest(testlib.TestCase):
self.assertTrue(mitogen.core.is_blacklisted_import(importer, 'builtins')) self.assertTrue(mitogen.core.is_blacklisted_import(importer, 'builtins'))
class Python24LineCacheTest(testlib.TestCase):
# TODO: mitogen.core.Importer._update_linecache()
pass
if __name__ == '__main__': if __name__ == '__main__':
unittest2.main() unittest2.main()

@ -9,6 +9,15 @@ import testlib
import mitogen.core import mitogen.core
def py24_mock_fix(m):
def wrapper(*args, **kwargs):
ret = m(*args, **kwargs)
if isinstance(ret, Exception):
raise ret
return ret
return wrapper
class RestartTest(object): class RestartTest(object):
func = staticmethod(mitogen.core.io_op) func = staticmethod(mitogen.core.io_op)
exception_class = None exception_class = None
@ -21,7 +30,7 @@ class RestartTest(object):
self.exception_class(errno.EINTR), self.exception_class(errno.EINTR),
'yay', 'yay',
] ]
rc, disconnected = self.func(m, 'input') rc, disconnected = self.func(py24_mock_fix(m), 'input')
self.assertEquals(rc, 'yay') self.assertEquals(rc, 'yay')
self.assertFalse(disconnected) self.assertFalse(disconnected)
self.assertEquals(4, m.call_count) self.assertEquals(4, m.call_count)

@ -3,6 +3,11 @@ import os
import mitogen import mitogen
import mitogen.lxc import mitogen.lxc
try:
any
except NameError:
from mitogen.core import any
import unittest2 import unittest2
import testlib import testlib

@ -162,16 +162,63 @@ class ResolveRelPathTest(testlib.TestCase):
self.assertEquals('', self.call('email.utils', 3)) self.assertEquals('', self.call('email.utils', 3))
class DjangoMixin(object): class FakeSshTest(testlib.TestCase):
klass = mitogen.master.ModuleFinder
def call(self, fullname):
return self.klass().find_related_imports(fullname)
def test_simple(self):
import mitogen.fakessh
related = self.call('mitogen.fakessh')
self.assertEquals(related, [
'mitogen',
'mitogen.core',
'mitogen.master',
'mitogen.parent',
])
class FindRelatedTest(testlib.TestCase):
klass = mitogen.master.ModuleFinder
def call(self, fullname):
return self.klass().find_related(fullname)
SIMPLE_EXPECT = set([
u'mitogen',
u'mitogen.core',
u'mitogen.master',
u'mitogen.minify',
u'mitogen.parent',
])
if sys.version_info < (3, 2):
SIMPLE_EXPECT.add('mitogen.compat')
SIMPLE_EXPECT.add('mitogen.compat.functools')
if sys.version_info < (2, 7):
SIMPLE_EXPECT.add('mitogen.compat.tokenize')
if sys.version_info < (2, 6):
SIMPLE_EXPECT.add('mitogen.compat.pkgutil')
def test_simple(self):
import mitogen.fakessh
related = self.call('mitogen.fakessh')
self.assertEquals(set(related), self.SIMPLE_EXPECT)
if sys.version_info > (2, 6):
class DjangoMixin(object):
WEBPROJECT_PATH = testlib.data_path('webproject') WEBPROJECT_PATH = testlib.data_path('webproject')
# TODO: rip out Django and replace with a static tree of weird imports that # TODO: rip out Django and replace with a static tree of weird imports
# don't depend on .. Django! The hack below is because the version of # that don't depend on .. Django! The hack below is because the version
# Django we need to test against 2.6 doesn't actually run on 3.6. But we # of Django we need to test against 2.6 doesn't actually run on 3.6.
# don't care, we just need to be able to import it. # But we don't care, we just need to be able to import it.
# #
# File "django/utils/html_parser.py", line 12, in <module> # File "django/utils/html_parser.py", line 12, in <module>
# AttributeError: module 'html.parser' has no attribute 'HTMLParseError' # AttributeError: module 'html.parser' has no attribute
# 'HTMLParseError'
# #
import pkg_resources._vendor.six import pkg_resources._vendor.six
from django.utils.six.moves import html_parser as _html_parser from django.utils.six.moves import html_parser as _html_parser
@ -190,22 +237,12 @@ class DjangoMixin(object):
super(DjangoMixin, cls).tearDownClass() super(DjangoMixin, cls).tearDownClass()
class FindRelatedImportsTest(DjangoMixin, testlib.TestCase): class FindRelatedImportsTest(DjangoMixin, testlib.TestCase):
klass = mitogen.master.ModuleFinder klass = mitogen.master.ModuleFinder
def call(self, fullname): def call(self, fullname):
return self.klass().find_related_imports(fullname) return self.klass().find_related_imports(fullname)
def test_simple(self):
import mitogen.fakessh
related = self.call('mitogen.fakessh')
self.assertEquals(related, [
'mitogen',
'mitogen.core',
'mitogen.master',
'mitogen.parent',
])
def test_django_db(self): def test_django_db(self):
import django.db import django.db
related = self.call('django.db') related = self.call('django.db')
@ -222,52 +259,26 @@ class FindRelatedImportsTest(DjangoMixin, testlib.TestCase):
related = self.call('django.db.models') related = self.call('django.db.models')
self.maxDiff=None self.maxDiff=None
self.assertEquals(related, [ self.assertEquals(related, [
'django', u'django',
'django.core.exceptions', u'django.core.exceptions',
'django.db', u'django.db',
'django.db.models', u'django.db.models',
'django.db.models.aggregates', u'django.db.models.aggregates',
'django.db.models.base', u'django.db.models.base',
'django.db.models.deletion', u'django.db.models.deletion',
'django.db.models.expressions', u'django.db.models.expressions',
'django.db.models.fields', u'django.db.models.fields',
'django.db.models.fields.files', u'django.db.models.fields.files',
'django.db.models.fields.related', u'django.db.models.fields.related',
'django.db.models.fields.subclassing', u'django.db.models.fields.subclassing',
'django.db.models.loading', u'django.db.models.loading',
'django.db.models.manager', u'django.db.models.manager',
'django.db.models.query', u'django.db.models.query',
'django.db.models.signals', u'django.db.models.signals',
]) ])
class FindRelatedTest(DjangoMixin, testlib.TestCase): class DjangoFindRelatedTest(DjangoMixin, testlib.TestCase):
klass = mitogen.master.ModuleFinder
def call(self, fullname):
return self.klass().find_related(fullname)
SIMPLE_EXPECT = set([
'mitogen',
'mitogen.core',
'mitogen.master',
'mitogen.minify',
'mitogen.parent',
])
if sys.version_info < (3, 2):
SIMPLE_EXPECT.add('mitogen.compat')
SIMPLE_EXPECT.add('mitogen.compat.functools')
if sys.version_info < (2, 7):
SIMPLE_EXPECT.add('mitogen.compat.tokenize')
def test_simple(self):
import mitogen.fakessh
related = self.call('mitogen.fakessh')
self.assertEquals(set(related), self.SIMPLE_EXPECT)
class DjangoFindRelatedTest(DjangoMixin, testlib.TestCase):
klass = mitogen.master.ModuleFinder klass = mitogen.master.ModuleFinder
maxDiff = None maxDiff = None
@ -278,119 +289,120 @@ class DjangoFindRelatedTest(DjangoMixin, testlib.TestCase):
import django.db import django.db
related = self.call('django.db') related = self.call('django.db')
self.assertEquals(related, [ self.assertEquals(related, [
'django', u'django',
'django.conf', u'django.conf',
'django.conf.global_settings', u'django.conf.global_settings',
'django.core', u'django.core',
'django.core.exceptions', u'django.core.exceptions',
'django.core.signals', u'django.core.signals',
'django.db.utils', u'django.db.utils',
'django.dispatch', u'django.dispatch',
'django.dispatch.dispatcher', u'django.dispatch.dispatcher',
'django.dispatch.saferef', u'django.dispatch.saferef',
'django.utils', u'django.utils',
'django.utils._os', u'django.utils._os',
'django.utils.encoding', u'django.utils.encoding',
'django.utils.functional', u'django.utils.functional',
'django.utils.importlib', u'django.utils.importlib',
'django.utils.module_loading', u'django.utils.module_loading',
'django.utils.six', u'django.utils.six',
]) ])
@unittest2.skipIf(
condition=(sys.version_info >= (3, 0)),
reason='broken due to ancient vendored six.py'
)
def test_django_db_models(self): def test_django_db_models(self):
if sys.version_info >= (3, 0):
raise unittest2.SkipTest('broken due to ancient vendored six.py')
import django.db.models import django.db.models
related = self.call('django.db.models') related = self.call('django.db.models')
self.assertEquals(related, [ self.assertEquals(related, [
'django', u'django',
'django.conf', u'django.conf',
'django.conf.global_settings', u'django.conf.global_settings',
'django.core', u'django.core',
'django.core.exceptions', u'django.core.exceptions',
'django.core.files', u'django.core.files',
'django.core.files.base', u'django.core.files.base',
'django.core.files.images', u'django.core.files.images',
'django.core.files.locks', u'django.core.files.locks',
'django.core.files.move', u'django.core.files.move',
'django.core.files.storage', u'django.core.files.storage',
'django.core.files.utils', u'django.core.files.utils',
'django.core.signals', u'django.core.signals',
'django.core.validators', u'django.core.validators',
'django.db', u'django.db',
'django.db.backends', u'django.db.backends',
'django.db.backends.signals', u'django.db.backends.signals',
'django.db.backends.util', u'django.db.backends.util',
'django.db.models.aggregates', u'django.db.models.aggregates',
'django.db.models.base', u'django.db.models.base',
'django.db.models.constants', u'django.db.models.constants',
'django.db.models.deletion', u'django.db.models.deletion',
'django.db.models.expressions', u'django.db.models.expressions',
'django.db.models.fields', u'django.db.models.fields',
'django.db.models.fields.files', u'django.db.models.fields.files',
'django.db.models.fields.proxy', u'django.db.models.fields.proxy',
'django.db.models.fields.related', u'django.db.models.fields.related',
'django.db.models.fields.subclassing', u'django.db.models.fields.subclassing',
'django.db.models.loading', u'django.db.models.loading',
'django.db.models.manager', u'django.db.models.manager',
'django.db.models.options', u'django.db.models.options',
'django.db.models.query', u'django.db.models.query',
'django.db.models.query_utils', u'django.db.models.query_utils',
'django.db.models.related', u'django.db.models.related',
'django.db.models.signals', u'django.db.models.signals',
'django.db.models.sql', u'django.db.models.sql',
'django.db.models.sql.aggregates', u'django.db.models.sql.aggregates',
'django.db.models.sql.constants', u'django.db.models.sql.constants',
'django.db.models.sql.datastructures', u'django.db.models.sql.datastructures',
'django.db.models.sql.expressions', u'django.db.models.sql.expressions',
'django.db.models.sql.query', u'django.db.models.sql.query',
'django.db.models.sql.subqueries', u'django.db.models.sql.subqueries',
'django.db.models.sql.where', u'django.db.models.sql.where',
'django.db.transaction', u'django.db.transaction',
'django.db.utils', u'django.db.utils',
'django.dispatch', u'django.dispatch',
'django.dispatch.dispatcher', u'django.dispatch.dispatcher',
'django.dispatch.saferef', u'django.dispatch.saferef',
'django.forms', u'django.forms',
'django.forms.fields', u'django.forms.fields',
'django.forms.forms', u'django.forms.forms',
'django.forms.formsets', u'django.forms.formsets',
'django.forms.models', u'django.forms.models',
'django.forms.util', u'django.forms.util',
'django.forms.widgets', u'django.forms.widgets',
'django.utils', u'django.utils',
'django.utils._os', u'django.utils._os',
'django.utils.crypto', u'django.utils.crypto',
'django.utils.datastructures', u'django.utils.datastructures',
'django.utils.dateformat', u'django.utils.dateformat',
'django.utils.dateparse', u'django.utils.dateparse',
'django.utils.dates', u'django.utils.dates',
'django.utils.datetime_safe', u'django.utils.datetime_safe',
'django.utils.decorators', u'django.utils.decorators',
'django.utils.deprecation', u'django.utils.deprecation',
'django.utils.encoding', u'django.utils.encoding',
'django.utils.formats', u'django.utils.formats',
'django.utils.functional', u'django.utils.functional',
'django.utils.html', u'django.utils.html',
'django.utils.html_parser', u'django.utils.html_parser',
'django.utils.importlib', u'django.utils.importlib',
'django.utils.ipv6', u'django.utils.ipv6',
'django.utils.itercompat', u'django.utils.itercompat',
'django.utils.module_loading', u'django.utils.module_loading',
'django.utils.numberformat', u'django.utils.numberformat',
'django.utils.safestring', u'django.utils.safestring',
'django.utils.six', u'django.utils.six',
'django.utils.text', u'django.utils.text',
'django.utils.timezone', u'django.utils.timezone',
'django.utils.translation', u'django.utils.translation',
'django.utils.tree', u'django.utils.tree',
'django.utils.tzinfo', u'django.utils.tzinfo',
'pytz', u'pytz',
'pytz.exceptions', u'pytz.exceptions',
'pytz.lazy', u'pytz.lazy',
'pytz.tzfile', u'pytz.tzfile',
'pytz.tzinfo', u'pytz.tzinfo',
]) ])
if __name__ == '__main__': if __name__ == '__main__':

@ -10,6 +10,7 @@ import time
import mock import mock
import unittest2 import unittest2
import testlib import testlib
from testlib import Popen__terminate
import mitogen.parent import mitogen.parent
@ -137,7 +138,7 @@ class StreamErrorTest(testlib.RouterMixin, testlib.TestCase):
def test_via_eof(self): def test_via_eof(self):
# Verify FD leakage does not keep failed process open. # Verify FD leakage does not keep failed process open.
local = self.router.fork() local = self.router.local()
e = self.assertRaises(mitogen.core.StreamError, e = self.assertRaises(mitogen.core.StreamError,
lambda: self.router.local( lambda: self.router.local(
via=local, via=local,
@ -159,7 +160,7 @@ class StreamErrorTest(testlib.RouterMixin, testlib.TestCase):
self.assertTrue(e.args[0].startswith(prefix)) self.assertTrue(e.args[0].startswith(prefix))
def test_via_enoent(self): def test_via_enoent(self):
local = self.router.fork() local = self.router.local()
e = self.assertRaises(mitogen.core.StreamError, e = self.assertRaises(mitogen.core.StreamError,
lambda: self.router.local( lambda: self.router.local(
via=local, via=local,
@ -264,12 +265,12 @@ class IterReadTest(testlib.TestCase):
proc = self.make_proc() proc = self.make_proc()
try: try:
reader = self.func([proc.stdout.fileno()]) reader = self.func([proc.stdout.fileno()])
for i, chunk in enumerate(reader, 1): for i, chunk in enumerate(reader):
self.assertEqual(i, int(chunk)) self.assertEqual(1+i, int(chunk))
if i > 3: if i > 2:
break break
finally: finally:
proc.terminate() Popen__terminate(proc)
proc.stdout.close() proc.stdout.close()
def test_deadline_exceeded_before_call(self): def test_deadline_exceeded_before_call(self):
@ -284,7 +285,7 @@ class IterReadTest(testlib.TestCase):
except mitogen.core.TimeoutError: except mitogen.core.TimeoutError:
self.assertEqual(len(got), 0) self.assertEqual(len(got), 0)
finally: finally:
proc.terminate() Popen__terminate(proc)
proc.stdout.close() proc.stdout.close()
def test_deadline_exceeded_during_call(self): def test_deadline_exceeded_during_call(self):
@ -306,7 +307,7 @@ class IterReadTest(testlib.TestCase):
self.assertLess(1, len(got)) self.assertLess(1, len(got))
self.assertLess(len(got), 20) self.assertLess(len(got), 20)
finally: finally:
proc.terminate() Popen__terminate(proc)
proc.stdout.close() proc.stdout.close()
@ -326,7 +327,7 @@ class WriteAllTest(testlib.TestCase):
try: try:
self.func(proc.stdin.fileno(), self.ten_ms_chunk) self.func(proc.stdin.fileno(), self.ten_ms_chunk)
finally: finally:
proc.terminate() Popen__terminate(proc)
proc.stdin.close() proc.stdin.close()
def test_deadline_exceeded_before_call(self): def test_deadline_exceeded_before_call(self):
@ -336,7 +337,7 @@ class WriteAllTest(testlib.TestCase):
lambda: self.func(proc.stdin.fileno(), self.ten_ms_chunk, 0) lambda: self.func(proc.stdin.fileno(), self.ten_ms_chunk, 0)
)) ))
finally: finally:
proc.terminate() Popen__terminate(proc)
proc.stdin.close() proc.stdin.close()
def test_deadline_exceeded_during_call(self): def test_deadline_exceeded_during_call(self):
@ -349,7 +350,7 @@ class WriteAllTest(testlib.TestCase):
deadline) deadline)
)) ))
finally: finally:
proc.terminate() Popen__terminate(proc)
proc.stdin.close() proc.stdin.close()
@ -357,7 +358,7 @@ class DisconnectTest(testlib.RouterMixin, testlib.TestCase):
def test_child_disconnected(self): def test_child_disconnected(self):
# Easy mode: process notices its own directly connected child is # Easy mode: process notices its own directly connected child is
# disconnected. # disconnected.
c1 = self.router.fork() c1 = self.router.local()
recv = c1.call_async(time.sleep, 9999) recv = c1.call_async(time.sleep, 9999)
c1.shutdown(wait=True) c1.shutdown(wait=True)
e = self.assertRaises(mitogen.core.ChannelError, e = self.assertRaises(mitogen.core.ChannelError,
@ -367,8 +368,8 @@ class DisconnectTest(testlib.RouterMixin, testlib.TestCase):
def test_indirect_child_disconnected(self): def test_indirect_child_disconnected(self):
# Achievement unlocked: process notices an indirectly connected child # Achievement unlocked: process notices an indirectly connected child
# is disconnected. # is disconnected.
c1 = self.router.fork() c1 = self.router.local()
c2 = self.router.fork(via=c1) c2 = self.router.local(via=c1)
recv = c2.call_async(time.sleep, 9999) recv = c2.call_async(time.sleep, 9999)
c2.shutdown(wait=True) c2.shutdown(wait=True)
e = self.assertRaises(mitogen.core.ChannelError, e = self.assertRaises(mitogen.core.ChannelError,
@ -378,8 +379,8 @@ class DisconnectTest(testlib.RouterMixin, testlib.TestCase):
def test_indirect_child_intermediary_disconnected(self): def test_indirect_child_intermediary_disconnected(self):
# Battlefield promotion: process notices indirect child disconnected # Battlefield promotion: process notices indirect child disconnected
# due to an intermediary child disconnecting. # due to an intermediary child disconnecting.
c1 = self.router.fork() c1 = self.router.local()
c2 = self.router.fork(via=c1) c2 = self.router.local(via=c1)
recv = c2.call_async(time.sleep, 9999) recv = c2.call_async(time.sleep, 9999)
c1.shutdown(wait=True) c1.shutdown(wait=True)
e = self.assertRaises(mitogen.core.ChannelError, e = self.assertRaises(mitogen.core.ChannelError,
@ -389,8 +390,8 @@ class DisconnectTest(testlib.RouterMixin, testlib.TestCase):
def test_near_sibling_disconnected(self): def test_near_sibling_disconnected(self):
# Hard mode: child notices sibling connected to same parent has # Hard mode: child notices sibling connected to same parent has
# disconnected. # disconnected.
c1 = self.router.fork() c1 = self.router.local()
c2 = self.router.fork() c2 = self.router.local()
# Let c1 call functions in c2. # Let c1 call functions in c2.
self.router.stream_by_id(c1.context_id).auth_id = mitogen.context_id self.router.stream_by_id(c1.context_id).auth_id = mitogen.context_id
@ -411,11 +412,11 @@ class DisconnectTest(testlib.RouterMixin, testlib.TestCase):
def test_far_sibling_disconnected(self): def test_far_sibling_disconnected(self):
# God mode: child of child notices child of child of parent has # God mode: child of child notices child of child of parent has
# disconnected. # disconnected.
c1 = self.router.fork() c1 = self.router.local()
c11 = self.router.fork(via=c1) c11 = self.router.local(via=c1)
c2 = self.router.fork() c2 = self.router.local()
c22 = self.router.fork(via=c2) c22 = self.router.local(via=c2)
# Let c1 call functions in c2. # Let c1 call functions in c2.
self.router.stream_by_id(c1.context_id).auth_id = mitogen.context_id self.router.stream_by_id(c1.context_id).auth_id = mitogen.context_id

@ -13,6 +13,12 @@ import mitogen.parent
import testlib import testlib
try:
next
except NameError:
# Python 2.4
from mitogen.core import next
class SockMixin(object): class SockMixin(object):
def tearDown(self): def tearDown(self):
@ -345,6 +351,22 @@ class FileClosedMixin(PollerMixin, SockMixin):
pass pass
class TtyHangupMixin(PollerMixin):
def test_tty_hangup_detected(self):
# bug in initial select.poll() implementation failed to detect POLLHUP.
master_fd, slave_fd = mitogen.parent.openpty()
try:
self.p.start_receive(master_fd)
self.assertEquals([], list(self.p.poll(0)))
os.close(slave_fd)
slave_fd = None
self.assertEquals([master_fd], list(self.p.poll(0)))
finally:
if slave_fd is not None:
os.close(slave_fd)
os.close(master_fd)
class DistinctDataMixin(PollerMixin, SockMixin): class DistinctDataMixin(PollerMixin, SockMixin):
# Verify different data is yielded for the same FD according to the event # Verify different data is yielded for the same FD according to the event
# being raised. # being raised.
@ -368,29 +390,39 @@ class AllMixin(ReceiveStateMixin,
FileClosedMixin, FileClosedMixin,
DistinctDataMixin, DistinctDataMixin,
PollMixin, PollMixin,
TtyHangupMixin,
CloseMixin): CloseMixin):
""" """
Helper to avoid cutpasting mixin names below. Helper to avoid cutpasting mixin names below.
""" """
@unittest2.skipIf(condition=not hasattr(select, 'select'),
reason='select.select() not supported')
class SelectTest(AllMixin, testlib.TestCase): class SelectTest(AllMixin, testlib.TestCase):
klass = mitogen.core.Poller klass = mitogen.core.Poller
SelectTest = unittest2.skipIf(
condition=not hasattr(select, 'select'),
reason='select.select() not supported'
)(SelectTest)
@unittest2.skipIf(condition=not hasattr(select, 'kqueue'),
reason='select.kqueue() not supported')
class KqueueTest(AllMixin, testlib.TestCase): class KqueueTest(AllMixin, testlib.TestCase):
klass = mitogen.parent.KqueuePoller klass = mitogen.parent.KqueuePoller
KqueueTest = unittest2.skipIf(
condition=not hasattr(select, 'kqueue'),
reason='select.kqueue() not supported'
)(KqueueTest)
@unittest2.skipIf(condition=not hasattr(select, 'epoll'),
reason='select.epoll() not supported')
class EpollTest(AllMixin, testlib.TestCase): class EpollTest(AllMixin, testlib.TestCase):
klass = mitogen.parent.EpollPoller klass = mitogen.parent.EpollPoller
EpollTest = unittest2.skipIf(
condition=not hasattr(select, 'epoll'),
reason='select.epoll() not supported'
)(EpollTest)
if __name__ == '__main__': if __name__ == '__main__':
unittest2.main() unittest2.main()

@ -0,0 +1,103 @@
import testlib
import unittest2
import mitogen.core
from mitogen.core import b
class BytesPartitionTest(testlib.TestCase):
func = staticmethod(mitogen.core.bytes_partition)
def test_no_sep(self):
left, sep, right = self.func(b('dave'), b('x'))
self.assertTrue(isinstance(left, mitogen.core.BytesType))
self.assertTrue(isinstance(sep, mitogen.core.BytesType))
self.assertTrue(isinstance(right, mitogen.core.BytesType))
self.assertEquals(left, b('dave'))
self.assertEquals(sep, b(''))
self.assertEquals(right, b(''))
def test_one_sep(self):
left, sep, right = self.func(b('davexdave'), b('x'))
self.assertTrue(isinstance(left, mitogen.core.BytesType))
self.assertTrue(isinstance(sep, mitogen.core.BytesType))
self.assertTrue(isinstance(right, mitogen.core.BytesType))
self.assertEquals(left, b('dave'))
self.assertEquals(sep, b('x'))
self.assertEquals(right, b('dave'))
def test_two_seps(self):
left, sep, right = self.func(b('davexdavexdave'), b('x'))
self.assertTrue(isinstance(left, mitogen.core.BytesType))
self.assertTrue(isinstance(sep, mitogen.core.BytesType))
self.assertTrue(isinstance(right, mitogen.core.BytesType))
self.assertEquals(left, b('dave'))
self.assertEquals(sep, b('x'))
self.assertEquals(right, b('davexdave'))
class StrPartitionTest(testlib.TestCase):
func = staticmethod(mitogen.core.str_partition)
def test_no_sep(self):
left, sep, right = self.func(u'dave', u'x')
self.assertTrue(isinstance(left, mitogen.core.UnicodeType))
self.assertTrue(isinstance(sep, mitogen.core.UnicodeType))
self.assertTrue(isinstance(right, mitogen.core.UnicodeType))
self.assertEquals(left, u'dave')
self.assertEquals(sep, u'')
self.assertEquals(right, u'')
def test_one_sep(self):
left, sep, right = self.func(u'davexdave', u'x')
self.assertTrue(isinstance(left, mitogen.core.UnicodeType))
self.assertTrue(isinstance(sep, mitogen.core.UnicodeType))
self.assertTrue(isinstance(right, mitogen.core.UnicodeType))
self.assertEquals(left, u'dave')
self.assertEquals(sep, u'x')
self.assertEquals(right, u'dave')
def test_two_seps(self):
left, sep, right = self.func(u'davexdavexdave', u'x')
self.assertTrue(isinstance(left, mitogen.core.UnicodeType))
self.assertTrue(isinstance(sep, mitogen.core.UnicodeType))
self.assertTrue(isinstance(right, mitogen.core.UnicodeType))
self.assertEquals(left, u'dave')
self.assertEquals(sep, u'x')
self.assertEquals(right, u'davexdave')
class StrRpartitionTest(testlib.TestCase):
func = staticmethod(mitogen.core.str_rpartition)
def test_no_sep(self):
left, sep, right = self.func(u'dave', u'x')
self.assertTrue(isinstance(left, mitogen.core.UnicodeType))
self.assertTrue(isinstance(sep, mitogen.core.UnicodeType))
self.assertTrue(isinstance(right, mitogen.core.UnicodeType))
self.assertEquals(left, u'')
self.assertEquals(sep, u'')
self.assertEquals(right, u'dave')
def test_one_sep(self):
left, sep, right = self.func(u'davexdave', u'x')
self.assertTrue(isinstance(left, mitogen.core.UnicodeType))
self.assertTrue(isinstance(sep, mitogen.core.UnicodeType))
self.assertTrue(isinstance(right, mitogen.core.UnicodeType))
self.assertEquals(left, u'dave')
self.assertEquals(sep, u'x')
self.assertEquals(right, u'dave')
def test_two_seps(self):
left, sep, right = self.func(u'davexdavexdave', u'x')
self.assertTrue(isinstance(left, mitogen.core.UnicodeType))
self.assertTrue(isinstance(sep, mitogen.core.UnicodeType))
self.assertTrue(isinstance(right, mitogen.core.UnicodeType))
self.assertEquals(left, u'davexdave')
self.assertEquals(sep, u'x')
self.assertEquals(right, u'dave')
if __name__ == '__main__':
unittest2.main()

@ -32,7 +32,7 @@ class ConstructorTest(testlib.RouterMixin, testlib.TestCase):
class IterationTest(testlib.RouterMixin, testlib.TestCase): class IterationTest(testlib.RouterMixin, testlib.TestCase):
def test_dead_stops_iteration(self): def test_dead_stops_iteration(self):
recv = mitogen.core.Receiver(self.router) recv = mitogen.core.Receiver(self.router)
fork = self.router.fork() fork = self.router.local()
ret = fork.call_async(yield_stuff_then_die, recv.to_sender()) ret = fork.call_async(yield_stuff_then_die, recv.to_sender())
self.assertEquals(list(range(5)), list(m.unpickle() for m in recv)) self.assertEquals(list(range(5)), list(m.unpickle() for m in recv))
self.assertEquals(10, ret.get().unpickle()) self.assertEquals(10, ret.get().unpickle())

@ -128,6 +128,10 @@ class BrokenModulesTest(testlib.TestCase):
self.assertEquals(('non_existent_module', None, None, None, ()), self.assertEquals(('non_existent_module', None, None, None, ()),
msg.unpickle()) msg.unpickle())
@unittest2.skipIf(
condition=sys.version_info < (2, 6),
reason='Ancient Python lacked "from . import foo"',
)
def test_ansible_six_messed_up_path(self): def test_ansible_six_messed_up_path(self):
# The copy of six.py shipped with Ansible appears in a package whose # The copy of six.py shipped with Ansible appears in a package whose
# __path__ subsequently ends up empty, which prevents pkgutil from # __path__ subsequently ends up empty, which prevents pkgutil from
@ -166,12 +170,12 @@ class ForwardTest(testlib.RouterMixin, testlib.TestCase):
def test_stats(self): def test_stats(self):
# Forwarding stats broken because forwarding is broken. See #469. # Forwarding stats broken because forwarding is broken. See #469.
c1 = self.router.local() c1 = self.router.local()
c2 = self.router.fork(via=c1) c2 = self.router.local(via=c1)
self.assertEquals(256, c2.call(plain_old_module.pow, 2, 8)) self.assertEquals(256, c2.call(plain_old_module.pow, 2, 8))
self.assertEquals(3, self.router.responder.get_module_count) self.assertEquals(2, self.router.responder.get_module_count)
self.assertEquals(3, self.router.responder.good_load_module_count) self.assertEquals(2, self.router.responder.good_load_module_count)
self.assertLess(23000, self.router.responder.good_load_module_size) self.assertLess(20000, self.router.responder.good_load_module_size)
class BlacklistTest(testlib.TestCase): class BlacklistTest(testlib.TestCase):

@ -39,11 +39,11 @@ class SourceVerifyTest(testlib.RouterMixin, testlib.TestCase):
super(SourceVerifyTest, self).setUp() super(SourceVerifyTest, self).setUp()
# Create some children, ping them, and store what their messages look # Create some children, ping them, and store what their messages look
# like so we can mess with them later. # like so we can mess with them later.
self.child1 = self.router.fork() self.child1 = self.router.local()
self.child1_msg = self.child1.call_async(ping).get() self.child1_msg = self.child1.call_async(ping).get()
self.child1_stream = self.router._stream_by_id[self.child1.context_id] self.child1_stream = self.router._stream_by_id[self.child1.context_id]
self.child2 = self.router.fork() self.child2 = self.router.local()
self.child2_msg = self.child2.call_async(ping).get() self.child2_msg = self.child2.call_async(ping).get()
self.child2_stream = self.router._stream_by_id[self.child2.context_id] self.child2_stream = self.router._stream_by_id[self.child2.context_id]
@ -68,7 +68,7 @@ class SourceVerifyTest(testlib.RouterMixin, testlib.TestCase):
self.assertTrue(recv.empty()) self.assertTrue(recv.empty())
# Ensure error was logged. # Ensure error was logged.
expect = 'bad auth_id: got %d via' % (self.child2_msg.auth_id,) expect = 'bad auth_id: got %r via' % (self.child2_msg.auth_id,)
self.assertTrue(expect in log.stop()) self.assertTrue(expect in log.stop())
def test_bad_src_id(self): def test_bad_src_id(self):
@ -245,7 +245,7 @@ class MessageSizeTest(testlib.BrokerMixin, testlib.TestCase):
# Try function call. Receiver should be woken by a dead message sent by # Try function call. Receiver should be woken by a dead message sent by
# router due to message size exceeded. # router due to message size exceeded.
child = router.fork() child = router.local()
e = self.assertRaises(mitogen.core.ChannelError, e = self.assertRaises(mitogen.core.ChannelError,
lambda: child.call(zlib.crc32, ' '*8192)) lambda: child.call(zlib.crc32, ' '*8192))
self.assertEquals(e.args[0], expect) self.assertEquals(e.args[0], expect)
@ -253,23 +253,22 @@ class MessageSizeTest(testlib.BrokerMixin, testlib.TestCase):
self.assertTrue(expect in logs.stop()) self.assertTrue(expect in logs.stop())
def test_remote_configured(self): def test_remote_configured(self):
router = self.klass(broker=self.broker, max_message_size=4096) router = self.klass(broker=self.broker, max_message_size=64*1024)
remote = router.fork() remote = router.local()
size = remote.call(return_router_max_message_size) size = remote.call(return_router_max_message_size)
self.assertEquals(size, 4096) self.assertEquals(size, 64*1024)
def test_remote_exceeded(self): def test_remote_exceeded(self):
# Ensure new contexts receive a router with the same value. # Ensure new contexts receive a router with the same value.
router = self.klass(broker=self.broker, max_message_size=4096) router = self.klass(broker=self.broker, max_message_size=64*1024)
recv = mitogen.core.Receiver(router) recv = mitogen.core.Receiver(router)
logs = testlib.LogCapturer() logs = testlib.LogCapturer()
logs.start() logs.start()
remote = router.local()
remote.call(send_n_sized_reply, recv.to_sender(), 128*1024)
remote = router.fork() expect = 'message too large (max %d bytes)' % (64*1024,)
remote.call(send_n_sized_reply, recv.to_sender(), 8192)
expect = 'message too large (max 4096 bytes)'
self.assertTrue(expect in logs.stop()) self.assertTrue(expect in logs.stop())
@ -277,7 +276,7 @@ class NoRouteTest(testlib.RouterMixin, testlib.TestCase):
def test_invalid_handle_returns_dead(self): def test_invalid_handle_returns_dead(self):
# Verify sending a message to an invalid handle yields a dead message # Verify sending a message to an invalid handle yields a dead message
# from the target context. # from the target context.
l1 = self.router.fork() l1 = self.router.local()
recv = l1.send_async(mitogen.core.Message(handle=999)) recv = l1.send_async(mitogen.core.Message(handle=999))
msg = recv.get(throw_dead=False) msg = recv.get(throw_dead=False)
self.assertEquals(msg.is_dead, True) self.assertEquals(msg.is_dead, True)
@ -314,7 +313,7 @@ class NoRouteTest(testlib.RouterMixin, testlib.TestCase):
))) )))
def test_previously_alive_context_returns_dead(self): def test_previously_alive_context_returns_dead(self):
l1 = self.router.fork() l1 = self.router.local()
l1.shutdown(wait=True) l1.shutdown(wait=True)
recv = mitogen.core.Receiver(self.router) recv = mitogen.core.Receiver(self.router)
msg = mitogen.core.Message( msg = mitogen.core.Message(
@ -343,8 +342,8 @@ class NoRouteTest(testlib.RouterMixin, testlib.TestCase):
class UnidirectionalTest(testlib.RouterMixin, testlib.TestCase): class UnidirectionalTest(testlib.RouterMixin, testlib.TestCase):
def test_siblings_cant_talk(self): def test_siblings_cant_talk(self):
self.router.unidirectional = True self.router.unidirectional = True
l1 = self.router.fork() l1 = self.router.local()
l2 = self.router.fork() l2 = self.router.local()
logs = testlib.LogCapturer() logs = testlib.LogCapturer()
logs.start() logs.start()
e = self.assertRaises(mitogen.core.CallError, e = self.assertRaises(mitogen.core.CallError,
@ -361,12 +360,12 @@ class UnidirectionalTest(testlib.RouterMixin, testlib.TestCase):
self.router.unidirectional = True self.router.unidirectional = True
# One stream has auth_id stamped to that of the master, so it should be # One stream has auth_id stamped to that of the master, so it should be
# treated like a parent. # treated like a parent.
l1 = self.router.fork() l1 = self.router.local()
l1s = self.router.stream_by_id(l1.context_id) l1s = self.router.stream_by_id(l1.context_id)
l1s.auth_id = mitogen.context_id l1s.auth_id = mitogen.context_id
l1s.is_privileged = True l1s.is_privileged = True
l2 = self.router.fork() l2 = self.router.local()
e = self.assertRaises(mitogen.core.CallError, e = self.assertRaises(mitogen.core.CallError,
lambda: l2.call(ping_context, l1)) lambda: l2.call(ping_context, l1))
@ -377,7 +376,7 @@ class UnidirectionalTest(testlib.RouterMixin, testlib.TestCase):
class EgressIdsTest(testlib.RouterMixin, testlib.TestCase): class EgressIdsTest(testlib.RouterMixin, testlib.TestCase):
def test_egress_ids_populated(self): def test_egress_ids_populated(self):
# Ensure Stream.egress_ids is populated on message reception. # Ensure Stream.egress_ids is populated on message reception.
c1 = self.router.fork() c1 = self.router.local()
stream = self.router.stream_by_id(c1.context_id) stream = self.router.stream_by_id(c1.context_id)
self.assertEquals(set(), stream.egress_ids) self.assertEquals(set(), stream.egress_ids)

@ -38,7 +38,7 @@ class ContextTest(testlib.RouterMixin, testlib.TestCase):
# together (e.g. Ansible). # together (e.g. Ansible).
def test_mitogen_roundtrip(self): def test_mitogen_roundtrip(self):
c = self.router.fork() c = self.router.local()
r = mitogen.core.Receiver(self.router) r = mitogen.core.Receiver(self.router)
r.to_sender().send(c) r.to_sender().send(c)
c2 = r.get().unpickle() c2 = r.get().unpickle()
@ -47,7 +47,7 @@ class ContextTest(testlib.RouterMixin, testlib.TestCase):
self.assertEquals(c.name, c2.name) self.assertEquals(c.name, c2.name)
def test_vanilla_roundtrip(self): def test_vanilla_roundtrip(self):
c = self.router.fork() c = self.router.local()
c2 = pickle.loads(pickle.dumps(c)) c2 = pickle.loads(pickle.dumps(c))
self.assertEquals(None, c2.router) self.assertEquals(None, c2.router)
self.assertEquals(c.context_id, c2.context_id) self.assertEquals(c.context_id, c2.context_id)

@ -38,21 +38,21 @@ def call_service_in(context, service_name, method_name):
class ActivationTest(testlib.RouterMixin, testlib.TestCase): class ActivationTest(testlib.RouterMixin, testlib.TestCase):
def test_parent_can_activate(self): def test_parent_can_activate(self):
l1 = self.router.fork() l1 = self.router.local()
counter, id_ = l1.call_service(MyService, 'get_id') counter, id_ = l1.call_service(MyService, 'get_id')
self.assertEquals(1, counter) self.assertEquals(1, counter)
self.assertTrue(isinstance(id_, int)) self.assertTrue(isinstance(id_, int))
def test_sibling_cannot_activate_framework(self): def test_sibling_cannot_activate_framework(self):
l1 = self.router.fork() l1 = self.router.local()
l2 = self.router.fork() l2 = self.router.local()
exc = self.assertRaises(mitogen.core.CallError, exc = self.assertRaises(mitogen.core.CallError,
lambda: l2.call(call_service_in, l1, MyService2.name(), 'get_id')) lambda: l2.call(call_service_in, l1, MyService2.name(), 'get_id'))
self.assertTrue(mitogen.core.Router.refused_msg in exc.args[0]) self.assertTrue(mitogen.core.Router.refused_msg in exc.args[0])
def test_sibling_cannot_activate_service(self): def test_sibling_cannot_activate_service(self):
l1 = self.router.fork() l1 = self.router.local()
l2 = self.router.fork() l2 = self.router.local()
l1.call_service(MyService, 'get_id') # force framework activation l1.call_service(MyService, 'get_id') # force framework activation
capture = testlib.LogCapturer() capture = testlib.LogCapturer()
capture.start() capture.start()
@ -65,7 +65,7 @@ class ActivationTest(testlib.RouterMixin, testlib.TestCase):
self.assertTrue(msg in exc.args[0]) self.assertTrue(msg in exc.args[0])
def test_activates_only_once(self): def test_activates_only_once(self):
l1 = self.router.fork() l1 = self.router.local()
counter, id_ = l1.call_service(MyService, 'get_id') counter, id_ = l1.call_service(MyService, 'get_id')
counter2, id_2 = l1.call_service(MyService, 'get_id') counter2, id_2 = l1.call_service(MyService, 'get_id')
self.assertEquals(1, counter) self.assertEquals(1, counter)
@ -75,16 +75,16 @@ class ActivationTest(testlib.RouterMixin, testlib.TestCase):
class PermissionTest(testlib.RouterMixin, testlib.TestCase): class PermissionTest(testlib.RouterMixin, testlib.TestCase):
def test_sibling_unprivileged_ok(self): def test_sibling_unprivileged_ok(self):
l1 = self.router.fork() l1 = self.router.local()
l1.call_service(MyService, 'get_id') l1.call_service(MyService, 'get_id')
l2 = self.router.fork() l2 = self.router.local()
self.assertEquals('unprivileged!', self.assertEquals('unprivileged!',
l2.call(call_service_in, l1, MyService.name(), 'unprivileged_op')) l2.call(call_service_in, l1, MyService.name(), 'unprivileged_op'))
def test_sibling_privileged_bad(self): def test_sibling_privileged_bad(self):
l1 = self.router.fork() l1 = self.router.local()
l1.call_service(MyService, 'get_id') l1.call_service(MyService, 'get_id')
l2 = self.router.fork() l2 = self.router.local()
capture = testlib.LogCapturer() capture = testlib.LogCapturer()
capture.start() capture.start()
try: try:

@ -5,7 +5,7 @@ import testlib
import mitogen.core import mitogen.core
class Thing(): class Thing:
pass pass

@ -3,6 +3,7 @@ import logging
import os import os
import random import random
import re import re
import signal
import socket import socket
import subprocess import subprocess
import sys import sys
@ -32,6 +33,11 @@ try:
except ImportError: except ImportError:
from io import StringIO from io import StringIO
try:
BaseException
except NameError:
BaseException = Exception
LOG = logging.getLogger(__name__) LOG = logging.getLogger(__name__)
DATA_DIR = os.path.join(os.path.dirname(__file__), 'data') DATA_DIR = os.path.join(os.path.dirname(__file__), 'data')
@ -72,9 +78,17 @@ def subprocess__check_output(*popenargs, **kwargs):
raise subprocess.CalledProcessError(retcode, cmd) raise subprocess.CalledProcessError(retcode, cmd)
return output return output
def Popen__terminate(proc):
os.kill(proc.pid, signal.SIGTERM)
if hasattr(subprocess, 'check_output'): if hasattr(subprocess, 'check_output'):
subprocess__check_output = subprocess.check_output subprocess__check_output = subprocess.check_output
if hasattr(subprocess.Popen, 'terminate'):
Popen__terminate = subprocess.Popen.terminate
def wait_for_port( def wait_for_port(
host, host,
@ -182,45 +196,61 @@ def log_fd_calls():
l = threading.Lock() l = threading.Lock()
real_pipe = os.pipe real_pipe = os.pipe
def pipe(): def pipe():
with l: l.acquire()
try:
rv = real_pipe() rv = real_pipe()
if mypid == os.getpid(): if mypid == os.getpid():
sys.stdout.write('\n%s\n' % (rv,)) sys.stdout.write('\n%s\n' % (rv,))
traceback.print_stack(limit=3) traceback.print_stack(limit=3)
sys.stdout.write('\n') sys.stdout.write('\n')
return rv return rv
finally:
l.release()
os.pipe = pipe os.pipe = pipe
real_socketpair = socket.socketpair real_socketpair = socket.socketpair
def socketpair(*args): def socketpair(*args):
with l: l.acquire()
try:
rv = real_socketpair(*args) rv = real_socketpair(*args)
if mypid == os.getpid(): if mypid == os.getpid():
sys.stdout.write('\n%s -> %s\n' % (args, rv)) sys.stdout.write('\n%s -> %s\n' % (args, rv))
traceback.print_stack(limit=3) traceback.print_stack(limit=3)
sys.stdout.write('\n') sys.stdout.write('\n')
return rv return rv
finally:
l.release()
socket.socketpair = socketpair socket.socketpair = socketpair
real_dup2 = os.dup2 real_dup2 = os.dup2
def dup2(*args): def dup2(*args):
with l: l.acquire()
try:
real_dup2(*args) real_dup2(*args)
if mypid == os.getpid(): if mypid == os.getpid():
sys.stdout.write('\n%s\n' % (args,)) sys.stdout.write('\n%s\n' % (args,))
traceback.print_stack(limit=3) traceback.print_stack(limit=3)
sys.stdout.write('\n') sys.stdout.write('\n')
finally:
l.release()
os.dup2 = dup2 os.dup2 = dup2
real_dup = os.dup real_dup = os.dup
def dup(*args): def dup(*args):
with l: l.acquire()
try:
rv = real_dup(*args) rv = real_dup(*args)
if mypid == os.getpid(): if mypid == os.getpid():
sys.stdout.write('\n%s -> %s\n' % (args, rv)) sys.stdout.write('\n%s -> %s\n' % (args, rv))
traceback.print_stack(limit=3) traceback.print_stack(limit=3)
sys.stdout.write('\n') sys.stdout.write('\n')
return rv return rv
finally:
l.release()
os.dup = dup os.dup = dup
@ -285,9 +315,11 @@ class TestCase(unittest2.TestCase):
def _teardown_check_threads(self): def _teardown_check_threads(self):
counts = {} counts = {}
for thread in threading.enumerate(): for thread in threading.enumerate():
assert thread.name in self.ALLOWED_THREADS, \ name = thread.getName()
'Found thread %r still running after tests.' % (thread.name,) # Python 2.4: enumerate() may return stopped threads.
counts[thread.name] = counts.get(thread.name, 0) + 1 assert (not thread.isAlive()) or name in self.ALLOWED_THREADS, \
'Found thread %r still running after tests.' % (name,)
counts[name] = counts.get(name, 0) + 1
for name in counts: for name in counts:
assert counts[name] == 1, \ assert counts[name] == 1, \
@ -331,16 +363,18 @@ def get_docker_host():
class DockerizedSshDaemon(object): class DockerizedSshDaemon(object):
distro, _, _py3 = ( mitogen_test_distro = os.environ.get('MITOGEN_TEST_DISTRO', 'debian')
os.environ.get('MITOGEN_TEST_DISTRO', 'debian') if '-' in mitogen_test_distro:
.partition('-') distro, _py3 = mitogen_test_distro.split('-')
) else:
distro = mitogen_test_distro
_py3 = None
if _py3 == 'py3':
python_path = '/usr/bin/python3'
else:
python_path = '/usr/bin/python'
python_path = (
'/usr/bin/python3'
if _py3 == 'py3'
else '/usr/bin/python'
)
image = 'mitogen/%s-test' % (distro,) image = 'mitogen/%s-test' % (distro,)
# 22/tcp -> 0.0.0.0:32771 # 22/tcp -> 0.0.0.0:32771

@ -8,11 +8,11 @@ import mitogen.core
import mitogen.master import mitogen.master
import testlib import testlib
import simple_pkg.ping
def roundtrip(*args): # TODO: this is a joke. 2/3 interop is one of the hardest bits to get right.
return args # There should be 100 tests in this file.
class TwoThreeCompatTest(testlib.RouterMixin, testlib.TestCase): class TwoThreeCompatTest(testlib.RouterMixin, testlib.TestCase):
if mitogen.core.PY3: if mitogen.core.PY3:
@ -21,10 +21,10 @@ class TwoThreeCompatTest(testlib.RouterMixin, testlib.TestCase):
python_path = 'python3' python_path = 'python3'
def test_succeeds(self): def test_succeeds(self):
spare = self.router.fork() spare = self.router.local()
target = self.router.local(python_path=self.python_path) target = self.router.local(python_path=self.python_path)
spare2, = target.call(roundtrip, spare) spare2, = target.call(simple_pkg.ping.ping, spare)
self.assertEquals(spare.context_id, spare2.context_id) self.assertEquals(spare.context_id, spare2.context_id)
self.assertEquals(spare.name, spare2.name) self.assertEquals(spare.name, spare2.name)

@ -1,4 +1,6 @@
import sys
try: try:
from io import StringIO from io import StringIO
from io import BytesIO from io import BytesIO
@ -88,11 +90,18 @@ class KwargsTest(testlib.TestCase):
self.assertTrue(type(dct) is dict) self.assertTrue(type(dct) is dict)
self.assertEquals({}, dct) self.assertEquals({}, dct)
@unittest2.skipIf(condition=lambda: not mitogen.core.PY3, @unittest2.skipIf(condition=(sys.version_info >= (2, 6)),
reason='py<2.6 only')
def test_bytes_conversion(self):
kw = self.klass({u'key': 123})
self.assertEquals({'key': 123}, kw)
self.assertEquals("Kwargs({'key': 123})", repr(kw))
@unittest2.skipIf(condition=not mitogen.core.PY3,
reason='py3 only') reason='py3 only')
def test_unicode_conversion(self): def test_unicode_conversion(self):
kw = self.klass({mitogen.core.b('key'): 123}) kw = self.klass({mitogen.core.b('key'): 123})
self.assertEquals({mitogen.core.b('key'): 123}, kw) self.assertEquals({u'key': 123}, kw)
self.assertEquals("Kwargs({'key': 123})", repr(kw)) self.assertEquals("Kwargs({'key': 123})", repr(kw))
klass, (dct,) = kw.__reduce__() klass, (dct,) = kw.__reduce__()
self.assertTrue(klass is self.klass) self.assertTrue(klass is self.klass)

@ -1,13 +1,13 @@
import os import os
import socket import socket
import subprocess
import sys import sys
import time import time
import unittest2 import unittest2
import mitogen import mitogen
import mitogen.fork
import mitogen.master import mitogen.master
import mitogen.service import mitogen.service
import mitogen.unix import mitogen.unix
@ -21,6 +21,12 @@ class MyService(mitogen.service.Service):
# used to wake up main thread once client has made its request # used to wake up main thread once client has made its request
self.latch = latch self.latch = latch
@classmethod
def name(cls):
# Because this is loaded from both __main__ and whatever unit2 does,
# specify a fixed name.
return 'unix_test.MyService'
@mitogen.service.expose(policy=mitogen.service.AllowParents()) @mitogen.service.expose(policy=mitogen.service.AllowParents())
def ping(self, msg): def ping(self, msg):
self.latch.put(None) self.latch.put(None)
@ -100,12 +106,13 @@ class ClientTest(testlib.TestCase):
router.broker.join() router.broker.join()
os.unlink(path) os.unlink(path)
def _test_simple_server(self, path): @classmethod
def _test_simple_server(cls, path):
router = mitogen.master.Router() router = mitogen.master.Router()
latch = mitogen.core.Latch() latch = mitogen.core.Latch()
try: try:
try: try:
listener = self.klass(path=path, router=router) listener = cls.klass(path=path, router=router)
pool = mitogen.service.Pool(router=router, services=[ pool = mitogen.service.Pool(router=router, services=[
MyService(latch=latch, router=router), MyService(latch=latch, router=router),
]) ])
@ -122,12 +129,21 @@ class ClientTest(testlib.TestCase):
def test_simple(self): def test_simple(self):
path = mitogen.unix.make_socket_path() path = mitogen.unix.make_socket_path()
if os.fork(): proc = subprocess.Popen(
[sys.executable, __file__, 'ClientTest_server', path]
)
try:
self._test_simple_client(path) self._test_simple_client(path)
else: finally:
mitogen.fork.on_fork() # TODO :)
self._test_simple_server(path) mitogen.context_id = 0
mitogen.parent_id = None
mitogen.parent_ids = []
proc.wait()
if __name__ == '__main__': if __name__ == '__main__':
if len(sys.argv) == 3 and sys.argv[1] == 'ClientTest_server':
ClientTest._test_simple_server(path=sys.argv[2])
else:
unittest2.main() unittest2.main()

@ -5,6 +5,7 @@ import unittest2
import mitogen.core import mitogen.core
import mitogen.master import mitogen.master
import mitogen.utils import mitogen.utils
from mitogen.core import b
import testlib import testlib
@ -86,7 +87,7 @@ class CastTest(testlib.TestCase):
self.assertEqual(type(mitogen.utils.cast(Unicode())), mitogen.core.UnicodeType) self.assertEqual(type(mitogen.utils.cast(Unicode())), mitogen.core.UnicodeType)
def test_bytes(self): def test_bytes(self):
self.assertEqual(type(mitogen.utils.cast(b'')), mitogen.core.BytesType) self.assertEqual(type(mitogen.utils.cast(b(''))), mitogen.core.BytesType)
self.assertEqual(type(mitogen.utils.cast(Bytes())), mitogen.core.BytesType) self.assertEqual(type(mitogen.utils.cast(Bytes())), mitogen.core.BytesType)
def test_unknown(self): def test_unknown(self):

Loading…
Cancel
Save