Commit 88a0e89f authored by Jeroen Demeyer's avatar Jeroen Demeyer

Merge remote-tracking branch 'origin/master'

parents e310ccb4 aac55cf6
...@@ -15,27 +15,105 @@ Features added ...@@ -15,27 +15,105 @@ Features added
* Tracing is supported in ``nogil`` functions/sections and module init code. * Tracing is supported in ``nogil`` functions/sections and module init code.
* Adding/subtracting constant Python floats and small integers is faster. * PEP 448 (Additional Unpacking Generalizations) was implemented.
* When generators are used in a Cython module and the module imports the
modules "inspect" and/or "asyncio", Cython enables interoperability by
patching these modules to recognise Cython's internal generator type.
This can be disabled by C compiling the module with
"-D CYTHON_PATCH_ASYNCIO=0" or "-D CYTHON_PATCH_INSPECT=0"
* When generators are used in a Cython module, the new ``Generator`` ABC
will be patched into the ``collections`` or ``collections.abc``
stdlib module if it is not there yet. It allows type checks for
``isinstance(obj, Generator)`` which includes both Python generators
and Cython generators. This can be disabled by C compiling the module
with "-D CYTHON_PATCH_ABC=0". See https://bugs.python.org/issue24018
* Adding/subtracting/dividing/modulus and equality comparisons with
constant Python floats and small integers are faster.
* Binary and/or/xor/rshift operations with small constant Python integers
are faster.
* Keyword argument dicts are no longer copied on function entry when they
are not being used or only passed through to other function calls (e.g.
in wrapper functions).
* The ``PyTypeObject`` declaration in ``cpython.object`` was extended.
* ``wraparound()`` and ``boundscheck()`` are available as no-ops in pure
Python mode.
* Const iterators were added to the provided C++ STL declarations.
* ``NULL`` is allowed as default argument when embedding signatures.
This fixes ticket 843.
* When compiling with ``--embed``, the internal module name is changed to
``__main__`` to allow arbitrary program names, including those that would
be invalid for modules. Note that this prevents reuse of the generated
C code as an importable module.
Bugs fixed Bugs fixed
---------- ----------
* Calling "yield from" from Python on a Cython generator that returned a value
triggered a crash in CPython. This is now being worked around.
See https://bugs.python.org/issue23996
* Language level 3 did not enable true division (a.k.a. float division) for * Language level 3 did not enable true division (a.k.a. float division) for
integer operands. integer operands.
* Relative cimports could accidentally fall back to trying an absolute cimport
on failure.
* The result of calling a C struct constructor no longer requires an intermediate
assignment when coercing to a Python dict.
* C++ exception declarations with mapping functions could fail to compile when
pre-declared in .pxd files.
0.22.1 (2015-05-??)
===================
Bugs fixed
----------
* Crash when returning values on generator termination.
* In some cases, exceptions raised during internal isinstance() checks were
not propagated.
* Runtime reported file paths of source files (e.g for profiling and tracing) * Runtime reported file paths of source files (e.g for profiling and tracing)
are now relative to the build root directory instead of the main source file. are now relative to the build root directory instead of the main source file.
* Tracing exception handling code could enter the trace function with an active * Tracing exception handling code could enter the trace function with an active
exception set. exception set.
* The internal generator function type was not shared across modules.
* Comparisons of (inferred) ctuples failed to compile. * Comparisons of (inferred) ctuples failed to compile.
* Closures inside of cdef functions returning ``void`` failed to compile.
* Using ``const`` C++ references in intermediate parts of longer expressions
could fail to compile.
* C++ exception declarations with mapping functions could fail to compile when * C++ exception declarations with mapping functions could fail to compile when
pre-declared in .pxd files. pre-declared in .pxd files.
* C++ compilation could fail with an ambiguity error in recent MacOS-X Xcode
versions.
* C compilation could fail in pypy3. * C compilation could fail in pypy3.
* Fixed a memory leak in the compiler when compiling multiple modules.
* When compiling multiple modules, external library dependencies could leak
into later compiler runs. Fix by Jeroen Demeyer. This fixes ticket 845.
0.22 (2015-02-11) 0.22 (2015-02-11)
================= =================
......
...@@ -45,7 +45,7 @@ except ImportError: ...@@ -45,7 +45,7 @@ except ImportError:
from distutils.extension import Extension from distutils.extension import Extension
from .. import Utils from .. import Utils
from ..Utils import cached_function, cached_method, path_exists, find_root_package_dir from ..Utils import cached_function, cached_method, path_exists, find_root_package_dir, is_package_dir
from ..Compiler.Main import Context, CompilationOptions, default_options from ..Compiler.Main import Context, CompilationOptions, default_options
join_path = cached_function(os.path.join) join_path = cached_function(os.path.join)
...@@ -216,12 +216,13 @@ class DistutilsInfo(object): ...@@ -216,12 +216,13 @@ class DistutilsInfo(object):
self.values[key] = value self.values[key] = value
elif type is transitive_list: elif type is transitive_list:
if key in self.values: if key in self.values:
all = self.values[key] # Change a *copy* of the list (Trac #845)
all = self.values[key][:]
for v in value: for v in value:
if v not in all: if v not in all:
all.append(v) all.append(v)
else: value = all
self.values[key] = value self.values[key] = value
return self return self
def subs(self, aliases): def subs(self, aliases):
...@@ -250,9 +251,8 @@ class DistutilsInfo(object): ...@@ -250,9 +251,8 @@ class DistutilsInfo(object):
for key, value in self.values.items(): for key, value in self.values.items():
type = distutils_settings[key] type = distutils_settings[key]
if type in [list, transitive_list]: if type in [list, transitive_list]:
getattr(extension, key).extend(value) value = getattr(extension, key) + list(value)
else: setattr(extension, key, value)
setattr(extension, key, value)
@cython.locals(start=long, q=long, single_q=long, double_q=long, hash_mark=long, @cython.locals(start=long, q=long, single_q=long, double_q=long, hash_mark=long,
end=long, k=long, counter=long, quote_len=long) end=long, k=long, counter=long, quote_len=long)
...@@ -383,7 +383,7 @@ def resolve_depend(depend, include_dirs): ...@@ -383,7 +383,7 @@ def resolve_depend(depend, include_dirs):
@cached_function @cached_function
def package(filename): def package(filename):
dir = os.path.dirname(os.path.abspath(str(filename))) dir = os.path.dirname(os.path.abspath(str(filename)))
if dir != filename and path_exists(join_path(dir, '__init__.py')): if dir != filename and is_package_dir(dir):
return package(dir) + (os.path.basename(dir),) return package(dir) + (os.path.basename(dir),)
else: else:
return () return ()
...@@ -832,7 +832,15 @@ def cythonize(module_list, exclude=[], nthreads=0, aliases=None, quiet=False, fo ...@@ -832,7 +832,15 @@ def cythonize(module_list, exclude=[], nthreads=0, aliases=None, quiet=False, fo
if not os.path.exists(options.cache): if not os.path.exists(options.cache):
os.makedirs(options.cache) os.makedirs(options.cache)
to_compile.sort() to_compile.sort()
if len(to_compile) <= 1: # Drop "priority" component of "to_compile" entries and add a
# simple progress indicator.
N = len(to_compile)
progress_fmt = "[{0:%d}/{1}] " % len(str(N))
for i in range(N):
progress = progress_fmt.format(i+1, N)
to_compile[i] = to_compile[i][1:] + (progress,)
if N <= 1:
nthreads = 0 nthreads = 0
if nthreads: if nthreads:
# Requires multiprocessing (or Python >= 2.6) # Requires multiprocessing (or Python >= 2.6)
...@@ -862,7 +870,7 @@ def cythonize(module_list, exclude=[], nthreads=0, aliases=None, quiet=False, fo ...@@ -862,7 +870,7 @@ def cythonize(module_list, exclude=[], nthreads=0, aliases=None, quiet=False, fo
pool.join() pool.join()
if not nthreads: if not nthreads:
for args in to_compile: for args in to_compile:
cythonize_one(*args[1:]) cythonize_one(*args)
if exclude_failures: if exclude_failures:
failed_modules = set() failed_modules = set()
...@@ -927,7 +935,7 @@ else: ...@@ -927,7 +935,7 @@ else:
# TODO: Share context? Issue: pyx processing leaks into pxd module # TODO: Share context? Issue: pyx processing leaks into pxd module
@record_results @record_results
def cythonize_one(pyx_file, c_file, fingerprint, quiet, options=None, raise_on_failure=True, embedded_metadata=None): def cythonize_one(pyx_file, c_file, fingerprint, quiet, options=None, raise_on_failure=True, embedded_metadata=None, progress=""):
from ..Compiler.Main import compile, default_options from ..Compiler.Main import compile, default_options
from ..Compiler.Errors import CompileError, PyrexError from ..Compiler.Errors import CompileError, PyrexError
...@@ -944,7 +952,7 @@ def cythonize_one(pyx_file, c_file, fingerprint, quiet, options=None, raise_on_f ...@@ -944,7 +952,7 @@ def cythonize_one(pyx_file, c_file, fingerprint, quiet, options=None, raise_on_f
options.cache, "%s-%s%s" % (os.path.basename(c_file), fingerprint, gzip_ext)) options.cache, "%s-%s%s" % (os.path.basename(c_file), fingerprint, gzip_ext))
if os.path.exists(fingerprint_file): if os.path.exists(fingerprint_file):
if not quiet: if not quiet:
print("Found compiled %s in cache" % pyx_file) print("%sFound compiled %s in cache" % (progress, pyx_file))
os.utime(fingerprint_file, None) os.utime(fingerprint_file, None)
g = gzip_open(fingerprint_file, 'rb') g = gzip_open(fingerprint_file, 'rb')
try: try:
...@@ -957,7 +965,7 @@ def cythonize_one(pyx_file, c_file, fingerprint, quiet, options=None, raise_on_f ...@@ -957,7 +965,7 @@ def cythonize_one(pyx_file, c_file, fingerprint, quiet, options=None, raise_on_f
g.close() g.close()
return return
if not quiet: if not quiet:
print("Cythonizing %s" % pyx_file) print("%sCythonizing %s" % (progress, pyx_file))
if options is None: if options is None:
options = CompilationOptions(default_options) options = CompilationOptions(default_options)
options.output_file = c_file options.output_file = c_file
...@@ -1000,7 +1008,7 @@ def cythonize_one(pyx_file, c_file, fingerprint, quiet, options=None, raise_on_f ...@@ -1000,7 +1008,7 @@ def cythonize_one(pyx_file, c_file, fingerprint, quiet, options=None, raise_on_f
def cythonize_one_helper(m): def cythonize_one_helper(m):
import traceback import traceback
try: try:
return cythonize_one(*m[1:]) return cythonize_one(*m)
except Exception: except Exception:
traceback.print_exc() traceback.print_exc()
raise raise
......
...@@ -277,9 +277,9 @@ _parse_code = re.compile( ...@@ -277,9 +277,9 @@ _parse_code = re.compile(
ur'(?P<trace>__Pyx_Trace[A-Za-z]+)|' ur'(?P<trace>__Pyx_Trace[A-Za-z]+)|'
ur'(?:' ur'(?:'
ur'(?P<pyx_macro_api>__Pyx_[A-Z][A-Z_]+)|' ur'(?P<pyx_macro_api>__Pyx_[A-Z][A-Z_]+)|'
ur'(?P<pyx_c_api>__Pyx_[A-Z][a-z_][A-Za-z_]+)|' ur'(?P<pyx_c_api>__Pyx_[A-Z][a-z_][A-Za-z_]*)|'
ur'(?P<py_macro_api>Py[A-Z][a-z]+_[A-Z][A-Z_]+)|' ur'(?P<py_macro_api>Py[A-Z][a-z]+_[A-Z][A-Z_]+)|'
ur'(?P<py_c_api>Py[A-Z][a-z]+_[A-Z][a-z][A-Za-z_]+)' ur'(?P<py_c_api>Py[A-Z][a-z]+_[A-Z][a-z][A-Za-z_]*)'
ur')(?=\()|' # look-ahead to exclude subsequent '(' from replacement ur')(?=\()|' # look-ahead to exclude subsequent '(' from replacement
ur'(?P<error_goto>(?:(?<=;) *if .* +)?\{__pyx_filename = .*goto __pyx_L\w+;\})' ur'(?P<error_goto>(?:(?<=;) *if .* +)?\{__pyx_filename = .*goto __pyx_L\w+;\})'
).sub ).sub
......
...@@ -51,6 +51,8 @@ class EmbedSignature(CythonTransform): ...@@ -51,6 +51,8 @@ class EmbedSignature(CythonTransform):
default_val = arg.default default_val = arg.default
if not default_val: if not default_val:
return None return None
if isinstance(default_val, ExprNodes.NullNode):
return 'NULL'
try: try:
denv = self.denv # XXX denv = self.denv # XXX
ctval = default_val.compile_time_value(self.denv) ctval = default_val.compile_time_value(self.denv)
......
...@@ -165,7 +165,26 @@ builtin_function_table = [ ...@@ -165,7 +165,26 @@ builtin_function_table = [
utility_code = iter_next_utility_code), # not available in Py2 => implemented here utility_code = iter_next_utility_code), # not available in Py2 => implemented here
#('oct', "", "", ""), #('oct', "", "", ""),
#('open', "ss", "O", "PyFile_FromString"), # not in Py3 #('open', "ss", "O", "PyFile_FromString"), # not in Py3
#('ord', "", "", ""), ] + [
BuiltinFunction('ord', None, None, "__Pyx_long_cast",
func_type=PyrexTypes.CFuncType(
PyrexTypes.c_long_type, [PyrexTypes.CFuncTypeArg("c", c_type, None)],
is_strict_signature=True))
for c_type in [PyrexTypes.c_py_ucs4_type, PyrexTypes.c_py_unicode_type]
] + [
BuiltinFunction('ord', None, None, "__Pyx_uchar_cast",
func_type=PyrexTypes.CFuncType(
PyrexTypes.c_uchar_type, [PyrexTypes.CFuncTypeArg("c", c_type, None)],
is_strict_signature=True))
for c_type in [PyrexTypes.c_char_type, PyrexTypes.c_schar_type, PyrexTypes.c_uchar_type]
] + [
BuiltinFunction('ord', None, None, "__Pyx_PyObject_Ord",
utility_code=UtilityCode.load_cached("object_ord", "Builtins.c"),
func_type=PyrexTypes.CFuncType(
PyrexTypes.c_long_type, [
PyrexTypes.CFuncTypeArg("c", PyrexTypes.py_object_type, None)
],
exception_value="(long)(Py_UCS4)-1")),
BuiltinFunction('pow', "OOO", "O", "PyNumber_Power"), BuiltinFunction('pow', "OOO", "O", "PyNumber_Power"),
BuiltinFunction('pow', "OO", "O", "__Pyx_PyNumber_Power2", BuiltinFunction('pow', "OO", "O", "__Pyx_PyNumber_Power2",
utility_code = UtilityCode.load("pow2", "Builtins.c")), utility_code = UtilityCode.load("pow2", "Builtins.c")),
...@@ -300,6 +319,8 @@ builtin_types_table = [ ...@@ -300,6 +319,8 @@ builtin_types_table = [
BuiltinMethod("clear", "T", "r", "PySet_Clear"), BuiltinMethod("clear", "T", "r", "PySet_Clear"),
# discard() and remove() have a special treatment for unhashable values # discard() and remove() have a special treatment for unhashable values
# BuiltinMethod("discard", "TO", "r", "PySet_Discard"), # BuiltinMethod("discard", "TO", "r", "PySet_Discard"),
BuiltinMethod("update", "TO", "r", "__Pyx_PySet_Update",
utility_code=UtilityCode.load_cached("PySet_Update", "Builtins.c")),
BuiltinMethod("add", "TO", "r", "PySet_Add"), BuiltinMethod("add", "TO", "r", "PySet_Add"),
BuiltinMethod("pop", "T", "O", "PySet_Pop")]), BuiltinMethod("pop", "T", "O", "PySet_Pop")]),
("frozenset", "PyFrozenSet_Type", []), ("frozenset", "PyFrozenSet_Type", []),
......
...@@ -231,12 +231,11 @@ class UtilityCodeBase(object): ...@@ -231,12 +231,11 @@ class UtilityCodeBase(object):
loader = __loader__ loader = __loader__
archive = loader.archive archive = loader.archive
with closing(zipfile.ZipFile(archive)) as fileobj: with closing(zipfile.ZipFile(archive)) as fileobj:
listing = [ os.path.basename(name) listing = [os.path.basename(name)
for name in fileobj.namelist() for name in fileobj.namelist()
if os.path.join(archive, name).startswith(utility_dir)] if os.path.join(archive, name).startswith(utility_dir)]
files = [ os.path.join(utility_dir, filename) files = [filename for filename in listing
for filename in listing if filename.startswith(prefix)]
if filename.startswith(prefix) ]
if not files: if not files:
raise ValueError("No match found for utility code " + util_code_name) raise ValueError("No match found for utility code " + util_code_name)
if len(files) > 1: if len(files) > 1:
...@@ -397,6 +396,9 @@ class UtilityCode(UtilityCodeBase): ...@@ -397,6 +396,9 @@ class UtilityCode(UtilityCodeBase):
def inject_string_constants(self, impl, output): def inject_string_constants(self, impl, output):
"""Replace 'PYIDENT("xyz")' by a constant Python identifier cname. """Replace 'PYIDENT("xyz")' by a constant Python identifier cname.
""" """
if 'PYIDENT(' not in impl:
return False, impl
replacements = {} replacements = {}
def externalise(matchobj): def externalise(matchobj):
name = matchobj.group(1) name = matchobj.group(1)
...@@ -407,9 +409,55 @@ class UtilityCode(UtilityCodeBase): ...@@ -407,9 +409,55 @@ class UtilityCode(UtilityCodeBase):
StringEncoding.EncodedString(name)).cname StringEncoding.EncodedString(name)).cname
return cname return cname
impl = re.sub('PYIDENT\("([^"]+)"\)', externalise, impl) impl = re.sub(r'PYIDENT\("([^"]+)"\)', externalise, impl)
assert 'PYIDENT(' not in impl
return bool(replacements), impl return bool(replacements), impl
def inject_unbound_methods(self, impl, output):
"""Replace 'UNBOUND_METHOD(type, "name")' by a constant Python identifier cname.
"""
if 'CALL_UNBOUND_METHOD(' not in impl:
return False, impl
utility_code = set()
def externalise(matchobj):
type_cname, method_name, args = matchobj.groups()
args = [arg.strip() for arg in args[1:].split(',')]
if len(args) == 1:
call = '__Pyx_CallUnboundCMethod0'
utility_code.add("CallUnboundCMethod0")
elif len(args) == 2:
call = '__Pyx_CallUnboundCMethod1'
utility_code.add("CallUnboundCMethod1")
else:
assert False, "CALL_UNBOUND_METHOD() requires 1 or 2 call arguments"
cname = output.get_cached_unbound_method(type_cname, method_name, len(args))
return '%s(&%s, %s)' % (call, cname, ', '.join(args))
impl = re.sub(r'CALL_UNBOUND_METHOD\(([a-zA-Z_]+),\s*"([^"]+)"((?:,\s*[^),]+)+)\)', externalise, impl)
assert 'CALL_UNBOUND_METHOD(' not in impl
for helper in sorted(utility_code):
output.use_utility_code(UtilityCode.load_cached(helper, "ObjectHandling.c"))
return bool(utility_code), impl
def wrap_c_strings(self, impl):
"""Replace CSTRING('''xyz''') by a C compatible string
"""
if 'CSTRING(' not in impl:
return impl
def split_string(matchobj):
content = matchobj.group(1).replace('"', '\042')
return ''.join(
'"%s\\n"\n' % line if not line.endswith('\\') or line.endswith('\\\\') else '"%s"\n' % line[:-1]
for line in content.splitlines())
impl = re.sub(r'CSTRING\(\s*"""([^"]+|"[^"])"""\s*\)', split_string, impl)
assert 'CSTRING(' not in impl
return impl
def put_code(self, output): def put_code(self, output):
if self.requires: if self.requires:
for dependency in self.requires: for dependency in self.requires:
...@@ -419,9 +467,10 @@ class UtilityCode(UtilityCodeBase): ...@@ -419,9 +467,10 @@ class UtilityCode(UtilityCodeBase):
self.format_code(self.proto), self.format_code(self.proto),
'%s_proto' % self.name) '%s_proto' % self.name)
if self.impl: if self.impl:
impl = self.format_code(self.impl) impl = self.format_code(self.wrap_c_strings(self.impl))
is_specialised, impl = self.inject_string_constants(impl, output) is_specialised1, impl = self.inject_string_constants(impl, output)
if not is_specialised: is_specialised2, impl = self.inject_unbound_methods(impl, output)
if not (is_specialised1 or is_specialised2):
# no module specific adaptations => can be reused # no module specific adaptations => can be reused
output['utility_code_def'].put_or_include( output['utility_code_def'].put_or_include(
impl, '%s_impl' % self.name) impl, '%s_impl' % self.name)
...@@ -642,7 +691,7 @@ class FunctionState(object): ...@@ -642,7 +691,7 @@ class FunctionState(object):
A C string referring to the variable is returned. A C string referring to the variable is returned.
""" """
if type.is_const: if type.is_const and not type.is_reference:
type = type.const_base_type type = type.const_base_type
if not type.is_pyobject and not type.is_memoryviewslice: if not type.is_pyobject and not type.is_memoryviewslice:
# Make manage_ref canonical, so that manage_ref will always mean # Make manage_ref canonical, so that manage_ref will always mean
...@@ -913,6 +962,7 @@ class GlobalState(object): ...@@ -913,6 +962,7 @@ class GlobalState(object):
'typeinfo', 'typeinfo',
'before_global_var', 'before_global_var',
'global_var', 'global_var',
'string_decls',
'decls', 'decls',
'all_the_rest', 'all_the_rest',
'pystring_table', 'pystring_table',
...@@ -946,6 +996,7 @@ class GlobalState(object): ...@@ -946,6 +996,7 @@ class GlobalState(object):
self.pyunicode_ptr_const_index = {} self.pyunicode_ptr_const_index = {}
self.num_const_index = {} self.num_const_index = {}
self.py_constants = [] self.py_constants = []
self.cached_cmethods = {}
assert writer.globalstate is None assert writer.globalstate is None
writer.globalstate = self writer.globalstate = self
...@@ -1000,7 +1051,8 @@ class GlobalState(object): ...@@ -1000,7 +1051,8 @@ class GlobalState(object):
# utility_code_def # utility_code_def
# #
code = self.parts['utility_code_def'] code = self.parts['utility_code_def']
code.put(UtilityCode.load_as_string("TypeConversions", "TypeConversion.c")[1]) util = TempitaUtilityCode.load_cached("TypeConversions", "TypeConversion.c")
code.put(util.format_code(util.impl))
code.putln("") code.putln("")
def __getitem__(self, key): def __getitem__(self, key):
...@@ -1167,6 +1219,15 @@ class GlobalState(object): ...@@ -1167,6 +1219,15 @@ class GlobalState(object):
prefix = Naming.const_prefix prefix = Naming.const_prefix
return "%s%s" % (prefix, name_suffix) return "%s%s" % (prefix, name_suffix)
def get_cached_unbound_method(self, type_cname, method_name, args_count):
key = (type_cname, method_name, args_count)
try:
cname = self.cached_cmethods[key]
except KeyError:
cname = self.cached_cmethods[key] = self.new_const_cname(
'umethod', '%s_%s' % (type_cname, method_name))
return cname
def add_cached_builtin_decl(self, entry): def add_cached_builtin_decl(self, entry):
if entry.is_builtin and entry.is_const: if entry.is_builtin and entry.is_const:
if self.should_declare(entry.cname, entry): if self.should_declare(entry.cname, entry):
...@@ -1198,6 +1259,7 @@ class GlobalState(object): ...@@ -1198,6 +1259,7 @@ class GlobalState(object):
w.error_goto(pos))) w.error_goto(pos)))
def generate_const_declarations(self): def generate_const_declarations(self):
self.generate_cached_methods_decls()
self.generate_string_constants() self.generate_string_constants()
self.generate_num_constants() self.generate_num_constants()
self.generate_object_constant_decls() self.generate_object_constant_decls()
...@@ -1211,13 +1273,34 @@ class GlobalState(object): ...@@ -1211,13 +1273,34 @@ class GlobalState(object):
decls_writer.putln( decls_writer.putln(
"static %s;" % c.type.declaration_code(cname)) "static %s;" % c.type.declaration_code(cname))
def generate_cached_methods_decls(self):
if not self.cached_cmethods:
return
decl = self.parts['decls']
init = self.parts['init_globals']
cnames = []
for (type_cname, method_name, _), cname in sorted(self.cached_cmethods.items()):
cnames.append(cname)
method_name_cname = self.get_interned_identifier(StringEncoding.EncodedString(method_name)).cname
decl.putln('static __Pyx_CachedCFunction %s = {0, &%s, 0, 0, 0};' % (
cname, method_name_cname))
# split type reference storage as it might not be static
init.putln('%s.type = (PyObject*)&%s;' % (
cname, type_cname))
if Options.generate_cleanup_code:
cleanup = self.parts['cleanup_globals']
for cname in cnames:
cleanup.putln("Py_CLEAR(%s.method);" % cname)
def generate_string_constants(self): def generate_string_constants(self):
c_consts = [ (len(c.cname), c.cname, c) c_consts = [ (len(c.cname), c.cname, c)
for c in self.string_const_index.values() ] for c in self.string_const_index.values() ]
c_consts.sort() c_consts.sort()
py_strings = [] py_strings = []
decls_writer = self.parts['decls'] decls_writer = self.parts['string_decls']
for _, cname, c in c_consts: for _, cname, c in c_consts:
conditional = False conditional = False
if c.py_versions and (2 not in c.py_versions or 3 not in c.py_versions): if c.py_versions and (2 not in c.py_versions or 3 not in c.py_versions):
...@@ -1798,6 +1881,10 @@ class CCodeWriter(object): ...@@ -1798,6 +1881,10 @@ class CCodeWriter(object):
if entry.type.is_pyobject: if entry.type.is_pyobject:
self.putln("__Pyx_INCREF(%s);" % self.entry_as_pyobject(entry)) self.putln("__Pyx_INCREF(%s);" % self.entry_as_pyobject(entry))
def put_var_xincref(self, entry):
if entry.type.is_pyobject:
self.putln("__Pyx_XINCREF(%s);" % self.entry_as_pyobject(entry))
def put_decref_clear(self, cname, type, nanny=True, clear_before_decref=False): def put_decref_clear(self, cname, type, nanny=True, clear_before_decref=False):
self._put_decref(cname, type, nanny, null_check=False, self._put_decref(cname, type, nanny, null_check=False,
clear=True, clear_before_decref=clear_before_decref) clear=True, clear_before_decref=clear_before_decref)
......
This diff is collapsed.
...@@ -19,6 +19,7 @@ from . import Errors ...@@ -19,6 +19,7 @@ from . import Errors
# conditional metaclass. These options are processed by CmdLine called from # conditional metaclass. These options are processed by CmdLine called from
# main() in this file. # main() in this file.
# import Parsing # import Parsing
from .StringEncoding import EncodedString
from .Scanning import PyrexScanner, FileSourceDescriptor from .Scanning import PyrexScanner, FileSourceDescriptor
from .Errors import PyrexError, CompileError, error, warning from .Errors import PyrexError, CompileError, error, warning
from .Symtab import ModuleScope from .Symtab import ModuleScope
...@@ -76,7 +77,8 @@ class Context(object): ...@@ -76,7 +77,8 @@ class Context(object):
self.cpp = cpp self.cpp = cpp
self.options = options self.options = options
self.pxds = {} # full name -> node tree self.pxds = {} # full name -> node tree
self._interned = {} # (type(value), value, *key_args) -> interned_value
standard_include_path = os.path.abspath(os.path.normpath( standard_include_path = os.path.abspath(os.path.normpath(
os.path.join(os.path.dirname(__file__), os.path.pardir, 'Includes'))) os.path.join(os.path.dirname(__file__), os.path.pardir, 'Includes')))
...@@ -93,6 +95,27 @@ class Context(object): ...@@ -93,6 +95,27 @@ class Context(object):
self.future_directives.update([print_function, unicode_literals, absolute_import, division]) self.future_directives.update([print_function, unicode_literals, absolute_import, division])
self.modules['builtins'] = self.modules['__builtin__'] self.modules['builtins'] = self.modules['__builtin__']
def intern_ustring(self, value, encoding=None):
key = (EncodedString, value, encoding)
try:
return self._interned[key]
except KeyError:
pass
value = EncodedString(value)
if encoding:
value.encoding = encoding
self._interned[key] = value
return value
def intern_value(self, value, *key):
key = (type(value), value) + key
try:
return self._interned[key]
except KeyError:
pass
self._interned[key] = value
return value
# pipeline creation functions can now be found in Pipeline.py # pipeline creation functions can now be found in Pipeline.py
def process_pxd(self, source_desc, scope, module_name): def process_pxd(self, source_desc, scope, module_name):
...@@ -110,7 +133,8 @@ class Context(object): ...@@ -110,7 +133,8 @@ class Context(object):
def nonfatal_error(self, exc): def nonfatal_error(self, exc):
return Errors.report_error(exc) return Errors.report_error(exc)
def find_module(self, module_name, relative_to=None, pos=None, need_pxd=1, check_module_name=True): def find_module(self, module_name, relative_to=None, pos=None, need_pxd=1,
absolute_fallback=True):
# Finds and returns the module scope corresponding to # Finds and returns the module scope corresponding to
# the given relative or absolute module name. If this # the given relative or absolute module name. If this
# is the first time the module has been requested, finds # is the first time the module has been requested, finds
...@@ -125,25 +149,39 @@ class Context(object): ...@@ -125,25 +149,39 @@ class Context(object):
scope = None scope = None
pxd_pathname = None pxd_pathname = None
if check_module_name and not module_name_pattern.match(module_name): if relative_to:
if pos is None: if module_name:
pos = (module_name, 0, 0) # from .module import ...
raise CompileError(pos, "'%s' is not a valid module name" % module_name) qualified_name = relative_to.qualify_name(module_name)
else:
# from . import ...
qualified_name = relative_to.qualified_name
scope = relative_to
relative_to = None
else:
qualified_name = module_name
if not module_name_pattern.match(qualified_name):
raise CompileError(pos or (module_name, 0, 0),
"'%s' is not a valid module name" % module_name)
if relative_to: if relative_to:
if debug_find_module: if debug_find_module:
print("...trying relative import") print("...trying relative import")
scope = relative_to.lookup_submodule(module_name) scope = relative_to.lookup_submodule(module_name)
if not scope: if not scope:
qualified_name = relative_to.qualify_name(module_name)
pxd_pathname = self.find_pxd_file(qualified_name, pos) pxd_pathname = self.find_pxd_file(qualified_name, pos)
if pxd_pathname: if pxd_pathname:
scope = relative_to.find_submodule(module_name) scope = relative_to.find_submodule(module_name)
if not scope: if not scope:
if debug_find_module: if debug_find_module:
print("...trying absolute import") print("...trying absolute import")
if absolute_fallback:
qualified_name = module_name
scope = self scope = self
for name in module_name.split("."): for name in qualified_name.split("."):
scope = scope.find_submodule(name) scope = scope.find_submodule(name)
if debug_find_module: if debug_find_module:
print("...scope = %s" % scope) print("...scope = %s" % scope)
if not scope.pxd_file_loaded: if not scope.pxd_file_loaded:
...@@ -153,15 +191,15 @@ class Context(object): ...@@ -153,15 +191,15 @@ class Context(object):
if not pxd_pathname: if not pxd_pathname:
if debug_find_module: if debug_find_module:
print("...looking for pxd file") print("...looking for pxd file")
pxd_pathname = self.find_pxd_file(module_name, pos) pxd_pathname = self.find_pxd_file(qualified_name, pos)
if debug_find_module: if debug_find_module:
print("......found %s" % pxd_pathname) print("......found %s" % pxd_pathname)
if not pxd_pathname and need_pxd: if not pxd_pathname and need_pxd:
package_pathname = self.search_include_directories(module_name, ".py", pos) package_pathname = self.search_include_directories(qualified_name, ".py", pos)
if package_pathname and package_pathname.endswith('__init__.py'): if package_pathname and package_pathname.endswith('__init__.py'):
pass pass
else: else:
error(pos, "'%s.pxd' not found" % module_name.replace('.', os.sep)) error(pos, "'%s.pxd' not found" % qualified_name.replace('.', os.sep))
if pxd_pathname: if pxd_pathname:
try: try:
if debug_find_module: if debug_find_module:
...@@ -170,7 +208,7 @@ class Context(object): ...@@ -170,7 +208,7 @@ class Context(object):
if not pxd_pathname.endswith(rel_path): if not pxd_pathname.endswith(rel_path):
rel_path = pxd_pathname # safety measure to prevent printing incorrect paths rel_path = pxd_pathname # safety measure to prevent printing incorrect paths
source_desc = FileSourceDescriptor(pxd_pathname, rel_path) source_desc = FileSourceDescriptor(pxd_pathname, rel_path)
err, result = self.process_pxd(source_desc, scope, module_name) err, result = self.process_pxd(source_desc, scope, qualified_name)
if err: if err:
raise err raise err
(pxd_codenodes, pxd_scope) = result (pxd_codenodes, pxd_scope) = result
......
...@@ -571,26 +571,6 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode): ...@@ -571,26 +571,6 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
code.putln("") code.putln("")
code.putln("#define PY_SSIZE_T_CLEAN") code.putln("#define PY_SSIZE_T_CLEAN")
# sizeof(PyLongObject.ob_digit[0]) may have been determined dynamically
# at compile time in CPython, in which case we can't know the correct
# storage size for an installed system. We can rely on it only if
# pyconfig.h defines it statically, i.e. if it was set by "configure".
# Once we include "Python.h", it will come up with its own idea about
# a suitable value, which may or may not match the real one.
code.putln("#ifndef CYTHON_USE_PYLONG_INTERNALS")
code.putln("#ifdef PYLONG_BITS_IN_DIGIT")
# assume it's an incorrect left-over
code.putln("#define CYTHON_USE_PYLONG_INTERNALS 0")
code.putln("#else")
code.putln('#include "pyconfig.h"')
code.putln("#ifdef PYLONG_BITS_IN_DIGIT")
code.putln("#define CYTHON_USE_PYLONG_INTERNALS 1")
code.putln("#else")
code.putln("#define CYTHON_USE_PYLONG_INTERNALS 0")
code.putln("#endif")
code.putln("#endif")
code.putln("#endif")
for filename in env.python_include_files: for filename in env.python_include_files:
code.putln('#include "%s"' % filename) code.putln('#include "%s"' % filename)
code.putln("#ifndef Py_PYTHON_H") code.putln("#ifndef Py_PYTHON_H")
...@@ -1982,7 +1962,7 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode): ...@@ -1982,7 +1962,7 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
"};") "};")
def generate_import_star(self, env, code): def generate_import_star(self, env, code):
env.use_utility_code(streq_utility_code) env.use_utility_code(UtilityCode.load_cached("CStringEquals", "StringTools.c"))
code.putln() code.putln()
code.enter_cfunc_scope() # as we need labels code.enter_cfunc_scope() # as we need labels
code.putln("static int %s(PyObject *o, PyObject* py_name, char *name) {" % Naming.import_star_set) code.putln("static int %s(PyObject *o, PyObject* py_name, char *name) {" % Naming.import_star_set)
...@@ -2044,8 +2024,9 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode): ...@@ -2044,8 +2024,9 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
code.putln("bad:") code.putln("bad:")
code.putln("return -1;") code.putln("return -1;")
code.putln("}") code.putln("}")
code.putln(import_star_utility_code) code.putln("")
code.exit_cfunc_scope() # done with labels code.putln(UtilityCode.load_cached("ImportStar", "ImportExport.c").impl)
code.exit_cfunc_scope() # done with labels
def generate_module_init_func(self, imported_modules, env, code): def generate_module_init_func(self, imported_modules, env, code):
code.enter_cfunc_scope() code.enter_cfunc_scope()
...@@ -2170,6 +2151,10 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode): ...@@ -2170,6 +2151,10 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
code.putln("/*--- Execution code ---*/") code.putln("/*--- Execution code ---*/")
code.mark_pos(None) code.mark_pos(None)
code.putln("#ifdef __Pyx_Generator_USED")
code.put_error_if_neg(self.pos, "__Pyx_patch_abc()")
code.putln("#endif")
if profile or linetrace: if profile or linetrace:
code.put_trace_call(header3, self.pos, nogil=not code.funcstate.gil_owned) code.put_trace_call(header3, self.pos, nogil=not code.funcstate.gil_owned)
code.funcstate.can_trace = True code.funcstate.can_trace = True
...@@ -2358,12 +2343,13 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode): ...@@ -2358,12 +2343,13 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
wmain = "wmain" wmain = "wmain"
else: else:
wmain = Options.embed wmain = Options.embed
main_method = UtilityCode.load_cached("MainFunction", "Embed.c")
code.globalstate.use_utility_code( code.globalstate.use_utility_code(
main_method.specialize( main_method.specialize(
module_name = env.module_name, module_name=env.module_name,
module_is_main = module_is_main, module_is_main=module_is_main,
main_method = Options.embed, main_method=Options.embed,
wmain_method = wmain)) wmain_method=wmain))
def generate_pymoduledef_struct(self, env, code): def generate_pymoduledef_struct(self, env, code):
if env.doc: if env.doc:
...@@ -2829,140 +2815,7 @@ def generate_cfunction_declaration(entry, env, code, definition): ...@@ -2829,140 +2815,7 @@ def generate_cfunction_declaration(entry, env, code, definition):
# #
#------------------------------------------------------------------------------------ #------------------------------------------------------------------------------------
streq_utility_code = UtilityCode( refnanny_utility_code = UtilityCode.load("Refnanny", "ModuleSetupCode.c")
proto = """
static CYTHON_INLINE int __Pyx_StrEq(const char *, const char *); /*proto*/
""",
impl = """
static CYTHON_INLINE int __Pyx_StrEq(const char *s1, const char *s2) {
while (*s1 != '\\0' && *s1 == *s2) { s1++; s2++; }
return *s1 == *s2;
}
""")
#------------------------------------------------------------------------------------
import_star_utility_code = """
/* import_all_from is an unexposed function from ceval.c */
static int
__Pyx_import_all_from(PyObject *locals, PyObject *v)
{
PyObject *all = PyObject_GetAttrString(v, "__all__");
PyObject *dict, *name, *value;
int skip_leading_underscores = 0;
int pos, err;
if (all == NULL) {
if (!PyErr_ExceptionMatches(PyExc_AttributeError))
return -1; /* Unexpected error */
PyErr_Clear();
dict = PyObject_GetAttrString(v, "__dict__");
if (dict == NULL) {
if (!PyErr_ExceptionMatches(PyExc_AttributeError))
return -1;
PyErr_SetString(PyExc_ImportError,
"from-import-* object has no __dict__ and no __all__");
return -1;
}
#if PY_MAJOR_VERSION < 3
all = PyObject_CallMethod(dict, (char *)"keys", NULL);
#else
all = PyMapping_Keys(dict);
#endif
Py_DECREF(dict);
if (all == NULL)
return -1;
skip_leading_underscores = 1;
}
for (pos = 0, err = 0; ; pos++) {
name = PySequence_GetItem(all, pos);
if (name == NULL) {
if (!PyErr_ExceptionMatches(PyExc_IndexError))
err = -1;
else
PyErr_Clear();
break;
}
if (skip_leading_underscores &&
#if PY_MAJOR_VERSION < 3
PyString_Check(name) &&
PyString_AS_STRING(name)[0] == '_')
#else
PyUnicode_Check(name) &&
PyUnicode_AS_UNICODE(name)[0] == '_')
#endif
{
Py_DECREF(name);
continue;
}
value = PyObject_GetAttr(v, name);
if (value == NULL)
err = -1;
else if (PyDict_CheckExact(locals))
err = PyDict_SetItem(locals, name, value);
else
err = PyObject_SetItem(locals, name, value);
Py_DECREF(name);
Py_XDECREF(value);
if (err != 0)
break;
}
Py_DECREF(all);
return err;
}
static int %(IMPORT_STAR)s(PyObject* m) {
int i;
int ret = -1;
char* s;
PyObject *locals = 0;
PyObject *list = 0;
#if PY_MAJOR_VERSION >= 3
PyObject *utf8_name = 0;
#endif
PyObject *name;
PyObject *item;
locals = PyDict_New(); if (!locals) goto bad;
if (__Pyx_import_all_from(locals, m) < 0) goto bad;
list = PyDict_Items(locals); if (!list) goto bad;
for(i=0; i<PyList_GET_SIZE(list); i++) {
name = PyTuple_GET_ITEM(PyList_GET_ITEM(list, i), 0);
item = PyTuple_GET_ITEM(PyList_GET_ITEM(list, i), 1);
#if PY_MAJOR_VERSION >= 3
utf8_name = PyUnicode_AsUTF8String(name);
if (!utf8_name) goto bad;
s = PyBytes_AS_STRING(utf8_name);
if (%(IMPORT_STAR_SET)s(item, name, s) < 0) goto bad;
Py_DECREF(utf8_name); utf8_name = 0;
#else
s = PyString_AsString(name);
if (!s) goto bad;
if (%(IMPORT_STAR_SET)s(item, name, s) < 0) goto bad;
#endif
}
ret = 0;
bad:
Py_XDECREF(locals);
Py_XDECREF(list);
#if PY_MAJOR_VERSION >= 3
Py_XDECREF(utf8_name);
#endif
return ret;
}
""" % {'IMPORT_STAR' : Naming.import_star,
'IMPORT_STAR_SET' : Naming.import_star_set }
refnanny_utility_code = UtilityCode.load_cached("Refnanny", "ModuleSetupCode.c")
main_method = UtilityCode.load("MainFunction", "Embed.c")
packed_struct_utility_code = UtilityCode(proto=""" packed_struct_utility_code = UtilityCode(proto="""
#if defined(__GNUC__) #if defined(__GNUC__)
......
...@@ -61,6 +61,7 @@ interned_prefixes = { ...@@ -61,6 +61,7 @@ interned_prefixes = {
'codeobj': pyrex_prefix + "codeobj_", 'codeobj': pyrex_prefix + "codeobj_",
'slice': pyrex_prefix + "slice_", 'slice': pyrex_prefix + "slice_",
'ustring': pyrex_prefix + "ustring_", 'ustring': pyrex_prefix + "ustring_",
'umethod': pyrex_prefix + "umethod_",
} }
ctuple_type_prefix = pyrex_prefix + "ctuple_" ctuple_type_prefix = pyrex_prefix + "ctuple_"
......
This diff is collapsed.
This diff is collapsed.
...@@ -48,7 +48,7 @@ cdef p_power(PyrexScanner s) ...@@ -48,7 +48,7 @@ cdef p_power(PyrexScanner s)
cdef p_new_expr(PyrexScanner s) cdef p_new_expr(PyrexScanner s)
cdef p_trailer(PyrexScanner s, node1) cdef p_trailer(PyrexScanner s, node1)
cdef p_call_parse_args(PyrexScanner s, bint allow_genexp = *) cdef p_call_parse_args(PyrexScanner s, bint allow_genexp = *)
cdef p_call_build_packed_args(pos, positional_args, keyword_args, star_arg, starstar_arg) cdef p_call_build_packed_args(pos, positional_args, keyword_args)
cdef p_call(PyrexScanner s, function) cdef p_call(PyrexScanner s, function)
cdef p_index(PyrexScanner s, base) cdef p_index(PyrexScanner s, base)
cdef tuple p_subscript_list(PyrexScanner s) cdef tuple p_subscript_list(PyrexScanner s)
......
This diff is collapsed.
...@@ -30,8 +30,7 @@ def parse_stage_factory(context): ...@@ -30,8 +30,7 @@ def parse_stage_factory(context):
full_module_name = compsrc.full_module_name full_module_name = compsrc.full_module_name
initial_pos = (source_desc, 1, 0) initial_pos = (source_desc, 1, 0)
saved_cimport_from_pyx, Options.cimport_from_pyx = Options.cimport_from_pyx, False saved_cimport_from_pyx, Options.cimport_from_pyx = Options.cimport_from_pyx, False
scope = context.find_module(full_module_name, pos = initial_pos, need_pxd = 0, scope = context.find_module(full_module_name, pos = initial_pos, need_pxd = 0)
check_module_name = not Options.embed)
Options.cimport_from_pyx = saved_cimport_from_pyx Options.cimport_from_pyx = saved_cimport_from_pyx
tree = context.parse(source_desc, scope, pxd = 0, full_module_name = full_module_name) tree = context.parse(source_desc, scope, pxd = 0, full_module_name = full_module_name)
tree.compilation_source = compsrc tree.compilation_source = compsrc
......
...@@ -393,7 +393,7 @@ class CTypedefType(BaseType): ...@@ -393,7 +393,7 @@ class CTypedefType(BaseType):
base_type = self.typedef_base_type base_type = self.typedef_base_type
if type(base_type) is CIntType: if type(base_type) is CIntType:
self.to_py_function = "__Pyx_PyInt_From_" + self.specialization_name() self.to_py_function = "__Pyx_PyInt_From_" + self.specialization_name()
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"CIntToPy", "TypeConversion.c", "CIntToPy", "TypeConversion.c",
context={"TYPE": self.empty_declaration_code(), context={"TYPE": self.empty_declaration_code(),
"TO_PY_FUNCTION": self.to_py_function})) "TO_PY_FUNCTION": self.to_py_function}))
...@@ -415,7 +415,7 @@ class CTypedefType(BaseType): ...@@ -415,7 +415,7 @@ class CTypedefType(BaseType):
base_type = self.typedef_base_type base_type = self.typedef_base_type
if type(base_type) is CIntType: if type(base_type) is CIntType:
self.from_py_function = "__Pyx_PyInt_As_" + self.specialization_name() self.from_py_function = "__Pyx_PyInt_As_" + self.specialization_name()
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"CIntFromPy", "TypeConversion.c", "CIntFromPy", "TypeConversion.c",
context={"TYPE": self.empty_declaration_code(), context={"TYPE": self.empty_declaration_code(),
"FROM_PY_FUNCTION": self.from_py_function})) "FROM_PY_FUNCTION": self.from_py_function}))
...@@ -450,17 +450,17 @@ class CTypedefType(BaseType): ...@@ -450,17 +450,17 @@ class CTypedefType(BaseType):
type = self.empty_declaration_code() type = self.empty_declaration_code()
name = self.specialization_name() name = self.specialization_name()
if binop == "lshift": if binop == "lshift":
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"LeftShift", "Overflow.c", "LeftShift", "Overflow.c",
context={'TYPE': type, 'NAME': name, 'SIGNED': self.signed})) context={'TYPE': type, 'NAME': name, 'SIGNED': self.signed}))
else: else:
if const_rhs: if const_rhs:
binop += "_const" binop += "_const"
_load_overflow_base(env) _load_overflow_base(env)
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"SizeCheck", "Overflow.c", "SizeCheck", "Overflow.c",
context={'TYPE': type, 'NAME': name})) context={'TYPE': type, 'NAME': name}))
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"Binop", "Overflow.c", "Binop", "Overflow.c",
context={'TYPE': type, 'NAME': name, 'BINOP': binop})) context={'TYPE': type, 'NAME': name, 'BINOP': binop}))
return "__Pyx_%s_%s_checking_overflow" % (binop, name) return "__Pyx_%s_%s_checking_overflow" % (binop, name)
...@@ -697,10 +697,9 @@ class MemoryViewSliceType(PyrexType): ...@@ -697,10 +697,9 @@ class MemoryViewSliceType(PyrexType):
# We don't have 'code', so use a LazyUtilityCode with a callback. # We don't have 'code', so use a LazyUtilityCode with a callback.
def lazy_utility_callback(code): def lazy_utility_callback(code):
context['dtype_typeinfo'] = Buffer.get_type_information_cname( context['dtype_typeinfo'] = Buffer.get_type_information_cname(code, self.dtype)
code, self.dtype)
return TempitaUtilityCode.load( return TempitaUtilityCode.load(
"ObjectToMemviewSlice", "MemoryView_C.c", context=context) "ObjectToMemviewSlice", "MemoryView_C.c", context=context)
env.use_utility_code(Buffer.acquire_utility_code) env.use_utility_code(Buffer.acquire_utility_code)
env.use_utility_code(MemoryView.memviewslice_init_code) env.use_utility_code(MemoryView.memviewslice_init_code)
...@@ -777,8 +776,8 @@ class MemoryViewSliceType(PyrexType): ...@@ -777,8 +776,8 @@ class MemoryViewSliceType(PyrexType):
error_condition = error_condition, error_condition = error_condition,
) )
utility = TempitaUtilityCode.load( utility = TempitaUtilityCode.load_cached(
utility_name, "MemoryView_C.c", context=context) utility_name, "MemoryView_C.c", context=context)
env.use_utility_code(utility) env.use_utility_code(utility)
return get_function, set_function return get_function, set_function
...@@ -1490,7 +1489,7 @@ class CIntType(CNumericType): ...@@ -1490,7 +1489,7 @@ class CIntType(CNumericType):
def create_to_py_utility_code(self, env): def create_to_py_utility_code(self, env):
if type(self).to_py_function is None: if type(self).to_py_function is None:
self.to_py_function = "__Pyx_PyInt_From_" + self.specialization_name() self.to_py_function = "__Pyx_PyInt_From_" + self.specialization_name()
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"CIntToPy", "TypeConversion.c", "CIntToPy", "TypeConversion.c",
context={"TYPE": self.empty_declaration_code(), context={"TYPE": self.empty_declaration_code(),
"TO_PY_FUNCTION": self.to_py_function})) "TO_PY_FUNCTION": self.to_py_function}))
...@@ -1499,7 +1498,7 @@ class CIntType(CNumericType): ...@@ -1499,7 +1498,7 @@ class CIntType(CNumericType):
def create_from_py_utility_code(self, env): def create_from_py_utility_code(self, env):
if type(self).from_py_function is None: if type(self).from_py_function is None:
self.from_py_function = "__Pyx_PyInt_As_" + self.specialization_name() self.from_py_function = "__Pyx_PyInt_As_" + self.specialization_name()
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"CIntFromPy", "TypeConversion.c", "CIntFromPy", "TypeConversion.c",
context={"TYPE": self.empty_declaration_code(), context={"TYPE": self.empty_declaration_code(),
"FROM_PY_FUNCTION": self.from_py_function})) "FROM_PY_FUNCTION": self.from_py_function}))
...@@ -1539,18 +1538,18 @@ class CIntType(CNumericType): ...@@ -1539,18 +1538,18 @@ class CIntType(CNumericType):
type = self.empty_declaration_code() type = self.empty_declaration_code()
name = self.specialization_name() name = self.specialization_name()
if binop == "lshift": if binop == "lshift":
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"LeftShift", "Overflow.c", "LeftShift", "Overflow.c",
context={'TYPE': type, 'NAME': name, 'SIGNED': self.signed})) context={'TYPE': type, 'NAME': name, 'SIGNED': self.signed}))
else: else:
if const_rhs: if const_rhs:
binop += "_const" binop += "_const"
if type in ('int', 'long', 'long long'): if type in ('int', 'long', 'long long'):
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"BaseCaseSigned", "Overflow.c", "BaseCaseSigned", "Overflow.c",
context={'INT': type, 'NAME': name})) context={'INT': type, 'NAME': name}))
elif type in ('unsigned int', 'unsigned long', 'unsigned long long'): elif type in ('unsigned int', 'unsigned long', 'unsigned long long'):
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"BaseCaseUnsigned", "Overflow.c", "BaseCaseUnsigned", "Overflow.c",
context={'UINT': type, 'NAME': name})) context={'UINT': type, 'NAME': name}))
elif self.rank <= 1: elif self.rank <= 1:
...@@ -1558,22 +1557,23 @@ class CIntType(CNumericType): ...@@ -1558,22 +1557,23 @@ class CIntType(CNumericType):
return "__Pyx_%s_%s_no_overflow" % (binop, name) return "__Pyx_%s_%s_no_overflow" % (binop, name)
else: else:
_load_overflow_base(env) _load_overflow_base(env)
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"SizeCheck", "Overflow.c", "SizeCheck", "Overflow.c",
context={'TYPE': type, 'NAME': name})) context={'TYPE': type, 'NAME': name}))
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"Binop", "Overflow.c", "Binop", "Overflow.c",
context={'TYPE': type, 'NAME': name, 'BINOP': binop})) context={'TYPE': type, 'NAME': name, 'BINOP': binop}))
return "__Pyx_%s_%s_checking_overflow" % (binop, name) return "__Pyx_%s_%s_checking_overflow" % (binop, name)
def _load_overflow_base(env): def _load_overflow_base(env):
env.use_utility_code(UtilityCode.load("Common", "Overflow.c")) env.use_utility_code(UtilityCode.load("Common", "Overflow.c"))
for type in ('int', 'long', 'long long'): for type in ('int', 'long', 'long long'):
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"BaseCaseSigned", "Overflow.c", "BaseCaseSigned", "Overflow.c",
context={'INT': type, 'NAME': type.replace(' ', '_')})) context={'INT': type, 'NAME': type.replace(' ', '_')}))
for type in ('unsigned int', 'unsigned long', 'unsigned long long'): for type in ('unsigned int', 'unsigned long', 'unsigned long long'):
env.use_utility_code(TempitaUtilityCode.load( env.use_utility_code(TempitaUtilityCode.load_cached(
"BaseCaseUnsigned", "Overflow.c", "BaseCaseUnsigned", "Overflow.c",
context={'UINT': type, 'NAME': type.replace(' ', '_')})) context={'UINT': type, 'NAME': type.replace(' ', '_')}))
...@@ -2710,8 +2710,9 @@ class CFuncType(CType): ...@@ -2710,8 +2710,9 @@ class CFuncType(CType):
trailer = " except %s" % self.exception_value trailer = " except %s" % self.exception_value
elif self.exception_check == '+': elif self.exception_check == '+':
trailer = " except +" trailer = " except +"
else: elif self.exception_check and for_display:
" except *" # ignored # not spelled out by default, unless for human eyes
trailer = " except *"
if self.nogil: if self.nogil:
trailer += " nogil" trailer += " nogil"
if not with_calling_convention: if not with_calling_convention:
...@@ -3684,13 +3685,11 @@ class CTupleType(CType): ...@@ -3684,13 +3685,11 @@ class CTupleType(CType):
env.use_utility_code(self._convert_from_py_code) env.use_utility_code(self._convert_from_py_code)
return True return True
c_tuple_types = {}
def c_tuple_type(components): def c_tuple_type(components):
components = tuple(components) components = tuple(components)
tuple_type = c_tuple_types.get(components) cname = Naming.ctuple_type_prefix + type_list_identifier(components)
if tuple_type is None: tuple_type = CTupleType(cname, components)
cname = Naming.ctuple_type_prefix + type_list_identifier(components)
tuple_type = c_tuple_types[components] = CTupleType(cname, components)
return tuple_type return tuple_type
......
...@@ -6,7 +6,7 @@ ...@@ -6,7 +6,7 @@
from __future__ import absolute_import from __future__ import absolute_import
import cython import cython
cython.declare(EncodedString=object, make_lexicon=object, lexicon=object, cython.declare(make_lexicon=object, lexicon=object,
any_string_prefix=unicode, IDENT=unicode, any_string_prefix=unicode, IDENT=unicode,
print_function=object, error=object, warning=object, print_function=object, error=object, warning=object,
os=object, platform=object) os=object, platform=object)
...@@ -21,8 +21,6 @@ from .Errors import error, warning ...@@ -21,8 +21,6 @@ from .Errors import error, warning
from .Lexicon import any_string_prefix, make_lexicon, IDENT from .Lexicon import any_string_prefix, make_lexicon, IDENT
from .Future import print_function from .Future import print_function
from .StringEncoding import EncodedString
debug_scanner = 0 debug_scanner = 0
trace_scanner = 0 trace_scanner = 0
scanner_debug_flags = 0 scanner_debug_flags = 0
...@@ -421,14 +419,11 @@ class PyrexScanner(Scanner): ...@@ -421,14 +419,11 @@ class PyrexScanner(Scanner):
if systring in self.keywords: if systring in self.keywords:
if systring == u'print' and print_function in self.context.future_directives: if systring == u'print' and print_function in self.context.future_directives:
self.keywords.discard('print') self.keywords.discard('print')
systring = EncodedString(systring)
elif systring == u'exec' and self.context.language_level >= 3: elif systring == u'exec' and self.context.language_level >= 3:
self.keywords.discard('exec') self.keywords.discard('exec')
systring = EncodedString(systring)
else: else:
sy = systring sy = systring
else: systring = self.context.intern_ustring(systring)
systring = EncodedString(systring)
self.sy = sy self.sy = sy
self.systring = systring self.systring = systring
if False: # debug_scanner: if False: # debug_scanner:
......
...@@ -610,8 +610,8 @@ class Scope(object): ...@@ -610,8 +610,8 @@ class Scope(object):
self.sue_entries.append(entry) self.sue_entries.append(entry)
return entry return entry
def declare_tuple_type(self, pos, type): def declare_tuple_type(self, pos, components):
return self.outer_scope.declare_tuple_type(pos, type) return self.outer_scope.declare_tuple_type(pos, components)
def declare_var(self, name, type, pos, def declare_var(self, name, type, pos,
cname = None, visibility = 'private', cname = None, visibility = 'private',
...@@ -1056,6 +1056,7 @@ class ModuleScope(Scope): ...@@ -1056,6 +1056,7 @@ class ModuleScope(Scope):
self.cached_builtins = [] self.cached_builtins = []
self.undeclared_cached_builtins = [] self.undeclared_cached_builtins = []
self.namespace_cname = self.module_cname self.namespace_cname = self.module_cname
self._cached_tuple_types = {}
for var_name in ['__builtins__', '__name__', '__file__', '__doc__', '__path__']: for var_name in ['__builtins__', '__name__', '__file__', '__doc__', '__path__']:
self.declare_var(EncodedString(var_name), py_object_type, None) self.declare_var(EncodedString(var_name), py_object_type, None)
...@@ -1075,18 +1076,24 @@ class ModuleScope(Scope): ...@@ -1075,18 +1076,24 @@ class ModuleScope(Scope):
return self.outer_scope.lookup(name, language_level=language_level) return self.outer_scope.lookup(name, language_level=language_level)
def declare_tuple_type(self, pos, type): def declare_tuple_type(self, pos, components):
cname = type.cname components = tuple(components)
try:
ttype = self._cached_tuple_types[components]
except KeyError:
ttype = self._cached_tuple_types[components] = PyrexTypes.c_tuple_type(components)
cname = ttype.cname
entry = self.lookup_here(cname) entry = self.lookup_here(cname)
if not entry: if not entry:
scope = StructOrUnionScope(cname) scope = StructOrUnionScope(cname)
for ix, component in enumerate(type.components): for ix, component in enumerate(components):
scope.declare_var(name="f%s" % ix, type=component, pos=pos) scope.declare_var(name="f%s" % ix, type=component, pos=pos)
struct_entry = self.declare_struct_or_union(cname + '_struct', 'struct', scope, typedef_flag=True, pos=pos, cname=cname) struct_entry = self.declare_struct_or_union(
cname + '_struct', 'struct', scope, typedef_flag=True, pos=pos, cname=cname)
self.type_entries.remove(struct_entry) self.type_entries.remove(struct_entry)
type.struct_entry = struct_entry ttype.struct_entry = struct_entry
entry = self.declare_type(cname, type, pos, cname) entry = self.declare_type(cname, ttype, pos, cname)
type.entry = entry ttype.entry = entry
return entry return entry
def declare_builtin(self, name, pos): def declare_builtin(self, name, pos):
...@@ -1127,14 +1134,23 @@ class ModuleScope(Scope): ...@@ -1127,14 +1134,23 @@ class ModuleScope(Scope):
# relative imports relative to this module's parent. # relative imports relative to this module's parent.
# Finds and parses the module's .pxd file if the module # Finds and parses the module's .pxd file if the module
# has not been referenced before. # has not been referenced before.
module_scope = self.global_scope() relative_to = None
absolute_fallback = False
if relative_level is not None and relative_level > 0: if relative_level is not None and relative_level > 0:
# merge current absolute module name and relative import name into qualified name # explicit relative cimport
current_module = module_scope.qualified_name.split('.') # error of going beyond top-level is handled in cimport node
base_package = current_module[:-relative_level] relative_to = self
module_name = '.'.join(base_package + (module_name.split('.') if module_name else [])) while relative_level > 0 and relative_to:
relative_to = relative_to.parent_module
relative_level -= 1
elif relative_level != 0:
# -1 or None: try relative cimport first, then absolute
relative_to = self.parent_module
absolute_fallback = True
module_scope = self.global_scope()
return module_scope.context.find_module( return module_scope.context.find_module(
module_name, relative_to=None if relative_level == 0 else self.parent_module, pos=pos) module_name, relative_to=relative_to, pos=pos, absolute_fallback=absolute_fallback)
def find_submodule(self, name): def find_submodule(self, name):
# Find and return scope for a submodule of this module, # Find and return scope for a submodule of this module,
...@@ -1258,6 +1274,8 @@ class ModuleScope(Scope): ...@@ -1258,6 +1274,8 @@ class ModuleScope(Scope):
cname = name cname = name
else: else:
cname = self.mangle(Naming.func_prefix, name) cname = self.mangle(Naming.func_prefix, name)
if visibility == 'extern' and type.optional_arg_count:
error(pos, "Extern functions cannot have default arguments values.")
entry = self.lookup_here(name) entry = self.lookup_here(name)
if entry and entry.defined_in_pxd: if entry and entry.defined_in_pxd:
if entry.visibility != "private": if entry.visibility != "private":
......
...@@ -32,7 +32,7 @@ class StringParseContext(Main.Context): ...@@ -32,7 +32,7 @@ class StringParseContext(Main.Context):
create_testscope=False) create_testscope=False)
self.module_name = name self.module_name = name
def find_module(self, module_name, relative_to=None, pos=None, need_pxd=1): def find_module(self, module_name, relative_to=None, pos=None, need_pxd=1, absolute_fallback=True):
if module_name not in (self.module_name, 'cython'): if module_name not in (self.module_name, 'cython'):
raise AssertionError("Not yet supporting any cimports/includes from string code snippets") raise AssertionError("Not yet supporting any cimports/includes from string code snippets")
return ModuleScope(module_name, parent_module=None, context=self) return ModuleScope(module_name, parent_module=None, context=self)
......
...@@ -12,8 +12,10 @@ from .Visitor import CythonTransform, EnvTransform ...@@ -12,8 +12,10 @@ from .Visitor import CythonTransform, EnvTransform
class TypedExprNode(ExprNodes.ExprNode): class TypedExprNode(ExprNodes.ExprNode):
# Used for declaring assignments of a specified type without a known entry. # Used for declaring assignments of a specified type without a known entry.
def __init__(self, type): subexprs = []
self.type = type
def __init__(self, type, pos=None):
super(TypedExprNode, self).__init__(pos, type=type)
object_expr = TypedExprNode(py_object_type) object_expr = TypedExprNode(py_object_type)
...@@ -184,10 +186,10 @@ class MarkParallelAssignments(EnvTransform): ...@@ -184,10 +186,10 @@ class MarkParallelAssignments(EnvTransform):
# use fake expressions with the right result type # use fake expressions with the right result type
if node.star_arg: if node.star_arg:
self.mark_assignment( self.mark_assignment(
node.star_arg, TypedExprNode(Builtin.tuple_type)) node.star_arg, TypedExprNode(Builtin.tuple_type, node.pos))
if node.starstar_arg: if node.starstar_arg:
self.mark_assignment( self.mark_assignment(
node.starstar_arg, TypedExprNode(Builtin.dict_type)) node.starstar_arg, TypedExprNode(Builtin.dict_type, node.pos))
EnvTransform.visit_FuncDefNode(self, node) EnvTransform.visit_FuncDefNode(self, node)
return node return node
......
...@@ -34,7 +34,9 @@ class NonManglingModuleScope(Symtab.ModuleScope): ...@@ -34,7 +34,9 @@ class NonManglingModuleScope(Symtab.ModuleScope):
class CythonUtilityCodeContext(StringParseContext): class CythonUtilityCodeContext(StringParseContext):
scope = None scope = None
def find_module(self, module_name, relative_to=None, pos=None, need_pxd=True): def find_module(self, module_name, relative_to=None, pos=None, need_pxd=True, absolute_fallback=True):
if relative_to:
raise AssertionError("Relative imports not supported in utility code.")
if module_name != self.module_name: if module_name != self.module_name:
if module_name not in self.modules: if module_name not in self.modules:
raise AssertionError("Only the cython cimport is supported.") raise AssertionError("Only the cython cimport is supported.")
......
...@@ -14,6 +14,7 @@ from . import Nodes ...@@ -14,6 +14,7 @@ from . import Nodes
from . import ExprNodes from . import ExprNodes
from . import Errors from . import Errors
from . import DebugFlags from . import DebugFlags
from . import Future
import cython import cython
...@@ -434,7 +435,7 @@ find_special_method_for_binary_operator = { ...@@ -434,7 +435,7 @@ find_special_method_for_binary_operator = {
'>': '__gt__', '>': '__gt__',
'+': '__add__', '+': '__add__',
'&': '__and__', '&': '__and__',
'/': '__truediv__', '/': '__div__',
'//': '__floordiv__', '//': '__floordiv__',
'<<': '__lshift__', '<<': '__lshift__',
'%': '__mod__', '%': '__mod__',
...@@ -515,6 +516,9 @@ class MethodDispatcherTransform(EnvTransform): ...@@ -515,6 +516,9 @@ class MethodDispatcherTransform(EnvTransform):
operand1, operand2 = node.operand1, node.operand2 operand1, operand2 = node.operand1, node.operand2
if special_method_name == '__contains__': if special_method_name == '__contains__':
operand1, operand2 = operand2, operand1 operand1, operand2 = operand2, operand1
elif special_method_name == '__div__':
if Future.division in self.current_env().global_scope().context.future_directives:
special_method_name = '__truediv__'
obj_type = operand1.type obj_type = operand1.type
if obj_type.is_builtin_type: if obj_type.is_builtin_type:
type_name = obj_type.name type_name = obj_type.name
...@@ -705,12 +709,17 @@ class PrintTree(TreeVisitor): ...@@ -705,12 +709,17 @@ class PrintTree(TreeVisitor):
"""Prints a representation of the tree to standard output. """Prints a representation of the tree to standard output.
Subclass and override repr_of to provide more information Subclass and override repr_of to provide more information
about nodes. """ about nodes. """
def __init__(self): def __init__(self, start=None, end=None):
TreeVisitor.__init__(self) TreeVisitor.__init__(self)
self._indent = "" self._indent = ""
if start is not None or end is not None:
self._line_range = (start or 0, end or 2**30)
else:
self._line_range = None
def indent(self): def indent(self):
self._indent += " " self._indent += " "
def unindent(self): def unindent(self):
self._indent = self._indent[:-2] self._indent = self._indent[:-2]
...@@ -724,15 +733,17 @@ class PrintTree(TreeVisitor): ...@@ -724,15 +733,17 @@ class PrintTree(TreeVisitor):
# under the parent-node, not displaying the list itself in # under the parent-node, not displaying the list itself in
# the hierarchy. # the hierarchy.
def visit_Node(self, node): def visit_Node(self, node):
if len(self.access_path) == 0: line = node.pos[1]
name = "(root)" if self._line_range is None or self._line_range[0] <= line <= self._line_range[1]:
else: if len(self.access_path) == 0:
parent, attr, idx = self.access_path[-1] name = "(root)"
if idx is not None:
name = "%s[%d]" % (attr, idx)
else: else:
name = attr parent, attr, idx = self.access_path[-1]
print("%s- %s: %s" % (self._indent, name, self.repr_of(node))) if idx is not None:
name = "%s[%d]" % (attr, idx)
else:
name = attr
print("%s- %s: %s" % (self._indent, name, self.repr_of(node)))
self.indent() self.indent()
self.visitchildren(node) self.visitchildren(node)
self.unindent() self.unindent()
......
from cpython.ref cimport PyObject from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
ctypedef struct va_list ctypedef struct va_list
......
from cpython.ref cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
......
from cpython.ref cimport PyObject from cpython.object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
ctypedef struct PyTypeObject: ctypedef struct PyTypeObject:
......
from cpython.ref cimport PyObject from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
......
from cpython.ref cimport PyObject from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
......
from cpython.ref cimport PyObject from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
......
from cpython.ref cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
##################################################################### #####################################################################
......
from cpython.ref cimport PyObject from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
......
from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
ctypedef void PyObject
############################################################################ ############################################################################
# 7.5.4 Method Objects # 7.5.4 Method Objects
############################################################################ ############################################################################
......
from cpython.ref cimport PyObject from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
ctypedef struct _inittab ctypedef struct _inittab
......
from cpython.ref cimport PyObject from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
......
from cpython.ref cimport PyObject, PyTypeObject
from libc.stdio cimport FILE from libc.stdio cimport FILE
cimport cpython.type
cdef extern from "Python.h": cdef extern from "Python.h":
ctypedef struct PyObject # forward declaration
ctypedef object (*newfunc)(cpython.type.type, object, object) # (type, args, kwargs)
ctypedef object (*unaryfunc)(object)
ctypedef object (*binaryfunc)(object, object)
ctypedef object (*ternaryfunc)(object, object, object)
ctypedef int (*inquiry)(object)
ctypedef Py_ssize_t (*lenfunc)(object)
ctypedef object (*ssizeargfunc)(object, Py_ssize_t)
ctypedef object (*ssizessizeargfunc)(object, Py_ssize_t, Py_ssize_t)
ctypedef int (*ssizeobjargproc)(object, Py_ssize_t, object)
ctypedef int (*ssizessizeobjargproc)(object, Py_ssize_t, Py_ssize_t, object)
ctypedef int (*objobjargproc)(object, object, object)
ctypedef int (*objobjproc)(object, object)
ctypedef Py_hash_t (*hashfunc)(object)
ctypedef object (*reprfunc)(object)
ctypedef int (*cmpfunc)(object, object)
ctypedef object (*richcmpfunc)(object, object, int)
# The following functions use 'PyObject*' as first argument instead of 'object' to prevent
# accidental reference counting when calling them during a garbage collection run.
ctypedef void (*destructor)(PyObject*)
ctypedef int (*visitproc)(PyObject*, void *)
ctypedef int (*traverseproc)(PyObject*, visitproc, void*)
ctypedef struct PyTypeObject:
const char* tp_name
const char* tp_doc
Py_ssize_t tp_basicsize
Py_ssize_t tp_itemsize
Py_ssize_t tp_dictoffset
unsigned long tp_flags
newfunc tp_new
destructor tp_dealloc
traverseproc tp_traverse
inquiry tp_clear
ternaryfunc tp_call
hashfunc tp_hash
reprfunc tp_str
reprfunc tp_repr
cmpfunc tp_compare
richcmpfunc tp_richcompare
PyTypeObject* tp_base
ctypedef struct PyObject:
Py_ssize_t ob_refcnt
PyTypeObject *ob_type
cdef PyTypeObject *Py_TYPE(object)
void* PyObject_Malloc(size_t)
void* PyObject_Realloc(void *, size_t)
void PyObject_Free(void *)
##################################################################### #####################################################################
# 6.1 Object Protocol # 6.1 Object Protocol
##################################################################### #####################################################################
......
from cpython.ref cimport PyObject
# available since Python 3.1! # available since Python 3.1!
......
# Thread and interpreter state structures and their interfaces # Thread and interpreter state structures and their interfaces
from cpython.ref cimport PyObject from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
......
cdef extern from "Python.h": from .object cimport PyObject, PyTypeObject, Py_TYPE # legacy imports for re-export
ctypedef struct PyTypeObject:
Py_ssize_t tp_basicsize
Py_ssize_t tp_itemsize
long tp_flags
ctypedef struct PyObject:
Py_ssize_t ob_refcnt
PyTypeObject *ob_type
cdef PyTypeObject *Py_TYPE(object)
cdef extern from "Python.h":
##################################################################### #####################################################################
# 3. Reference Counts # 3. Reference Counts
##################################################################### #####################################################################
......
from cpython.ref cimport PyObject from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
......
from cpython.ref cimport PyObject from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
ctypedef struct va_list ctypedef struct va_list
......
from cpython.ref cimport PyObject from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
......
from cpython.ref cimport PyObject from .object cimport PyObject
cdef extern from "Python.h": cdef extern from "Python.h":
......
...@@ -39,63 +39,63 @@ cdef extern from "<complex>" namespace "std" nogil: ...@@ -39,63 +39,63 @@ cdef extern from "<complex>" namespace "std" nogil:
void imag(T) void imag(T)
# Return real part # Return real part
T real(complex[T]&) T real[T](complex[T]&)
long double real(long double) long double real(long double)
double real(double) double real(double)
float real(float) float real(float)
# Return imaginary part # Return imaginary part
T imag(complex[T]&) T imag[T](complex[T]&)
long double imag(long double) long double imag(long double)
double imag(double) double imag(double)
float imag(float) float imag(float)
T abs(complex[T]&) T abs[T](complex[T]&)
T arg(complex[T]&) T arg[T](complex[T]&)
long double arg(long double) long double arg(long double)
double arg(double) double arg(double)
float arg(float) float arg(float)
T norm(complex[T]) T norm[T](complex[T])
long double norm(long double) long double norm(long double)
double norm(double) double norm(double)
float norm(float) float norm(float)
complex[T] conj(complex[T]&) complex[T] conj[T](complex[T]&)
complex[long double] conj(long double) complex[long double] conj(long double)
complex[double] conj(double) complex[double] conj(double)
complex[float] conj(float) complex[float] conj(float)
complex[T] proj(complex[T]) complex[T] proj[T](complex[T])
complex[long double] proj(long double) complex[long double] proj(long double)
complex[double] proj(double) complex[double] proj(double)
complex[float] proj(float) complex[float] proj(float)
complex[T] polar(T&, T&) complex[T] polar[T](T&, T&)
complex[T] ploar(T&) complex[T] ploar[T](T&)
complex[T] exp(complex[T]&) complex[T] exp[T](complex[T]&)
complex[T] log(complex[T]&) complex[T] log[T](complex[T]&)
complex[T] log10(complex[T]&) complex[T] log10[T](complex[T]&)
complex[T] pow(complex[T]&, complex[T]&) complex[T] pow[T](complex[T]&, complex[T]&)
complex[T] pow(complex[T]&, T&) complex[T] pow[T](complex[T]&, T&)
complex[T] pow(T&, complex[T]&) complex[T] pow[T](T&, complex[T]&)
# There are some promotion versions too # There are some promotion versions too
complex[T] sqrt(complex[T]&) complex[T] sqrt[T](complex[T]&)
complex[T] sin(complex[T]&) complex[T] sin[T](complex[T]&)
complex[T] cos(complex[T]&) complex[T] cos[T](complex[T]&)
complex[T] tan(complex[T]&) complex[T] tan[T](complex[T]&)
complex[T] asin(complex[T]&) complex[T] asin[T](complex[T]&)
complex[T] acos(complex[T]&) complex[T] acos[T](complex[T]&)
complex[T] atan(complex[T]&) complex[T] atan[T](complex[T]&)
complex[T] sinh(complex[T]&) complex[T] sinh[T](complex[T]&)
complex[T] cosh(complex[T]&) complex[T] cosh[T](complex[T]&)
complex[T] tanh(complex[T]&) complex[T] tanh[T](complex[T]&)
complex[T] asinh(complex[T]&) complex[T] asinh[T](complex[T]&)
complex[T] acosh(complex[T]&) complex[T] acosh[T](complex[T]&)
complex[T] atanh(complex[T]&) complex[T] atanh[T](complex[T]&)
...@@ -12,8 +12,8 @@ cdef extern from "<deque>" namespace "std" nogil: ...@@ -12,8 +12,8 @@ cdef extern from "<deque>" namespace "std" nogil:
iterator operator--() iterator operator--()
bint operator==(reverse_iterator) bint operator==(reverse_iterator)
bint operator!=(reverse_iterator) bint operator!=(reverse_iterator)
#cppclass const_iterator(iterator): cppclass const_iterator(iterator):
# pass pass
#cppclass const_reverse_iterator(reverse_iterator): #cppclass const_reverse_iterator(reverse_iterator):
# pass # pass
deque() except + deque() except +
...@@ -34,11 +34,11 @@ cdef extern from "<deque>" namespace "std" nogil: ...@@ -34,11 +34,11 @@ cdef extern from "<deque>" namespace "std" nogil:
T& at(size_t) T& at(size_t)
T& back() T& back()
iterator begin() iterator begin()
#const_iterator begin() const_iterator const_begin "begin"()
void clear() void clear()
bint empty() bint empty()
iterator end() iterator end()
#const_iterator end() const_iterator const_end "end"()
iterator erase(iterator) iterator erase(iterator)
iterator erase(iterator, iterator) iterator erase(iterator, iterator)
T& front() T& front()
......
...@@ -16,10 +16,10 @@ cdef extern from "<list>" namespace "std" nogil: ...@@ -16,10 +16,10 @@ cdef extern from "<list>" namespace "std" nogil:
reverse_iterator operator--() reverse_iterator operator--()
bint operator==(reverse_iterator) bint operator==(reverse_iterator)
bint operator!=(reverse_iterator) bint operator!=(reverse_iterator)
#cppclass const_iterator(iterator): cppclass const_iterator(iterator):
# pass pass
#cppclass const_reverse_iterator(reverse_iterator): cppclass const_reverse_iterator(reverse_iterator):
# pass pass
list() except + list() except +
list(list&) except + list(list&) except +
list(size_t, T&) except + list(size_t, T&) except +
...@@ -33,11 +33,11 @@ cdef extern from "<list>" namespace "std" nogil: ...@@ -33,11 +33,11 @@ cdef extern from "<list>" namespace "std" nogil:
void assign(size_t, T&) void assign(size_t, T&)
T& back() T& back()
iterator begin() iterator begin()
#const_iterator begin() const_iterator const_begin "begin"()
void clear() void clear()
bint empty() bint empty()
iterator end() iterator end()
#const_iterator end() const_iterator const_end "end"()
iterator erase(iterator) iterator erase(iterator)
iterator erase(iterator, iterator) iterator erase(iterator, iterator)
T& front() T& front()
...@@ -51,11 +51,11 @@ cdef extern from "<list>" namespace "std" nogil: ...@@ -51,11 +51,11 @@ cdef extern from "<list>" namespace "std" nogil:
void push_back(T&) void push_back(T&)
void push_front(T&) void push_front(T&)
reverse_iterator rbegin() reverse_iterator rbegin()
#const_reverse_iterator rbegin() const_reverse_iterator const_rbegin "rbegin"()
void remove(T&) void remove(T&)
#void remove_if(UnPred) #void remove_if(UnPred)
reverse_iterator rend() reverse_iterator rend()
#const_reverse_iterator rend() const_reverse_iterator const_rend "rend"()
void resize(size_t, T&) void resize(size_t, T&)
void reverse() void reverse()
size_t size() size_t size()
......
...@@ -20,8 +20,8 @@ cdef extern from "<map>" namespace "std" nogil: ...@@ -20,8 +20,8 @@ cdef extern from "<map>" namespace "std" nogil:
iterator operator--() iterator operator--()
bint operator==(reverse_iterator) bint operator==(reverse_iterator)
bint operator!=(reverse_iterator) bint operator!=(reverse_iterator)
#cppclass const_reverse_iterator(reverse_iterator): cppclass const_reverse_iterator(reverse_iterator):
# pass pass
map() except + map() except +
map(map&) except + map(map&) except +
#map(key_compare&) #map(key_compare&)
...@@ -53,14 +53,14 @@ cdef extern from "<map>" namespace "std" nogil: ...@@ -53,14 +53,14 @@ cdef extern from "<map>" namespace "std" nogil:
#void insert(input_iterator, input_iterator) #void insert(input_iterator, input_iterator)
#key_compare key_comp() #key_compare key_comp()
iterator lower_bound(const T&) iterator lower_bound(const T&)
#const_iterator lower_bound(const key_type&) const_iterator const_lower_bound "lower_bound"(const T&)
size_t max_size() size_t max_size()
reverse_iterator rbegin() reverse_iterator rbegin()
#const_reverse_iterator rbegin() const_reverse_iterator const_rbegin "rbegin"()
reverse_iterator rend() reverse_iterator rend()
#const_reverse_iterator rend() const_reverse_iterator const_rend "rend"()
size_t size() size_t size()
void swap(map&) void swap(map&)
iterator upper_bound(const T&) iterator upper_bound(const T&)
#const_iterator upper_bound(const key_type&) const_iterator const_upper_bound "upper_bound"(const T&)
#value_compare value_comp() #value_compare value_comp()
...@@ -14,10 +14,10 @@ cdef extern from "<set>" namespace "std" nogil: ...@@ -14,10 +14,10 @@ cdef extern from "<set>" namespace "std" nogil:
iterator operator--() iterator operator--()
bint operator==(reverse_iterator) bint operator==(reverse_iterator)
bint operator!=(reverse_iterator) bint operator!=(reverse_iterator)
#cppclass const_iterator(iterator): cppclass const_iterator(iterator):
# pass pass
#cppclass const_reverse_iterator(reverse_iterator): cppclass const_reverse_iterator(reverse_iterator):
# pass pass
set() except + set() except +
set(set&) except + set(set&) except +
#set(key_compare&) #set(key_compare&)
...@@ -29,32 +29,32 @@ cdef extern from "<set>" namespace "std" nogil: ...@@ -29,32 +29,32 @@ cdef extern from "<set>" namespace "std" nogil:
bint operator<=(set&, set&) bint operator<=(set&, set&)
bint operator>=(set&, set&) bint operator>=(set&, set&)
iterator begin() iterator begin()
#const_iterator begin() const_iterator const_begin "begin"()
void clear() void clear()
size_t count(const T&) size_t count(const T&)
bint empty() bint empty()
iterator end() iterator end()
#const_iterator end() const_iterator const_end "end"()
pair[iterator, iterator] equal_range(const T&) pair[iterator, iterator] equal_range(const T&)
#pair[const_iterator, const_iterator] equal_range(T&) #pair[const_iterator, const_iterator] equal_range(T&)
void erase(iterator) void erase(iterator)
void erase(iterator, iterator) void erase(iterator, iterator)
size_t erase(T&) size_t erase(T&)
iterator find(T&) iterator find(T&)
#const_iterator find(T&) const_iterator const_find "find"(T&)
pair[iterator, bint] insert(const T&) except + pair[iterator, bint] insert(const T&) except +
iterator insert(iterator, const T&) except + iterator insert(iterator, const T&) except +
#void insert(input_iterator, input_iterator) #void insert(input_iterator, input_iterator)
#key_compare key_comp() #key_compare key_comp()
iterator lower_bound(T&) iterator lower_bound(T&)
#const_iterator lower_bound(T&) const_iterator const_lower_bound "lower_bound"(T&)
size_t max_size() size_t max_size()
reverse_iterator rbegin() reverse_iterator rbegin()
#const_reverse_iterator rbegin() const_reverse_iterator const_rbegin "rbegin"()
reverse_iterator rend() reverse_iterator rend()
#const_reverse_iterator rend() const_reverse_iterator const_rend "rend"()
size_t size() size_t size()
void swap(set&) void swap(set&)
iterator upper_bound(const T&) iterator upper_bound(const T&)
#const_iterator upper_bound(const T&) const_iterator const_upper_bound "upper_bound"(const T&)
#value_compare value_comp() #value_compare value_comp()
...@@ -14,10 +14,10 @@ cdef extern from "<unordered_map>" namespace "std" nogil: ...@@ -14,10 +14,10 @@ cdef extern from "<unordered_map>" namespace "std" nogil:
iterator operator--() iterator operator--()
bint operator==(reverse_iterator) bint operator==(reverse_iterator)
bint operator!=(reverse_iterator) bint operator!=(reverse_iterator)
#cppclass const_iterator(iterator): cppclass const_iterator(iterator):
# pass pass
#cppclass const_reverse_iterator(reverse_iterator): cppclass const_reverse_iterator(reverse_iterator):
# pass pass
unordered_map() except + unordered_map() except +
unordered_map(unordered_map&) except + unordered_map(unordered_map&) except +
#unordered_map(key_compare&) #unordered_map(key_compare&)
...@@ -31,34 +31,34 @@ cdef extern from "<unordered_map>" namespace "std" nogil: ...@@ -31,34 +31,34 @@ cdef extern from "<unordered_map>" namespace "std" nogil:
bint operator>=(unordered_map&, unordered_map&) bint operator>=(unordered_map&, unordered_map&)
U& at(T&) U& at(T&)
iterator begin() iterator begin()
#const_iterator begin() const_iterator const_begin "begin"()
void clear() void clear()
size_t count(T&) size_t count(T&)
bint empty() bint empty()
iterator end() iterator end()
#const_iterator end() const_iterator const_end "end"()
pair[iterator, iterator] equal_range(T&) pair[iterator, iterator] equal_range(T&)
#pair[const_iterator, const_iterator] equal_range(key_type&) #pair[const_iterator, const_iterator] equal_range(key_type&)
void erase(iterator) void erase(iterator)
void erase(iterator, iterator) void erase(iterator, iterator)
size_t erase(T&) size_t erase(T&)
iterator find(T&) iterator find(T&)
#const_iterator find(key_type&) const_iterator const_find "find"(T&)
pair[iterator, bint] insert(pair[T, U]) # XXX pair[T,U]& pair[iterator, bint] insert(pair[T, U]) # XXX pair[T,U]&
iterator insert(iterator, pair[T, U]) # XXX pair[T,U]& iterator insert(iterator, pair[T, U]) # XXX pair[T,U]&
#void insert(input_iterator, input_iterator) #void insert(input_iterator, input_iterator)
#key_compare key_comp() #key_compare key_comp()
iterator lower_bound(T&) iterator lower_bound(T&)
#const_iterator lower_bound(key_type&) const_iterator const_lower_bound "lower_bound"(T&)
size_t max_size() size_t max_size()
reverse_iterator rbegin() reverse_iterator rbegin()
#const_reverse_iterator rbegin() const_reverse_iterator const_rbegin "rbegin"()
reverse_iterator rend() reverse_iterator rend()
#const_reverse_iterator rend() const_reverse_iterator const_rend "rend"()
size_t size() size_t size()
void swap(unordered_map&) void swap(unordered_map&)
iterator upper_bound(T&) iterator upper_bound(T&)
#const_iterator upper_bound(key_type&) const_iterator const_upper_bound "upper_bound"(T&)
#value_compare value_comp() #value_compare value_comp()
void max_load_factor(float) void max_load_factor(float)
float max_load_factor() float max_load_factor()
...@@ -14,10 +14,10 @@ cdef extern from "<unordered_set>" namespace "std" nogil: ...@@ -14,10 +14,10 @@ cdef extern from "<unordered_set>" namespace "std" nogil:
iterator operator--() iterator operator--()
bint operator==(reverse_iterator) bint operator==(reverse_iterator)
bint operator!=(reverse_iterator) bint operator!=(reverse_iterator)
#cppclass const_iterator(iterator): cppclass const_iterator(iterator):
# pass pass
#cppclass const_reverse_iterator(reverse_iterator): cppclass const_reverse_iterator(reverse_iterator):
# pass pass
unordered_set() except + unordered_set() except +
unordered_set(unordered_set&) except + unordered_set(unordered_set&) except +
#unordered_set(key_compare&) #unordered_set(key_compare&)
...@@ -29,32 +29,32 @@ cdef extern from "<unordered_set>" namespace "std" nogil: ...@@ -29,32 +29,32 @@ cdef extern from "<unordered_set>" namespace "std" nogil:
bint operator<=(unordered_set&, unordered_set&) bint operator<=(unordered_set&, unordered_set&)
bint operator>=(unordered_set&, unordered_set&) bint operator>=(unordered_set&, unordered_set&)
iterator begin() iterator begin()
#const_iterator begin() const_iterator const_begin "begin"()
void clear() void clear()
size_t count(T&) size_t count(T&)
bint empty() bint empty()
iterator end() iterator end()
#const_iterator end() const_iterator const_end "end"()
pair[iterator, iterator] equal_range(T&) pair[iterator, iterator] equal_range(T&)
#pair[const_iterator, const_iterator] equal_range(T&) #pair[const_iterator, const_iterator] equal_range(T&)
void erase(iterator) void erase(iterator)
void erase(iterator, iterator) void erase(iterator, iterator)
size_t erase(T&) size_t erase(T&)
iterator find(T&) iterator find(T&)
#const_iterator find(T&) const_iterator const_find "find"(T&)
pair[iterator, bint] insert(T&) pair[iterator, bint] insert(T&)
iterator insert(iterator, T&) iterator insert(iterator, T&)
#void insert(input_iterator, input_iterator) #void insert(input_iterator, input_iterator)
#key_compare key_comp() #key_compare key_comp()
iterator lower_bound(T&) iterator lower_bound(T&)
#const_iterator lower_bound(T&) const_iterator const_lower_bound "lower_bound"(T&)
size_t max_size() size_t max_size()
reverse_iterator rbegin() reverse_iterator rbegin()
#const_reverse_iterator rbegin() const_reverse_iterator const_rbegin "rbegin"()
reverse_iterator rend() reverse_iterator rend()
#const_reverse_iterator rend() const_reverse_iterator const_rend "rend"()
size_t size() size_t size()
void swap(unordered_set&) void swap(unordered_set&)
iterator upper_bound(T&) iterator upper_bound(T&)
#const_iterator upper_bound(T&) const_iterator const_upper_bound "upper_bound"(T&)
#value_compare value_comp() #value_compare value_comp()
...@@ -24,10 +24,10 @@ cdef extern from "<vector>" namespace "std" nogil: ...@@ -24,10 +24,10 @@ cdef extern from "<vector>" namespace "std" nogil:
bint operator>(reverse_iterator) bint operator>(reverse_iterator)
bint operator<=(reverse_iterator) bint operator<=(reverse_iterator)
bint operator>=(reverse_iterator) bint operator>=(reverse_iterator)
#cppclass const_iterator(iterator): cppclass const_iterator(iterator):
# pass pass
#cppclass const_reverse_iterator(reverse_iterator): cppclass const_reverse_iterator(reverse_iterator):
# pass pass
vector() except + vector() except +
vector(vector&) except + vector(vector&) except +
vector(size_t) except + vector(size_t) except +
...@@ -46,12 +46,12 @@ cdef extern from "<vector>" namespace "std" nogil: ...@@ -46,12 +46,12 @@ cdef extern from "<vector>" namespace "std" nogil:
T& at(size_t) except + T& at(size_t) except +
T& back() T& back()
iterator begin() iterator begin()
#const_iterator begin() const_iterator const_begin "begin"()
size_t capacity() size_t capacity()
void clear() void clear()
bint empty() bint empty()
iterator end() iterator end()
#const_iterator end() const_iterator const_end "end"()
iterator erase(iterator) iterator erase(iterator)
iterator erase(iterator, iterator) iterator erase(iterator, iterator)
T& front() T& front()
...@@ -62,15 +62,15 @@ cdef extern from "<vector>" namespace "std" nogil: ...@@ -62,15 +62,15 @@ cdef extern from "<vector>" namespace "std" nogil:
void pop_back() void pop_back()
void push_back(T&) except + void push_back(T&) except +
reverse_iterator rbegin() reverse_iterator rbegin()
#const_reverse_iterator rbegin() const_reverse_iterator const_rbegin "rbegin"()
reverse_iterator rend() reverse_iterator rend()
#const_reverse_iterator rend() const_reverse_iterator const_rend "rend"()
void reserve(size_t) void reserve(size_t)
void resize(size_t) except + void resize(size_t) except +
void resize(size_t, T&) except + void resize(size_t, T&) except +
size_t size() size_t size()
void swap(vector&) void swap(vector&)
# C++11 methods # C++11 methods
T* data() T* data()
void shrink_to_fit() void shrink_to_fit()
...@@ -241,7 +241,6 @@ cdef extern from "numpy/arrayobject.h": ...@@ -241,7 +241,6 @@ cdef extern from "numpy/arrayobject.h":
cdef int t cdef int t
cdef char* f = NULL cdef char* f = NULL
cdef dtype descr = self.descr cdef dtype descr = self.descr
cdef list stack
cdef int offset cdef int offset
cdef bint hasfields = PyDataType_HASFIELDS(descr) cdef bint hasfields = PyDataType_HASFIELDS(descr)
...@@ -788,8 +787,6 @@ cdef inline char* _util_dtypestring(dtype descr, char* f, char* end, int* offset ...@@ -788,8 +787,6 @@ cdef inline char* _util_dtypestring(dtype descr, char* f, char* end, int* offset
# string. The new location in the format string is returned. # string. The new location in the format string is returned.
cdef dtype child cdef dtype child
cdef int delta_offset
cdef tuple i
cdef int endian_detector = 1 cdef int endian_detector = 1
cdef bint little_endian = ((<char*>&endian_detector)[0] != 0) cdef bint little_endian = ((<char*>&endian_detector)[0] != 0)
cdef tuple fields cdef tuple fields
......
...@@ -109,17 +109,29 @@ subscript: test | [test] ':' [test] [sliceop] ...@@ -109,17 +109,29 @@ subscript: test | [test] ':' [test] [sliceop]
sliceop: ':' [test] sliceop: ':' [test]
exprlist: (expr|star_expr) (',' (expr|star_expr))* [','] exprlist: (expr|star_expr) (',' (expr|star_expr))* [',']
testlist: test (',' test)* [','] testlist: test (',' test)* [',']
dictorsetmaker: ( (test ':' test (comp_for | (',' test ':' test)* [','])) | dictorsetmaker: ( ((test ':' test | '**' expr)
(test (comp_for | (',' test)* [','])) ) (comp_for | (',' (test ':' test | '**' expr))* [','])) |
((test | star_expr)
(comp_for | (',' (test | star_expr))* [','])) )
classdef: 'class' PY_NAME ['(' [arglist] ')'] ':' suite classdef: 'class' PY_NAME ['(' [arglist] ')'] ':' suite
arglist: (argument ',')* (argument [','] arglist: argument (',' argument)* [',']
|'*' test (',' argument)* [',' '**' test]
|'**' test)
# The reason that keywords are test nodes instead of NAME is that using NAME # The reason that keywords are test nodes instead of NAME is that using NAME
# results in an ambiguity. ast.c makes sure it's a NAME. # results in an ambiguity. ast.c makes sure it's a NAME.
argument: test [comp_for] | test '=' test # Really [keyword '='] test # "test '=' test" is really "keyword '=' test", but we have no such token.
# These need to be in a single rule to avoid grammar that is ambiguous
# to our LL(1) parser. Even though 'test' includes '*expr' in star_expr,
# we explicitly match '*' here, too, to give it proper precedence.
# Illegal combinations and orderings are blocked in ast.c:
# multiple (test comp_for) arguements are blocked; keyword unpackings
# that precede iterable unpackings are blocked; etc.
argument: ( test [comp_for] |
test '=' test |
'**' expr |
star_expr )
comp_iter: comp_for | comp_if comp_iter: comp_for | comp_if
comp_for: 'for' exprlist ('in' or_test | for_from_clause) [comp_iter] comp_for: 'for' exprlist ('in' or_test | for_from_clause) [comp_iter]
comp_if: 'if' test_nocond [comp_iter] comp_if: 'if' test_nocond [comp_iter]
......
# cython.* namespace for pure mode. # cython.* namespace for pure mode.
__version__ = "0.22" __version__ = "0.23dev"
# BEGIN shameless copy from Cython/minivect/minitypes.py # BEGIN shameless copy from Cython/minivect/minitypes.py
...@@ -97,7 +97,7 @@ class _EmptyDecoratorAndManager(object): ...@@ -97,7 +97,7 @@ class _EmptyDecoratorAndManager(object):
cclass = ccall = cfunc = _EmptyDecoratorAndManager() cclass = ccall = cfunc = _EmptyDecoratorAndManager()
returns = lambda type_arg: _EmptyDecoratorAndManager() returns = wraparound = boundscheck = lambda arg: _EmptyDecoratorAndManager()
final = internal = type_version_tag = no_gc_clear = _empty_decorator final = internal = type_version_tag = no_gc_clear = _empty_decorator
......
...@@ -858,16 +858,13 @@ static struct __pyx_typeinfo_string __Pyx_TypeInfoToFormat(__Pyx_TypeInfo *type) ...@@ -858,16 +858,13 @@ static struct __pyx_typeinfo_string __Pyx_TypeInfoToFormat(__Pyx_TypeInfo *type)
case 'I': case 'I':
case 'U': case 'U':
if (size == 1) if (size == 1)
*buf = 'b'; *buf = (type->is_unsigned) ? 'B' : 'b';
else if (size == 2) else if (size == 2)
*buf = 'h'; *buf = (type->is_unsigned) ? 'H' : 'h';
else if (size == 4) else if (size == 4)
*buf = 'i'; *buf = (type->is_unsigned) ? 'I' : 'i';
else if (size == 8) else if (size == 8)
*buf = 'q'; *buf = (type->is_unsigned) ? 'Q' : 'q';
if (type->is_unsigned)
*buf = toupper(*buf);
break; break;
case 'P': case 'P':
*buf = 'P'; *buf = 'P';
......
...@@ -29,7 +29,7 @@ static PyObject* __Pyx_Globals(void) { ...@@ -29,7 +29,7 @@ static PyObject* __Pyx_Globals(void) {
goto bad; goto bad;
for (i = PyList_GET_SIZE(names)-1; i >= 0; i--) { for (i = PyList_GET_SIZE(names)-1; i >= 0; i--) {
#if CYTHON_COMPILING_IN_PYPY #if CYTHON_COMPILING_IN_PYPY
PyObject* name = PySequence_GetItem(names, i); PyObject* name = PySequence_ITEM(names, i);
if (!name) if (!name)
goto bad; goto bad;
#else #else
...@@ -239,20 +239,64 @@ static CYTHON_INLINE unsigned PY_LONG_LONG __Pyx_abs_longlong(PY_LONG_LONG x) { ...@@ -239,20 +239,64 @@ static CYTHON_INLINE unsigned PY_LONG_LONG __Pyx_abs_longlong(PY_LONG_LONG x) {
#endif #endif
} }
//////////////////// pow2.proto //////////////////// //////////////////// pow2.proto ////////////////////
#define __Pyx_PyNumber_Power2(a, b) PyNumber_Power(a, b, Py_None) #define __Pyx_PyNumber_Power2(a, b) PyNumber_Power(a, b, Py_None)
//////////////////// object_ord.proto ////////////////////
//@requires: TypeConversion.c::UnicodeAsUCS4
#if PY_MAJOR_VERSION >= 3
#define __Pyx_PyObject_Ord(c) \
(likely(PyUnicode_Check(c)) ? (long)__Pyx_PyUnicode_AsPy_UCS4(c) : __Pyx__PyObject_Ord(c))
#else
#define __Pyx_PyObject_Ord(c) __Pyx__PyObject_Ord(c)
#endif
static long __Pyx__PyObject_Ord(PyObject* c); /*proto*/
//////////////////// object_ord ////////////////////
static long __Pyx__PyObject_Ord(PyObject* c) {
Py_ssize_t size;
if (PyBytes_Check(c)) {
size = PyBytes_GET_SIZE(c);
if (likely(size == 1)) {
return (unsigned char) PyBytes_AS_STRING(c)[0];
}
#if PY_MAJOR_VERSION < 3
} else if (PyUnicode_Check(c)) {
return (long)__Pyx_PyUnicode_AsPy_UCS4(c);
#endif
#if (!CYTHON_COMPILING_IN_PYPY) || (defined(PyByteArray_AS_STRING) && defined(PyByteArray_GET_SIZE))
} else if (PyByteArray_Check(c)) {
size = PyByteArray_GET_SIZE(c);
if (likely(size == 1)) {
return (unsigned char) PyByteArray_AS_STRING(c)[0];
}
#endif
} else {
// FIXME: support character buffers - but CPython doesn't support them either
PyErr_Format(PyExc_TypeError,
"ord() expected string of length 1, but %.200s found", c->ob_type->tp_name);
return (long)(Py_UCS4)-1;
}
PyErr_Format(PyExc_TypeError,
"ord() expected a character, but string of length %zd found", size);
return (long)(Py_UCS4)-1;
}
//////////////////// py_dict_keys.proto //////////////////// //////////////////// py_dict_keys.proto ////////////////////
static CYTHON_INLINE PyObject* __Pyx_PyDict_Keys(PyObject* d); /*proto*/ static CYTHON_INLINE PyObject* __Pyx_PyDict_Keys(PyObject* d); /*proto*/
//////////////////// py_dict_keys //////////////////// //////////////////// py_dict_keys ////////////////////
//@requires: ObjectHandling.c::PyObjectCallMethod1
static CYTHON_INLINE PyObject* __Pyx_PyDict_Keys(PyObject* d) { static CYTHON_INLINE PyObject* __Pyx_PyDict_Keys(PyObject* d) {
if (PY_MAJOR_VERSION >= 3) if (PY_MAJOR_VERSION >= 3)
return __Pyx_PyObject_CallMethod1((PyObject*)&PyDict_Type, PYIDENT("keys"), d); return CALL_UNBOUND_METHOD(PyDict_Type, "keys", d);
else else
return PyDict_Keys(d); return PyDict_Keys(d);
} }
...@@ -262,11 +306,10 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_Keys(PyObject* d) { ...@@ -262,11 +306,10 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_Keys(PyObject* d) {
static CYTHON_INLINE PyObject* __Pyx_PyDict_Values(PyObject* d); /*proto*/ static CYTHON_INLINE PyObject* __Pyx_PyDict_Values(PyObject* d); /*proto*/
//////////////////// py_dict_values //////////////////// //////////////////// py_dict_values ////////////////////
//@requires: ObjectHandling.c::PyObjectCallMethod1
static CYTHON_INLINE PyObject* __Pyx_PyDict_Values(PyObject* d) { static CYTHON_INLINE PyObject* __Pyx_PyDict_Values(PyObject* d) {
if (PY_MAJOR_VERSION >= 3) if (PY_MAJOR_VERSION >= 3)
return __Pyx_PyObject_CallMethod1((PyObject*)&PyDict_Type, PYIDENT("values"), d); return CALL_UNBOUND_METHOD(PyDict_Type, "values", d);
else else
return PyDict_Values(d); return PyDict_Values(d);
} }
...@@ -276,11 +319,10 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_Values(PyObject* d) { ...@@ -276,11 +319,10 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_Values(PyObject* d) {
static CYTHON_INLINE PyObject* __Pyx_PyDict_Items(PyObject* d); /*proto*/ static CYTHON_INLINE PyObject* __Pyx_PyDict_Items(PyObject* d); /*proto*/
//////////////////// py_dict_items //////////////////// //////////////////// py_dict_items ////////////////////
//@requires: ObjectHandling.c::PyObjectCallMethod1
static CYTHON_INLINE PyObject* __Pyx_PyDict_Items(PyObject* d) { static CYTHON_INLINE PyObject* __Pyx_PyDict_Items(PyObject* d) {
if (PY_MAJOR_VERSION >= 3) if (PY_MAJOR_VERSION >= 3)
return __Pyx_PyObject_CallMethod1((PyObject*)&PyDict_Type, PYIDENT("items"), d); return CALL_UNBOUND_METHOD(PyDict_Type, "items", d);
else else
return PyDict_Items(d); return PyDict_Items(d);
} }
...@@ -290,10 +332,12 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_Items(PyObject* d) { ...@@ -290,10 +332,12 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_Items(PyObject* d) {
static CYTHON_INLINE PyObject* __Pyx_PyDict_IterKeys(PyObject* d); /*proto*/ static CYTHON_INLINE PyObject* __Pyx_PyDict_IterKeys(PyObject* d); /*proto*/
//////////////////// py_dict_iterkeys //////////////////// //////////////////// py_dict_iterkeys ////////////////////
//@requires: ObjectHandling.c::PyObjectCallMethod0
static CYTHON_INLINE PyObject* __Pyx_PyDict_IterKeys(PyObject* d) { static CYTHON_INLINE PyObject* __Pyx_PyDict_IterKeys(PyObject* d) {
return __Pyx_PyObject_CallMethod0(d, (PY_MAJOR_VERSION >= 3) ? PYIDENT("keys") : PYIDENT("iterkeys")); if (PY_MAJOR_VERSION >= 3)
return CALL_UNBOUND_METHOD(PyDict_Type, "keys", d);
else
return CALL_UNBOUND_METHOD(PyDict_Type, "iterkeys", d);
} }
//////////////////// py_dict_itervalues.proto //////////////////// //////////////////// py_dict_itervalues.proto ////////////////////
...@@ -301,10 +345,12 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_IterKeys(PyObject* d) { ...@@ -301,10 +345,12 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_IterKeys(PyObject* d) {
static CYTHON_INLINE PyObject* __Pyx_PyDict_IterValues(PyObject* d); /*proto*/ static CYTHON_INLINE PyObject* __Pyx_PyDict_IterValues(PyObject* d); /*proto*/
//////////////////// py_dict_itervalues //////////////////// //////////////////// py_dict_itervalues ////////////////////
//@requires: ObjectHandling.c::PyObjectCallMethod0
static CYTHON_INLINE PyObject* __Pyx_PyDict_IterValues(PyObject* d) { static CYTHON_INLINE PyObject* __Pyx_PyDict_IterValues(PyObject* d) {
return __Pyx_PyObject_CallMethod0(d, (PY_MAJOR_VERSION >= 3) ? PYIDENT("values") : PYIDENT("itervalues")); if (PY_MAJOR_VERSION >= 3)
return CALL_UNBOUND_METHOD(PyDict_Type, "values", d);
else
return CALL_UNBOUND_METHOD(PyDict_Type, "itervalues", d);
} }
//////////////////// py_dict_iteritems.proto //////////////////// //////////////////// py_dict_iteritems.proto ////////////////////
...@@ -312,10 +358,12 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_IterValues(PyObject* d) { ...@@ -312,10 +358,12 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_IterValues(PyObject* d) {
static CYTHON_INLINE PyObject* __Pyx_PyDict_IterItems(PyObject* d); /*proto*/ static CYTHON_INLINE PyObject* __Pyx_PyDict_IterItems(PyObject* d); /*proto*/
//////////////////// py_dict_iteritems //////////////////// //////////////////// py_dict_iteritems ////////////////////
//@requires: ObjectHandling.c::PyObjectCallMethod0
static CYTHON_INLINE PyObject* __Pyx_PyDict_IterItems(PyObject* d) { static CYTHON_INLINE PyObject* __Pyx_PyDict_IterItems(PyObject* d) {
return __Pyx_PyObject_CallMethod0(d, (PY_MAJOR_VERSION >= 3) ? PYIDENT("items") : PYIDENT("iteritems")); if (PY_MAJOR_VERSION >= 3)
return CALL_UNBOUND_METHOD(PyDict_Type, "items", d);
else
return CALL_UNBOUND_METHOD(PyDict_Type, "iteritems", d);
} }
//////////////////// py_dict_viewkeys.proto //////////////////// //////////////////// py_dict_viewkeys.proto ////////////////////
...@@ -326,10 +374,12 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_IterItems(PyObject* d) { ...@@ -326,10 +374,12 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_IterItems(PyObject* d) {
static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewKeys(PyObject* d); /*proto*/ static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewKeys(PyObject* d); /*proto*/
//////////////////// py_dict_viewkeys //////////////////// //////////////////// py_dict_viewkeys ////////////////////
//@requires: ObjectHandling.c::PyObjectCallMethod0
static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewKeys(PyObject* d) { static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewKeys(PyObject* d) {
return __Pyx_PyObject_CallMethod0(d, (PY_MAJOR_VERSION >= 3) ? PYIDENT("keys") : PYIDENT("viewkeys")); if (PY_MAJOR_VERSION >= 3)
return CALL_UNBOUND_METHOD(PyDict_Type, "keys", d);
else
return CALL_UNBOUND_METHOD(PyDict_Type, "viewkeys", d);
} }
//////////////////// py_dict_viewvalues.proto //////////////////// //////////////////// py_dict_viewvalues.proto ////////////////////
...@@ -340,10 +390,12 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewKeys(PyObject* d) { ...@@ -340,10 +390,12 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewKeys(PyObject* d) {
static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewValues(PyObject* d); /*proto*/ static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewValues(PyObject* d); /*proto*/
//////////////////// py_dict_viewvalues //////////////////// //////////////////// py_dict_viewvalues ////////////////////
//@requires: ObjectHandling.c::PyObjectCallMethod0
static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewValues(PyObject* d) { static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewValues(PyObject* d) {
return __Pyx_PyObject_CallMethod0(d, (PY_MAJOR_VERSION >= 3) ? PYIDENT("values") : PYIDENT("viewvalues")); if (PY_MAJOR_VERSION >= 3)
return CALL_UNBOUND_METHOD(PyDict_Type, "values", d);
else
return CALL_UNBOUND_METHOD(PyDict_Type, "viewvalues", d);
} }
//////////////////// py_dict_viewitems.proto //////////////////// //////////////////// py_dict_viewitems.proto ////////////////////
...@@ -354,10 +406,12 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewValues(PyObject* d) { ...@@ -354,10 +406,12 @@ static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewValues(PyObject* d) {
static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewItems(PyObject* d); /*proto*/ static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewItems(PyObject* d); /*proto*/
//////////////////// py_dict_viewitems //////////////////// //////////////////// py_dict_viewitems ////////////////////
//@requires: ObjectHandling.c::PyObjectCallMethod0
static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewItems(PyObject* d) { static CYTHON_INLINE PyObject* __Pyx_PyDict_ViewItems(PyObject* d) {
return __Pyx_PyObject_CallMethod0(d, (PY_MAJOR_VERSION >= 3) ? PYIDENT("items") : PYIDENT("viewitems")); if (PY_MAJOR_VERSION >= 3)
return CALL_UNBOUND_METHOD(PyDict_Type, "items", d);
else
return CALL_UNBOUND_METHOD(PyDict_Type, "viewitems", d);
} }
//////////////////// pyfrozenset_new.proto //////////////////// //////////////////// pyfrozenset_new.proto ////////////////////
...@@ -396,3 +450,34 @@ static CYTHON_INLINE PyObject* __Pyx_PyFrozenSet_New(PyObject* it) { ...@@ -396,3 +450,34 @@ static CYTHON_INLINE PyObject* __Pyx_PyFrozenSet_New(PyObject* it) {
return PyObject_Call((PyObject*)&PyFrozenSet_Type, $empty_tuple, NULL); return PyObject_Call((PyObject*)&PyFrozenSet_Type, $empty_tuple, NULL);
#endif #endif
} }
//////////////////// PySet_Update.proto ////////////////////
static CYTHON_INLINE int __Pyx_PySet_Update(PyObject* set, PyObject* it); /*proto*/
//////////////////// PySet_Update ////////////////////
static CYTHON_INLINE int __Pyx_PySet_Update(PyObject* set, PyObject* it) {
PyObject *retval;
#if CYTHON_COMPILING_IN_CPYTHON
if (PyAnySet_Check(it)) {
if (PySet_GET_SIZE(it) == 0)
return 0;
// fast and safe case: CPython will update our result set and return it
retval = PySet_Type.tp_as_number->nb_inplace_or(set, it);
if (likely(retval == set)) {
Py_DECREF(retval);
return 0;
}
if (unlikely(!retval))
return -1;
// unusual result, fall through to set.update() call below
Py_DECREF(retval);
}
#endif
retval = CALL_UNBOUND_METHOD(PySet_Type, "update", set, it);
if (unlikely(!retval)) return -1;
Py_DECREF(retval);
return 0;
}
/////////////// CDivisionWarning.proto ///////////////
static int __Pyx_cdivision_warning(const char *, int); /* proto */
/////////////// CDivisionWarning ///////////////
static int __Pyx_cdivision_warning(const char *filename, int lineno) {
#if CYTHON_COMPILING_IN_PYPY
// avoid compiler warnings
filename++; lineno++;
return PyErr_Warn(PyExc_RuntimeWarning,
"division with oppositely signed operands, C and Python semantics differ");
#else
return PyErr_WarnExplicit(PyExc_RuntimeWarning,
"division with oppositely signed operands, C and Python semantics differ",
filename,
lineno,
__Pyx_MODULE_NAME,
NULL);
#endif
}
/////////////// DivInt.proto ///////////////
static CYTHON_INLINE %(type)s __Pyx_div_%(type_name)s(%(type)s, %(type)s); /* proto */
/////////////// DivInt ///////////////
static CYTHON_INLINE %(type)s __Pyx_div_%(type_name)s(%(type)s a, %(type)s b) {
%(type)s q = a / b;
%(type)s r = a - q*b;
q -= ((r != 0) & ((r ^ b) < 0));
return q;
}
/////////////// ModInt.proto ///////////////
static CYTHON_INLINE %(type)s __Pyx_mod_%(type_name)s(%(type)s, %(type)s); /* proto */
/////////////// ModInt ///////////////
static CYTHON_INLINE %(type)s __Pyx_mod_%(type_name)s(%(type)s a, %(type)s b) {
%(type)s r = a %% b;
r += ((r != 0) & ((r ^ b) < 0)) * b;
return r;
}
/////////////// ModFloat.proto ///////////////
static CYTHON_INLINE %(type)s __Pyx_mod_%(type_name)s(%(type)s, %(type)s); /* proto */
/////////////// ModFloat ///////////////
static CYTHON_INLINE %(type)s __Pyx_mod_%(type_name)s(%(type)s a, %(type)s b) {
%(type)s r = fmod%(math_h_modifier)s(a, b);
r += ((r != 0) & ((r < 0) ^ (b < 0))) * b;
return r;
}
/////////////// IntPow.proto ///////////////
static CYTHON_INLINE %(type)s %(func_name)s(%(type)s, %(type)s); /* proto */
/////////////// IntPow ///////////////
static CYTHON_INLINE %(type)s %(func_name)s(%(type)s b, %(type)s e) {
%(type)s t = b;
switch (e) {
case 3:
t *= b;
case 2:
t *= b;
case 1:
return t;
case 0:
return 1;
}
#if %(signed)s
if (unlikely(e<0)) return 0;
#endif
t = 1;
while (likely(e)) {
t *= (b * (e&1)) | ((~e)&1); /* 1 or b */
b *= b;
e >>= 1;
}
return t;
}
...@@ -245,17 +245,27 @@ __Pyx_CyFunction_get_code(__pyx_CyFunctionObject *op) ...@@ -245,17 +245,27 @@ __Pyx_CyFunction_get_code(__pyx_CyFunctionObject *op)
static int static int
__Pyx_CyFunction_init_defaults(__pyx_CyFunctionObject *op) { __Pyx_CyFunction_init_defaults(__pyx_CyFunctionObject *op) {
int result = 0;
PyObject *res = op->defaults_getter((PyObject *) op); PyObject *res = op->defaults_getter((PyObject *) op);
if (unlikely(!res)) if (unlikely(!res))
return -1; return -1;
// Cache result // Cache result
#if CYTHON_COMPILING_IN_CPYTHON
op->defaults_tuple = PyTuple_GET_ITEM(res, 0); op->defaults_tuple = PyTuple_GET_ITEM(res, 0);
Py_INCREF(op->defaults_tuple); Py_INCREF(op->defaults_tuple);
op->defaults_kwdict = PyTuple_GET_ITEM(res, 1); op->defaults_kwdict = PyTuple_GET_ITEM(res, 1);
Py_INCREF(op->defaults_kwdict); Py_INCREF(op->defaults_kwdict);
#else
op->defaults_tuple = PySequence_ITEM(res, 0);
if (unlikely(!op->defaults_tuple)) result = -1;
else {
op->defaults_kwdict = PySequence_ITEM(res, 1);
if (unlikely(!op->defaults_kwdict)) result = -1;
}
#endif
Py_DECREF(res); Py_DECREF(res);
return 0; return result;
} }
static int static int
...@@ -566,21 +576,21 @@ __Pyx_CyFunction_repr(__pyx_CyFunctionObject *op) ...@@ -566,21 +576,21 @@ __Pyx_CyFunction_repr(__pyx_CyFunctionObject *op)
// PyPy does not have this function // PyPy does not have this function
static PyObject * __Pyx_CyFunction_Call(PyObject *func, PyObject *arg, PyObject *kw) { static PyObject * __Pyx_CyFunction_Call(PyObject *func, PyObject *arg, PyObject *kw) {
PyCFunctionObject* f = (PyCFunctionObject*)func; PyCFunctionObject* f = (PyCFunctionObject*)func;
PyCFunction meth = PyCFunction_GET_FUNCTION(func); PyCFunction meth = f->m_ml->ml_meth;
PyObject *self = PyCFunction_GET_SELF(func); PyObject *self = f->m_self;
Py_ssize_t size; Py_ssize_t size;
switch (PyCFunction_GET_FLAGS(func) & ~(METH_CLASS | METH_STATIC | METH_COEXIST)) { switch (f->m_ml->ml_flags & (METH_VARARGS | METH_KEYWORDS | METH_NOARGS | METH_O)) {
case METH_VARARGS: case METH_VARARGS:
if (likely(kw == NULL) || PyDict_Size(kw) == 0) if (likely(kw == NULL || PyDict_Size(kw) == 0))
return (*meth)(self, arg); return (*meth)(self, arg);
break; break;
case METH_VARARGS | METH_KEYWORDS: case METH_VARARGS | METH_KEYWORDS:
return (*(PyCFunctionWithKeywords)meth)(self, arg, kw); return (*(PyCFunctionWithKeywords)meth)(self, arg, kw);
case METH_NOARGS: case METH_NOARGS:
if (likely(kw == NULL) || PyDict_Size(kw) == 0) { if (likely(kw == NULL || PyDict_Size(kw) == 0)) {
size = PyTuple_GET_SIZE(arg); size = PyTuple_GET_SIZE(arg);
if (size == 0) if (likely(size == 0))
return (*meth)(self, NULL); return (*meth)(self, NULL);
PyErr_Format(PyExc_TypeError, PyErr_Format(PyExc_TypeError,
"%.200s() takes no arguments (%" CYTHON_FORMAT_SSIZE_T "d given)", "%.200s() takes no arguments (%" CYTHON_FORMAT_SSIZE_T "d given)",
...@@ -589,10 +599,15 @@ static PyObject * __Pyx_CyFunction_Call(PyObject *func, PyObject *arg, PyObject ...@@ -589,10 +599,15 @@ static PyObject * __Pyx_CyFunction_Call(PyObject *func, PyObject *arg, PyObject
} }
break; break;
case METH_O: case METH_O:
if (likely(kw == NULL) || PyDict_Size(kw) == 0) { if (likely(kw == NULL || PyDict_Size(kw) == 0)) {
size = PyTuple_GET_SIZE(arg); size = PyTuple_GET_SIZE(arg);
if (size == 1) if (likely(size == 1)) {
return (*meth)(self, PyTuple_GET_ITEM(arg, 0)); PyObject *result, *arg0 = PySequence_ITEM(arg, 0);
if (unlikely(!arg0)) return NULL;
result = (*meth)(self, arg0);
Py_DECREF(arg0);
return result;
}
PyErr_Format(PyExc_TypeError, PyErr_Format(PyExc_TypeError,
"%.200s() takes exactly one argument (%" CYTHON_FORMAT_SSIZE_T "d given)", "%.200s() takes exactly one argument (%" CYTHON_FORMAT_SSIZE_T "d given)",
f->m_ml->ml_name, size); f->m_ml->ml_name, size);
...@@ -720,21 +735,30 @@ static CYTHON_INLINE void __Pyx_CyFunction_SetAnnotationsDict(PyObject *func, Py ...@@ -720,21 +735,30 @@ static CYTHON_INLINE void __Pyx_CyFunction_SetAnnotationsDict(PyObject *func, Py
} }
//////////////////// CyFunctionClassCell.proto //////////////////// //////////////////// CyFunctionClassCell.proto ////////////////////
static CYTHON_INLINE void __Pyx_CyFunction_InitClassCell(PyObject *cyfunctions, static CYTHON_INLINE int __Pyx_CyFunction_InitClassCell(PyObject *cyfunctions, PyObject *classobj);
PyObject *classobj);
//////////////////// CyFunctionClassCell //////////////////// //////////////////// CyFunctionClassCell ////////////////////
//@requires: CythonFunction //@requires: CythonFunction
static CYTHON_INLINE void __Pyx_CyFunction_InitClassCell(PyObject *cyfunctions, PyObject *classobj) { static CYTHON_INLINE int __Pyx_CyFunction_InitClassCell(PyObject *cyfunctions, PyObject *classobj) {
int i; Py_ssize_t i, count = PyList_GET_SIZE(cyfunctions);
for (i = 0; i < PyList_GET_SIZE(cyfunctions); i++) { for (i = 0; i < count; i++) {
__pyx_CyFunctionObject *m = __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *)
(__pyx_CyFunctionObject *) PyList_GET_ITEM(cyfunctions, i); #if CYTHON_COMPILING_IN_CPYTHON
m->func_classobj = classobj; PyList_GET_ITEM(cyfunctions, i);
#else
PySequence_ITEM(cyfunctions, i);
if (unlikely(!m))
return -1;
#endif
Py_INCREF(classobj); Py_INCREF(classobj);
m->func_classobj = classobj;
#if !CYTHON_COMPILING_IN_CPYTHON
Py_DECREF((PyObject*)m);
#endif
} }
return 0;
} }
//////////////////// FusedFunction.proto //////////////////// //////////////////// FusedFunction.proto ////////////////////
...@@ -886,9 +910,15 @@ __pyx_FusedFunction_getitem(__pyx_FusedFunctionObject *self, PyObject *idx) ...@@ -886,9 +910,15 @@ __pyx_FusedFunction_getitem(__pyx_FusedFunctionObject *self, PyObject *idx)
return NULL; return NULL;
for (i = 0; i < n; i++) { for (i = 0; i < n; i++) {
#if CYTHON_COMPILING_IN_CPYTHON
PyObject *item = PyTuple_GET_ITEM(idx, i); PyObject *item = PyTuple_GET_ITEM(idx, i);
#else
PyObject *item = PySequence_ITEM(idx, i);
#endif
string = _obj_to_str(item); string = _obj_to_str(item);
#if !CYTHON_COMPILING_IN_CPYTHON
Py_DECREF(item);
#endif
if (!string || PyList_Append(list, string) < 0) if (!string || PyList_Append(list, string) < 0)
goto __pyx_err; goto __pyx_err;
...@@ -998,12 +1028,19 @@ __pyx_FusedFunction_call(PyObject *func, PyObject *args, PyObject *kw) ...@@ -998,12 +1028,19 @@ __pyx_FusedFunction_call(PyObject *func, PyObject *args, PyObject *kw)
return NULL; return NULL;
self = binding_func->self; self = binding_func->self;
#if !CYTHON_COMPILING_IN_CPYTHON
Py_INCREF(self);
#endif
Py_INCREF(self); Py_INCREF(self);
PyTuple_SET_ITEM(new_args, 0, self); PyTuple_SET_ITEM(new_args, 0, self);
for (i = 0; i < argc; i++) { for (i = 0; i < argc; i++) {
#if CYTHON_COMPILING_IN_CPYTHON
PyObject *item = PyTuple_GET_ITEM(args, i); PyObject *item = PyTuple_GET_ITEM(args, i);
Py_INCREF(item); Py_INCREF(item);
#else
PyObject *item = PySequence_ITEM(args, i); if (unlikely(!item)) goto bad;
#endif
PyTuple_SET_ITEM(new_args, i + 1, item); PyTuple_SET_ITEM(new_args, i + 1, item);
} }
...@@ -1014,30 +1051,42 @@ __pyx_FusedFunction_call(PyObject *func, PyObject *args, PyObject *kw) ...@@ -1014,30 +1051,42 @@ __pyx_FusedFunction_call(PyObject *func, PyObject *args, PyObject *kw)
PyErr_SetString(PyExc_TypeError, "Need at least one argument, 0 given."); PyErr_SetString(PyExc_TypeError, "Need at least one argument, 0 given.");
return NULL; return NULL;
} }
#if CYTHON_COMPILING_IN_CPYTHON
self = PyTuple_GET_ITEM(args, 0); self = PyTuple_GET_ITEM(args, 0);
#else
self = PySequence_ITEM(args, 0); if (unlikely(!self)) return NULL;
#endif
} }
if (self && !is_classmethod && !is_staticmethod && if (self && !is_classmethod && !is_staticmethod) {
!PyObject_IsInstance(self, binding_func->type)) { int is_instance = PyObject_IsInstance(self, binding_func->type);
PyErr_Format(PyExc_TypeError, if (unlikely(!is_instance)) {
"First argument should be of type %.200s, got %.200s.", PyErr_Format(PyExc_TypeError,
((PyTypeObject *) binding_func->type)->tp_name, "First argument should be of type %.200s, got %.200s.",
self->ob_type->tp_name); ((PyTypeObject *) binding_func->type)->tp_name,
goto __pyx_err; self->ob_type->tp_name);
goto bad;
} else if (unlikely(is_instance == -1)) {
goto bad;
}
} }
#if !CYTHON_COMPILING_IN_CPYTHON
Py_XDECREF(self);
self = NULL;
#endif
if (binding_func->__signatures__) { if (binding_func->__signatures__) {
PyObject *tup = PyTuple_Pack(4, binding_func->__signatures__, args, PyObject *tup = PyTuple_Pack(4, binding_func->__signatures__, args,
kw == NULL ? Py_None : kw, kw == NULL ? Py_None : kw,
binding_func->func.defaults_tuple); binding_func->func.defaults_tuple);
if (!tup) if (!tup)
goto __pyx_err; goto bad;
new_func = (__pyx_FusedFunctionObject *) __pyx_FusedFunction_callfunction(func, tup, NULL); new_func = (__pyx_FusedFunctionObject *) __pyx_FusedFunction_callfunction(func, tup, NULL);
Py_DECREF(tup); Py_DECREF(tup);
if (!new_func) if (!new_func)
goto __pyx_err; goto bad;
Py_XINCREF(binding_func->func.func_classobj); Py_XINCREF(binding_func->func.func_classobj);
Py_CLEAR(new_func->func.func_classobj); Py_CLEAR(new_func->func.func_classobj);
...@@ -1047,7 +1096,10 @@ __pyx_FusedFunction_call(PyObject *func, PyObject *args, PyObject *kw) ...@@ -1047,7 +1096,10 @@ __pyx_FusedFunction_call(PyObject *func, PyObject *args, PyObject *kw)
} }
result = __pyx_FusedFunction_callfunction(func, args, kw); result = __pyx_FusedFunction_callfunction(func, args, kw);
__pyx_err: bad:
#if !CYTHON_COMPILING_IN_CPYTHON
Py_XDECREF(self);
#endif
Py_XDECREF(new_args); Py_XDECREF(new_args);
Py_XDECREF((PyObject *) new_func); Py_XDECREF((PyObject *) new_func);
return result; return result;
......
...@@ -144,11 +144,15 @@ static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject ...@@ -144,11 +144,15 @@ static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject
if (value && PyExceptionInstance_Check(value)) { if (value && PyExceptionInstance_Check(value)) {
instance_class = (PyObject*) Py_TYPE(value); instance_class = (PyObject*) Py_TYPE(value);
if (instance_class != type) { if (instance_class != type) {
if (PyObject_IsSubclass(instance_class, type)) { int is_subclass = PyObject_IsSubclass(instance_class, type);
if (!is_subclass) {
instance_class = NULL;
} else if (unlikely(is_subclass == -1)) {
// error on subclass test
goto bad;
} else {
// believe the instance // believe the instance
type = instance_class; type = instance_class;
} else {
instance_class = NULL;
} }
} }
} }
......
...@@ -110,6 +110,17 @@ static void __Pyx_RaiseDoubleKeywordsError( ...@@ -110,6 +110,17 @@ static void __Pyx_RaiseDoubleKeywordsError(
} }
//////////////////// RaiseMappingExpected.proto ////////////////////
static void __Pyx_RaiseMappingExpectedError(PyObject* arg); /*proto*/
//////////////////// RaiseMappingExpected ////////////////////
static void __Pyx_RaiseMappingExpectedError(PyObject* arg) {
PyErr_Format(PyExc_TypeError, "'%.200s' object is not a mapping", Py_TYPE(arg)->tp_name);
}
//////////////////// KeywordStringCheck.proto //////////////////// //////////////////// KeywordStringCheck.proto ////////////////////
static CYTHON_INLINE int __Pyx_CheckKeywordStrings(PyObject *kwdict, const char* function_name, int kw_allowed); /*proto*/ static CYTHON_INLINE int __Pyx_CheckKeywordStrings(PyObject *kwdict, const char* function_name, int kw_allowed); /*proto*/
...@@ -290,3 +301,58 @@ invalid_keyword: ...@@ -290,3 +301,58 @@ invalid_keyword:
bad: bad:
return -1; return -1;
} }
//////////////////// MergeKeywords.proto ////////////////////
static int __Pyx_MergeKeywords(PyObject *kwdict, PyObject *source_mapping); /*proto*/
//////////////////// MergeKeywords ////////////////////
//@requires: RaiseDoubleKeywords
//@requires: Optimize.c::dict_iter
static int __Pyx_MergeKeywords(PyObject *kwdict, PyObject *source_mapping) {
PyObject *iter, *key = NULL, *value = NULL;
int source_is_dict, result;
Py_ssize_t orig_length, ppos = 0;
iter = __Pyx_dict_iterator(source_mapping, 0, PYIDENT("items"), &orig_length, &source_is_dict);
if (unlikely(!iter)) {
// slow fallback: try converting to dict, then iterate
PyObject *args;
if (!PyErr_ExceptionMatches(PyExc_AttributeError)) goto bad;
PyErr_Clear();
args = PyTuple_Pack(1, source_mapping);
if (likely(args)) {
PyObject *fallback = PyObject_Call((PyObject*)&PyDict_Type, args, NULL);
Py_DECREF(args);
if (likely(fallback)) {
iter = __Pyx_dict_iterator(fallback, 1, PYIDENT("items"), &orig_length, &source_is_dict);
Py_DECREF(fallback);
}
}
if (unlikely(!iter)) goto bad;
}
while (1) {
result = __Pyx_dict_iter_next(iter, orig_length, &ppos, &key, &value, NULL, source_is_dict);
if (unlikely(result < 0)) goto bad;
if (!result) break;
if (unlikely(PyDict_Contains(kwdict, key))) {
__Pyx_RaiseDoubleKeywordsError("function", key);
result = -1;
} else {
result = PyDict_SetItem(kwdict, key, value);
}
Py_DECREF(key);
Py_DECREF(value);
if (unlikely(result < 0)) goto bad;
}
Py_XDECREF(iter);
return 0;
bad:
Py_XDECREF(iter);
return -1;
}
This diff is collapsed.
This diff is collapsed.
...@@ -40,6 +40,10 @@ ...@@ -40,6 +40,10 @@
#define CYTHON_COMPILING_IN_CPYTHON 1 #define CYTHON_COMPILING_IN_CPYTHON 1
#endif #endif
#if !defined(CYTHON_USE_PYLONG_INTERNALS) && CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x02070000
#define CYTHON_USE_PYLONG_INTERNALS 1
#endif
#if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x02070600 && !defined(Py_OptimizeFlag) #if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x02070600 && !defined(Py_OptimizeFlag)
#define Py_OptimizeFlag 0 #define Py_OptimizeFlag 0
#endif #endif
...@@ -59,16 +63,16 @@ ...@@ -59,16 +63,16 @@
#define __Pyx_DefaultClassType PyType_Type #define __Pyx_DefaultClassType PyType_Type
#endif #endif
#if !defined(Py_TPFLAGS_CHECKTYPES) #ifndef Py_TPFLAGS_CHECKTYPES
#define Py_TPFLAGS_CHECKTYPES 0 #define Py_TPFLAGS_CHECKTYPES 0
#endif #endif
#if !defined(Py_TPFLAGS_HAVE_INDEX) #ifndef Py_TPFLAGS_HAVE_INDEX
#define Py_TPFLAGS_HAVE_INDEX 0 #define Py_TPFLAGS_HAVE_INDEX 0
#endif #endif
#if !defined(Py_TPFLAGS_HAVE_NEWBUFFER) #ifndef Py_TPFLAGS_HAVE_NEWBUFFER
#define Py_TPFLAGS_HAVE_NEWBUFFER 0 #define Py_TPFLAGS_HAVE_NEWBUFFER 0
#endif #endif
#if !defined(Py_TPFLAGS_HAVE_FINALIZE) #ifndef Py_TPFLAGS_HAVE_FINALIZE
#define Py_TPFLAGS_HAVE_FINALIZE 0 #define Py_TPFLAGS_HAVE_FINALIZE 0
#endif #endif
...@@ -96,13 +100,14 @@ ...@@ -96,13 +100,14 @@
#if CYTHON_COMPILING_IN_PYPY #if CYTHON_COMPILING_IN_PYPY
#define __Pyx_PyUnicode_Concat(a, b) PyNumber_Add(a, b) #define __Pyx_PyUnicode_Concat(a, b) PyNumber_Add(a, b)
#define __Pyx_PyUnicode_ConcatSafe(a, b) PyNumber_Add(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) PyNumber_Add(a, b)
// PyPy doesn't handle frozenset() in PySet_Size()
#define __Pyx_PyFrozenSet_Size(s) PyObject_Size(s)
#else #else
#define __Pyx_PyUnicode_Concat(a, b) PyUnicode_Concat(a, b) #define __Pyx_PyUnicode_Concat(a, b) PyUnicode_Concat(a, b)
#define __Pyx_PyUnicode_ConcatSafe(a, b) ((unlikely((a) == Py_None) || unlikely((b) == Py_None)) ? \ #define __Pyx_PyUnicode_ConcatSafe(a, b) ((unlikely((a) == Py_None) || unlikely((b) == Py_None)) ? \
PyNumber_Add(a, b) : __Pyx_PyUnicode_Concat(a, b)) PyNumber_Add(a, b) : __Pyx_PyUnicode_Concat(a, b))
#define __Pyx_PyFrozenSet_Size(s) PySet_Size(s) #endif
#if CYTHON_COMPILING_IN_PYPY && !defined(PyUnicode_Contains)
#define PyUnicode_Contains(u, s) PySequence_Contains(u, s)
#endif #endif
#define __Pyx_PyString_FormatSafe(a, b) ((unlikely((a) == Py_None)) ? PyNumber_Remainder(a, b) : __Pyx_PyString_Format(a, b)) #define __Pyx_PyString_FormatSafe(a, b) ((unlikely((a) == Py_None)) ? PyNumber_Remainder(a, b) : __Pyx_PyString_Format(a, b))
...@@ -257,6 +262,14 @@ class __Pyx_FakeReference { ...@@ -257,6 +262,14 @@ class __Pyx_FakeReference {
# endif # endif
#endif #endif
#ifndef CYTHON_NCP_UNUSED
# if CYTHON_COMPILING_IN_CPYTHON
# define CYTHON_NCP_UNUSED
# else
# define CYTHON_NCP_UNUSED CYTHON_UNUSED
# endif
#endif
typedef struct {PyObject **p; char *s; const Py_ssize_t n; const char* encoding; typedef struct {PyObject **p; char *s; const Py_ssize_t n; const char* encoding;
const char is_unicode; const char is_str; const char intern; } __Pyx_StringTabEntry; /*proto*/ const char is_unicode; const char is_str; const char intern; } __Pyx_StringTabEntry; /*proto*/
......
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
def pylong_join(count, digits_ptr='digits', join_type='unsigned long'):
"""
Generate an unrolled shift-then-or loop over the first 'count' digits.
Assumes that they fit into 'join_type'.
(((d[2] << n) | d[1]) << n) | d[0]
"""
return ('(' * (count * 2) + "(%s)" % join_type + ' | '.join(
"%s[%d])%s)" % (digits_ptr, _i, " << PyLong_SHIFT" if _i else '')
for _i in range(count-1, -1, -1)))
# although it could potentially make use of data independence,
# this implementation is a bit slower than the simpler one above
def _pylong_join(count, digits_ptr='digits', join_type='unsigned long'):
"""
Generate an or-ed series of shifts for the first 'count' digits.
Assumes that they fit into 'join_type'.
(d[2] << 2*n) | (d[1] << 1*n) | d[0]
"""
def shift(n):
# avoid compiler warnings for overly large shifts that will be discarded anyway
return " << (%d * PyLong_SHIFT < 8 * sizeof(%s) ? %d * PyLong_SHIFT : 0)" % (n, join_type, n) if n else ''
return '(%s)' % ' | '.join(
"(((%s)%s[%d])%s)" % (join_type, digits_ptr, i, shift(i))
for i in range(count-1, -1, -1))
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment