Commit 60676a6c authored by Robert Bradshaw's avatar Robert Bradshaw

Merge branch 'master' into grammar

parents aec028fd c52c6a6e
......@@ -22,12 +22,14 @@ before_install:
- sudo apt-get install gdb python$( python -c 'import sys; print("%d.%d" % sys.version_info[:2])' )-dbg || true
- dpkg -l | grep gdb || true
install: CFLAGS="-O2 -ggdb" pip install .
install:
- CFLAGS="-O2 -ggdb -Wall -Wextra $(python -c 'import sys; print("-fno-strict-aliasing" if sys.version_info[0] == 2 else "")')" python setup.py build
script:
- PYTHON_DBG="python$( python -c 'import sys; print("%d.%d" % sys.version_info[:2])' )-dbg"
- if $PYTHON_DBG -V >&2; then CFLAGS="-O0 -ggdb" $PYTHON_DBG runtests.py -vv Debugger --backends=$BACKEND; fi
- CFLAGS="-O0 -ggdb" python runtests.py -vv -x Debugger --backends=$BACKEND
- CFLAGS="-O2 -ggdb -Wall -Wextra" python setup.py build_ext -i
- CFLAGS="-O0 -ggdb -Wall -Wextra" python runtests.py -vv -x Debugger --backends=$BACKEND
matrix:
allow_failures:
......@@ -38,4 +40,3 @@ matrix:
env: BACKEND=cpp
- python: pypy3
env: BACKEND=cpp
fast_finish: true
......@@ -2,9 +2,64 @@
Cython Changelog
================
=======
Latest
=======
======
Features added
--------------
* C functions can coerce to Python functions, which allows passing them
around as callable objects.
* Extern C functions can now be declared as cpdef to export them to
the module's Python namespace. Extern C functions in pxd files export
their values to their own module, iff it exists.
* Missing C-API declarations in ``cpython.unicode`` were added.
* Passing ``language='c++'`` into cythonize() globally enables C++ mode for
all modules that were not passed as Extension objects (i.e. only source
files and file patterns).
* ``Py_hash_t`` is a known type (used in CPython for hash values).
* ``PySlice_*()`` C-API functions are available from the ``cpython.slice``
module.
Bugs fixed
----------
* Mismatching 'except' declarations on signatures in .pxd and .pyx files failed
to produce a compile error.
* Reference leak for non-simple Python expressions in boolean and/or expressions.
* To fix a name collision and to reflect availability on host platforms,
standard C declarations [ clock(), time(), struct tm and tm* functions ]
were moved from posix/time.pxd to a new libc/time.pxd.
* Rerunning unmodified modules in IPython's cython support failed.
Patch by Matthias Bussonier.
* Casting C++ ``std::string`` to Python byte strings failed when
auto-decoding was enabled.
* Fatal exceptions in global module init code could lead to crashes
if the already created module was used later on (e.g. through a
stale reference in sys.modules or elsewhere).
* Allow arrays of C++ classes.
Other changes
-------------
* Compilation no longer fails hard when unknown compilation options are passed.
Instead, it raises a warning and ignores them (as it did silently before 0.21).
0.21 (2014-09-10)
=================
Features added
--------------
......@@ -55,6 +110,9 @@ Features added
* Defines dynamic_cast et al. in ``libcpp.cast`` and C++ heap data
structure operations in ``libcpp.algorithm``.
* Shipped header declarations in ``posix.*`` were extended to cover
more of the POSIX API. Patches by Lars Buitinck and Mark Peek.
Optimizations
-------------
......@@ -80,6 +138,12 @@ Optimizations
Bugs fixed
----------
* Crash when assigning memory views from ternary conditional expressions.
* Nested C++ templates could lead to unseparated ">>" characters being
generated into the C++ declarations, which older C++ compilers could
not parse.
* Sending SIGINT (Ctrl-C) during parallel cythonize() builds could
hang the child processes.
......@@ -122,9 +186,15 @@ Bugs fixed
* Fix infinite recursion when using super with cpdef methods.
* No-args ``dir()`` was not guaranteed to return a sorted list.
Other changes
-------------
* The header line in the generated C files no longer contains the
timestamp but only the Cython version that wrote it. This was
changed to make builds more reproducible.
* Removed support for CPython 2.4, 2.5 and 3.1.
* The licensing implications on the generated code were clarified
......
......@@ -145,6 +145,8 @@ def parse_args(args):
help='set a cythonize option')
parser.add_option('-3', dest='python3_mode', action='store_true',
help='use Python 3 syntax mode by default')
parser.add_option('-a', '--annotate', dest='annotate', action='store_true',
help='generate annotated HTML page for source files')
parser.add_option('-x', '--exclude', metavar='PATTERN', dest='excludes',
action='append', default=[],
......@@ -188,6 +190,9 @@ def main(args=None):
Options.error_on_unknown_names = False
Options.error_on_uninitialized = False
if options.annotate:
Options.annotate = True
for path in paths:
cython_compile(path, options)
......
......@@ -251,6 +251,7 @@ def strip_string_literals(code, prefix='__Pyx_L'):
in_quote = False
hash_mark = single_q = double_q = -1
code_len = len(code)
quote_type = quote_len = None
while True:
if hash_mark < q:
......@@ -260,7 +261,8 @@ def strip_string_literals(code, prefix='__Pyx_L'):
if double_q < q:
double_q = code.find('"', q)
q = min(single_q, double_q)
if q == -1: q = max(single_q, double_q)
if q == -1:
q = max(single_q, double_q)
# We're done.
if q == -1 and hash_mark == -1:
......@@ -276,7 +278,8 @@ def strip_string_literals(code, prefix='__Pyx_L'):
if k % 2 == 0:
q += 1
continue
if code[q] == quote_type and (quote_len == 1 or (code_len > q + 2 and quote_type == code[q+1] == code[q+2])):
if code[q] == quote_type and (
quote_len == 1 or (code_len > q + 2 and quote_type == code[q+1] == code[q+2])):
counter += 1
label = "%s%s_" % (prefix, counter)
literals[label] = code[start+quote_len:q]
......@@ -586,7 +589,8 @@ def create_dependency_tree(ctx=None, quiet=False):
# This may be useful for advanced users?
def create_extension_list(patterns, exclude=[], ctx=None, aliases=None, quiet=False, exclude_failures=False):
def create_extension_list(patterns, exclude=[], ctx=None, aliases=None, quiet=False, language=None,
exclude_failures=False):
if not isinstance(patterns, (list, tuple)):
patterns = [patterns]
explicit_modules = set([m.name for m in patterns if isinstance(m, Extension)])
......@@ -606,6 +610,7 @@ def create_extension_list(patterns, exclude=[], ctx=None, aliases=None, quiet=Fa
name = '*'
base = None
exn_type = Extension
ext_language = language
elif isinstance(pattern, Extension):
for filepattern in pattern.sources:
if os.path.splitext(filepattern)[1] in ('.py', '.pyx'):
......@@ -618,6 +623,7 @@ def create_extension_list(patterns, exclude=[], ctx=None, aliases=None, quiet=Fa
name = template.name
base = DistutilsInfo(exn=template)
exn_type = template.__class__
ext_language = None # do not override whatever the Extension says
else:
raise TypeError(pattern)
......@@ -661,6 +667,9 @@ def create_extension_list(patterns, exclude=[], ctx=None, aliases=None, quiet=Fa
depends = list(set(template.depends).union(set(depends)))
kwds['depends'] = depends
if ext_language and 'language' not in kwds:
kwds['language'] = ext_language
module_list.append(exn_type(
name=module_name,
sources=sources,
......@@ -671,7 +680,7 @@ def create_extension_list(patterns, exclude=[], ctx=None, aliases=None, quiet=Fa
# This is the user-exposed entry point.
def cythonize(module_list, exclude=[], nthreads=0, aliases=None, quiet=False, force=False,
def cythonize(module_list, exclude=[], nthreads=0, aliases=None, quiet=False, force=False, language=None,
exclude_failures=False, **options):
"""
Compile a set of source modules into C/C++ files and return a list of distutils
......@@ -684,6 +693,11 @@ def cythonize(module_list, exclude=[], nthreads=0, aliases=None, quiet=False, fo
When using glob patterns, you can exclude certain module names explicitly
by passing them into the 'exclude' option.
To globally enable C++ mode, you can pass language='c++'. Otherwise, this
will be determined at a per-file level based on compiler directives. This
affects only modules found based on file names. Extension instances passed
into cythonize() will not be changed.
For parallel compilation, set the 'nthreads' option to the number of
concurrent builds.
......@@ -711,6 +725,7 @@ def cythonize(module_list, exclude=[], nthreads=0, aliases=None, quiet=False, fo
ctx=ctx,
quiet=quiet,
exclude_failures=exclude_failures,
language=language,
aliases=aliases)
deps = create_dependency_tree(ctx, quiet=quiet)
build_dir = getattr(options, 'build_dir', None)
......
......@@ -214,7 +214,7 @@ class CythonMagics(Magics):
code = cell if cell.endswith('\n') else cell+'\n'
lib_dir = os.path.join(get_ipython_cache_dir(), 'cython')
quiet = True
key = code, sys.version_info, sys.executable, cython_version
key = code, line, sys.version_info, sys.executable, cython_version
if not os.path.exists(lib_dir):
os.makedirs(lib_dir)
......
......@@ -12,6 +12,13 @@ try:
except:
__test__ = False
try:
# disable IPython history thread to avoid having to clean it up
from IPython.core.history import HistoryManager
HistoryManager.enabled = False
except ImportError:
pass
from Cython.TestUtils import CythonTest
ip = get_ipython()
......
......@@ -150,6 +150,10 @@ class EmbedSignature(CythonTransform):
self.class_node = oldclass
return node
def visit_LambdaNode(self, node):
# lambda expressions so not have signature or inner functions
return node
def visit_DefNode(self, node):
if not self.current_directives['embedsignature']:
return node
......
......@@ -46,7 +46,6 @@ non_portable_builtins_map = {
'basestring' : ('PY_MAJOR_VERSION >= 3', 'str'),
'xrange' : ('PY_MAJOR_VERSION >= 3', 'range'),
'raw_input' : ('PY_MAJOR_VERSION >= 3', 'input'),
'BaseException' : ('PY_VERSION_HEX < 0x02050000', 'Exception'),
}
basicsize_builtins_map = {
......@@ -1920,7 +1919,7 @@ class CCodeWriter(object):
if entry.is_special:
method_flags += [method_coexist]
self.putln(
'{__Pyx_NAMESTR("%s"), (PyCFunction)%s, %s, __Pyx_DOCSTR(%s)}%s' % (
'{"%s", (PyCFunction)%s, %s, %s}%s' % (
entry.name,
entry.func_cname,
"|".join(method_flags),
......
......@@ -1384,10 +1384,8 @@ class UnicodeNode(ConstNode):
data_cname,
data_cname,
code.error_goto_if_null(self.result_code, self.pos)))
code.putln("#if CYTHON_PEP393_ENABLED")
code.put_error_if_neg(
self.pos, "PyUnicode_READY(%s)" % self.result_code)
code.putln("#endif")
self.pos, "__Pyx_PyUnicode_READY(%s)" % self.result_code)
else:
self.result_code = code.get_py_string_const(self.value)
else:
......@@ -4254,19 +4252,6 @@ class SliceNode(ExprNode):
if self.is_literal:
code.put_giveref(self.py_result())
def __deepcopy__(self, memo):
"""
There is a copy bug in python 2.4 for slice objects.
"""
return SliceNode(
self.pos,
start=copy.deepcopy(self.start, memo),
stop=copy.deepcopy(self.stop, memo),
step=copy.deepcopy(self.step, memo),
is_temp=self.is_temp,
is_literal=self.is_literal,
constant_result=self.constant_result)
class CallNode(ExprNode):
......@@ -7533,7 +7518,7 @@ class BoundMethodNode(ExprNode):
def generate_result_code(self, code):
code.putln(
"%s = PyMethod_New(%s, %s, (PyObject*)%s->ob_type); %s" % (
"%s = __Pyx_PyMethod_New(%s, %s, (PyObject*)%s->ob_type); %s" % (
self.result(),
self.function.py_result(),
self.self_object.py_result(),
......@@ -7565,7 +7550,7 @@ class UnboundMethodNode(ExprNode):
def generate_result_code(self, code):
class_cname = code.pyclass_stack[-1].classobj.result()
code.putln(
"%s = PyMethod_New(%s, 0, %s); %s" % (
"%s = __Pyx_PyMethod_New(%s, 0, %s); %s" % (
self.result(),
self.function.py_result(),
class_cname,
......@@ -7798,7 +7783,7 @@ class PyCFunctionNode(ExprNode, ModuleNameMixin):
self.get_py_qualified_name(code),
self.self_result_code(),
self.get_py_mod_name(code),
"PyModule_GetDict(%s)" % Naming.module_cname,
Naming.moddict_cname,
code_object_result,
code.error_goto_if_null(self.result(), self.pos)))
......@@ -9923,7 +9908,7 @@ class BoolBinopNode(ExprNode):
operator=self.operator,
operand1=operand1, operand2=operand2)
def generate_bool_evaluation_code(self, code, final_result_temp, and_label, or_label, end_label):
def generate_bool_evaluation_code(self, code, final_result_temp, and_label, or_label, end_label, fall_through):
code.mark_pos(self.pos)
outer_labels = (and_label, or_label)
......@@ -9931,20 +9916,21 @@ class BoolBinopNode(ExprNode):
my_label = and_label = code.new_label('next_and')
else:
my_label = or_label = code.new_label('next_or')
self.operand1.generate_bool_evaluation_code(code, final_result_temp, and_label, or_label, end_label)
self.operand1.generate_bool_evaluation_code(
code, final_result_temp, and_label, or_label, end_label, my_label)
and_label, or_label = outer_labels
code.put_label(my_label)
self.operand2.generate_bool_evaluation_code(code, final_result_temp, and_label, or_label, end_label)
self.operand2.generate_bool_evaluation_code(
code, final_result_temp, and_label, or_label, end_label, fall_through)
def generate_evaluation_code(self, code):
self.allocate_temp_result(code)
or_label = and_label = None
end_label = code.new_label('bool_binop_done')
self.generate_bool_evaluation_code(code, self.result(), and_label, or_label, end_label)
if code.label_used(end_label):
code.put_label(end_label)
self.generate_bool_evaluation_code(code, self.result(), and_label, or_label, end_label, end_label)
code.put_label(end_label)
gil_message = "Truth-testing Python object"
......@@ -10028,7 +10014,7 @@ class BoolBinopResultNode(ExprNode):
test_result = self.arg.result()
return (test_result, self.arg.type.is_pyobject)
def generate_bool_evaluation_code(self, code, final_result_temp, and_label, or_label, end_label):
def generate_bool_evaluation_code(self, code, final_result_temp, and_label, or_label, end_label, fall_through):
code.mark_pos(self.pos)
# x => x
......@@ -10040,31 +10026,43 @@ class BoolBinopResultNode(ExprNode):
self.arg.generate_evaluation_code(code)
if and_label or or_label:
test_result, uses_temp = self.generate_operand_test(code)
if uses_temp and (and_label and or_label):
# cannot become final result => free early
# disposal: uses_temp and (and_label and or_label)
self.arg.generate_disposal_code(code)
sense = '!' if or_label else ''
code.putln("if (%s%s) {" % (sense, test_result))
if uses_temp:
code.funcstate.release_temp(test_result)
self.arg.generate_disposal_code(code)
if not uses_temp or not (and_label and or_label):
# disposal: (not uses_temp) or {not (and_label and or_label) [if]}
self.arg.generate_disposal_code(code)
if or_label:
if or_label and or_label != fall_through:
# value is false => short-circuit to next 'or'
code.put_goto(or_label)
code.putln("} else {")
if and_label:
# value is true => go to next 'and'
code.put_goto(and_label)
if not or_label:
if or_label:
code.putln("} else {")
if not uses_temp:
# disposal: (not uses_temp) and {(and_label and or_label) [else]}
self.arg.generate_disposal_code(code)
if and_label != fall_through:
code.put_goto(and_label)
if not and_label or not or_label:
# if no next 'and' or 'or', we provide the result
if and_label or or_label:
code.putln("} else {")
self.value.generate_evaluation_code(code)
self.value.make_owned_reference(code)
code.putln("%s = %s;" % (final_result_temp, self.value.result()))
self.value.generate_post_assignment_code(code)
# disposal: {not (and_label and or_label) [else]}
self.arg.generate_disposal_code(code)
self.value.free_temps(code)
if and_label or or_label:
if end_label != fall_through:
code.put_goto(end_label)
if and_label or or_label:
......@@ -10156,7 +10154,10 @@ class CondExprNode(ExprNode):
def eval_and_get(self, code, expr):
expr.generate_evaluation_code(code)
expr.make_owned_reference(code)
if self.type.is_memoryviewslice:
expr.make_owned_memoryviewslice(code)
else:
expr.make_owned_reference(code)
code.putln('%s = %s;' % (self.result(), expr.result_as(self.ctype())))
expr.generate_post_assignment_code(code)
expr.free_temps(code)
......@@ -11172,9 +11173,9 @@ class CoerceToPyTypeNode(CoercionNode):
func = arg_type.to_py_function
if arg_type.is_string or arg_type.is_cpp_string:
if self.type in (bytes_type, str_type, unicode_type):
func = func.replace("Object", self.type.name.title())
func = func.replace("Object", self.type.name.title(), 1)
elif self.type is bytearray_type:
func = func.replace("Object", "ByteArray")
func = func.replace("Object", "ByteArray", 1)
funccall = "%s(%s)" % (func, self.arg.result())
code.putln('%s = %s; %s' % (
......
......@@ -634,7 +634,7 @@ def check_definitions(flow, compiler_directives):
for entry in flow.entries:
if (not entry.cf_references
and not entry.is_pyclass_attr):
if entry.name != '_':
if entry.name != '_' and not entry.name.startswith('unused'):
# '_' is often used for unused variables, e.g. in loops
if entry.is_arg:
if warn_unused_arg:
......
......@@ -7,7 +7,7 @@ from __future__ import absolute_import
import os
import re
import sys
import codecs
import io
if sys.version_info[:2] < (2, 6) or (3, 0) <= sys.version_info[:2] < (3, 2):
sys.stderr.write("Sorry, Cython requires Python 2.6+ or 3.2+, found %d.%d\n" % tuple(sys.version_info[:2]))
......@@ -22,10 +22,12 @@ from . import Errors
from .Scanning import PyrexScanner, FileSourceDescriptor
from .Errors import PyrexError, CompileError, error, warning
from .Symtab import ModuleScope
from .. import __version__ as version
from .. import Utils
from . import Options
from . import Version # legacy import needed by old PyTables versions
version = Version.version # legacy attribute - use "Cython.__version__" instead
module_name_pattern = re.compile(r"[A-Za-z_][A-Za-z0-9_]*(\.[A-Za-z_][A-Za-z0-9_]*)*$")
verbose = 0
......@@ -364,10 +366,9 @@ class Context(object):
return ".".join(names)
def setup_errors(self, options, result):
Errors.reset() # clear any remaining error state
Errors.reset() # clear any remaining error state
if options.use_listing_file:
result.listing_file = Utils.replace_suffix(source, ".lis")
path = result.listing_file
path = result.listing_file = Utils.replace_suffix(result.main_source_file, ".lis")
else:
path = None
Errors.open_listing_file(path=path,
......@@ -432,9 +433,10 @@ def run_pipeline(source, options, full_module_name=None, context=None):
# By default, decide based on whether an html file already exists.
html_filename = os.path.splitext(result.c_file)[0] + ".html"
if os.path.exists(html_filename):
line = codecs.open(html_filename, "r", encoding="UTF-8").readline()
if line.startswith(u'<!-- Generated by Cython'):
options.annotate = True
with io.open(html_filename, "r", encoding="UTF-8") as html_file:
line = html_file.readline()
if line.startswith(u'<!-- Generated by Cython'):
options.annotate = True
# Get pipeline
if source_ext.lower() == '.py' or not source_ext:
......@@ -504,11 +506,14 @@ class CompilationOptions(object):
# ignore valid options that are not in the defaults
unknown_options.difference_update(['include_path'])
if unknown_options:
raise ValueError("got unexpected compilation option%s: %s" % (
# TODO: make this a hard error in 0.22
message = "got unknown compilation option%s, please remove: %s" % (
's' if len(unknown_options) > 1 else '',
', '.join(unknown_options)))
', '.join(unknown_options))
import warnings
warnings.warn(message)
directives = dict(options['compiler_directives']) # copy mutable field
directives = dict(options['compiler_directives']) # copy mutable field
options['compiler_directives'] = directives
if 'language_level' in directives and 'language_level' not in kw:
options['language_level'] = int(directives['language_level'])
......
......@@ -439,13 +439,16 @@ def get_is_contig_utility(c_contig, ndim):
return utility
def copy_src_to_dst_cname():
return "__pyx_memoryview_copy_contents"
def verify_direct_dimensions(node):
for access, packing in node.type.axes:
if access != 'direct':
error(self.pos, "All dimensions must be direct")
error(node.pos, "All dimensions must be direct")
def copy_broadcast_memview_src_to_dst(src, dst, code):
"""
......@@ -662,7 +665,7 @@ def get_axes_specs(env, axes):
if entry.name in view_constant_to_access_packing:
axes_specs.append(view_constant_to_access_packing[entry.name])
else:
raise CompilerError(axis.step.pos, INVALID_ERR)
raise CompileError(axis.step.pos, INVALID_ERR)
else:
raise CompileError(axis.step.pos, INVALID_ERR)
......
......@@ -1894,7 +1894,7 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
code.putln(
"PyVarObject_HEAD_INIT(0, 0)")
code.putln(
'__Pyx_NAMESTR("%s.%s"), /*tp_name*/' % (
'"%s.%s", /*tp_name*/' % (
self.full_module_name, scope.class_name))
if type.typedef_flag:
objstruct = type.objstruct_cname
......@@ -1933,7 +1933,7 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
env.getset_table_cname)
for entry in env.property_entries:
if entry.doc:
doc_code = "__Pyx_DOCSTR(%s)" % code.get_string_const(entry.doc)
doc_code = "%s" % code.get_string_const(entry.doc)
else:
doc_code = "0"
code.putln(
......@@ -2040,8 +2040,10 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
env.use_utility_code(UtilityCode.load("CheckBinaryVersion", "ModuleSetupCode.c"))
code.putln("if ( __Pyx_check_binary_version() < 0) %s" % code.error_goto(self.pos))
code.putln("%s = PyTuple_New(0); %s" % (Naming.empty_tuple, code.error_goto_if_null(Naming.empty_tuple, self.pos)))
code.putln("%s = PyBytes_FromStringAndSize(\"\", 0); %s" % (Naming.empty_bytes, code.error_goto_if_null(Naming.empty_bytes, self.pos)))
code.putln("%s = PyTuple_New(0); %s" % (
Naming.empty_tuple, code.error_goto_if_null(Naming.empty_tuple, self.pos)))
code.putln("%s = PyBytes_FromStringAndSize(\"\", 0); %s" % (
Naming.empty_bytes, code.error_goto_if_null(Naming.empty_bytes, self.pos)))
code.putln("#ifdef __Pyx_CyFunction_USED")
code.putln("if (__Pyx_CyFunction_init() < 0) %s" % code.error_goto(self.pos))
......@@ -2079,7 +2081,7 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
EncodedString("__main__"), identifier=True)
code.putln("if (%s%s) {" % (Naming.module_is_main, self.full_module_name.replace('.', '__')))
code.putln(
'if (__Pyx_SetAttrString(%s, "__name__", %s) < 0) %s;' % (
'if (PyObject_SetAttrString(%s, "__name__", %s) < 0) %s;' % (
env.module_cname,
__main__name.cname,
code.error_goto(self.pos)))
......@@ -2140,8 +2142,15 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
for cname, type in code.funcstate.all_managed_temps():
code.put_xdecref(cname, type)
code.putln('if (%s) {' % env.module_cname)
code.putln('if (%s) {' % env.module_dict_cname)
code.put_add_traceback("init %s" % env.qualified_name)
env.use_utility_code(Nodes.traceback_utility_code)
code.globalstate.use_utility_code(Nodes.traceback_utility_code)
# Module reference and module dict are in global variables which might still be needed
# for cleanup, atexit code, etc., so leaking is better than crashing.
# At least clearing the module dict here might be a good idea, but could still break
# user code in atexit or other global registries.
##code.put_decref_clear(env.module_dict_cname, py_object_type, nanny=False)
code.putln('}')
code.put_decref_clear(env.module_cname, py_object_type, nanny=False)
code.putln('} else if (!PyErr_Occurred()) {')
code.putln('PyErr_SetString(PyExc_ImportError, "init %s");' % env.qualified_name)
......@@ -2167,7 +2176,7 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
module_path = self.pos[0].filename
if module_path:
code.putln('if (__Pyx_SetAttrString(%s, "__file__", %s) < 0) %s;' % (
code.putln('if (PyObject_SetAttrString(%s, "__file__", %s) < 0) %s;' % (
env.module_cname,
code.globalstate.get_py_string_const(
EncodedString(decode_filename(module_path))).cname,
......@@ -2184,7 +2193,7 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
code.error_goto_if_null(temp, self.pos)))
code.put_gotref(temp)
code.putln(
'if (__Pyx_SetAttrString(%s, "__path__", %s) < 0) %s;' % (
'if (PyObject_SetAttrString(%s, "__path__", %s) < 0) %s;' % (
env.module_cname, temp, code.error_goto(self.pos)))
code.put_decref_clear(temp, py_object_type)
code.funcstate.release_temp(temp)
......@@ -2305,7 +2314,7 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
def generate_pymoduledef_struct(self, env, code):
if env.doc:
doc = "__Pyx_DOCSTR(%s)" % code.get_string_const(env.doc)
doc = "%s" % code.get_string_const(env.doc)
else:
doc = "0"
if Options.generate_cleanup_code:
......@@ -2322,7 +2331,7 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
code.putln("#else")
code.putln(" PyModuleDef_HEAD_INIT,")
code.putln("#endif")
code.putln(' __Pyx_NAMESTR("%s"),' % env.module_name)
code.putln(' "%s",' % env.module_name)
code.putln(" %s, /* m_doc */" % doc)
code.putln(" -1, /* m_size */")
code.putln(" %s /* m_methods */," % env.method_table_cname)
......@@ -2337,12 +2346,12 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
# Generate code to create the module object and
# install the builtins.
if env.doc:
doc = "__Pyx_DOCSTR(%s)" % code.get_string_const(env.doc)
doc = "%s" % code.get_string_const(env.doc)
else:
doc = "0"
code.putln("#if PY_MAJOR_VERSION < 3")
code.putln(
'%s = Py_InitModule4(__Pyx_NAMESTR("%s"), %s, %s, 0, PYTHON_API_VERSION); Py_XINCREF(%s);' % (
'%s = Py_InitModule4("%s", %s, %s, 0, PYTHON_API_VERSION); Py_XINCREF(%s);' % (
env.module_cname,
env.module_name,
env.method_table_cname,
......@@ -2362,20 +2371,20 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
code.put_incref(env.module_dict_cname, py_object_type, nanny=False)
code.putln(
'%s = PyImport_AddModule(__Pyx_NAMESTR(__Pyx_BUILTIN_MODULE_NAME)); %s' % (
'%s = PyImport_AddModule(__Pyx_BUILTIN_MODULE_NAME); %s' % (
Naming.builtins_cname,
code.error_goto_if_null(Naming.builtins_cname, self.pos)))
code.putln('#if CYTHON_COMPILING_IN_PYPY')
code.putln('Py_INCREF(%s);' % Naming.builtins_cname)
code.putln('#endif')
code.putln(
'if (__Pyx_SetAttrString(%s, "__builtins__", %s) < 0) %s;' % (
'if (PyObject_SetAttrString(%s, "__builtins__", %s) < 0) %s;' % (
env.module_cname,
Naming.builtins_cname,
code.error_goto(self.pos)))
if Options.pre_import is not None:
code.putln(
'%s = PyImport_AddModule(__Pyx_NAMESTR("%s")); %s' % (
'%s = PyImport_AddModule("%s"); %s' % (
Naming.preimport_cname,
Options.pre_import,
code.error_goto_if_null(Naming.preimport_cname, self.pos)))
......@@ -2401,7 +2410,7 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
entry.cname))
code.putln(code.error_goto_if_null("wrapped", entry.pos))
code.putln(
'if (__Pyx_SetAttrString(%s, "%s", wrapped) < 0) %s;' % (
'if (PyObject_SetAttrString(%s, "%s", wrapped) < 0) %s;' % (
env.module_cname,
name,
code.error_goto(entry.pos)))
......@@ -2639,7 +2648,7 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
code.putln('#if CYTHON_COMPILING_IN_CPYTHON')
code.putln("{")
code.putln(
'PyObject *wrapper = __Pyx_GetAttrString((PyObject *)&%s, "%s"); %s' % (
'PyObject *wrapper = PyObject_GetAttrString((PyObject *)&%s, "%s"); %s' % (
typeobj_cname,
func.name,
code.error_goto_if_null('wrapper', entry.pos)))
......@@ -2671,7 +2680,7 @@ class ModuleNode(Nodes.Node, Nodes.BlockNode):
# Cython (such as closures), the 'internal'
# directive is set by users
code.putln(
'if (__Pyx_SetAttrString(%s, "%s", (PyObject *)&%s) < 0) %s' % (
'if (PyObject_SetAttrString(%s, "%s", (PyObject *)&%s) < 0) %s' % (
Naming.module_cname,
scope.class_name,
typeobj_cname,
......@@ -2787,7 +2796,7 @@ import_star_utility_code = """
static int
__Pyx_import_all_from(PyObject *locals, PyObject *v)
{
PyObject *all = __Pyx_GetAttrString(v, "__all__");
PyObject *all = PyObject_GetAttrString(v, "__all__");
PyObject *dict, *name, *value;
int skip_leading_underscores = 0;
int pos, err;
......@@ -2796,7 +2805,7 @@ __Pyx_import_all_from(PyObject *locals, PyObject *v)
if (!PyErr_ExceptionMatches(PyExc_AttributeError))
return -1; /* Unexpected error */
PyErr_Clear();
dict = __Pyx_GetAttrString(v, "__dict__");
dict = PyObject_GetAttrString(v, "__dict__");
if (dict == NULL) {
if (!PyErr_ExceptionMatches(PyExc_AttributeError))
return -1;
......
......@@ -531,7 +531,7 @@ class CArrayDeclaratorNode(CDeclaratorNode):
child_attrs = ["base", "dimension"]
def analyse(self, base_type, env, nonempty = 0):
if base_type.is_cpp_class or base_type.is_cfunction:
if (base_type.is_cpp_class and base_type.is_template_type()) or base_type.is_cfunction:
from .ExprNodes import TupleNode
if isinstance(self.dimension, TupleNode):
args = self.dimension.args
......@@ -1090,7 +1090,7 @@ class TemplatedTypeNode(CBaseTypeNode):
base_type = self.base_type_node.analyse(env)
if base_type.is_error: return base_type
if base_type.is_cpp_class:
if base_type.is_cpp_class and base_type.is_template_type():
# Templated class
if self.keyword_args and self.keyword_args.key_value_pairs:
error(self.pos, "c++ templates cannot take keyword arguments")
......@@ -1271,6 +1271,11 @@ class CVarDefNode(StatNode):
"Non-trivial type declarators in shared declaration (e.g. mix of pointers and values). " +
"Each pointer declaration should be on its own line.", 1)
create_extern_wrapper = (self.overridable
and self.visibility == 'extern'
and env.is_module_scope)
if create_extern_wrapper:
declarator.overridable = False
if isinstance(declarator, CFuncDeclaratorNode):
name_declarator, type = declarator.analyse(base_type, env, directive_locals=self.directive_locals)
else:
......@@ -1296,6 +1301,9 @@ class CVarDefNode(StatNode):
self.entry.directive_locals = copy.copy(self.directive_locals)
if 'staticmethod' in env.directives:
type.is_static_method = True
if create_extern_wrapper:
self.entry.type.create_to_py_utility_code(env)
self.entry.create_wrapper = True
else:
if self.directive_locals:
error(self.pos, "Decorators can only be followed by functions")
......@@ -1321,8 +1329,6 @@ class CStructOrUnionDefNode(StatNode):
child_attrs = ["attributes"]
def declare(self, env, scope=None):
if self.visibility == 'extern' and self.packed and not scope:
error(self.pos, "Cannot declare extern struct as 'packed'")
self.entry = env.declare_struct_or_union(
self.name, self.kind, scope, self.typedef_flag, self.pos,
self.cname, visibility = self.visibility, api = self.api,
......@@ -1585,7 +1591,7 @@ class FuncDefNode(StatNode, BlockNode):
if arg.name in directive_locals:
type_node = directive_locals[arg.name]
other_type = type_node.analyse_as_type(env)
elif isinstance(arg, CArgDeclNode) and arg.annotation:
elif isinstance(arg, CArgDeclNode) and arg.annotation and env.directives['annotation_typing']:
type_node = arg.annotation
other_type = arg.inject_type_from_annotations(env)
if other_type is None:
......
......@@ -1775,7 +1775,8 @@ class InlineDefNodeCalls(Visitor.NodeRefCleanupMixin, Visitor.EnvTransform):
return node
class OptimizeBuiltinCalls(Visitor.MethodDispatcherTransform):
class OptimizeBuiltinCalls(Visitor.NodeRefCleanupMixin,
Visitor.MethodDispatcherTransform):
"""Optimize some common methods calls and instantiation patterns
for builtin types *after* the type analysis phase.
......@@ -2061,17 +2062,18 @@ class OptimizeBuiltinCalls(Visitor.MethodDispatcherTransform):
temps.append(arg)
args.append(arg)
result = ExprNodes.SetNode(node.pos, is_temp=1, args=args)
self.replace(node, result)
for temp in temps[::-1]:
result = UtilNodes.EvalWithTempExprNode(temp, result)
return result
else:
# PySet_New(it) is better than a generic Python call to set(it)
return ExprNodes.PythonCapiCallNode(
return self.replace(node, ExprNodes.PythonCapiCallNode(
node.pos, "PySet_New",
self.PySet_New_func_type,
args=pos_args,
is_temp=node.is_temp,
py_name="set")
py_name="set"))
PyFrozenSet_New_func_type = PyrexTypes.CFuncType(
Builtin.frozenset_type, [
......@@ -2195,7 +2197,7 @@ class OptimizeBuiltinCalls(Visitor.MethodDispatcherTransform):
Builtin.tuple_type : "PyTuple_GET_SIZE",
Builtin.dict_type : "PyDict_Size",
Builtin.set_type : "PySet_Size",
Builtin.frozenset_type : "PySet_Size",
Builtin.frozenset_type : "__Pyx_PyFrozenSet_Size",
}.get
_ext_types_with_pysize = set(["cpython.array.array"])
......@@ -2274,11 +2276,14 @@ class OptimizeBuiltinCalls(Visitor.MethodDispatcherTransform):
if len(pos_args) != 2:
return node
arg, types = pos_args
temp = None
temps = []
if isinstance(types, ExprNodes.TupleNode):
types = types.args
if len(types) == 1 and not types[0].type is Builtin.type_type:
return node # nothing to improve here
if arg.is_attribute or not arg.is_simple():
arg = temp = UtilNodes.ResultRefNode(arg)
arg = UtilNodes.ResultRefNode(arg)
temps.append(arg)
elif types.type is Builtin.type_type:
types = [types]
else:
......@@ -2309,13 +2314,17 @@ class OptimizeBuiltinCalls(Visitor.MethodDispatcherTransform):
type_check_function = '__Pyx_TypeCheck'
type_check_args = [arg, test_type_node]
else:
return node
if not test_type_node.is_literal:
test_type_node = UtilNodes.ResultRefNode(test_type_node)
temps.append(test_type_node)
type_check_function = 'PyObject_IsInstance'
type_check_args = [arg, test_type_node]
test_nodes.append(
ExprNodes.PythonCapiCallNode(
test_type_node.pos, type_check_function, self.Py_type_check_func_type,
args = type_check_args,
is_temp = True,
))
args=type_check_args,
is_temp=True,
))
def join_with_or(a, b, make_binop_node=ExprNodes.binop_node):
or_node = make_binop_node(node.pos, 'or', a, b)
......@@ -2324,7 +2333,7 @@ class OptimizeBuiltinCalls(Visitor.MethodDispatcherTransform):
return or_node
test_node = reduce(join_with_or, test_nodes).coerce_to(node.type, env)
if temp is not None:
for temp in temps[::-1]:
test_node = UtilNodes.EvalWithTempExprNode(temp, test_node)
return test_node
......@@ -3764,7 +3773,8 @@ class FinalOptimizePhase(Visitor.CythonTransform, Visitor.NodeRefCleanupMixin):
function.type = function.entry.type
PyTypeObjectPtr = PyrexTypes.CPtrType(cython_scope.lookup('PyTypeObject').type)
node.args[1] = ExprNodes.CastNode(node.args[1], PyTypeObjectPtr)
elif node.is_temp and function.type.is_pyobject:
elif (self.current_directives.get("optimize.unpack_method_calls")
and node.is_temp and function.type.is_pyobject):
# optimise simple Python methods calls
if isinstance(node.arg_tuple, ExprNodes.TupleNode) and not (
node.arg_tuple.mult_factor or (node.arg_tuple.is_literal and node.arg_tuple.args)):
......
......@@ -131,6 +131,7 @@ directive_defaults = {
# optimizations
'optimize.inline_defnode_calls': True,
'optimize.unpack_method_calls': True, # increases code size when True
'optimize.use_switch': True,
# remove unreachable code
......
......@@ -458,7 +458,7 @@ def p_call_parse_args(s, allow_genexp = True):
s.next()
starstar_arg = p_test(s)
if s.sy == ',':
s.next()
s.next() # FIXME: this is actually not valid Python syntax
s.expect(')')
return positional_args, keyword_args, star_arg, starstar_arg
......@@ -1226,7 +1226,7 @@ def p_pass_statement(s, with_newline = 0):
pos = s.position()
s.expect('pass')
if with_newline:
s.expect_newline("Expected a newline")
s.expect_newline("Expected a newline", ignore_semicolon=True)
return Nodes.PassStatNode(pos)
def p_break_statement(s):
......@@ -1793,7 +1793,7 @@ def p_DEF_statement(s):
value = expr.compile_time_value(denv)
#print "p_DEF_statement: %s = %r" % (name, value) ###
denv.declare(name, value)
s.expect_newline()
s.expect_newline("Expected a newline", ignore_semicolon=True)
return Nodes.PassStatNode(pos)
def p_IF_statement(s, ctx):
......@@ -1950,7 +1950,7 @@ def p_suite_with_docstring(s, ctx, with_doc_only=False):
body = p_simple_statement_list(s, ctx)
else:
body = p_pass_statement(s)
s.expect_newline("Syntax error in declarations")
s.expect_newline("Syntax error in declarations", ignore_semicolon=True)
if not with_doc_only:
doc, body = _extract_docstring(body)
return doc, body
......@@ -2285,6 +2285,7 @@ special_basic_c_types = cython.declare(dict, {
# name : (signed, longness)
"Py_UNICODE" : (0, 0),
"Py_UCS4" : (0, 0),
"Py_hash_t" : (2, 0),
"Py_ssize_t" : (2, 0),
"ssize_t" : (2, 0),
"size_t" : (0, 0),
......@@ -2842,7 +2843,7 @@ def p_c_func_or_var_declaration(s, pos, ctx):
assignable = 1, nonempty = 1)
declarators.append(declarator)
doc_line = s.start_line + 1
s.expect_newline("Syntax error in C variable declaration")
s.expect_newline("Syntax error in C variable declaration", ignore_semicolon=True)
if ctx.level in ('c_class', 'c_class_pxd') and s.start_line == doc_line:
doc = p_doc_string(s)
else:
......@@ -2876,7 +2877,7 @@ def p_ctypedef_statement(s, ctx):
else:
base_type = p_c_base_type(s, nonempty = 1)
declarator = p_c_declarator(s, ctx, is_type = 1, nonempty = 1)
s.expect_newline("Syntax error in ctypedef statement")
s.expect_newline("Syntax error in ctypedef statement", ignore_semicolon=True)
return Nodes.CTypeDefNode(
pos, base_type = base_type,
declarator = declarator,
......@@ -3090,8 +3091,7 @@ def p_ignorable_statement(s):
if s.sy == 'BEGIN_STRING':
pos = s.position()
string_node = p_atom(s)
if s.sy != 'EOF':
s.expect_newline("Syntax error in string")
s.expect_newline("Syntax error in string", ignore_semicolon=True)
return Nodes.ExprStatNode(pos, expr=string_node)
return None
......@@ -3100,8 +3100,7 @@ def p_doc_string(s):
if s.sy == 'BEGIN_STRING':
pos = s.position()
kind, bytes_result, unicode_result = p_cat_string_literal(s)
if s.sy != 'EOF':
s.expect_newline("Syntax error in doc string")
s.expect_newline("Syntax error in doc string", ignore_semicolon=True)
if kind in ('u', ''):
return unicode_result
warning(pos, "Python 3 requires docstrings to be unicode strings")
......
This diff is collapsed.
......@@ -4,10 +4,14 @@ import cython
from ..Plex.Scanners cimport Scanner
cdef get_lexicon()
cdef initial_compile_time_env()
cdef class Method:
cdef object name
cdef object __name__
@cython.final
cdef class CompileTimeScope:
cdef public dict entries
cdef public CompileTimeScope outer
......@@ -15,6 +19,7 @@ cdef class CompileTimeScope:
cdef lookup_here(self, name)
cpdef lookup(self, name)
@cython.final
cdef class PyrexScanner(Scanner):
cdef public context
cdef public list included_files
......@@ -51,4 +56,4 @@ cdef class PyrexScanner(Scanner):
cdef expected(self, what, message = *)
cdef expect_indent(self)
cdef expect_dedent(self)
cdef expect_newline(self, message = *)
cdef expect_newline(self, message=*, bint ignore_semicolon=*)
......@@ -5,17 +5,19 @@
from __future__ import absolute_import
import cython
cython.declare(EncodedString=object, make_lexicon=object, lexicon=object,
any_string_prefix=unicode, IDENT=unicode,
print_function=object, error=object, warning=object,
os=object, platform=object)
import os
import platform
import cython
cython.declare(EncodedString=object, any_string_prefix=unicode, IDENT=unicode,
print_function=object)
from .. import Utils
from ..Plex.Scanners import Scanner
from ..Plex.Errors import UnrecognizedInput
from .Errors import error
from .Errors import error, warning
from .Lexicon import any_string_prefix, make_lexicon, IDENT
from .Future import print_function
......@@ -28,12 +30,14 @@ scanner_dump_file = None
lexicon = None
def get_lexicon():
global lexicon
if not lexicon:
lexicon = make_lexicon()
return lexicon
#------------------------------------------------------------------
py_reserved_words = [
......@@ -49,15 +53,17 @@ pyx_reserved_words = py_reserved_words + [
"cimport", "DEF", "IF", "ELIF", "ELSE"
]
class Method(object):
def __init__(self, name):
self.name = name
self.__name__ = name # for Plex tracing
self.__name__ = name # for Plex tracing
def __call__(self, stream, text):
return getattr(stream, self.name)(text)
#------------------------------------------------------------------
class CompileTimeScope(object):
......@@ -88,6 +94,7 @@ class CompileTimeScope(object):
else:
raise
def initial_compile_time_env():
benv = CompileTimeScope()
names = ('UNAME_SYSNAME', 'UNAME_NODENAME', 'UNAME_RELEASE',
......@@ -116,6 +123,7 @@ def initial_compile_time_env():
denv = CompileTimeScope(benv)
return denv
#------------------------------------------------------------------
class SourceDescriptor(object):
......@@ -166,6 +174,7 @@ class SourceDescriptor(object):
except AttributeError:
return False
class FileSourceDescriptor(SourceDescriptor):
"""
Represents a code source. A code source is a more generic abstraction
......@@ -210,7 +219,11 @@ class FileSourceDescriptor(SourceDescriptor):
return lines
def get_description(self):
return self.path_description
try:
return os.path.relpath(self.path_description)
except ValueError:
# path not under current directory => use complete file path
return self.path_description
def get_error_description(self):
path = self.filename
......@@ -231,6 +244,7 @@ class FileSourceDescriptor(SourceDescriptor):
def __repr__(self):
return "<FileSourceDescriptor:%s>" % self.filename
class StringSourceDescriptor(SourceDescriptor):
"""
Instances of this class can be used instead of a filenames if the
......@@ -271,6 +285,7 @@ class StringSourceDescriptor(SourceDescriptor):
def __repr__(self):
return "<StringSourceDescriptor:%s>" % self.name
#------------------------------------------------------------------
class PyrexScanner(Scanner):
......@@ -280,8 +295,8 @@ class PyrexScanner(Scanner):
# compile_time_eval boolean In a true conditional compilation context
# compile_time_expr boolean In a compile-time expression context
def __init__(self, file, filename, parent_scanner = None,
scope = None, context = None, source_encoding=None, parse_comments=True, initial_pos=None):
def __init__(self, file, filename, parent_scanner=None,
scope=None, context=None, source_encoding=None, parse_comments=True, initial_pos=None):
Scanner.__init__(self, get_lexicon(), file, filename, initial_pos)
if parent_scanner:
self.context = parent_scanner.context
......@@ -295,8 +310,7 @@ class PyrexScanner(Scanner):
self.compile_time_env = initial_compile_time_env()
self.compile_time_eval = 1
self.compile_time_expr = 0
if hasattr(context.options, 'compile_time_env') and \
context.options.compile_time_env is not None:
if getattr(context.options, 'compile_time_env', None):
self.compile_time_env.update(context.options.compile_time_env)
self.parse_comments = parse_comments
self.source_encoding = source_encoding
......@@ -322,11 +336,11 @@ class PyrexScanner(Scanner):
return self.indentation_stack[-1]
def open_bracket_action(self, text):
self.bracket_nesting_level = self.bracket_nesting_level + 1
self.bracket_nesting_level += 1
return text
def close_bracket_action(self, text):
self.bracket_nesting_level = self.bracket_nesting_level - 1
self.bracket_nesting_level -= 1
return text
def newline_action(self, text):
......@@ -402,6 +416,7 @@ class PyrexScanner(Scanner):
sy, systring = self.read()
except UnrecognizedInput:
self.error("Unrecognized character")
return # just a marker, error() always raises
if sy == IDENT:
if systring in self.keywords:
if systring == u'print' and print_function in self.context.future_directives:
......@@ -441,21 +456,21 @@ class PyrexScanner(Scanner):
# This method should be added to Plex
self.queue.insert(0, (token, value))
def error(self, message, pos = None, fatal = True):
def error(self, message, pos=None, fatal=True):
if pos is None:
pos = self.position()
if self.sy == 'INDENT':
err = error(pos, "Possible inconsistent indentation")
error(pos, "Possible inconsistent indentation")
err = error(pos, message)
if fatal: raise err
def expect(self, what, message = None):
def expect(self, what, message=None):
if self.sy == what:
self.next()
else:
self.expected(what, message)
def expect_keyword(self, what, message = None):
def expect_keyword(self, what, message=None):
if self.sy == IDENT and self.systring == what:
self.next()
else:
......@@ -472,14 +487,18 @@ class PyrexScanner(Scanner):
self.error("Expected '%s', found '%s'" % (what, found))
def expect_indent(self):
self.expect('INDENT',
"Expected an increase in indentation level")
self.expect('INDENT', "Expected an increase in indentation level")
def expect_dedent(self):
self.expect('DEDENT',
"Expected a decrease in indentation level")
self.expect('DEDENT', "Expected a decrease in indentation level")
def expect_newline(self, message = "Expected a newline"):
def expect_newline(self, message="Expected a newline", ignore_semicolon=False):
# Expect either a newline or end of file
useless_trailing_semicolon = None
if ignore_semicolon and self.sy == ';':
useless_trailing_semicolon = self.position()
self.next()
if self.sy != 'EOF':
self.expect('NEWLINE', message)
if useless_trailing_semicolon is not None:
warning(useless_trailing_semicolon, "useless trailing semicolon")
......@@ -303,7 +303,7 @@ class Scope(object):
self.name = name
self.outer_scope = outer_scope
self.parent_scope = parent_scope
mangled_name = "%d%s_" % (len(name), name)
mangled_name = "%d%s_" % (len(name), name.replace('.', '_dot_'))
qual_scope = self.qualifying_scope()
if qual_scope:
self.qualified_name = qual_scope.qualify_name(name)
......@@ -1041,15 +1041,13 @@ class ModuleScope(Scope):
def global_scope(self):
return self
def lookup(self, name):
def lookup(self, name, language_level=None):
entry = self.lookup_here(name)
if entry is not None:
return entry
if self.context is not None:
language_level = self.context.language_level
else:
language_level = 3
if language_level is None:
language_level = self.context.language_level if self.context is not None else 3
return self.outer_scope.lookup(name, language_level=language_level)
......
......@@ -23,16 +23,19 @@ from . import UtilNodes
class StringParseContext(Main.Context):
def __init__(self, name, include_directories=None):
if include_directories is None: include_directories = []
Main.Context.__init__(self, include_directories, {},
def __init__(self, name, include_directories=None, compiler_directives=None):
if include_directories is None:
include_directories = []
if compiler_directives is None:
compiler_directives = {}
Main.Context.__init__(self, include_directories, compiler_directives,
create_testscope=False)
self.module_name = name
def find_module(self, module_name, relative_to = None, pos = None, need_pxd = 1):
def find_module(self, module_name, relative_to=None, pos=None, need_pxd=1):
if module_name not in (self.module_name, 'cython'):
raise AssertionError("Not yet supporting any cimports/includes from string code snippets")
return ModuleScope(module_name, parent_module = None, context = self)
return ModuleScope(module_name, parent_module=None, context=self)
def parse_from_strings(name, code, pxds={}, level=None, initial_pos=None,
......@@ -64,7 +67,7 @@ def parse_from_strings(name, code, pxds={}, level=None, initial_pos=None,
initial_pos = (name, 1, 0)
code_source = StringSourceDescriptor(name, code)
scope = context.find_module(module_name, pos = initial_pos, need_pxd = 0)
scope = context.find_module(module_name, pos=initial_pos, need_pxd=False)
buf = StringIO(code)
......@@ -190,20 +193,27 @@ class TemplateTransform(VisitorTransform):
else:
return self.visit_Node(node)
def copy_code_tree(node):
return TreeCopier()(node)
INDENT_RE = re.compile(ur"^ *")
_match_indent = re.compile(ur"^ *").match
def strip_common_indent(lines):
"Strips empty lines and common indentation from the list of strings given in lines"
"""Strips empty lines and common indentation from the list of strings given in lines"""
# TODO: Facilitate textwrap.indent instead
lines = [x for x in lines if x.strip() != u""]
minindent = min([len(INDENT_RE.match(x).group(0)) for x in lines])
minindent = min([len(_match_indent(x).group(0)) for x in lines])
lines = [x[minindent:] for x in lines]
return lines
class TreeFragment(object):
def __init__(self, code, name="(tree fragment)", pxds={}, temps=[], pipeline=[], level=None, initial_pos=None):
def __init__(self, code, name=None, pxds={}, temps=[], pipeline=[], level=None, initial_pos=None):
if not name:
name = "(tree fragment)"
if isinstance(code, unicode):
def fmt(x): return u"\n".join(strip_common_indent(x.split(u"\n")))
......
......@@ -494,24 +494,22 @@ def find_spanning_type(type1, type2):
return PyrexTypes.c_double_type
return result_type
def aggressive_spanning_type(types, might_overflow, pos):
result_type = reduce(find_spanning_type, types)
def simply_type(result_type, pos):
if result_type.is_reference:
result_type = result_type.ref_base_type
if result_type.is_const:
result_type = result_type.const_base_type
if result_type.is_cpp_class:
result_type.check_nullary_constructor(pos)
if result_type.is_array:
result_type = PyrexTypes.c_ptr_type(result_type.base_type)
return result_type
def aggressive_spanning_type(types, might_overflow, pos):
return simply_type(reduce(find_spanning_type, types), pos)
def safe_spanning_type(types, might_overflow, pos):
result_type = reduce(find_spanning_type, types)
if result_type.is_const:
result_type = result_type.const_base_type
if result_type.is_reference:
result_type = result_type.ref_base_type
if result_type.is_cpp_class:
result_type.check_nullary_constructor(pos)
result_type = simply_type(reduce(find_spanning_type, types), pos)
if result_type.is_pyobject:
# In theory, any specific Python type is always safe to
# infer. However, inferring str can cause some existing code
......
......@@ -417,7 +417,7 @@ class DocStringSlot(SlotDescriptor):
doc = scope.doc.utf8encode()
else:
doc = scope.doc.byteencode()
return '__Pyx_DOCSTR("%s")' % StringEncoding.escape_byte_string(doc)
return '"%s"' % StringEncoding.escape_byte_string(doc)
else:
return "0"
......@@ -738,8 +738,8 @@ PyBufferProcs = (
MethodSlot(segcountproc, "bf_getsegcount", "__getsegcount__", py3 = False),
MethodSlot(charbufferproc, "bf_getcharbuffer", "__getcharbuffer__", py3 = False),
MethodSlot(getbufferproc, "bf_getbuffer", "__getbuffer__", ifdef = "PY_VERSION_HEX >= 0x02060000"),
MethodSlot(releasebufferproc, "bf_releasebuffer", "__releasebuffer__", ifdef = "PY_VERSION_HEX >= 0x02060000")
MethodSlot(getbufferproc, "bf_getbuffer", "__getbuffer__"),
MethodSlot(releasebufferproc, "bf_releasebuffer", "__releasebuffer__")
)
#------------------------------------------------------------------------------------------
......@@ -809,7 +809,7 @@ slot_table = (
EmptySlot("tp_subclasses"),
EmptySlot("tp_weaklist"),
EmptySlot("tp_del"),
EmptySlot("tp_version_tag", ifdef="PY_VERSION_HEX >= 0x02060000"),
EmptySlot("tp_version_tag"),
EmptySlot("tp_finalize", ifdef="PY_VERSION_HEX >= 0x030400a1"),
)
......
......@@ -8,6 +8,8 @@ from . import Code
class NonManglingModuleScope(Symtab.ModuleScope):
cpp = False
def __init__(self, prefix, *args, **kw):
self.prefix = prefix
self.cython_scope = None
......@@ -28,12 +30,11 @@ class NonManglingModuleScope(Symtab.ModuleScope):
else:
return Symtab.ModuleScope.mangle(self, prefix)
class CythonUtilityCodeContext(StringParseContext):
scope = None
def find_module(self, module_name, relative_to = None, pos = None,
need_pxd = 1):
def find_module(self, module_name, relative_to=None, pos=None, need_pxd=True):
if module_name != self.module_name:
if module_name not in self.modules:
raise AssertionError("Only the cython cimport is supported.")
......@@ -41,10 +42,8 @@ class CythonUtilityCodeContext(StringParseContext):
return self.modules[module_name]
if self.scope is None:
self.scope = NonManglingModuleScope(self.prefix,
module_name,
parent_module=None,
context=self)
self.scope = NonManglingModuleScope(
self.prefix, module_name, parent_module=None, context=self)
return self.scope
......@@ -69,7 +68,8 @@ class CythonUtilityCode(Code.UtilityCodeBase):
is_cython_utility = True
def __init__(self, impl, name="__pyxutil", prefix="", requires=None,
file=None, from_scope=None, context=None):
file=None, from_scope=None, context=None, compiler_directives=None,
outer_module_scope=None):
# 1) We need to delay the parsing/processing, so that all modules can be
# imported without import loops
# 2) The same utility code object can be used for multiple source files;
......@@ -84,6 +84,20 @@ class CythonUtilityCode(Code.UtilityCodeBase):
self.prefix = prefix
self.requires = requires or []
self.from_scope = from_scope
self.outer_module_scope = outer_module_scope
self.compiler_directives = compiler_directives
def __eq__(self, other):
if isinstance(other, CythonUtilityCode):
return self._equality_params() == other._equality_params()
else:
return False
def _equality_params(self):
return self.impl, self.outer_module_scope, self.compiler_directives
def __hash__(self):
return hash(self.impl)
def get_tree(self, entries_only=False, cython_scope=None):
from .AnalysedTreeTransforms import AutoTestDictTransform
......@@ -93,12 +107,13 @@ class CythonUtilityCode(Code.UtilityCodeBase):
excludes = [AutoTestDictTransform]
from . import Pipeline, ParseTreeTransforms
context = CythonUtilityCodeContext(self.name)
context = CythonUtilityCodeContext(
self.name, compiler_directives=self.compiler_directives)
context.prefix = self.prefix
context.cython_scope = cython_scope
#context = StringParseContext(self.name)
tree = parse_from_strings(self.name, self.impl, context=context,
allow_struct_enum_decorator=True)
tree = parse_from_strings(
self.name, self.impl, context=context, allow_struct_enum_decorator=True)
pipeline = Pipeline.create_pipeline(context, 'pyx', exclude_classes=excludes)
if entries_only:
......@@ -126,6 +141,16 @@ class CythonUtilityCode(Code.UtilityCodeBase):
pipeline = Pipeline.insert_into_pipeline(pipeline, scope_transform,
before=transform)
if self.outer_module_scope:
# inject outer module between utility code module and builtin module
def scope_transform(module_node):
module_node.scope.outer_scope = self.outer_module_scope
return module_node
transform = ParseTreeTransforms.AnalyseDeclarationsTransform
pipeline = Pipeline.insert_into_pipeline(pipeline, scope_transform,
before=transform)
(err, tree) = Pipeline.run_pipeline(pipeline, tree, printtree=False)
assert not err, err
return tree
......
......@@ -4,7 +4,6 @@ from __future__ import absolute_import
from .. import __version__ as version
# For generated by string.
# For 'generated by' header line in C files.
import time
watermark = "%s on %s" % (version, time.asctime())
watermark = str(version)
......@@ -2,7 +2,7 @@
GDB extension that adds Cython support.
"""
from __future__ import with_statement
from __future__ import print_function
import sys
import textwrap
......@@ -54,6 +54,7 @@ PythonObject = 'PythonObject'
_data_types = dict(CObject=CObject, PythonObject=PythonObject)
_filesystemencoding = sys.getfilesystemencoding() or 'UTF-8'
# decorators
def dont_suppress_errors(function):
......@@ -68,6 +69,7 @@ def dont_suppress_errors(function):
return wrapper
def default_selected_gdb_frame(err=True):
def decorator(function):
@functools.wraps(function)
......@@ -84,6 +86,7 @@ def default_selected_gdb_frame(err=True):
return wrapper
return decorator
def require_cython_frame(function):
@functools.wraps(function)
@require_running_program
......@@ -95,6 +98,7 @@ def require_cython_frame(function):
return function(self, *args, **kwargs)
return wrapper
def dispatch_on_frame(c_command, python_command=None):
def decorator(function):
@functools.wraps(function)
......@@ -115,6 +119,7 @@ def dispatch_on_frame(c_command, python_command=None):
return wrapper
return decorator
def require_running_program(function):
@functools.wraps(function)
def wrapper(*args, **kwargs):
......@@ -152,6 +157,7 @@ class CythonModule(object):
self.lineno_c2cy = {}
self.functions = {}
class CythonVariable(object):
def __init__(self, name, cname, qualified_name, type, lineno):
......@@ -161,6 +167,7 @@ class CythonVariable(object):
self.type = type
self.lineno = int(lineno)
class CythonFunction(CythonVariable):
def __init__(self,
module,
......@@ -297,7 +304,7 @@ class CythonBase(object):
try:
source_desc, lineno = self.get_source_desc(frame)
except NoFunctionNameInFrameError:
print '#%-2d Unknown Frame (compile with -g)' % index
print('#%-2d Unknown Frame (compile with -g)' % index)
return
if not is_c and self.is_python_function(frame):
......@@ -381,10 +388,9 @@ class CythonBase(object):
typename = '(%s) ' % (value.type,)
if max_name_length is None:
print '%s%s = %s%s' % (prefix, name, typename, value)
print('%s%s = %s%s' % (prefix, name, typename, value))
else:
print '%s%-*s = %s%s' % (prefix, max_name_length, name, typename,
value)
print('%s%-*s = %s%s' % (prefix, max_name_length, name, typename, value))
def is_initialized(self, cython_func, local_name):
cyvar = cython_func.locals[local_name]
......@@ -473,6 +479,7 @@ class CyGDBError(gdb.GdbError):
args = args or (self.msg,)
super(CyGDBError, self).__init__(*args)
class NoCythonFunctionInFrameError(CyGDBError):
"""
raised when the user requests the current cython function, which is
......@@ -480,6 +487,7 @@ class NoCythonFunctionInFrameError(CyGDBError):
"""
msg = "Current function is a function cygdb doesn't know about"
class NoFunctionNameInFrameError(NoCythonFunctionInFrameError):
"""
raised when the name of the C function could not be determined
......@@ -509,21 +517,25 @@ class CythonParameter(gdb.Parameter):
__nonzero__ = __bool__ # Python 2
class CompleteUnqualifiedFunctionNames(CythonParameter):
"""
Have 'cy break' complete unqualified function or method names.
"""
class ColorizeSourceCode(CythonParameter):
"""
Tell cygdb whether to colorize source code.
"""
class TerminalBackground(CythonParameter):
"""
Tell cygdb about the user's terminal background (light or dark).
"""
class CythonParameters(object):
"""
Simple container class that might get more functionality in the distant
......@@ -636,7 +648,7 @@ class CyCy(CythonCommand):
cy_eval = CyEval('cy_eval'),
)
for command_name, command in commands.iteritems():
for command_name, command in commands.items():
command.cy = self
setattr(self, command_name, command)
......@@ -672,9 +684,8 @@ class CyImport(CythonCommand):
for arg in string_to_argv(args):
try:
f = open(arg)
except OSError, e:
raise gdb.GdbError('Unable to open file %r: %s' %
(args, e.args[1]))
except OSError as e:
raise gdb.GdbError('Unable to open file %r: %s' % (args, e.args[1]))
t = etree.parse(f)
......@@ -782,9 +793,9 @@ class CyBreak(CythonCommand):
if len(funcs) > 1:
# multiple functions, let the user pick one
print 'There are multiple such functions:'
print('There are multiple such functions:')
for idx, func in enumerate(funcs):
print '%3d) %s' % (idx, func.qualified_name)
print('%3d) %s' % (idx, func.qualified_name))
while True:
try:
......@@ -800,11 +811,11 @@ class CyBreak(CythonCommand):
break_funcs = funcs
break
elif (result.isdigit() and
0 <= int(result) < len(funcs)):
0 <= int(result) < len(funcs)):
break_funcs = [funcs[int(result)]]
break
else:
print 'Not understood...'
print('Not understood...')
else:
break_funcs = [funcs[0]]
......@@ -977,7 +988,7 @@ class CyUp(CythonCommand):
gdb.execute(self._command, to_string=True)
while not self.is_relevant_function(gdb.selected_frame()):
gdb.execute(self._command, to_string=True)
except RuntimeError, e:
except RuntimeError as e:
raise gdb.GdbError(*e.args)
frame = gdb.selected_frame()
......@@ -1020,7 +1031,7 @@ class CySelect(CythonCommand):
try:
gdb.execute('select %d' % (stackdepth - stackno - 1,))
except RuntimeError, e:
except RuntimeError as e:
raise gdb.GdbError(*e.args)
......@@ -1070,7 +1081,7 @@ class CyList(CythonCommand):
sd, lineno = self.get_source_desc()
source = sd.get_source(lineno - 5, lineno + 5, mark_line=lineno,
lex_entire=True)
print source
print(source)
class CyPrint(CythonCommand):
......@@ -1104,7 +1115,8 @@ class CyPrint(CythonCommand):
return []
sortkey = lambda (name, value): name.lower()
sortkey = lambda item: item[0].lower()
class CyLocals(CythonCommand):
"""
......@@ -1157,13 +1169,13 @@ class CyGlobals(CyLocals):
max_name_length = max(max_globals_len, max_globals_dict_len)
seen = set()
print 'Python globals:'
print('Python globals:')
for k, v in sorted(global_python_dict.iteritems(), key=sortkey):
v = v.get_truncated_repr(libpython.MAX_OUTPUT_LEN)
seen.add(k)
print ' %-*s = %s' % (max_name_length, k, v)
print(' %-*s = %s' % (max_name_length, k, v))
print 'C globals:'
print('C globals:')
for name, cyvar in sorted(module_globals.iteritems(), key=sortkey):
if name not in seen:
try:
......@@ -1176,7 +1188,6 @@ class CyGlobals(CyLocals):
max_name_length, ' ')
class EvaluateOrExecuteCodeMixin(object):
"""
Evaluate or execute Python code in a Cython or Python frame. The 'evalcode'
......@@ -1228,7 +1239,6 @@ class EvaluateOrExecuteCodeMixin(object):
raise gdb.GdbError("There is no Cython or Python frame on the stack.")
def _evalcode_cython(self, executor, code, input_type):
with libpython.FetchAndRestoreError():
# get the dict of Cython globals and construct a dict in the
......@@ -1384,6 +1394,7 @@ cython_info = CythonInfo()
cy = CyCy.register()
cython_info.cy = cy
def register_defines():
libpython.source_gdb_script(textwrap.dedent("""\
define cy step
......
This diff is collapsed.
......@@ -246,6 +246,6 @@ cdef extern from "Python.h":
# and the value is clipped to PY_SSIZE_T_MIN for a negative
# integer or PY_SSIZE_T_MAX for a positive integer.
bint PyIndex_Check "__Pyx_PyIndex_Check" (object)
bint PyIndex_Check(object)
# Returns True if o is an index integer (has the nb_index slot of
# the tp_as_number structure filled in).
cdef extern from "Python.h":
# PyTypeObject PySlice_Type
#
# The type object for slice objects. This is the same as slice and types.SliceType
bint PySlice_Check(object ob)
#
# Return true if ob is a slice object; ob must not be NULL.
slice PySlice_New(object start, object stop, object step)
#
# Return a new slice object with the given values. The start, stop, and step
# parameters are used as the values of the slice object attributes of the same
# names. Any of the values may be NULL, in which case the None will be used
# for the corresponding attribute. Return NULL if the new object could not be
# allocated.
int PySlice_GetIndices(object slice, Py_ssize_t length,
Py_ssize_t *start, Py_ssize_t *stop, Py_ssize_t *step) except? -1
#
# Retrieve the start, stop and step indices from the slice object slice,
# assuming a sequence of length length. Treats indices greater than length
# as errors.
#
# Returns 0 on success and -1 on error with no exception set (unless one
# of the indices was not None and failed to be converted to an integer,
# in which case -1 is returned with an exception set).
#
# You probably do not want to use this function.
#
# Changed in version 3.2: The parameter type for the slice parameter was
# PySliceObject* before.
int PySlice_GetIndicesEx(object slice, Py_ssize_t length,
Py_ssize_t *start, Py_ssize_t *stop, Py_ssize_t *step,
Py_ssize_t *slicelength) except -1
#
# Usable replacement for PySlice_GetIndices(). Retrieve the start, stop, and step
# indices from the slice object slice assuming a sequence of length length, and
# store the length of the slice in slicelength. Out of bounds indices are clipped
# in a manner consistent with the handling of normal slices.
#
# Returns 0 on success and -1 on error with exception set.
#
# Changed in version 3.2: The parameter type for the slice parameter was
# PySliceObject* before.
This diff is collapsed.
......@@ -4,8 +4,8 @@
#
# if PY_MAJOR_VERSION >= 3:
# do_stuff_in_Py3_0_and_later()
# if PY_VERSION_HEX >= 0x02050000:
# do_stuff_in_Py2_5_and_later()
# if PY_VERSION_HEX >= 0x02070000:
# do_stuff_in_Py2_7_and_later()
#
# than using the IF/DEF statements, which are evaluated at Cython
# compile time. This will keep your C code portable.
......
# http://en.wikipedia.org/wiki/C_date_and_time_functions
from libc.stddef cimport wchar_t
cdef extern from "time.h" nogil:
ctypedef long clock_t
ctypedef long time_t
enum: CLOCKS_PER_SEC
clock_t clock() # CPU time
time_t time(time_t *) # wall clock time since Unix epoch
cdef struct tm:
int tm_sec
int tm_min
int tm_hour
int tm_mday
int tm_mon
int tm_year
int tm_wday
int tm_yday
int tm_isdst
char *tm_zone
long tm_gmtoff
int daylight # global state
long timezone
char *tzname[2]
void tzset()
char *asctime(const tm *)
char *asctime_r(const tm *, char *)
char *ctime(const time_t *)
char *ctime_r(const time_t *, char *)
double difftime(time_t, time_t)
tm *getdate(const char *)
tm *gmtime(const time_t *)
tm *gmtime_r(const time_t *, tm *)
tm *localtime(const time_t *)
tm *localtime_r(const time_t *, tm *)
time_t mktime(tm *)
size_t strftime(char *, size_t, const char *, const tm *)
size_t wcsftime(wchar_t *str, size_t cnt, const wchar_t *fmt, tm *time)
# POSIX not stdC
char *strptime(const char *, const char *, tm *)
# http://pubs.opengroup.org/onlinepubs/009695399/basedefs/sys/mman.h.html
from posix.types cimport off_t, mode_t
cdef extern from "sys/mman.h" nogil:
enum: PROT_EXEC # protection bits for mmap/mprotect
enum: PROT_READ
enum: PROT_WRITE
enum: PROT_NONE
enum: MAP_PRIVATE # flag bits for mmap
enum: MAP_SHARED
enum: MAP_FIXED
enum: MAP_ANON # These three are not in POSIX, but are
enum: MAP_ANONYMOUS # fairly common in spelling/semantics
enum: MAP_STACK
enum: MAP_LOCKED # Typically available only on Linux
enum: MAP_HUGETLB
enum: MAP_POPULATE
enum: MAP_NORESERVE
enum: MAP_GROWSDOWN
enum: MAP_NOCORE # Typically available only on BSD
enum: MAP_NOSYNC
void *mmap(void *addr, size_t Len, int prot, int flags, int fd, off_t off)
int munmap(void *addr, size_t Len)
int mprotect(void *addr, size_t Len, int prot)
enum: MS_ASYNC
enum: MS_SYNC
enum: MS_INVALIDATE
int msync(void *addr, size_t Len, int flags)
enum: POSIX_MADV_NORMAL # POSIX advice flags
enum: POSIX_MADV_SEQUENTIAL
enum: POSIX_MADV_RANDOM
enum: POSIX_MADV_WILLNEED
enum: POSIX_MADV_DONTNEED
int posix_madvise(void *addr, size_t Len, int advice)
enum: MCL_CURRENT
enum: MCL_FUTURE
int mlock(const void *addr, size_t Len)
int munlock(const void *addr, size_t Len)
int mlockall(int flags)
int munlockall()
int shm_open(const char *name, int oflag, mode_t mode)
int shm_unlink(const char *name)
# often available
enum: MADV_REMOVE # pre-POSIX advice flags; often available
enum: MADV_DONTFORK
enum: MADV_DOFORK
enum: MADV_HWPOISON
enum: MADV_MERGEABLE,
enum: MADV_UNMERGEABLE
int madvise(void *addr, size_t Len, int advice)
# sometimes available
int mincore(void *addr, size_t Len, unsigned char *vec)
# These two are Linux specific but sometimes very efficient
void *mremap(void *old_addr, size_t old_len, size_t new_len, int flags, ...)
int remap_file_pages(void *addr, size_t Len, int prot,
size_t pgoff, int flags)
# The rare but standardized typed memory option
enum: POSIX_TYPED_MEM_ALLOCATE
enum: POSIX_TYPED_MEM_ALLOCATE_CONTIG
enum: POSIX_TYPED_MEM_MAP_ALLOCATABLE
int posix_typed_mem_open(const char *name, int oflag, int tflag)
int posix_mem_offset(const void *addr, size_t Len, off_t *off,
size_t *contig_len, int *fildes)
cdef struct posix_typed_mem_info:
size_t posix_tmi_length
int posix_typed_mem_get_info(int fildes, posix_typed_mem_info *info)
# http://pubs.opengroup.org/onlinepubs/009695399/basedefs/sys/resource.h.html
from posix.time cimport timeval
from posix.time cimport timeval
from posix.types cimport id_t
cdef extern from "sys/resource.h" nogil:
......
# http://pubs.opengroup.org/onlinepubs/009695399/basedefs/sys/time.h.html
from posix.types cimport suseconds_t, time_t, clockid_t, timer_t
from posix.signal cimport sigevent
from posix.types cimport clock_t, clockid_t, suseconds_t, time_t, timer_t
cdef extern from "sys/time.h" nogil:
enum: CLOCKS_PER_SEC
enum: CLOCK_PROCESS_CPUTIME_ID
enum: CLOCK_THREAD_CPUTIME_ID
......@@ -37,70 +35,40 @@ cdef extern from "sys/time.h" nogil:
enum: ITIMER_VIRTUAL
enum: ITIMER_PROF
cdef struct timeval:
time_t tv_sec
suseconds_t tv_usec
cdef struct itimerval:
timeval it_interval
timeval it_value
cdef struct timezone:
int tz_minuteswest
int dsttime
cdef struct timeval:
time_t tv_sec
suseconds_t tv_usec
cdef struct timespec:
time_t tv_sec
long tv_nsec
cdef struct itimerval:
timeval it_interval
timeval it_value
cdef struct itimerspec:
timespec it_interval
timespec it_value
cdef struct tm:
int tm_sec
int tm_min
int tm_hour
int tm_mday
int tm_mon
int tm_year
int tm_wday
int tm_yday
int tm_isdst
char *tm_zone
long tm_gmtoff
int nanosleep(const timespec *, timespec *)
int getitimer(int, itimerval *)
int gettimeofday(timeval *tp, timezone *tzp)
int setitimer(int, const itimerval *, itimerval *)
char *asctime(const tm *)
char *asctime_r(const tm *, char *)
clock_t clock()
int clock_getcpuclockid(pid_t, clockid_t *)
int clock_getres(clockid_t, timespec *)
int clock_gettime(clockid_t, timespec *)
int clock_nanosleep(clockid_t, int, const timespec *, timespec *)
int clock_settime(clockid_t, const timespec *)
char *ctime(const time_t *)
char *ctime_r(const time_t *, char *)
double difftime(time_t, time_t)
tm *getdate(const char *)
int getitimer(int, itimerval *)
int gettimeofday(timeval *tp, timezone *tzp)
tm *gmtime(const time_t *)
tm *gmtime_r(const time_t *, tm *)
tm *localtime(const time_t *)
tm *localtime_r(const time_t *, tm *)
time_t mktime(tm *)
int nanosleep(const timespec *, timespec *)
int setitimer(int, const itimerval *, itimerval *)
size_t strftime(char *, size_t, const char *, const tm *)
char *strptime(const char *, const char *, tm *)
time_t time(time_t *)
int timer_create(clockid_t, sigevent *, timer_t *)
int timer_delete(timer_t)
int timer_gettime(timer_t, itimerspec *)
int timer_getoverrun(timer_t)
int timer_settime(timer_t, int, const itimerspec *, itimerspec *)
void tzset()
int clock_getcpuclockid(pid_t, clockid_t *)
int clock_getres(clockid_t, timespec *)
int clock_gettime(clockid_t, timespec *)
int clock_nanosleep(clockid_t, int, const timespec *, timespec *)
int clock_settime(clockid_t, const timespec *)
int daylight
long timezone
char *tzname[2]
int timer_create(clockid_t, sigevent *, timer_t *)
int timer_delete(timer_t)
int timer_gettime(timer_t, itimerspec *)
int timer_getoverrun(timer_t)
int timer_settime(timer_t, int, const itimerspec *, itimerspec *)
cdef extern from "sys/types.h":
ctypedef long blkcnt_t
ctypedef long blksize_t
ctypedef long clock_t
ctypedef long clockid_t
ctypedef long dev_t
ctypedef long gid_t
......
# http://pubs.opengroup.org/onlinepubs/009695399/basedefs/sys/wait.h.html
from posix.types cimport pid_t, id_t
from posix.signal cimport siginfo_t
from posix.resource cimport rusage
cdef extern from "sys/wait.h" nogil:
enum: WNOHANG
enum: WUNTRACED
enum: WCONTINUED
enum: WEXITED
enum: WSTOPPED
enum: WNOWAIT
int WEXITSTATUS(int status)
int WIFCONTINUED(int status)
int WIFEXITED(int status)
int WIFSIGNALED(int status)
int WIFSTOPPED(int status)
int WSTOPSIG(int status)
int WTERMSIG(int status)
ctypedef int idtype_t
enum: P_ALL # idtype_t values
enum: P_PID
enum: P_PGID
pid_t wait(int *stat_loc)
pid_t waitpid(pid_t pid, int *status, int options)
int waitid(idtype_t idtype, id_t id, siginfo_t *infop, int options)
# wait3 was in POSIX until 2008 while wait4 was never standardized.
# Even so, these calls are in almost every Unix, always in sys/wait.h.
# Hence, posix.wait is the least surprising place to declare them for Cython.
# libc may require _XXX_SOURCE to be defined at C-compile time to provide them.
pid_t wait3(int *status, int options, rusage *rusage)
pid_t wait4(pid_t pid, int *status, int options, rusage *rusage)
......@@ -7,98 +7,101 @@
#=======================================================================
class Action(object):
def perform(self, token_stream, text):
pass # abstract
def perform(self, token_stream, text):
pass # abstract
def same_as(self, other):
return self is other
def same_as(self, other):
return self is other
class Return(Action):
"""
Internal Plex action which causes |value| to
be returned as the value of the associated token
"""
"""
Internal Plex action which causes |value| to
be returned as the value of the associated token
"""
def __init__(self, value):
self.value = value
def __init__(self, value):
self.value = value
def perform(self, token_stream, text):
return self.value
def perform(self, token_stream, text):
return self.value
def same_as(self, other):
return isinstance(other, Return) and self.value == other.value
def same_as(self, other):
return isinstance(other, Return) and self.value == other.value
def __repr__(self):
return "Return(%s)" % repr(self.value)
def __repr__(self):
return "Return(%s)" % repr(self.value)
class Call(Action):
"""
Internal Plex action which causes a function to be called.
"""
"""
Internal Plex action which causes a function to be called.
"""
def __init__(self, function):
self.function = function
def __init__(self, function):
self.function = function
def perform(self, token_stream, text):
return self.function(token_stream, text)
def perform(self, token_stream, text):
return self.function(token_stream, text)
def __repr__(self):
return "Call(%s)" % self.function.__name__
def __repr__(self):
return "Call(%s)" % self.function.__name__
def same_as(self, other):
return isinstance(other, Call) and self.function is other.function
def same_as(self, other):
return isinstance(other, Call) and self.function is other.function
class Begin(Action):
"""
Begin(state_name) is a Plex action which causes the Scanner to
enter the state |state_name|. See the docstring of Plex.Lexicon
for more information.
"""
"""
Begin(state_name) is a Plex action which causes the Scanner to
enter the state |state_name|. See the docstring of Plex.Lexicon
for more information.
"""
def __init__(self, state_name):
self.state_name = state_name
def __init__(self, state_name):
self.state_name = state_name
def perform(self, token_stream, text):
token_stream.begin(self.state_name)
def perform(self, token_stream, text):
token_stream.begin(self.state_name)
def __repr__(self):
return "Begin(%s)" % self.state_name
def __repr__(self):
return "Begin(%s)" % self.state_name
def same_as(self, other):
return isinstance(other, Begin) and self.state_name == other.state_name
def same_as(self, other):
return isinstance(other, Begin) and self.state_name == other.state_name
class Ignore(Action):
"""
IGNORE is a Plex action which causes its associated token
to be ignored. See the docstring of Plex.Lexicon for more
information.
"""
def perform(self, token_stream, text):
return None
"""
IGNORE is a Plex action which causes its associated token
to be ignored. See the docstring of Plex.Lexicon for more
information.
"""
def perform(self, token_stream, text):
return None
def __repr__(self):
return "IGNORE"
def __repr__(self):
return "IGNORE"
IGNORE = Ignore()
#IGNORE.__doc__ = Ignore.__doc__
class Text(Action):
"""
TEXT is a Plex action which causes the text of a token to
be returned as the value of the token. See the docstring of
Plex.Lexicon for more information.
"""
"""
TEXT is a Plex action which causes the text of a token to
be returned as the value of the token. See the docstring of
Plex.Lexicon for more information.
"""
def perform(self, token_stream, text):
return text
def perform(self, token_stream, text):
return text
def __repr__(self):
return "TEXT"
def __repr__(self):
return "TEXT"
TEXT = Text()
#TEXT.__doc__ = Text.__doc__
......
This diff is collapsed.
......@@ -6,45 +6,49 @@
#
#=======================================================================
class PlexError(Exception):
message = ""
message = ""
class PlexTypeError(PlexError, TypeError):
pass
pass
class PlexValueError(PlexError, ValueError):
pass
pass
class InvalidRegex(PlexError):
pass
pass
class InvalidToken(PlexError):
def __init__(self, token_number, message):
PlexError.__init__(self, "Token number %d: %s" % (token_number, message))
def __init__(self, token_number, message):
PlexError.__init__(self, "Token number %d: %s" % (token_number, message))
class InvalidScanner(PlexError):
pass
class AmbiguousAction(PlexError):
message = "Two tokens with different actions can match the same string"
def __init__(self):
pass
class UnrecognizedInput(PlexError):
scanner = None
position = None
state_name = None
def __init__(self, scanner, state_name):
self.scanner = scanner
self.position = scanner.get_position()
self.state_name = state_name
def __str__(self):
return ("'%s', line %d, char %d: Token not recognised in state %s"
% (self.position + (repr(self.state_name),)))
class AmbiguousAction(PlexError):
message = "Two tokens with different actions can match the same string"
def __init__(self):
pass
class UnrecognizedInput(PlexError):
scanner = None
position = None
state_name = None
def __init__(self, scanner, state_name):
self.scanner = scanner
self.position = scanner.get_position()
self.state_name = state_name
def __str__(self):
return ("'%s', line %d, char %d: Token not recognised in state %r" % (
self.position + (self.state_name,)))
This diff is collapsed.
This diff is collapsed.
......@@ -42,14 +42,15 @@ def chars_to_ranges(s):
while i < n:
code1 = ord(char_list[i])
code2 = code1 + 1
i = i + 1
i += 1
while i < n and code2 >= ord(char_list[i]):
code2 = code2 + 1
i = i + 1
code2 += 1
i += 1
result.append(code1)
result.append(code2)
return result
def uppercase_range(code1, code2):
"""
If the range of characters from code1 to code2-1 includes any
......@@ -63,6 +64,7 @@ def uppercase_range(code1, code2):
else:
return None
def lowercase_range(code1, code2):
"""
If the range of characters from code1 to code2-1 includes any
......@@ -76,6 +78,7 @@ def lowercase_range(code1, code2):
else:
return None
def CodeRanges(code_list):
"""
Given a list of codes as returned by chars_to_ranges, return
......@@ -86,6 +89,7 @@ def CodeRanges(code_list):
re_list.append(CodeRange(code_list[i], code_list[i + 1]))
return Alt(*re_list)
def CodeRange(code1, code2):
"""
CodeRange(code1, code2) is an RE which matches any character
......@@ -93,11 +97,12 @@ def CodeRange(code1, code2):
"""
if code1 <= nl_code < code2:
return Alt(RawCodeRange(code1, nl_code),
RawNewline,
RawCodeRange(nl_code + 1, code2))
RawNewline,
RawCodeRange(nl_code + 1, code2))
else:
return RawCodeRange(code1, code2)
#
# Abstract classes
#
......@@ -110,12 +115,12 @@ class RE(object):
re1 | re2 is an RE which matches either |re1| or |re2|
"""
nullable = 1 # True if this RE can match 0 input symbols
match_nl = 1 # True if this RE can match a string ending with '\n'
str = None # Set to a string to override the class's __str__ result
nullable = 1 # True if this RE can match 0 input symbols
match_nl = 1 # True if this RE can match a string ending with '\n'
str = None # Set to a string to override the class's __str__ result
def build_machine(self, machine, initial_state, final_state,
match_bol, nocase):
match_bol, nocase):
"""
This method should add states to |machine| to implement this
RE, starting at |initial_state| and ending at |final_state|.
......@@ -124,7 +129,7 @@ class RE(object):
letters should be treated as equivalent.
"""
raise NotImplementedError("%s.build_machine not implemented" %
self.__class__.__name__)
self.__class__.__name__)
def build_opt(self, m, initial_state, c):
"""
......@@ -160,18 +165,18 @@ class RE(object):
self.check_string(num, value)
if len(value) != 1:
raise Errors.PlexValueError("Invalid value for argument %d of Plex.%s."
"Expected a string of length 1, got: %s" % (
num, self.__class__.__name__, repr(value)))
"Expected a string of length 1, got: %s" % (
num, self.__class__.__name__, repr(value)))
def wrong_type(self, num, value, expected):
if type(value) == types.InstanceType:
got = "%s.%s instance" % (
value.__class__.__module__, value.__class__.__name__)
got = "%s.%s instance" % (
value.__class__.__module__, value.__class__.__name__)
else:
got = type(value).__name__
raise Errors.PlexTypeError("Invalid type for argument %d of Plex.%s "
"(expected %s, got %s" % (
num, self.__class__.__name__, expected, got))
"(expected %s, got %s" % (
num, self.__class__.__name__, expected, got))
#
# Primitive RE constructors
......@@ -211,6 +216,7 @@ class RE(object):
## def calc_str(self):
## return "Char(%s)" % repr(self.char)
def Char(c):
"""
Char(c) is an RE which matches the character |c|.
......@@ -222,6 +228,7 @@ def Char(c):
result.str = "Char(%s)" % repr(c)
return result
class RawCodeRange(RE):
"""
RawCodeRange(code1, code2) is a low-level RE which matches any character
......@@ -230,9 +237,9 @@ class RawCodeRange(RE):
"""
nullable = 0
match_nl = 0
range = None # (code, code)
uppercase_range = None # (code, code) or None
lowercase_range = None # (code, code) or None
range = None # (code, code)
uppercase_range = None # (code, code) or None
lowercase_range = None # (code, code) or None
def __init__(self, code1, code2):
self.range = (code1, code2)
......@@ -252,6 +259,7 @@ class RawCodeRange(RE):
def calc_str(self):
return "CodeRange(%d,%d)" % (self.code1, self.code2)
class _RawNewline(RE):
"""
RawNewline is a low-level RE which matches a newline character.
......@@ -266,6 +274,7 @@ class _RawNewline(RE):
s = self.build_opt(m, initial_state, EOL)
s.add_transition((nl_code, nl_code + 1), final_state)
RawNewline = _RawNewline()
......@@ -304,7 +313,7 @@ class Seq(RE):
i = len(re_list)
match_nl = 0
while i:
i = i - 1
i -= 1
re = re_list[i]
if re.match_nl:
match_nl = 1
......@@ -354,7 +363,7 @@ class Alt(RE):
non_nullable_res.append(re)
if re.match_nl:
match_nl = 1
i = i + 1
i += 1
self.nullable_res = nullable_res
self.non_nullable_res = non_nullable_res
self.nullable = nullable
......@@ -411,7 +420,7 @@ class SwitchCase(RE):
def build_machine(self, m, initial_state, final_state, match_bol, nocase):
self.re.build_machine(m, initial_state, final_state, match_bol,
self.nocase)
self.nocase)
def calc_str(self):
if self.nocase:
......@@ -434,6 +443,7 @@ Empty.__doc__ = \
"""
Empty.str = "Empty"
def Str1(s):
"""
Str1(s) is an RE which matches the literal string |s|.
......@@ -442,6 +452,7 @@ def Str1(s):
result.str = "Str(%s)" % repr(s)
return result
def Str(*strs):
"""
Str(s) is an RE which matches the literal string |s|.
......@@ -454,6 +465,7 @@ def Str(*strs):
result.str = "Str(%s)" % ','.join(map(repr, strs))
return result
def Any(s):
"""
Any(s) is an RE which matches any character in the string |s|.
......@@ -463,6 +475,7 @@ def Any(s):
result.str = "Any(%s)" % repr(s)
return result
def AnyBut(s):
"""
AnyBut(s) is an RE which matches any character (including
......@@ -475,6 +488,7 @@ def AnyBut(s):
result.str = "AnyBut(%s)" % repr(s)
return result
AnyChar = AnyBut("")
AnyChar.__doc__ = \
"""
......@@ -482,7 +496,8 @@ AnyChar.__doc__ = \
"""
AnyChar.str = "AnyChar"
def Range(s1, s2 = None):
def Range(s1, s2=None):
"""
Range(c1, c2) is an RE which matches any single character in the range
|c1| to |c2| inclusive.
......@@ -495,11 +510,12 @@ def Range(s1, s2 = None):
else:
ranges = []
for i in range(0, len(s1), 2):
ranges.append(CodeRange(ord(s1[i]), ord(s1[i+1]) + 1))
ranges.append(CodeRange(ord(s1[i]), ord(s1[i + 1]) + 1))
result = Alt(*ranges)
result.str = "Range(%s)" % repr(s1)
return result
def Opt(re):
"""
Opt(re) is an RE which matches either |re| or the empty string.
......@@ -508,6 +524,7 @@ def Opt(re):
result.str = "Opt(%s)" % re
return result
def Rep(re):
"""
Rep(re) is an RE which matches zero or more repetitions of |re|.
......@@ -516,12 +533,14 @@ def Rep(re):
result.str = "Rep(%s)" % re
return result
def NoCase(re):
"""
NoCase(re) is an RE which matches the same strings as RE, but treating
upper and lower case letters as equivalent.
"""
return SwitchCase(re, nocase = 1)
return SwitchCase(re, nocase=1)
def Case(re):
"""
......@@ -529,7 +548,7 @@ def Case(re):
upper and lower case letters as distinct, i.e. it cancels the effect
of any enclosing NoCase().
"""
return SwitchCase(re, nocase = 0)
return SwitchCase(re, nocase=0)
#
# RE Constants
......
......@@ -31,7 +31,7 @@ cdef class Scanner:
@cython.locals(input_state=long)
cdef next_char(self)
@cython.locals(action=Action)
cdef tuple read(self)
cpdef tuple read(self)
cdef tuple scan_a_token(self)
cdef tuple position(self)
......
This diff is collapsed.
......@@ -13,147 +13,146 @@ from .Errors import PlexError
class RegexpSyntaxError(PlexError):
pass
pass
def re(s):
"""
Convert traditional string representation of regular expression |s|
into Plex representation.
"""
return REParser(s).parse_re()
"""
Convert traditional string representation of regular expression |s|
into Plex representation.
"""
return REParser(s).parse_re()
class REParser(object):
def __init__(self, s):
self.s = s
self.i = -1
self.end = 0
self.next()
def parse_re(self):
re = self.parse_alt()
if not self.end:
self.error("Unexpected %s" % repr(self.c))
return re
def parse_alt(self):
"""Parse a set of alternative regexps."""
re = self.parse_seq()
if self.c == '|':
re_list = [re]
while self.c == '|':
def __init__(self, s):
self.s = s
self.i = -1
self.end = 0
self.next()
re_list.append(self.parse_seq())
re = Alt(*re_list)
return re
def parse_seq(self):
"""Parse a sequence of regexps."""
re_list = []
while not self.end and not self.c in "|)":
re_list.append(self.parse_mod())
return Seq(*re_list)
def parse_mod(self):
"""Parse a primitive regexp followed by *, +, ? modifiers."""
re = self.parse_prim()
while not self.end and self.c in "*+?":
if self.c == '*':
re = Rep(re)
elif self.c == '+':
re = Rep1(re)
else: # self.c == '?'
re = Opt(re)
self.next()
return re
def parse_prim(self):
"""Parse a primitive regexp."""
c = self.get()
if c == '.':
re = AnyBut("\n")
elif c == '^':
re = Bol
elif c == '$':
re = Eol
elif c == '(':
re = self.parse_alt()
self.expect(')')
elif c == '[':
re = self.parse_charset()
self.expect(']')
else:
if c == '\\':
def parse_re(self):
re = self.parse_alt()
if not self.end:
self.error("Unexpected %s" % repr(self.c))
return re
def parse_alt(self):
"""Parse a set of alternative regexps."""
re = self.parse_seq()
if self.c == '|':
re_list = [re]
while self.c == '|':
self.next()
re_list.append(self.parse_seq())
re = Alt(*re_list)
return re
def parse_seq(self):
"""Parse a sequence of regexps."""
re_list = []
while not self.end and not self.c in "|)":
re_list.append(self.parse_mod())
return Seq(*re_list)
def parse_mod(self):
"""Parse a primitive regexp followed by *, +, ? modifiers."""
re = self.parse_prim()
while not self.end and self.c in "*+?":
if self.c == '*':
re = Rep(re)
elif self.c == '+':
re = Rep1(re)
else: # self.c == '?'
re = Opt(re)
self.next()
return re
def parse_prim(self):
"""Parse a primitive regexp."""
c = self.get()
re = Char(c)
return re
def parse_charset(self):
"""Parse a charset. Does not include the surrounding []."""
char_list = []
invert = 0
if self.c == '^':
invert = 1
self.next()
if self.c == ']':
char_list.append(']')
self.next()
while not self.end and self.c != ']':
c1 = self.get()
if self.c == '-' and self.lookahead(1) != ']':
if c == '.':
re = AnyBut("\n")
elif c == '^':
re = Bol
elif c == '$':
re = Eol
elif c == '(':
re = self.parse_alt()
self.expect(')')
elif c == '[':
re = self.parse_charset()
self.expect(']')
else:
if c == '\\':
c = self.get()
re = Char(c)
return re
def parse_charset(self):
"""Parse a charset. Does not include the surrounding []."""
char_list = []
invert = 0
if self.c == '^':
invert = 1
self.next()
if self.c == ']':
char_list.append(']')
self.next()
while not self.end and self.c != ']':
c1 = self.get()
if self.c == '-' and self.lookahead(1) != ']':
self.next()
c2 = self.get()
for a in xrange(ord(c1), ord(c2) + 1):
char_list.append(chr(a))
else:
char_list.append(c1)
chars = ''.join(char_list)
if invert:
return AnyBut(chars)
else:
return Any(chars)
def next(self):
"""Advance to the next char."""
s = self.s
i = self.i = self.i + 1
if i < len(s):
self.c = s[i]
else:
self.c = ''
self.end = 1
def get(self):
if self.end:
self.error("Premature end of string")
c = self.c
self.next()
c2 = self.get()
for a in xrange(ord(c1), ord(c2) + 1):
char_list.append(chr(a))
else:
char_list.append(c1)
chars = ''.join(char_list)
if invert:
return AnyBut(chars)
else:
return Any(chars)
def next(self):
"""Advance to the next char."""
s = self.s
i = self.i = self.i + 1
if i < len(s):
self.c = s[i]
else:
self.c = ''
self.end = 1
def get(self):
if self.end:
self.error("Premature end of string")
c = self.c
self.next()
return c
def lookahead(self, n):
"""Look ahead n chars."""
j = self.i + n
if j < len(self.s):
return self.s[j]
else:
return ''
def expect(self, c):
"""
Expect to find character |c| at current position.
Raises an exception otherwise.
"""
if self.c == c:
self.next()
else:
self.error("Missing %s" % repr(c))
def error(self, mess):
"""Raise exception to signal syntax error in regexp."""
raise RegexpSyntaxError("Syntax error in regexp %s at position %d: %s" % (
repr(self.s), self.i, mess))
return c
def lookahead(self, n):
"""Look ahead n chars."""
j = self.i + n
if j < len(self.s):
return self.s[j]
else:
return ''
def expect(self, c):
"""
Expect to find character |c| at current position.
Raises an exception otherwise.
"""
if self.c == c:
self.next()
else:
self.error("Missing %s" % repr(c))
def error(self, mess):
"""Raise exception to signal syntax error in regexp."""
raise RegexpSyntaxError("Syntax error in regexp %s at position %d: %s" % (
repr(self.s), self.i, mess))
This diff is collapsed.
# cython.* namespace for pure mode.
__version__ = "0.21b1"
__version__ = "0.21"
# BEGIN shameless copy from Cython/minivect/minitypes.py
......
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
"""
Inject Cython type declarations into a .py file using the Jedi static analysis tool.
"""
from __future__ import absolute_import
......
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment