Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
S
slapos.buildout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
Analytics
Analytics
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Commits
Issue Boards
Open sidebar
Xavier Thompson
slapos.buildout
Commits
3b16f5af
Commit
3b16f5af
authored
May 07, 2024
by
Xavier Thompson
Browse files
Options
Browse Files
Download
Plain Diff
zc.buildout 3.0.1+slapos001: Rebase on zc.buildout 3.0.1
See merge request
nexedi/slapos.buildout!30
parents
ac3f5e4c
d8f72f75
Changes
29
Hide whitespace changes
Inline
Side-by-side
Showing
29 changed files
with
2516 additions
and
832 deletions
+2516
-832
doc/getting-started.rst
doc/getting-started.rst
+34
-3
doc/reference.rst
doc/reference.rst
+13
-0
setup.py
setup.py
+2
-2
src/zc/buildout/__init__.py
src/zc/buildout/__init__.py
+13
-0
src/zc/buildout/buildout.py
src/zc/buildout/buildout.py
+745
-373
src/zc/buildout/configparser.py
src/zc/buildout/configparser.py
+12
-3
src/zc/buildout/download.py
src/zc/buildout/download.py
+154
-72
src/zc/buildout/easy_install.py
src/zc/buildout/easy_install.py
+321
-255
src/zc/buildout/rmtree.py
src/zc/buildout/rmtree.py
+79
-13
src/zc/buildout/testing.py
src/zc/buildout/testing.py
+23
-0
src/zc/buildout/tests/__init__.py
src/zc/buildout/tests/__init__.py
+39
-0
src/zc/buildout/tests/allow-unknown-extras.txt
src/zc/buildout/tests/allow-unknown-extras.txt
+1
-0
src/zc/buildout/tests/allowhosts.txt
src/zc/buildout/tests/allowhosts.txt
+4
-0
src/zc/buildout/tests/buildout.txt
src/zc/buildout/tests/buildout.txt
+117
-6
src/zc/buildout/tests/dependencylinks.txt
src/zc/buildout/tests/dependencylinks.txt
+2
-0
src/zc/buildout/tests/download.txt
src/zc/buildout/tests/download.txt
+76
-15
src/zc/buildout/tests/downloadcache.txt
src/zc/buildout/tests/downloadcache.txt
+9
-6
src/zc/buildout/tests/easy_install.txt
src/zc/buildout/tests/easy_install.txt
+22
-16
src/zc/buildout/tests/extends-cache.txt.disabled
src/zc/buildout/tests/extends-cache.txt.disabled
+2
-2
src/zc/buildout/tests/repeatable.txt
src/zc/buildout/tests/repeatable.txt
+1
-0
src/zc/buildout/tests/test_all.py
src/zc/buildout/tests/test_all.py
+533
-21
zc.recipe.egg_/setup.py
zc.recipe.egg_/setup.py
+1
-1
zc.recipe.egg_/src/zc/recipe/egg/README.rst
zc.recipe.egg_/src/zc/recipe/egg/README.rst
+13
-0
zc.recipe.egg_/src/zc/recipe/egg/api.rst
zc.recipe.egg_/src/zc/recipe/egg/api.rst
+2
-2
zc.recipe.egg_/src/zc/recipe/egg/custom.py
zc.recipe.egg_/src/zc/recipe/egg/custom.py
+89
-39
zc.recipe.egg_/src/zc/recipe/egg/custom.rst
zc.recipe.egg_/src/zc/recipe/egg/custom.rst
+26
-1
zc.recipe.egg_/src/zc/recipe/egg/egg.py
zc.recipe.egg_/src/zc/recipe/egg/egg.py
+39
-2
zc.recipe.egg_/src/zc/recipe/egg/patches.rst
zc.recipe.egg_/src/zc/recipe/egg/patches.rst
+124
-0
zc.recipe.egg_/src/zc/recipe/egg/tests.py
zc.recipe.egg_/src/zc/recipe/egg/tests.py
+20
-0
No files found.
doc/getting-started.rst
View file @
3b16f5af
...
...
@@ -28,11 +28,15 @@ The recommended way to install Buildout is to use pip within a virtual environme
..
code
-
block
::
console
virtualenv
mybuildout
cd
mybuildout
bin
/
pip
install
zc
.
buildout
python3
-
m
venv
myenv
source
myenv
/
bin
/
activate
pip
install
zc
.
buildout
Or
for
the
code
from
master
branch
:
..
code
-
block
::
console
pip
install
https
://
lab
.
nexedi
.
com
/
nexedi
/
slapos
.
buildout
/-/
archive
/
master
/
slapos
.
buildout
-
master
.
tar
.
gz
To
use
Buildout
,
you
need
to
provide
a
Buildout
configuration
.
Here
is
a
minimal
configuration
:
...
...
@@ -98,6 +102,33 @@ specified using *parts*. The parts to be built are listed in the
name
that
specifies
the
software
to
build
the
part
and
provides
parameters
to
control
how
the
part
is
built
.
Bootstrapping
an
isolated
environment
=====================================
Sometimes
it
is
useful
to
install
``
zc
.
buildout
``
and
its
dependencies
directly
in
``
eggs
``
directory
and
to
generate
a
``
buildout
``
script
in
the
``
bin
``
directory
that
uses
the
version
in
``
eggs
``
directory
,
instead
of
relying
on
the
package
available
in
the
environment
.
One
way
to
achieve
this
uses
the
``
extra
-
paths
``
option
of
``
buildout
``
section
:
by
setting
it
to
empty
value
,
packages
outside
of
``
eggs
``
or
``
develop
-
eggs
``
directories
will
not
be
considered
when
looking
for
already
installed
eggs
.
Then
the
``
bootstrap
``
command
will
install
``
zc
.
buildout
``
and
its
dependencies
from
scratch
in
``
eggs
``.
..
code
-
block
::
console
buildout
buildout
:
extra
-
paths
=
bootstrap
After
this
,
the
generated
``
bin
/
buildout
``
script
will
use
the
packages
installed
in
``
eggs
``
directory
instead
of
those
in
the
environment
and
preserve
the
isolation
from
the
environment
,
even
without
setting
``
extra
-
paths
``.
That
is
because
the
default
value
for
``
extra
-
paths
``
only
considers
the
paths
where
``
zc
.
buildout
``
and
its
dependencies
are
found
,
and
in
this
case
that
is
only
the
``
eggs
``
directory
.
Installing
software
===================
...
...
doc/reference.rst
View file @
3b16f5af
...
...
@@ -358,6 +358,19 @@ extends-cache
substitutions, and the result is a relative path, then it will be
interpreted relative to the buildout directory.)
.. _extra-paths-buildout-option
extra-paths, default: '
zc
.
buildout
'
Extra paths to scan for already installed distributions.
Setting this to an empty value enables isolation of buildout.
Setting this to '
legacy
' enables the legacy behavior of
scanning the paths of the distributions of zc.buildout itself
and its dependencies, which may contain sites-packages or not.
Setting this to '
zc
.
buildout
' also scans the paths of the
current zc.buildout and dependencies, but respects the order
they appear in sys.path, avoiding unexpected results.
.. _find-links-option:
find-links, default: ''
...
...
setup.py
View file @
3b16f5af
...
...
@@ -12,7 +12,7 @@
#
##############################################################################
name
=
"zc.buildout"
version
=
'3.0.1'
version
=
'3.0.1
+slapos001
'
import
os
from
setuptools
import
setup
...
...
@@ -47,7 +47,7 @@ setup(
python_requires
=
'>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*'
,
namespace_packages
=
[
'zc'
],
install_requires
=
[
'setuptools>=
8.0
'
,
'setuptools>=
38.2.3
'
,
'pip'
,
'wheel'
,
],
...
...
src/zc/buildout/__init__.py
View file @
3b16f5af
...
...
@@ -49,3 +49,16 @@ class UserError(Exception):
def
__str__
(
self
):
return
" "
.
join
(
map
(
str
,
self
.
args
))
# Used for Python 2-3 compatibility
if
str
is
bytes
:
# BBB Py2
bytes2str
=
str2bytes
=
lambda
s
:
s
def
unicode2str
(
s
):
return
s
.
encode
(
'utf-8'
)
else
:
def
bytes2str
(
s
):
return
s
.
decode
()
def
str2bytes
(
s
):
return
s
.
encode
()
def
unicode2str
(
s
):
return
s
src/zc/buildout/buildout.py
View file @
3b16f5af
...
...
@@ -29,10 +29,21 @@ try:
except
ImportError
:
from
UserDict
import
DictMixin
try
:
from
cStringIO
import
StringIO
except
ImportError
:
from
io
import
StringIO
try
:
from
urllib.parse
import
urljoin
except
ImportError
:
# BBB Py2
from
urlparse
import
urljoin
import
zc.buildout.configparser
import
copy
import
datetime
import
distutils.errors
import
errno
import
glob
import
importlib
import
inspect
...
...
@@ -45,8 +56,10 @@ import shutil
import
subprocess
import
sys
import
tempfile
import
pprint
import
zc.buildout
import
zc.buildout.download
from
functools
import
partial
PY3
=
sys
.
version_info
[
0
]
==
3
if
PY3
:
...
...
@@ -82,6 +95,66 @@ def print_(*args, **kw):
file
=
sys
.
stdout
file
.
write
(
sep
.
join
(
map
(
str
,
args
))
+
end
)
_MARKER
=
[]
class
BuildoutSerialiser
(
object
):
# XXX: I would like to access pprint._safe_repr, but it's not
# officially available. PrettyPrinter class has a functionally-speaking
# static method "format" which just calls _safe_repr, but it is not
# declared as static... So I must create an instance of it.
_format
=
pprint
.
PrettyPrinter
().
format
_dollar
=
'
\
\
x%02x'
%
ord
(
'$'
)
_semicolon
=
'
\
\
x%02x'
%
ord
(
';'
)
_safe_globals
=
{
'__builtins__'
:
{
# Types which are represented as calls to their constructor.
'bytearray'
:
bytearray
,
'complex'
:
complex
,
'frozenset'
:
frozenset
,
'set'
:
set
,
# Those buildins are available through keywords, which allow creating
# instances which in turn give back access to classes. So no point in
# hiding them.
'dict'
:
dict
,
'list'
:
list
,
'str'
:
str
,
'tuple'
:
tuple
,
'False'
:
False
,
'True'
:
True
,
'None'
:
None
,
}}
def
loads
(
self
,
value
):
return
eval
(
value
,
self
.
_safe_globals
)
def
dumps
(
self
,
value
):
value
,
isreadable
,
_
=
self
.
_format
(
value
,
{},
0
,
0
)
if
not
isreadable
:
raise
ValueError
(
'Value cannot be serialised: %s'
%
(
value
,
))
return
value
.
replace
(
'$'
,
self
.
_dollar
).
replace
(
';'
,
self
.
_semicolon
)
SERIALISED_VALUE_MAGIC
=
'!py'
SERIALISED
=
re
.
compile
(
SERIALISED_VALUE_MAGIC
+
'([^!]*)!(.*)'
)
SERIALISER_REGISTRY
=
{
''
:
BuildoutSerialiser
(),
}
SERIALISER_VERSION
=
''
SERIALISER
=
SERIALISER_REGISTRY
[
SERIALISER_VERSION
]
# Used only to compose data
SERIALISER_PREFIX
=
SERIALISED_VALUE_MAGIC
+
SERIALISER_VERSION
+
'!'
assert
SERIALISED
.
match
(
SERIALISER_PREFIX
).
groups
()
==
(
SERIALISER_VERSION
,
''
),
SERIALISED
.
match
(
SERIALISER_PREFIX
).
groups
()
def
dumps
(
value
):
orig_value
=
value
value
=
SERIALISER
.
dumps
(
value
)
assert
SERIALISER
.
loads
(
value
)
==
orig_value
,
(
repr
(
value
),
orig_value
)
return
SERIALISER_PREFIX
+
value
def
loads
(
value
):
assert
value
.
startswith
(
SERIALISED_VALUE_MAGIC
),
repr
(
value
)
version
,
data
=
SERIALISED
.
match
(
value
).
groups
()
return
SERIALISER_REGISTRY
[
version
].
loads
(
data
)
realpath
=
zc
.
buildout
.
easy_install
.
realpath
_isurl
=
re
.
compile
(
'([a-zA-Z0-9+.-]+)://'
).
match
...
...
@@ -259,13 +332,22 @@ def _print_annotate(data, verbose, chosen_sections, basedir):
sectionkey
=
data
[
section
][
key
]
sectionkey
.
printAll
(
key
,
basedir
,
verbose
)
def
_remove_ignore_missing
(
path
):
try
:
os
.
remove
(
path
)
except
OSError
as
e
:
if
e
.
errno
!=
errno
.
ENOENT
:
raise
def
_unannotate_section
(
section
):
return
{
key
:
entry
.
value
for
key
,
entry
in
section
.
items
()}
for
key
in
section
:
section
[
key
]
=
section
[
key
].
value
return
section
def
_unannotate
(
data
):
return
{
key
:
_unannotate_section
(
section
)
for
key
,
section
in
data
.
items
()}
for
key
in
data
:
_unannotate_section
(
data
[
key
])
return
data
def
_format_picked_versions
(
picked_versions
,
required_by
):
...
...
@@ -292,6 +374,7 @@ _buildout_default_options = _annotate_section({
'develop-eggs-directory'
:
'develop-eggs'
,
'eggs-directory'
:
'eggs'
,
'executable'
:
sys
.
executable
,
'extra-paths'
:
'zc.buildout'
,
'find-links'
:
''
,
'install-from-cache'
:
'false'
,
'installed'
:
'.installed.cfg'
,
...
...
@@ -299,6 +382,7 @@ _buildout_default_options = _annotate_section({
'log-level'
:
'INFO'
,
'newest'
:
'true'
,
'offline'
:
'false'
,
'dry-run'
:
'false'
,
'parts-directory'
:
'parts'
,
'prefer-final'
:
'true'
,
'python'
:
'buildout'
,
...
...
@@ -316,11 +400,16 @@ def _get_user_config():
return
os
.
path
.
join
(
buildout_home
,
'default.cfg'
)
networkcache_client
=
None
@
commands
class
Buildout
(
DictMixin
):
COMMANDS
=
set
()
installed_part_options
=
None
def
__init__
(
self
,
config_file
,
cloptions
,
use_user_defaults
=
True
,
command
=
None
,
args
=
()):
...
...
@@ -355,54 +444,24 @@ class Buildout(DictMixin):
data
[
'buildout'
][
'directory'
]
=
SectionKey
(
os
.
path
.
dirname
(
config_file
),
'COMPUTED_VALUE'
)
cloptions
=
dict
(
(
section
,
dict
((
option
,
SectionKey
(
value
,
'COMMAND_LINE_VALUE'
))
for
(
_
,
option
,
value
)
in
v
))
for
(
section
,
v
)
in
itertools
.
groupby
(
sorted
(
cloptions
),
lambda
v
:
v
[
0
])
)
override
=
copy
.
deepcopy
(
cloptions
.
get
(
'buildout'
,
{}))
result
=
{}
for
section
,
option
,
value
in
cloptions
:
result
.
setdefault
(
section
,
{})[
option
]
=
value
options
=
result
.
setdefault
(
'buildout'
,
{})
# load user defaults, which override defaults
extends
=
[]
user_config
=
_get_user_config
()
if
use_user_defaults
and
os
.
path
.
exists
(
user_config
):
download_options
=
data
[
'buildout'
]
user_defaults
,
_
=
_open
(
os
.
path
.
dirname
(
user_config
),
user_config
,
[],
download_options
,
override
,
set
(),
{}
)
for_download_options
=
_update
(
data
,
user_defaults
)
else
:
user_defaults
=
{}
for_download_options
=
copy
.
deepcopy
(
data
)
# load configuration files
extends
.
append
(
user_config
)
if
config_file
:
download_options
=
for_download_options
[
'buildout'
]
cfg_data
,
_
=
_open
(
os
.
path
.
dirname
(
config_file
),
config_file
,
[],
download_options
,
override
,
set
(),
user_defaults
)
data
=
_update
(
data
,
cfg_data
)
# extends from command-line
if
'buildout'
in
cloptions
:
cl_extends
=
cloptions
[
'buildout'
].
pop
(
'extends'
,
None
)
if
cl_extends
:
for
extends
in
cl_extends
.
value
.
split
():
download_options
=
for_download_options
[
'buildout'
]
cfg_data
,
_
=
_open
(
os
.
path
.
dirname
(
extends
),
os
.
path
.
basename
(
extends
),
[],
download_options
,
override
,
set
(),
user_defaults
)
data
=
_update
(
data
,
cfg_data
)
extends
.
append
(
config_file
)
clextends
=
options
.
get
(
'extends'
)
if
clextends
:
extends
.
append
(
clextends
)
options
[
'extends'
]
=
'
\
n
'
.
join
(
extends
)
# apply command-line options
data
=
_update
(
data
,
cloptions
)
data
=
_extends
(
data
,
result
,
os
.
getcwd
(),
'COMMAND_LINE_VALUE'
)
# Set up versions section, if necessary
if
'versions'
not
in
data
[
'buildout'
]:
...
...
@@ -418,15 +477,17 @@ class Buildout(DictMixin):
else
:
versions
=
{}
versions
.
update
(
dict
((
k
,
SectionKey
(
v
,
'DEFAULT_VALUE'
))
dict
((
k
,
SectionKey
(
v
(),
'DEFAULT_VALUE'
))
# Use lambdas to compute values only if needed
for
(
k
,
v
)
in
(
# Prevent downgrading due to prefer-final:
(
'zc.buildout'
,
'>='
+
pkg_resources
.
working_set
.
find
(
lambda
:
'>='
+
pkg_resources
.
working_set
.
find
(
pkg_resources
.
Requirement
.
parse
(
'zc.buildout'
)
).
version
),
# Skip local part because ">=x.y.z+abc" is invalid
).
parsed_version
.
public
),
# Use 2, even though not final
(
'zc.recipe.egg'
,
'>=2.0.6'
),
(
'zc.recipe.egg'
,
lambda
:
'>=2.0.6'
),
)
if
k
not
in
versions
))
...
...
@@ -440,6 +501,9 @@ class Buildout(DictMixin):
sectionkey
=
data
[
'buildout'
][
name
]
origdir
=
sectionkey
.
value
src
=
sectionkey
.
source
if
not
origdir
:
del
data
[
'buildout'
][
name
]
continue
if
'${'
in
origdir
:
continue
if
not
os
.
path
.
isabs
(
origdir
):
...
...
@@ -468,6 +532,9 @@ class Buildout(DictMixin):
self
.
_raw
=
_unannotate
(
data
)
self
.
_data
=
{}
self
.
_parts
=
[]
self
.
_initializing
=
[]
self
.
_signature_cache
=
{}
self
.
_default_requirement
=
None
# provide some defaults before options are parsed
# because while parsing options those attributes might be
...
...
@@ -502,6 +569,7 @@ class Buildout(DictMixin):
self
.
newest
=
((
not
self
.
offline
)
and
bool_option
(
buildout_section
,
'newest'
)
)
self
.
dry_run
=
(
buildout_section
[
'dry-run'
]
==
'true'
)
##################################################################
## WARNING!!!
...
...
@@ -533,6 +601,42 @@ class Buildout(DictMixin):
options
[
'installed'
]
=
os
.
path
.
join
(
options
[
'directory'
],
options
[
'installed'
])
# Extra paths to scan for already installed distributions.
extra_paths
=
options
[
'extra-paths'
]
if
extra_paths
==
'sys.path'
:
# special case: sys.path
extra_paths
=
sys
.
path
options
[
'extra-paths'
]
=
' '
.
join
(
extra_paths
)
elif
extra_paths
==
'legacy'
:
# special case: legacy behavior
# this case is why this is done before setting easy_install
# versions and other options, to get the legacy behavior.
# XXX: These 'sorted' calls correspond to the original behavior,
# but they are quite problematic, as other distributions for
# zc.buildout, pip, wheel and setuptools may take precedence
# over the ones currently running.
old_extra_paths
=
zc
.
buildout
.
easy_install
.
extra_paths
(
sorted
({
d
.
location
for
d
in
pkg_resources
.
working_set
}))
try
:
buildout_and_setuptools_dists
=
list
(
zc
.
buildout
.
easy_install
.
install
([
'zc.buildout'
],
None
,
check_picked
=
False
))
finally
:
zc
.
buildout
.
easy_install
.
extra_paths
(
old_extra_paths
)
extra_paths
=
sorted
(
{
d
.
location
for
d
in
buildout_and_setuptools_dists
})
options
[
'extra-paths'
]
=
' '
.
join
(
extra_paths
)
elif
extra_paths
==
'zc.buildout'
:
# special case: only zc.buildout and its dependencies
# but in the order they appear in sys.path, unlike legacy
buildout_dists
=
pkg_resources
.
require
(
'zc.buildout'
)
buildout_paths
=
{
d
.
location
for
d
in
buildout_dists
}
extra_paths
=
[
p
for
p
in
sys
.
path
if
p
in
buildout_paths
]
options
[
'extra-paths'
]
=
' '
.
join
(
extra_paths
)
else
:
extra_paths
=
extra_paths
.
split
()
zc
.
buildout
.
easy_install
.
extra_paths
(
extra_paths
)
self
.
_setup_logging
()
self
.
_setup_socket_timeout
()
...
...
@@ -598,6 +702,19 @@ class Buildout(DictMixin):
os
.
chdir
(
options
[
'directory'
])
networkcache_section_name
=
options
.
get
(
'networkcache-section'
)
if
networkcache_section_name
:
networkcache_section
=
self
[
networkcache_section_name
]
try
:
from
slapos.libnetworkcache
import
NetworkcacheClient
global
networkcache_client
networkcache_client
=
NetworkcacheClient
(
networkcache_section
)
except
ImportError
:
pass
except
Exception
:
self
.
_logger
.
exception
(
"Failed to setup Networkcache. Continue without."
)
def
_buildout_path
(
self
,
name
):
if
'${'
in
name
:
return
name
...
...
@@ -607,44 +724,86 @@ class Buildout(DictMixin):
def
bootstrap
(
self
,
args
):
__doing__
=
'Bootstrapping.'
if
os
.
path
.
exists
(
self
[
'buildout'
][
'develop-eggs-directory'
]):
if
os
.
path
.
isdir
(
self
[
'buildout'
][
'develop-eggs-directory'
]):
rmtree
(
self
[
'buildout'
][
'develop-eggs-directory'
])
self
.
_logger
.
debug
(
"Removed existing develop-eggs directory"
)
self
.
_setup_directories
()
# Now copy buildout and setuptools eggs, and record destination eggs:
entries
=
[]
for
dist
in
zc
.
buildout
.
easy_install
.
buildout_and_setuptools_dists
:
# Hack: propagate libnetworkcache soft dependency
specs
=
[
'zc.buildout'
]
try
:
import
slapos.libnetworkcache
specs
.
append
(
'slapos.libnetworkcache'
)
except
ImportError
:
pass
# Install buildout and dependent eggs following pinned versions.
dest
=
self
[
'buildout'
][
'eggs-directory'
]
path
=
[
self
[
'buildout'
][
'develop-eggs-directory'
]]
if
self
.
offline
:
# Cannot install: just check requirements are already met
path
.
append
(
dest
)
dest
=
None
ws
=
zc
.
buildout
.
easy_install
.
install
(
specs
,
dest
,
links
=
self
.
_links
,
index
=
self
[
'buildout'
].
get
(
'index'
),
path
=
path
,
newest
=
self
.
newest
,
allow_hosts
=
self
.
_allow_hosts
,
)
# If versions aren't pinned or if current modules match,
# nothing will be installed, but then we'll copy them to
# the local eggs or develop-eggs folder just after this.
# XXX Note: except if the current modules are not eggs, in which case
# we'll create .egg-link to them. This applies to packages installed
# in site-packages by pip (.dist-info, not .egg), which in turn would
# cause site-packages to be in the sys.path of the generated script.
# Sort the working set to keep entries with single dists first.
options
=
self
[
'buildout'
]
buildout_dir
=
options
[
'directory'
]
eggs_dir
=
options
[
'eggs-directory'
]
develop_eggs_dir
=
options
[
'develop-eggs-directory'
]
ws
=
zc
.
buildout
.
easy_install
.
sort_working_set
(
ws
,
buildout_dir
=
buildout_dir
,
eggs_dir
=
eggs_dir
,
develop_eggs_dir
=
develop_eggs_dir
)
# Now copy buildout and setuptools eggs, and record destination eggs.
# XXX Note: dists using .dist-info format - e.g. packages installed by
# pip in site-packages - will be seen as develop dists and not copied.
egg_entries
=
[]
link_dists
=
[]
for
dist
in
ws
:
if
dist
.
precedence
==
pkg_resources
.
DEVELOP_DIST
:
dest
=
os
.
path
.
join
(
self
[
'buildout'
][
'develop-eggs-directory'
],
dist
.
key
+
'.egg-link'
)
with
open
(
dest
,
'w'
)
as
fh
:
fh
.
write
(
dist
.
location
)
entries
.
append
(
dist
.
location
)
link_dists
.
append
(
dist
)
else
:
dest
=
os
.
path
.
join
(
self
[
'buildout'
][
'eggs-directory'
],
os
.
path
.
basename
(
dist
.
location
))
entries
.
append
(
dest
)
e
gg_e
ntries
.
append
(
dest
)
if
not
os
.
path
.
exists
(
dest
):
if
os
.
path
.
isdir
(
dist
.
location
):
shutil
.
copytree
(
dist
.
location
,
dest
)
else
:
shutil
.
copy2
(
dist
.
location
,
dest
)
# Create buildout script
ws
=
pkg_resources
.
WorkingSet
(
entries
)
# Recreate a working set with the potentially-new paths after copying.
# We keep the eggs dists first since we know their locations contain a
# single dist. We add the other dists manually to avoid activating any
# unneded dists at the same location, and we can because these are the
# same dists as before as they were not copied.
ws
=
pkg_resources
.
WorkingSet
(
egg_entries
)
for
dist
in
link_dists
:
ws
.
add
(
dist
)
ws
.
require
(
'zc.buildout'
)
options
=
self
[
'buildout'
]
eggs_dir
=
options
[
'eggs-directory'
]
develop_eggs_dir
=
options
[
'develop-eggs-directory'
]
ws
=
zc
.
buildout
.
easy_install
.
sort_working_set
(
ws
,
eggs_dir
=
eggs_dir
,
develop_eggs_dir
=
develop_eggs_dir
)
# Create buildout script
zc
.
buildout
.
easy_install
.
scripts
(
[
'zc.buildout'
],
ws
,
sys
.
executable
,
options
[
'bin-directory'
],
...
...
@@ -692,6 +851,19 @@ class Buildout(DictMixin):
@
command
def
install
(
self
,
install_args
):
try
:
self
.
_install_parts
(
install_args
)
finally
:
if
self
.
installed_part_options
is
not
None
:
try
:
self
.
_save_installed_options
()
finally
:
del
self
.
installed_part_options
if
self
.
show_picked_versions
or
self
.
update_versions_file
:
self
.
_print_picked_versions
()
self
.
_unload_extensions
()
def
_install_parts
(
self
,
install_args
):
__doing__
=
'Installing.'
self
.
_load_extensions
()
...
...
@@ -705,8 +877,8 @@ class Buildout(DictMixin):
self
.
_maybe_upgrade
()
# load installed data
(
installed_part_options
,
installed_exists
)
=
self
.
_read_installed_part_options
()
installed_part_options
=
self
.
_read_installed_part_options
()
installed_parts
=
installed_part_options
[
'buildout'
][
'parts'
].
split
()
# Remove old develop eggs
self
.
_uninstall
(
...
...
@@ -719,21 +891,15 @@ class Buildout(DictMixin):
installed_part_options
[
'buildout'
][
'installed_develop_eggs'
]
=
installed_develop_eggs
if
installed_exists
:
self
.
_update_installed
(
installed_develop_eggs
=
installed_develop_eggs
)
# get configured and installed part lists
conf_parts
=
self
[
'buildout'
][
'parts'
]
conf_parts
=
conf_parts
and
conf_parts
.
split
()
or
[]
installed_parts
=
installed_part_options
[
'buildout'
][
'parts'
]
installed_parts
=
installed_parts
and
installed_parts
.
split
()
or
[]
# From now, the caller will update the .installed.cfg at return.
self
.
installed_part_options
=
installed_part_options
install_parts
=
self
[
'buildout'
][
'parts'
]
if
install_args
:
install_parts
=
install_args
uninstall_missing
=
False
else
:
install_parts
=
conf_parts
install_parts
=
install_parts
.
split
()
uninstall_missing
=
True
# load and initialize recipes
...
...
@@ -750,68 +916,79 @@ class Buildout(DictMixin):
_save_options
(
section
,
self
[
section
],
sys
.
stdout
)
print_
()
# compute new part recipe signatures
self
.
_compute_part_signatures
(
install_parts
)
del
self
.
_signature_cache
# uninstall parts that are no-longer used or who's configs
# have changed
if
self
.
_logger
.
getEffectiveLevel
()
<
logging
.
DEBUG
:
reinstall_reason_score
=
-
1
elif
int
(
os
.
getenv
(
'BUILDOUT_INFO_REINSTALL_REASON'
)
or
1
):
# We rely on the fact that installed_parts is sorted according to
# dependencies (unless install_args). This is not the case of
# installed_parts.
reinstall_reason_score
=
len
(
installed_parts
)
else
:
# Provide a way to disable in tests
# or we'd have to update all recipe eggs.
reinstall_reason_score
=
0
reinstall_reason
=
None
for
part
in
reversed
(
installed_parts
):
if
part
in
install_parts
:
try
:
part_index
=
install_parts
.
index
(
part
)
except
ValueError
:
if
not
uninstall_missing
:
continue
else
:
old_options
=
installed_part_options
[
part
].
copy
()
installed_files
=
old_options
.
pop
(
'__buildout_installed__'
)
new_options
=
self
.
get
(
part
)
new_options
=
self
.
get
(
part
)
.
copy
()
if
old_options
==
new_options
:
# The options are the same, but are all of the
# installed files still there? If not, we should
# reinstall.
if
not
installed_files
:
continue
for
f
in
installed_files
.
split
(
'
\
n
'
):
if
not
os
.
path
.
exists
(
self
.
_buildout_path
(
f
)):
for
installed_path
in
installed_files
.
split
(
'
\
n
'
):
if
not
os
.
path
.
exists
(
self
.
_buildout_path
(
installed_path
)):
break
else
:
continue
else
:
installed_path
=
None
# output debugging info
if
self
.
_logger
.
getEffectiveLevel
()
<
logging
.
DEBUG
:
for
k
in
old_options
:
if
k
not
in
new_options
:
self
.
_logger
.
debug
(
"Part %s, dropped option %s."
,
part
,
k
)
elif
old_options
[
k
]
!=
new_options
[
k
]:
self
.
_logger
.
debug
(
"Part %s, option %s changed:
\
n
%r != %r"
,
part
,
k
,
new_options
[
k
],
old_options
[
k
],
)
for
k
in
new_options
:
if
k
not
in
old_options
:
self
.
_logger
.
debug
(
"Part %s, new option %s."
,
part
,
k
)
elif
not
uninstall_missing
:
continue
if
part_index
<
reinstall_reason_score
:
reinstall_reason_score
=
part_index
reinstall_reason
=
(
part
,
old_options
,
new_options
,
installed_path
)
elif
reinstall_reason_score
<
0
:
self
.
_log_reinstall_reason
(
logging
.
DEBUG
,
part
,
old_options
,
new_options
,
installed_path
)
self
.
_uninstall_part
(
part
,
installed_part_options
)
installed_parts
=
[
p
for
p
in
installed_parts
if
p
!=
part
]
if
installed_exists
:
self
.
_update_installed
(
parts
=
' '
.
join
(
installed_parts
))
installed_part_options
[
'buildout'
][
'parts'
]
=
(
' '
.
join
(
installed_parts
))
if
reinstall_reason
:
self
.
_log_reinstall_reason
(
logging
.
INFO
,
*
reinstall_reason
)
# Check for unused buildout options:
_check_for_unused_options_in_section
(
self
,
'buildout'
)
# install new parts
all_installed_paths
=
{}
for
part
in
install_parts
:
signature
=
self
[
part
].
pop
(
'__buildout_signature__'
)
saved_options
=
self
[
part
].
copy
()
recipe
=
self
[
part
].
recipe
if
part
in
installed_parts
:
# update
need_to_save_installed
=
False
__doing__
=
'Updating %s.'
,
part
self
.
_logger
.
info
(
*
__doing__
)
if
self
.
dry_run
:
continue
old_options
=
installed_part_options
[
part
]
old_
installed_files
=
old_options
[
'__buildout_installed__'
]
installed_files
=
old_options
[
'__buildout_installed__'
]
try
:
update
=
recipe
.
update
...
...
@@ -823,88 +1000,84 @@ class Buildout(DictMixin):
part
)
try
:
install
ed_files
=
self
[
part
].
_call
(
update
)
except
:
updat
ed_files
=
self
[
part
].
_call
(
update
)
except
Exception
:
installed_parts
.
remove
(
part
)
self
.
_uninstall
(
old_installed_files
)
if
installed_exists
:
self
.
_update_installed
(
parts
=
' '
.
join
(
installed_parts
))
self
.
_uninstall
(
installed_files
)
installed_part_options
[
'buildout'
][
'parts'
]
=
(
' '
.
join
(
installed_parts
))
raise
old_installed_files
=
old_installed_files
.
split
(
'
\
n
'
)
if
installed_files
is
None
:
installed_files
=
old_installed_files
else
:
if
isinstance
(
installed_files
,
str
):
installed_files
=
[
installed_files
]
else
:
installed_files
=
list
(
installed_files
)
need_to_save_installed
=
[
p
for
p
in
installed_files
if
p
not
in
old_installed_files
]
if
need_to_save_installed
:
installed_files
=
(
old_installed_files
+
need_to_save_installed
)
installed_files
=
set
(
installed_files
.
split
(
'
\
n
'
))
\
if
installed_files
else
set
()
if
updated_files
:
(
installed_files
.
add
if
isinstance
(
updated_files
,
str
)
else
installed_files
.
update
)(
updated_files
)
else
:
# install
need_to_save_installed
=
True
__doing__
=
'Installing %s.'
,
part
self
.
_logger
.
info
(
*
__doing__
)
if
self
.
dry_run
:
continue
installed_files
=
self
[
part
].
_call
(
recipe
.
install
)
if
installed_files
is
None
:
self
.
_logger
.
warning
(
"The %s install returned None. A path or "
"iterable os paths should be returned."
,
part
)
installed_files
=
()
elif
isinstance
(
installed_files
,
str
):
installed_files
=
[
installed_files
]
else
:
installed_files
=
list
(
installed_files
)
elif
installed_files
:
installed_files
=
({
installed_files
}
if
isinstance
(
installed_files
,
str
)
else
set
(
installed_files
))
if
installed_files
:
conflicts
=
installed_files
.
intersection
(
all_installed_paths
)
if
conflicts
:
self
.
_error
(
"The following paths are already"
" installed by other sections: %r"
,
{
x
:
all_installed_paths
[
x
]
for
x
in
conflicts
})
all_installed_paths
.
update
(
dict
.
fromkeys
(
installed_files
,
part
))
installed_files
=
'
\
n
'
.
join
(
sorted
(
installed_files
))
else
:
installed_files
=
''
installed_part_options
[
part
]
=
saved_options
saved_options
[
'__buildout_installed__'
]
=
'
\
n
'
.
join
(
installed_files
)
saved_options
[
'__buildout_installed__'
]
=
installed_files
saved_options
[
'__buildout_signature__'
]
=
signature
installed_part_options
[
part
]
=
saved_options
installed_parts
=
[
p
for
p
in
installed_parts
if
p
!=
part
]
installed_parts
.
append
(
part
)
_check_for_unused_options_in_section
(
self
,
part
)
if
need_to_save_installed
:
if
part
not
in
installed_parts
:
installed_parts
.
append
(
part
)
installed_part_options
[
'buildout'
][
'parts'
]
=
(
' '
.
join
(
installed_parts
))
self
.
_save_installed_options
(
installed_part_options
)
installed_exists
=
True
else
:
assert
installed_exists
self
.
_update_installed
(
parts
=
' '
.
join
(
installed_parts
))
if
installed_develop_eggs
:
if
not
installed_exists
:
self
.
_save_installed_options
(
installed_part_options
)
elif
(
not
installed_parts
)
and
installed_exists
:
os
.
remove
(
self
[
'buildout'
][
'installed'
])
_check_for_unused_options_in_section
(
self
,
part
)
if
self
.
show_picked_versions
or
self
.
update_versions_file
:
self
.
_print_picked_versions
()
self
.
_unload_extensions
()
if
self
.
_log_level
<
logging
.
INFO
:
self
.
_save_installed_options
()
def
_update_installed
(
self
,
**
buildout_options
):
installed
=
self
[
'buildout'
][
'installed'
]
f
=
open
(
installed
,
'a'
)
f
.
write
(
'
\
n
[buildout]
\
n
'
)
for
option
,
value
in
list
(
buildout_options
.
items
()):
_save_option
(
option
,
value
,
f
)
f
.
close
()
def
_log_reinstall_reason
(
self
,
level
,
part
,
old_options
,
new_options
,
missing
):
log
=
self
.
_logger
.
log
if
missing
:
log
(
level
,
"Part %s, missing path: %s"
,
part
,
missing
)
return
for
k
in
old_options
:
if
k
not
in
new_options
:
log
(
level
,
"Part %s, dropped option %s."
,
part
,
k
)
elif
old_options
[
k
]
!=
new_options
[
k
]:
log
(
level
,
"Part %s, option %s changed: %r != %r"
,
part
,
k
,
new_options
[
k
],
old_options
[
k
])
for
k
in
new_options
:
if
k
not
in
old_options
:
log
(
level
,
"Part %s, new option %s."
,
part
,
k
)
def
_uninstall_part
(
self
,
part
,
installed_part_options
):
# uninstall part
__doing__
=
'Uninstalling %s.'
,
part
self
.
_logger
.
info
(
*
__doing__
)
if
self
.
dry_run
:
return
# run uninstall recipe
recipe
,
entry
=
_recipe
(
installed_part_options
[
part
])
...
...
@@ -990,17 +1163,6 @@ class Buildout(DictMixin):
self
.
_logger
.
warning
(
"Unexpected entry, %r, in develop-eggs directory."
,
f
)
def
_compute_part_signatures
(
self
,
parts
):
# Compute recipe signature and add to options
for
part
in
parts
:
options
=
self
.
get
(
part
)
if
options
is
None
:
options
=
self
[
part
]
=
{}
recipe
,
entry
=
_recipe
(
options
)
req
=
pkg_resources
.
Requirement
.
parse
(
recipe
)
sig
=
_dists_sig
(
pkg_resources
.
working_set
.
resolve
([
req
]))
options
[
'__buildout_signature__'
]
=
' '
.
join
(
sig
)
def
_read_installed_part_options
(
self
):
old
=
self
[
'buildout'
][
'installed'
]
if
old
and
os
.
path
.
isfile
(
old
):
...
...
@@ -1016,11 +1178,9 @@ class Buildout(DictMixin):
options
[
option
]
=
value
result
[
section
]
=
self
.
Options
(
self
,
section
,
options
)
return
result
,
True
return
result
else
:
return
({
'buildout'
:
self
.
Options
(
self
,
'buildout'
,
{
'parts'
:
''
})},
False
,
)
return
{
'buildout'
:
self
.
Options
(
self
,
'buildout'
,
{
'parts'
:
''
})}
def
_uninstall
(
self
,
installed
):
for
f
in
installed
.
split
(
'
\
n
'
):
...
...
@@ -1061,16 +1221,41 @@ class Buildout(DictMixin):
return
' '
.
join
(
installed
)
def
_save_installed_options
(
self
,
installed_options
):
installed
=
self
[
'buildout'
][
'installed'
]
if
not
installed
:
def
_save_installed_options
(
self
):
if
self
.
dry_run
:
return
f
=
open
(
installed
,
'w'
)
_save_options
(
'buildout'
,
installed_options
[
'buildout'
],
f
)
for
part
in
installed_options
[
'buildout'
][
'parts'
].
split
():
print_
(
file
=
f
)
_save_options
(
part
,
installed_options
[
part
],
f
)
f
.
close
()
installed_path
=
self
[
'buildout'
][
'installed'
]
if
not
installed_path
:
return
installed_part_options
=
self
.
installed_part_options
buildout
=
installed_part_options
[
'buildout'
]
installed_parts
=
buildout
[
'parts'
]
if
installed_parts
or
buildout
[
'installed_develop_eggs'
]:
new
=
StringIO
()
_save_options
(
'buildout'
,
buildout
,
new
)
for
part
in
installed_parts
.
split
():
new
.
write
(
'
\
n
'
)
_save_options
(
part
,
installed_part_options
[
part
],
new
)
new
=
new
.
getvalue
()
try
:
with
open
(
installed_path
)
as
f
:
save
=
f
.
read
(
1
+
len
(
new
))
!=
new
except
IOError
as
e
:
if
e
.
errno
!=
errno
.
ENOENT
:
raise
save
=
True
if
save
:
installed_tmp
=
installed_path
+
".tmp"
try
:
with
open
(
installed_tmp
,
"w"
)
as
f
:
f
.
write
(
new
)
f
.
flush
()
os
.
fsync
(
f
.
fileno
())
os
.
rename
(
installed_tmp
,
installed_path
)
finally
:
_remove_ignore_missing
(
installed_tmp
)
else
:
_remove_ignore_missing
(
installed_path
)
def
_error
(
self
,
message
,
*
args
):
raise
zc
.
buildout
.
UserError
(
message
%
args
)
...
...
@@ -1135,8 +1320,17 @@ class Buildout(DictMixin):
if
not
self
.
newest
:
return
# Hack: propagate libnetworkcache soft dependency
# XXX just zc.buildout should suffice, then iter over projects in ws
specs
=
[
'zc.buildout'
,
'setuptools'
,
'pip'
,
'wheel'
]
try
:
import
slapos.libnetworkcache
specs
.
append
(
'slapos.libnetworkcache'
)
except
ImportError
:
pass
ws
=
zc
.
buildout
.
easy_install
.
install
(
(
'zc.buildout'
,
'setuptools'
,
'pip'
,
'wheel'
)
,
specs
,
self
[
'buildout'
][
'eggs-directory'
],
links
=
self
[
'buildout'
].
get
(
'find-links'
,
''
).
split
(),
index
=
self
[
'buildout'
].
get
(
'index'
),
...
...
@@ -1146,7 +1340,7 @@ class Buildout(DictMixin):
upgraded
=
[]
for
project
in
'zc.buildout'
,
'setuptools'
,
'pip'
,
'wheel'
:
for
project
in
specs
:
req
=
pkg_resources
.
Requirement
.
parse
(
project
)
dist
=
ws
.
find
(
req
)
importlib
.
import_module
(
project
)
...
...
@@ -1184,10 +1378,12 @@ class Buildout(DictMixin):
# the new dist is different, so we've upgraded.
# Update the scripts and return True
options
=
self
[
'buildout'
]
buildout_dir
=
options
[
'directory'
]
eggs_dir
=
options
[
'eggs-directory'
]
develop_eggs_dir
=
options
[
'develop-eggs-directory'
]
ws
=
zc
.
buildout
.
easy_install
.
sort_working_set
(
ws
,
buildout_dir
=
buildout_dir
,
eggs_dir
=
eggs_dir
,
develop_eggs_dir
=
develop_eggs_dir
)
...
...
@@ -1297,6 +1493,7 @@ class Buildout(DictMixin):
os
.
write
(
fd
,
(
zc
.
buildout
.
easy_install
.
runsetup_template
%
dict
(
setupdir
=
os
.
path
.
dirname
(
setup
),
setup
=
setup
,
path_list
=
[],
__file__
=
setup
,
)).
encode
())
args
=
[
sys
.
executable
,
tsetup
]
+
args
...
...
@@ -1356,21 +1553,46 @@ class Buildout(DictMixin):
v
=
v
.
replace
(
os
.
getcwd
(),
base_path
)
print_
(
"%s =%s"
%
(
k
,
v
))
def
__getitem__
(
self
,
section
):
__doing__
=
'Getting section %s.'
,
section
def
initialize
(
self
,
options
,
reqs
,
entry
):
recipe_class
=
_install_and_load
(
reqs
,
'zc.buildout'
,
entry
,
self
)
try
:
return
self
.
_data
[
section
]
sig
=
self
.
_signature_cache
[
reqs
]
except
KeyError
:
pass
req
=
pkg_resources
.
Requirement
.
parse
(
reqs
)
sig
=
self
.
_signature_cache
[
reqs
]
=
sorted
(
set
(
_dists_sig
(
pkg_resources
.
working_set
.
resolve
([
req
]))))
self
.
_initializing
.
append
((
options
,
sig
))
try
:
recipe
=
recipe_class
(
self
,
options
.
name
,
options
)
options
[
'__buildout_signature__'
]
finally
:
del
self
.
_initializing
[
-
1
]
return
recipe
def
__getitem__
(
self
,
section
):
__doing__
=
'Getting section %s.'
,
section
try
:
data
=
self
.
_raw
[
section
]
options
=
self
.
_data
[
section
]
except
KeyError
:
raise
MissingSection
(
section
)
options
=
self
.
Options
(
self
,
section
,
data
)
self
.
_data
[
section
]
=
options
options
.
_initialize
()
try
:
data
=
self
.
_raw
[
section
]
except
KeyError
:
raise
MissingSection
(
section
)
e
=
data
.
get
(
'__unsupported_conditional_expression__'
)
if
e
:
raise
e
options
=
self
.
Options
(
self
,
section
,
data
)
self
.
_data
[
section
]
=
options
options
.
_initialize
()
if
self
.
_initializing
:
caller
=
self
.
_initializing
[
-
1
][
0
]
if
'buildout'
!=
section
and
not
(
section
in
caller
.
depends
or
# Do not only check the caller,
# because of circular dependencies during substitutions.
section
in
(
x
[
0
].
name
for
x
in
self
.
_initializing
)):
caller
.
depends
.
add
(
section
)
return
options
def
__setitem__
(
self
,
name
,
data
):
...
...
@@ -1380,10 +1602,6 @@ class Buildout(DictMixin):
self
[
name
]
# Add to parts
def
parse
(
self
,
data
):
try
:
from
cStringIO
import
StringIO
except
ImportError
:
from
io
import
StringIO
import
textwrap
sections
=
zc
.
buildout
.
configparser
.
parse
(
...
...
@@ -1409,9 +1627,16 @@ class Buildout(DictMixin):
def
__len__
(
self
):
return
len
(
self
.
_raw
)
_install_and_load_cache
=
{}
def
_install_and_load
(
spec
,
group
,
entry
,
buildout
):
__doing__
=
'Loading recipe %r.'
,
spec
key
=
spec
,
group
,
entry
try
:
return
_install_and_load_cache
[
key
]
except
KeyError
:
pass
try
:
req
=
pkg_resources
.
Requirement
.
parse
(
spec
)
...
...
@@ -1438,8 +1663,9 @@ def _install_and_load(spec, group, entry, buildout):
)
__doing__
=
'Loading %s recipe entry %s:%s.'
,
group
,
spec
,
entry
re
turn
pkg_resources
.
load_entry_point
(
re
sult
=
_install_and_load_cache
[
key
]
=
pkg_resources
.
load_entry_point
(
req
.
project_name
,
group
,
entry
)
return
result
except
Exception
:
v
=
sys
.
exc_info
()[
1
]
...
...
@@ -1457,6 +1683,7 @@ class Options(DictMixin):
self
.
_raw
=
data
self
.
_cooked
=
{}
self
.
_data
=
{}
self
.
depends
=
set
()
def
_initialize
(
self
):
name
=
self
.
name
...
...
@@ -1465,6 +1692,8 @@ class Options(DictMixin):
if
'<'
in
self
.
_raw
:
self
.
_raw
=
self
.
_do_extend_raw
(
name
,
self
.
_raw
,
[])
default
=
self
.
buildout
.
_default_requirement
# force substitutions
for
k
,
v
in
sorted
(
self
.
_raw
.
items
()):
if
'${'
in
v
:
...
...
@@ -1478,16 +1707,24 @@ class Options(DictMixin):
self
.
buildout
[
dname
]
if
self
.
get
(
'recipe'
):
self
.
initialize
()
if
default
:
self
.
depends
.
add
(
default
)
self
.
recipe
=
self
.
buildout
.
initialize
(
self
,
*
_recipe
(
self
.
_data
))
self
.
buildout
.
_parts
.
append
(
name
)
def
initialize
(
self
):
reqs
,
entry
=
_recipe
(
self
.
_data
)
buildout
=
self
.
buildout
recipe_class
=
_install_and_load
(
reqs
,
'zc.buildout'
,
entry
,
buildout
)
name
=
self
.
name
self
.
recipe
=
recipe_class
(
buildout
,
name
,
self
)
m
=
md5
()
# _profile_base_location_ is ignored in signatures, so that two sections
# at different URLs can have same signature
_profile_base_location_
=
self
.
get
(
'_profile_base_location_'
)
# access values through .get() instead of .items() to detect unused keys
for
key
in
sorted
(
self
.
keys
()):
if
key
==
'_profile_base_location_'
:
continue
value
=
self
.
_data
.
get
(
key
,
self
.
_cooked
.
get
(
key
,
self
.
_raw
.
get
(
key
)))
if
_profile_base_location_
:
value
=
value
.
replace
(
_profile_base_location_
,
'${:_profile_base_location_}'
)
m
.
update
((
'%r
\
0
%r
\
0
'
%
(
key
,
value
)).
encode
())
self
.
items_signature
=
'%s:%s'
%
(
name
,
m
.
hexdigest
())
def
_do_extend_raw
(
self
,
name
,
data
,
doing
):
if
name
==
'buildout'
:
...
...
@@ -1511,10 +1748,10 @@ class Options(DictMixin):
raise
zc
.
buildout
.
UserError
(
"No section named %r"
%
iname
)
result
.
update
(
self
.
_do_extend_raw
(
iname
,
raw
,
doing
))
result
=
_annotate_section
(
result
,
""
)
_annotate_section
(
result
,
""
)
data
=
_annotate_section
(
copy
.
deepcopy
(
data
),
""
)
result
=
_update_section
(
result
,
data
)
result
=
_unannotate_section
(
result
)
_update_section
(
result
,
data
)
_unannotate_section
(
result
)
result
.
pop
(
'<'
,
None
)
return
result
finally
:
...
...
@@ -1523,12 +1760,26 @@ class Options(DictMixin):
def
_dosub
(
self
,
option
,
v
):
__doing__
=
'Getting option %s:%s.'
,
self
.
name
,
option
seen
=
[(
self
.
name
,
option
)]
v
=
'$$'
.
join
([
self
.
_sub
(
s
,
seen
)
for
s
in
v
.
split
(
'$$'
)])
v
=
'$$'
.
join
([
self
.
_sub
(
s
,
seen
,
last
=
False
)
for
s
in
v
.
split
(
'$$'
)])
self
.
_cooked
[
option
]
=
v
def
get
(
self
,
option
,
default
=
None
,
seen
=
None
):
def
get
(
self
,
*
args
,
**
kw
):
v
=
self
.
_get
(
*
args
,
**
kw
)
if
hasattr
(
v
,
'startswith'
)
and
v
.
startswith
(
SERIALISED_VALUE_MAGIC
):
v
=
loads
(
v
)
return
v
def
_get
(
self
,
option
,
default
=
None
,
seen
=
None
,
last
=
True
):
# TODO: raise instead of handling a default parameter,
# so that get() never tries to deserialize a default value
# (and then: move deserialization to __getitem__
# and make get() use __getitem__)
try
:
return
self
.
_data
[
option
]
if
last
:
return
self
.
_data
[
option
].
replace
(
'$${'
,
'${'
)
else
:
return
self
.
_data
[
option
]
except
KeyError
:
pass
...
...
@@ -1536,6 +1787,16 @@ class Options(DictMixin):
if
v
is
None
:
v
=
self
.
_raw
.
get
(
option
)
if
v
is
None
:
if
option
==
'__buildout_signature__'
:
buildout
=
self
.
buildout
options
,
sig
=
buildout
.
_initializing
[
-
1
]
if
options
is
self
:
self
.
depends
=
frozenset
(
self
.
depends
)
v
=
self
.
_data
[
option
]
=
' '
.
join
(
sig
+
[
buildout
[
dependency
].
items_signature
for
dependency
in
sorted
(
self
.
depends
)])
return
v
raise
zc
.
buildout
.
UserError
(
"premature access to "
+
option
)
return
default
__doing__
=
'Getting option %s:%s.'
,
self
.
name
,
option
...
...
@@ -1550,16 +1811,20 @@ class Options(DictMixin):
)
else
:
seen
.
append
(
key
)
v
=
'$$'
.
join
([
self
.
_sub
(
s
,
seen
)
for
s
in
v
.
split
(
'$$'
)])
v
=
'$$'
.
join
([
self
.
_sub
(
s
,
seen
,
last
=
False
)
for
s
in
v
.
split
(
'$$'
)])
seen
.
pop
()
self
.
_data
[
option
]
=
v
return
v
if
last
:
return
v
.
replace
(
'$${'
,
'${'
)
else
:
return
v
_template_split
=
re
.
compile
(
'([$]{[^}]*})'
).
split
_simple
=
re
.
compile
(
'[-a-zA-Z0-9 ._]+$'
).
match
_valid
=
re
.
compile
(
r'\
${[-
a-zA-Z0-9 ._]*:[-a-zA-Z0-9 ._]+}$'
).
match
def
_sub
(
self
,
template
,
seen
):
def
_sub
(
self
,
template
,
seen
,
last
=
True
):
value
=
self
.
_template_split
(
template
)
subs
=
[]
for
ref
in
value
[
1
::
2
]:
...
...
@@ -1587,7 +1852,14 @@ class Options(DictMixin):
section
,
option
=
s
if
not
section
:
section
=
self
.
name
v
=
self
.
buildout
[
section
].
get
(
option
,
None
,
seen
)
options
=
self
else
:
self
.
buildout
.
_initializing
.
append
((
self
,))
try
:
options
=
self
.
buildout
[
section
]
finally
:
del
self
.
buildout
.
_initializing
[
-
1
]
v
=
options
.
_get
(
option
,
None
,
seen
,
last
=
last
)
if
v
is
None
:
if
option
==
'_buildout_section_name_'
:
v
=
self
.
name
...
...
@@ -1600,20 +1872,17 @@ class Options(DictMixin):
return
''
.
join
([
''
.
join
(
v
)
for
v
in
zip
(
value
[::
2
],
subs
)])
def
__getitem__
(
self
,
key
):
try
:
return
self
.
_data
[
key
]
except
KeyError
:
pass
v
=
self
.
get
(
key
)
if
v
is
None
:
v
=
self
.
get
(
key
,
_MARKER
)
if
v
is
_MARKER
:
raise
MissingOption
(
"Missing option: %s:%s"
%
(
self
.
name
,
key
))
return
v
def
__setitem__
(
self
,
option
,
value
):
if
not
re
.
match
(
zc
.
buildout
.
configparser
.
option_name_re
+
'$'
,
option
):
raise
zc
.
buildout
.
UserError
(
"Invalid option name %r"
%
(
option
,
))
if
not
isinstance
(
value
,
str
):
raise
TypeError
(
'Option values must be strings'
,
value
)
self
.
_data
[
option
]
=
value
value
=
dumps
(
value
)
self
.
_data
[
option
]
=
value
.
replace
(
'${'
,
'$${'
)
def
__delitem__
(
self
,
key
):
if
key
in
self
.
_raw
:
...
...
@@ -1641,6 +1910,9 @@ class Options(DictMixin):
result
=
copy
.
deepcopy
(
self
.
_raw
)
result
.
update
(
self
.
_cooked
)
result
.
update
(
self
.
_data
)
for
key
,
value
in
result
.
items
():
if
value
.
startswith
(
SERIALISED_VALUE_MAGIC
):
result
[
key
]
=
loads
(
value
)
return
result
def
_call
(
self
,
f
):
...
...
@@ -1672,9 +1944,28 @@ class Options(DictMixin):
self
.
name
)
return
self
.
_created
def
barrier
(
self
):
"""Set self as a default requirement for not-yet processed parts
This method must be called if this part may alter the processing
of other parts in any way, like modifying environment variables.
In other words, it sets an implicit dependency for these parts.
"""
buildout
=
self
.
buildout
if
not
buildout
.
_initializing
:
raise
zc
.
buildout
.
UserError
(
"Options.barrier() shall only be used during initialization"
)
buildout
.
_default_requirement
=
self
.
name
def
__repr__
(
self
):
return
repr
(
dict
(
self
))
def
__eq__
(
self
,
other
):
try
:
return
sorted
(
self
.
items
())
==
sorted
(
other
.
items
())
except
Exception
:
return
super
(
Options
,
self
).
__eq__
(
other
)
Buildout
.
Options
=
Options
_spacey_nl
=
re
.
compile
(
'[
\
t
\
r
\
f
\
v
]*
\
n
[
\
t
\
r
\
f
\
v
\
n
]*'
...
...
@@ -1707,6 +1998,8 @@ def _quote_spacey_nl(match):
return
result
def
_save_option
(
option
,
value
,
f
):
if
not
isinstance
(
value
,
str
):
value
=
dumps
(
value
)
value
=
_spacey_nl
.
sub
(
_quote_spacey_nl
,
value
)
if
value
.
startswith
(
'
\
n
\
t
'
):
value
=
'%(__buildout_space_n__)s'
+
value
[
2
:]
...
...
@@ -1716,10 +2009,12 @@ def _save_option(option, value, f):
def
_save_options
(
section
,
options
,
f
):
print_
(
'[%s]'
%
section
,
file
=
f
)
items
=
list
(
options
.
items
())
items
.
sort
()
for
option
,
value
in
items
:
_save_option
(
option
,
value
,
f
)
try
:
get_option
=
partial
(
options
.
_get
,
last
=
False
)
except
AttributeError
:
get_option
=
options
.
get
for
option
in
sorted
(
options
):
_save_option
(
option
,
get_option
(
option
),
f
)
def
_default_globals
():
"""Return a mapping of default and precomputed expressions.
...
...
@@ -1801,103 +2096,172 @@ def _default_globals():
return
globals_defs
variable_template_split
=
re
.
compile
(
'([$]{[^}]*})'
).
split
def
_open
(
base
,
filename
,
seen
,
download_options
,
override
,
downloaded
,
user_defaults
):
"""Open a configuration file and return the result as a dictionary,
class
_default_globals
(
dict
):
"""
Make sure parser context is computed at most once,
even if several files are parsed.
And compute some values only if accessed.
Recursively open other files based on buildout options found.
If __getitem__ raises, _doing() calls .get('__doing__'),
but that's not the only reason to subclass dict:
CPython requests it (for performance reasons?). PyPy does not care.
"""
download_options
=
_update_section
(
download_options
,
override
)
raw_download_options
=
_unannotate_section
(
download_options
)
newest
=
bool_option
(
raw_download_options
,
'newest'
,
'false'
)
fallback
=
newest
and
not
(
filename
in
downloaded
)
extends_cache
=
raw_download_options
.
get
(
'extends-cache'
)
if
extends_cache
and
variable_template_split
(
extends_cache
)[
1
::
2
]:
raise
ValueError
(
"extends-cache '%s' may not contain ${section:variable} to expand."
%
extends_cache
)
download
=
zc
.
buildout
.
download
.
Download
(
raw_download_options
,
cache
=
extends_cache
,
fallback
=
fallback
,
hash_name
=
True
)
is_temp
=
False
downloaded_filename
=
None
if
_isurl
(
filename
):
downloaded_filename
,
is_temp
=
download
(
filename
)
fp
=
open
(
downloaded_filename
)
base
=
filename
[:
filename
.
rfind
(
'/'
)]
elif
_isurl
(
base
):
if
os
.
path
.
isabs
(
filename
):
fp
=
open
(
filename
)
base
=
os
.
path
.
dirname
(
filename
)
else
:
filename
=
base
+
'/'
+
filename
downloaded_filename
,
is_temp
=
download
(
filename
)
fp
=
open
(
downloaded_filename
)
base
=
filename
[:
filename
.
rfind
(
'/'
)]
else
:
filename
=
os
.
path
.
join
(
base
,
filename
)
fp
=
open
(
filename
)
base
=
os
.
path
.
dirname
(
filename
)
downloaded
.
add
(
filename
)
# XXX: The following line is only to keep access to the overridden global.
# If pushed upstream, proper naming would avoid such hack.
# Meanwhile, the patch consists only in this drop-in class
# and that's easier to maintain.
_default_globals
=
staticmethod
(
_default_globals
)
if
filename
in
seen
:
if
is_temp
:
fp
.
close
()
os
.
remove
(
downloaded_filename
)
raise
zc
.
buildout
.
UserError
(
"Recursive file include"
,
seen
,
filename
)
root_config_file
=
not
seen
seen
.
append
(
filename
)
filename_for_logging
=
filename
if
downloaded_filename
:
filename_for_logging
=
'%s (downloaded as %s)'
%
(
filename
,
downloaded_filename
)
result
=
zc
.
buildout
.
configparser
.
parse
(
fp
,
filename_for_logging
,
_default_globals
)
fp
.
close
()
if
is_temp
:
os
.
remove
(
downloaded_filename
)
options
=
result
.
get
(
'buildout'
,
{})
extends
=
options
.
pop
(
'extends'
,
None
)
if
'extended-by'
in
options
:
raise
zc
.
buildout
.
UserError
(
'No-longer supported "extended-by" option found in %s.'
%
filename
)
def
__getitem__
(
self
,
key
):
cls
=
self
.
__class__
try
:
context
=
self
.
context
except
AttributeError
:
context
=
self
.
context
=
cls
.
_default_globals
()
context
[
'sys'
]
=
_sysproxy
(
self
)
try
:
return
context
[
key
]
except
KeyError
as
e
:
try
:
value
=
getattr
(
self
,
key
)
except
AttributeError
:
pass
else
:
value
=
context
[
key
]
=
value
()
return
value
raise
e
# BBB: On Python 3, a bare 'raise' is enough.
result
=
_annotate
(
result
,
filename
)
def
multiarch
(
self
):
args
=
os
.
getenv
(
'CC'
)
or
'gcc'
,
'-dumpmachine'
self
[
'__doing__'
]
=
'%r'
,
args
m
=
subprocess
.
check_output
(
args
,
universal_newlines
=
True
).
rstrip
()
del
self
[
'__doing__'
]
return
m
if
root_config_file
and
'buildout'
in
result
:
download_options
=
_update_section
(
download_options
,
result
[
'buildout'
]
)
class
_sysproxy
(
object
):
# BBB: alternate/temporary way to get multiarch value
if
extends
:
extends
=
extends
.
split
()
eresult
,
user_defaults
=
_open
(
base
,
extends
.
pop
(
0
),
seen
,
download_options
,
override
,
downloaded
,
user_defaults
)
for
fname
in
extends
:
next_extend
,
user_defaults
=
_open
(
base
,
fname
,
seen
,
download_options
,
override
,
downloaded
,
user_defaults
)
eresult
=
_update
(
eresult
,
next_extend
)
result
=
_update
(
eresult
,
result
)
else
:
if
user_defaults
:
result
=
_update
(
user_defaults
,
result
)
user_defaults
=
{}
seen
.
pop
()
return
result
,
user_defaults
def
__init__
(
self
,
default_globals
):
self
.
__default_globals
=
default_globals
def
__getattr__
(
self
,
name
):
if
name
==
'_multiarch'
:
default_globals
=
self
.
__default_globals
setattr
(
sys
,
name
,
getattr
(
default_globals
,
name
[
1
:])())
default_globals
.
context
[
'sys'
]
=
sys
return
getattr
(
sys
,
name
)
variable_template_split
=
re
.
compile
(
'([$]{[^}]*})'
).
split
class
_extends
(
object
):
def
__new__
(
cls
,
defaults
,
*
args
):
self
=
super
(
_extends
,
cls
).
__new__
(
cls
)
self
.
seen
=
set
()
self
.
processing
=
[]
self
.
extends
=
[
defaults
]
self
.
_download_options
=
[]
self
.
collect
(
*
args
)
return
self
.
merge
()
def
merge
(
self
):
result
=
{}
for
d
in
self
.
extends
:
_update
(
result
,
d
)
return
result
def
__getattr__
(
self
,
attr
):
if
attr
==
'download_options'
:
# Compute processed options
result_so_far
=
self
.
merge
()
self
.
extends
[:]
=
[
result_so_far
]
value
=
copy
.
deepcopy
(
result_so_far
.
get
(
'buildout'
))
or
{}
# Update with currently-being-processed options
for
options
in
reversed
(
self
.
_download_options
):
_update_section
(
value
,
options
)
value
=
_unannotate_section
(
value
)
setattr
(
self
,
attr
,
value
)
return
value
return
self
.
__getattribute__
(
attr
)
def
collect
(
self
,
result
,
base
,
filename
):
options
=
result
.
get
(
'buildout'
,
{})
extends
=
options
.
pop
(
'extends'
,
''
)
# Sanitize buildout options
if
'extended-by'
in
options
:
raise
zc
.
buildout
.
UserError
(
'No-longer supported "extended-by" option found in %s.'
%
filename
)
# Find and expose _profile_base_location_
for
section
in
result
.
values
():
for
value
in
section
.
values
():
if
'${:_profile_base_location_}'
in
value
:
section
[
'_profile_base_location_'
]
=
base
break
_annotate
(
result
,
filename
)
# Collect extends and unprocessed download options
self
.
processing
.
append
(
filename
)
self
.
_download_options
.
append
(
options
)
for
fextends
in
extends
.
split
():
self
.
open
(
base
,
fextends
)
self
.
extends
.
append
(
result
)
del
self
.
processing
[
-
1
],
self
.
_download_options
[
-
1
]
def
open
(
self
,
base
,
filename
):
# Determine file location
if
_isurl
(
filename
):
download
=
True
elif
_isurl
(
base
):
download
=
True
filename
=
urljoin
(
base
+
'/'
,
filename
)
else
:
download
=
False
filename
=
os
.
path
.
realpath
(
os
.
path
.
join
(
base
,
os
.
path
.
expanduser
(
filename
)))
# Detect repetitions and loops
if
filename
in
self
.
seen
:
if
filename
in
self
.
processing
:
raise
zc
.
buildout
.
UserError
(
"circular extends: %s"
%
filename
)
return
self
.
seen
.
add
(
filename
)
# Fetch file
is_temp
=
False
try
:
if
download
:
download_options
=
self
.
download_options
extends_cache
=
download_options
.
get
(
'extends-cache'
)
if
extends_cache
and
variable_template_split
(
extends_cache
)[
1
::
2
]:
raise
ValueError
(
"extends-cache '%s' may not contain ${section:variable} to expand."
%
extends_cache
)
downloaded_filename
,
is_temp
=
zc
.
buildout
.
download
.
Download
(
download_options
,
cache
=
extends_cache
,
fallback
=
bool_option
(
download_options
,
'newest'
),
hash_name
=
True
)(
filename
)
filename_for_logging
=
'%s (downloaded as %s)'
%
(
filename
,
downloaded_filename
)
base
=
filename
[:
filename
.
rfind
(
'/'
)]
else
:
downloaded_filename
=
filename_for_logging
=
filename
base
=
os
.
path
.
dirname
(
filename
)
with
open
(
downloaded_filename
)
as
fp
:
result
=
zc
.
buildout
.
configparser
.
parse
(
fp
,
filename_for_logging
,
_default_globals
)
finally
:
if
is_temp
:
os
.
remove
(
downloaded_filename
)
return
self
.
collect
(
result
,
base
,
filename
)
ignore_directories
=
'.svn'
,
'CVS'
,
'__pycache__'
,
'.git'
...
...
@@ -1946,58 +2310,57 @@ def _dists_sig(dists):
continue
seen
.
add
(
dist
)
location
=
dist
.
location
if
dist
.
precedence
==
pkg_resources
.
DEVELOP_DIST
:
if
(
dist
.
precedence
==
pkg_resources
.
DEVELOP_DIST
and
location
!=
zc
.
buildout
.
easy_install
.
python_lib
and
not
isinstance
(
dist
,
pkg_resources
.
DistInfoDistribution
)):
result
.
append
(
dist
.
project_name
+
'-'
+
_dir_hash
(
location
))
else
:
result
.
append
(
os
.
path
.
basename
(
location
)
)
result
.
append
(
dist
.
project_name
+
'-'
+
dist
.
version
)
return
result
def
_update_section
(
in
1
,
s2
):
s1
=
copy
.
deepcopy
(
in1
)
#
Base section 2 on section 1; section 1 is copied, with key-value pairs
# i
n section 2 overriding those in section 1. If there are += or -=
#
operators in section 2, process these to add or subtract items (delimited
#
by newlines) from the preexisting values
.
s2
=
copy
.
deepcopy
(
s2
)
# avoid mutating the second argument, which is unexpected
# Sort on key, then on the addition or subtraction operator (+ comes first
)
for
k
,
v
in
sorted
(
s2
.
items
(),
key
=
lambda
x
:
(
x
[
0
].
rstrip
(
' +'
),
x
[
0
][
-
1
])
):
def
_update_section
(
s
1
,
s2
):
# Base section 2 on section 1; key-value pairs in s2 override those in s1.
#
If there are += or -= operators in s2, process these to add or subtract
# i
tems (delimited by newlines) from the preexisting values.
#
Sort on key, then on + and - operators, so that KEY < KEY + < KEY -, to
#
process them in this order if several are defined in the same section
.
# Section s1 is modified in place.
keysort
=
lambda
x
:
(
x
[
0
].
rstrip
(
' +'
),
x
[
0
].
endswith
(
'+'
)
)
for
k
,
v
in
sorted
(
s2
.
items
(),
key
=
keysort
):
if
k
.
endswith
(
'+'
):
key
=
k
.
rstrip
(
' +'
)
implicit_value
=
SectionKey
(
""
,
"IMPLICIT_VALUE"
)
# Find v1 in s2 first; it may have been defined locally too.
section_key
=
s2
.
get
(
key
,
s1
.
get
(
key
,
implicit_value
))
section_key
=
copy
.
deepcopy
(
section_key
)
section_key
.
addToValue
(
v
.
value
,
v
.
source
)
s2
[
key
]
=
section_key
del
s2
[
k
]
value
=
s1
.
get
(
key
,
SectionKey
(
""
,
"IMPLICIT_VALUE"
))
value
.
addToValue
(
v
.
value
,
v
.
source
)
s1
[
key
]
=
value
elif
k
.
endswith
(
'-'
):
key
=
k
.
rstrip
(
' -'
)
implicit_value
=
SectionKey
(
""
,
"IMPLICIT_VALUE"
)
# Find v1 in s2 first; it may have been set by a += operation first
section_key
=
s2
.
get
(
key
,
s1
.
get
(
key
,
implicit_value
))
section_key
=
copy
.
deepcopy
(
section_key
)
section_key
.
removeFromValue
(
v
.
value
,
v
.
source
)
s2
[
key
]
=
section_key
del
s2
[
k
]
_update_verbose
(
s1
,
s2
)
return
s1
def
_update_verbose
(
s1
,
s2
):
for
key
,
v2
in
s2
.
items
():
if
key
in
s1
:
v1
=
s1
[
key
]
v1
.
overrideValue
(
v2
)
value
=
s1
.
get
(
key
,
SectionKey
(
""
,
"IMPLICIT_VALUE"
))
value
.
removeFromValue
(
v
.
value
,
v
.
source
)
s1
[
key
]
=
value
else
:
s1
[
key
]
=
copy
.
deepcopy
(
v2
)
if
k
in
s1
:
v1
=
s1
[
k
]
v1
.
overrideValue
(
v
)
else
:
s1
[
k
]
=
v
return
s1
def
_update
(
in1
,
d2
):
d1
=
copy
.
deepcopy
(
in1
)
def
_update
(
d1
,
d2
):
for
section
in
d2
:
if
section
in
d1
:
d1
[
section
]
=
_update_section
(
d1
[
section
],
d2
[
section
])
_update_section
(
d1
[
section
],
d2
[
section
])
else
:
d1
[
section
]
=
copy
.
deepcopy
(
d2
[
section
])
# XXX: In order to process += (and -=) correctly when
# <key> = <value> and <key> += <value> are in the same section
# _update_section should be called even when section is not in d1
# Hack: When <= is used in the section, _update_section will be
# called later anyway, so we can avoid calling it now which will
# enable brittle and partial support for += (or -=) with keys that
# come from the <= sections.
# TODO: Either implement += and -= support with <= fully or call
# _update_section systematically and give up <= compatibility.
s2
=
d2
[
section
]
d1
[
section
]
=
s2
if
'<'
in
s2
else
_update_section
({},
s2
)
return
d1
def
_recipe
(
options
):
...
...
@@ -2095,6 +2458,12 @@ Options:
Print buildout version number and exit.
--dry-run
Dry-run mode. With this setting, buildout will display what will
be uninstalled and what will be installed without doing anything
in reality.
Assignments are of the form: section:option=value and are used to
provide configuration options that override those given in the
configuration file. For example, to run the buildout in offline mode,
...
...
@@ -2214,13 +2583,16 @@ def main(args=None):
_error
(
"No timeout value specified for option"
,
orig_op
)
except
ValueError
:
_error
(
"Timeout value must be numeric"
,
orig_op
)
elif
orig_op
==
'--dry-run'
:
options
.
append
((
'buildout'
,
'dry-run'
,
'true'
))
elif
op
:
if
orig_op
==
'--help'
:
_help
()
elif
orig_op
==
'--version'
:
_version
()
else
:
_error
(
"Invalid option"
,
'-'
+
op
[
0
]
)
_error
(
"Invalid option"
,
orig_op
)
elif
'='
in
args
[
0
]:
option
,
value
=
args
.
pop
(
0
).
split
(
'='
,
1
)
option
=
option
.
split
(
':'
)
...
...
src/zc/buildout/configparser.py
View file @
3b16f5af
...
...
@@ -113,10 +113,12 @@ section_header = re.compile(
r'
([
#;].*)?$)'
).
match
option_name_re
=
r'[^\
s{}[
\]=:]+'
option_start
=
re
.
compile
(
r'(?P<name>
[^\
s{}[
\]=:]+
\
s*[-+]?)
'
r'(?P<name>
%s
\
s*[-+]?)
'
r'
=
'
r'
(
?
P
<
value
>
.
*
)
$
').match
r'
(
?
P
<
value
>
.
*
)
$
'
% option_name_re).match
leading_blank_lines = re.compile(r"^(
\
s*
\
n)+")
...
...
@@ -201,7 +203,12 @@ def parse(fp, fpname, exp_globals=dict):
if not context:
context = exp_globals()
# evaluated expression is in list: get first element
section_condition = eval(expr, context)[0]
try:
section_condition = eval(expr, context)[0]
except NameError as x:
sections.setdefault(sectname, {})[
'__unsupported_conditional_expression__'] = x
continue
# finally, ignore section when an expression
# evaluates to false
if not section_condition:
...
...
@@ -255,6 +262,8 @@ def parse(fp, fpname, exp_globals=dict):
section = sections[sectname]
for name in section:
value = section[name]
if isinstance(value, NameError):
continue
if value[:1].isspace():
section[name] = leading_blank_lines.sub(
'', textwrap.dedent(value.rstrip()))
...
...
src/zc/buildout/download.py
View file @
3b16f5af
...
...
@@ -20,44 +20,47 @@ except ImportError:
try
:
# Python 3
from
urllib.request
import
urlretrieve
from
urllib.parse
import
urlparse
from
urllib.error
import
HTTPError
from
urllib.request
import
Request
,
urlopen
from
urllib.parse
import
urlparse
,
urlunparse
except
ImportError
:
# Python 2
import
base64
from
urlparse
import
urlparse
from
urlparse
import
urlunparse
import
urllib2
def
urlretrieve
(
url
,
tmp_path
):
"""Work around Python issue 24599 including basic auth support
"""
scheme
,
netloc
,
path
,
params
,
query
,
frag
=
urlparse
(
url
)
auth
,
host
=
urllib2
.
splituser
(
netloc
)
if
auth
:
url
=
urlunparse
((
scheme
,
host
,
path
,
params
,
query
,
frag
))
req
=
urllib2
.
Request
(
url
)
base64string
=
base64
.
encodestring
(
auth
)[:
-
1
]
basic
=
"Basic "
+
base64string
req
.
add_header
(
"Authorization"
,
basic
)
else
:
req
=
urllib2
.
Request
(
url
)
url_obj
=
urllib2
.
urlopen
(
req
)
with
open
(
tmp_path
,
'wb'
)
as
fp
:
fp
.
write
(
url_obj
.
read
())
return
tmp_path
,
url_obj
.
info
()
from
urllib2
import
HTTPError
,
Request
,
urlopen
from
zc.buildout.easy_install
import
realpath
from
base64
import
b64encode
from
contextlib
import
closing
import
errno
import
logging
import
netrc
import
os
import
os.path
import
re
import
shutil
import
sys
import
tempfile
import
zc.buildout
from
.
import
bytes2str
,
str2bytes
from
.rmtree
import
rmtree
class
netrc
(
netrc
.
netrc
):
def
__init__
(
*
args
):
pass
def
authenticators
(
self
,
host
):
self
.
__class__
,
=
self
.
__class__
.
__bases__
try
:
self
.
__init__
()
except
IOError
as
e
:
if
e
.
errno
!=
errno
.
ENOENT
:
raise
self
.
__init__
(
os
.
devnull
)
return
self
.
authenticators
(
host
)
netrc
=
netrc
()
class
ChecksumError
(
zc
.
buildout
.
UserError
):
pass
...
...
@@ -74,7 +77,8 @@ class Download(object):
cache: path to the download cache (excluding namespaces)
namespace: namespace directory to use inside the cache
offline: whether to operate in offline mode
fallback: whether to use the cache as a fallback (try downloading first)
fallback: whether to use the cache as a fallback (try downloading first),
when an MD5 checksum is not given
hash_name: whether to use a hash of the URL as cache file name
logger: an optional logger to receive download-related log messages
...
...
@@ -107,7 +111,8 @@ class Download(object):
if
self
.
download_cache
is
not
None
:
return
os
.
path
.
join
(
self
.
download_cache
,
self
.
namespace
or
''
)
def
__call__
(
self
,
url
,
md5sum
=
None
,
path
=
None
):
@
property
def
__call__
(
self
):
"""Download a file according to the utility's configuration.
url: URL to download
...
...
@@ -117,19 +122,14 @@ class Download(object):
Returns the path to the downloaded file.
"""
if
self
.
cache
:
local_path
,
is_temp
=
self
.
download_cached
(
url
,
md5sum
)
else
:
local_path
,
is_temp
=
self
.
download
(
url
,
md5sum
,
path
)
return
self
.
download_cached
if
self
.
cache
else
self
.
download
return
locate_at
(
local_path
,
path
),
is_temp
def
download_cached
(
self
,
url
,
md5sum
=
None
):
def
download_cached
(
self
,
url
,
md5sum
=
None
,
path
=
None
,
alternate_url
=
None
):
"""Download a file from a URL using the cache.
This method assumes that the cache has been configured.
Optionally, it
raises a ChecksumError if a cached copy of a file has an MD5 mismatch,
but will not remove the copy in that case
.
This method assumes that the cache has been configured.
If a cached copy of a file has an MD5 mismatch, download
and update the cache on success
.
"""
if
not
os
.
path
.
exists
(
self
.
download_cache
):
...
...
@@ -146,28 +146,45 @@ class Download(object):
self
.
logger
.
debug
(
'Searching cache at %s'
%
cache_dir
)
if
os
.
path
.
exists
(
cached_path
):
is_temp
=
False
if
self
.
fallback
:
try
:
_
,
is_temp
=
self
.
download
(
url
,
md5sum
,
cached_path
)
except
ChecksumError
:
if
check_md5sum
(
cached_path
,
md5sum
):
if
md5sum
or
not
self
.
fallback
:
self
.
logger
.
debug
(
'Using cache file %s'
,
cached_path
)
return
locate_at
(
cached_path
,
path
),
False
else
:
self
.
logger
.
warning
(
'MD5 checksum mismatch for cached download from %r at %r'
,
url
,
cached_path
)
# Don't download directly to cached_path to minimize
# the probability to alter old data if download fails.
try
:
path
,
is_temp
=
self
.
download
(
url
,
md5sum
,
path
,
alternate_url
)
except
ChecksumError
:
raise
except
Exception
:
if
md5sum
:
raise
except
Exception
:
pass
if
not
check_md5sum
(
cached_path
,
md5sum
):
raise
ChecksumError
(
'MD5 checksum mismatch for cached download '
'from %r at %r'
%
(
url
,
cached_path
))
self
.
logger
.
debug
(
'Using cache file %s'
%
cached_path
)
self
.
logger
.
debug
(
"Fallback to cache using %s"
,
cached_path
,
exception
=
1
)
else
:
samefile
=
getattr
(
os
.
path
,
'samefile'
,
None
)
if
not
(
samefile
and
samefile
(
path
,
cached_path
)):
# update cache
try
:
os
.
remove
(
cached_path
)
except
OSError
as
e
:
if
e
.
errno
!=
errno
.
EISDIR
:
raise
rmtree
(
cached_path
)
locate_at
(
path
,
cached_path
)
return
path
,
is_temp
else
:
self
.
logger
.
debug
(
'Cache miss; will cache %s as %s'
%
(
url
,
cached_path
))
_
,
is_temp
=
self
.
download
(
url
,
md5sum
,
cached_path
)
self
.
download
(
url
,
md5sum
,
cached_path
,
alternate_url
)
return
cached_path
,
is_temp
return
locate_at
(
cached_path
,
path
),
False
def
download
(
self
,
url
,
md5sum
=
None
,
path
=
None
):
def
download
(
self
,
url
,
md5sum
=
None
,
path
=
None
,
alternate_url
=
None
):
"""Download a file from a URL to a given or temporary path.
An online resource is always downloaded to a temporary file and moved
...
...
@@ -196,27 +213,38 @@ class Download(object):
"Couldn't download %r in offline mode."
%
url
)
self
.
logger
.
info
(
'Downloading %s'
%
url
)
handle
,
tmp_path
=
tempfile
.
mkstemp
(
prefix
=
'buildout-'
)
os
.
close
(
handle
)
download_url
=
url
tmp_path
=
path
cleanup
=
True
try
:
tmp_path
,
headers
=
urlretrieve
(
url
,
tmp_path
)
if
not
check_md5sum
(
tmp_path
,
md5sum
):
raise
ChecksumError
(
'MD5 checksum mismatch downloading %r'
%
url
)
except
IOError
:
e
=
sys
.
exc_info
()[
1
]
os
.
remove
(
tmp_path
)
raise
zc
.
buildout
.
UserError
(
"Error downloading extends for URL "
"%s: %s"
%
(
url
,
e
))
except
Exception
:
os
.
remove
(
tmp_path
)
raise
if
path
:
shutil
.
move
(
tmp_path
,
path
)
return
path
,
False
else
:
return
tmp_path
,
True
if
not
path
:
handle
,
tmp_path
=
tempfile
.
mkstemp
(
prefix
=
'buildout-'
)
os
.
close
(
handle
)
self
.
_download
(
url
,
tmp_path
,
md5sum
,
alternate_url
)
cleanup
=
False
finally
:
if
cleanup
and
tmp_path
:
remove
(
tmp_path
)
return
tmp_path
,
not
path
def
_download
(
self
,
url
,
path
,
md5sum
=
None
,
alternate_url
=
None
):
download_url
=
url
try
:
try
:
self
.
urlretrieve
(
url
,
path
)
except
HTTPError
:
if
not
alternate_url
:
raise
self
.
logger
.
info
(
'using alternate URL: %s'
,
alternate_url
)
download_url
=
alternate_url
self
.
urlretrieve
(
alternate_url
,
path
)
if
not
check_md5sum
(
path
,
md5sum
):
raise
ChecksumError
(
'MD5 checksum mismatch downloading %r'
%
download_url
)
except
IOError
as
e
:
raise
zc
.
buildout
.
UserError
(
"Error downloading %s: %s"
%
(
download_url
,
e
))
def
filename
(
self
,
url
):
"""Determine a file name from a URL according to the configuration.
...
...
@@ -245,6 +273,60 @@ class Download(object):
url_host
,
url_port
=
parsed
[
-
2
:]
return
'%s:%s'
%
(
url_host
,
url_port
)
def
_auth
(
self
,
url
):
parsed_url
=
urlparse
(
url
)
if
parsed_url
.
scheme
in
(
'http'
,
'https'
):
auth_host
=
parsed_url
.
netloc
.
rsplit
(
'@'
,
1
)
if
len
(
auth_host
)
>
1
:
return
(
auth_host
[
0
],
parsed_url
.
_replace
(
netloc
=
auth_host
[
1
]).
geturl
())
auth
=
netrc
.
authenticators
(
parsed_url
.
hostname
)
if
auth
:
return
'{0}:{2}'
.
format
(
*
auth
),
url
def
urlretrieve
(
self
,
url
,
tmp_path
):
auth
=
self
.
_auth
(
url
)
if
auth
:
req
=
Request
(
auth
[
1
])
req
.
add_header
(
"Authorization"
,
"Basic "
+
bytes2str
(
b64encode
(
str2bytes
(
auth
[
0
]))))
else
:
req
=
url
with
closing
(
urlopen
(
req
))
as
src
:
with
open
(
tmp_path
,
'wb'
)
as
dst
:
shutil
.
copyfileobj
(
src
,
dst
)
return
tmp_path
,
src
.
info
()
class
Download
(
Download
):
def
_download
(
self
,
url
,
path
,
md5sum
=
None
,
alternate_url
=
None
):
from
.buildout
import
networkcache_client
as
nc
while
nc
:
# not a loop
if
self
.
_auth
(
url
):
# do not cache restricted data
nc
=
None
break
key
=
'file-urlmd5:'
+
md5
(
url
.
encode
()).
hexdigest
()
if
not
nc
.
tryDownload
(
key
):
break
with
nc
:
entry
=
next
(
nc
.
select
(
key
,
{
'url'
:
url
}),
None
)
if
entry
is
None
:
err
=
'no matching entry'
else
:
with
closing
(
nc
.
download
(
entry
[
'sha512'
]))
as
src
,
\
open
(
path
,
'wb'
)
as
dst
:
shutil
.
copyfileobj
(
src
,
dst
)
if
check_md5sum
(
path
,
md5sum
):
return
err
=
'MD5 checksum mismatch'
self
.
logger
.
info
(
'Cannot download from network cache: %s'
,
err
)
break
super
(
Download
,
self
).
_download
(
url
,
path
,
md5sum
,
alternate_url
)
if
nc
and
nc
.
tryUpload
(
key
):
with
nc
,
open
(
path
,
'rb'
)
as
f
:
nc
.
upload
(
f
,
key
,
url
=
url
)
def
check_md5sum
(
path
,
md5sum
):
"""Tell whether the MD5 checksum of the file at path matches.
...
...
src/zc/buildout/easy_install.py
View file @
3b16f5af
...
...
@@ -18,8 +18,10 @@ It doesn't install scripts. It uses setuptools and requires it to be
installed.
"""
import
atexit
import
copy
import
distutils.errors
import
distutils.sysconfig
import
errno
import
glob
import
logging
...
...
@@ -33,6 +35,7 @@ import setuptools.command.easy_install
import
setuptools.command.setopt
import
setuptools.package_index
import
shutil
import
stat
import
subprocess
import
sys
import
tempfile
...
...
@@ -41,6 +44,8 @@ import zc.buildout.rmtree
from
zc.buildout
import
WINDOWS
from
zc.buildout
import
PY3
import
warnings
from
contextlib
import
closing
from
setuptools.package_index
import
distros_for_location
,
URL_SCHEME
import
csv
try
:
...
...
@@ -56,6 +61,8 @@ except ImportError:
BIN_SCRIPTS
=
'Scripts'
if
WINDOWS
else
'bin'
WHL_DIST
=
pkg_resources
.
EGG_DIST
+
1
warnings
.
filterwarnings
(
'ignore'
,
'.+is being parsed as a legacy, non PEP 440, version'
)
...
...
@@ -77,6 +84,9 @@ is_source_encoding_line = re.compile(r'coding[:=]\s*([-\w.]+)').search
is_win32
=
sys
.
platform
==
'win32'
is_jython
=
sys
.
platform
.
startswith
(
'java'
)
PATCH_MARKER
=
'SlapOSPatched'
orig_versions_re
=
re
.
compile
(
r'[+\
-]%s
\d+'
%
PATCH_MARKER
)
if
is_jython
:
import
java.lang.System
jython_os_name
=
(
java
.
lang
.
System
.
getProperties
()[
'os.name'
]).
lower
()
...
...
@@ -91,17 +101,24 @@ if has_distribute and not has_setuptools:
sys
.
exit
(
"zc.buildout 3 needs setuptools, not distribute."
"Did you properly install with pip in a virtualenv ?"
)
# Include buildout and setuptools eggs in paths. We get this
# initially from the entire working set. Later, we'll use the install
# function to narrow to just the buildout and setuptools paths.
buildout_and_setuptools_path
=
sorted
({
d
.
location
for
d
in
pkg_resources
.
working_set
})
setuptools_path
=
buildout_and_setuptools_path
pip_path
=
buildout_and_setuptools_path
logger
.
debug
(
'before restricting versions: pip_path %r'
,
pip_path
)
# XXX Take care to respect the sys.path order, as otherwise other
# distributions for pip, wheel and setuptools may take precedence
# over the ones currently running.
pip_path
=
setuptools_path
=
[
dist
.
location
for
dist
in
pkg_resources
.
working_set
if
dist
.
project_name
in
(
'pip'
,
'wheel'
,
'setuptools'
)
]
pip_pythonpath
=
setuptools_pythonpath
=
os
.
pathsep
.
join
(
pip_path
)
python_lib
=
distutils
.
sysconfig
.
get_python_lib
()
FILE_SCHEME
=
re
.
compile
(
'file://'
,
re
.
I
).
match
DUNDER_FILE_PATTERN
=
re
.
compile
(
r"__file__ = '(?P<filename>.+)'$"
)
networkcache_key
=
'pypi:{}={}'
.
format
class
_Monkey
(
object
):
def
__init__
(
self
,
module
,
**
kw
):
mdict
=
self
.
_mdict
=
module
.
__dict__
...
...
@@ -140,7 +157,7 @@ class AllowHostsPackageIndex(setuptools.package_index.PackageIndex):
_indexes
=
{}
def
_get_index
(
index_url
,
find_links
,
allow_hosts
=
(
'*'
,)):
key
=
index_url
,
tuple
(
find_links
)
key
=
index_url
,
tuple
(
find_links
)
,
allow_hosts
index
=
_indexes
.
get
(
key
)
if
index
is
not
None
:
return
index
...
...
@@ -165,7 +182,12 @@ if is_win32:
def
_safe_arg
(
arg
):
return
'"%s"'
%
arg
else
:
_safe_arg
=
str
def
_safe_arg
(
arg
):
if
len
(
arg
)
<
126
:
return
arg
else
:
# Workaround for the shebang line length limitation.
return
'/bin/sh
\
n
"exec" "%s" "$0" "$@"'
%
arg
def
call_subprocess
(
args
,
**
kw
):
if
subprocess
.
call
(
args
,
**
kw
)
!=
0
:
...
...
@@ -229,6 +251,10 @@ def dist_needs_pkg_resources(dist):
)
_doing_list
=
type
(
''
,
(),
{
'__mod__'
:
staticmethod
(
lambda
x
:
'
\
n
'
.
join
(
*
x
))})()
class
Installer
(
object
):
_versions
=
{}
...
...
@@ -241,6 +267,7 @@ class Installer(object):
_allow_picked_versions
=
True
_store_required_by
=
False
_allow_unknown_extras
=
False
_extra_paths
=
[]
def
__init__
(
self
,
dest
=
None
,
...
...
@@ -275,7 +302,7 @@ class Installer(object):
links
.
insert
(
0
,
self
.
_download_cache
)
self
.
_index_url
=
index
path
=
(
path
and
path
[:]
or
[])
+
buildout_and_setuptools_path
path
=
(
path
and
path
[:]
or
[])
+
self
.
_extra_paths
self
.
_path
=
path
if
self
.
_dest
is
None
:
newest
=
False
...
...
@@ -289,36 +316,15 @@ class Installer(object):
self
.
_versions
=
normalize_versions
(
versions
)
def
_make_env
(
self
):
full_path
=
self
.
_get_dest_dist_paths
()
+
self
.
_path
env
=
pkg_resources
.
Environment
(
full_path
)
# this needs to be called whenever self._env is modified (or we could
# make an Environment subclass):
self
.
_eggify_env_dest_dists
(
env
,
self
.
_dest
)
return
env
dest
=
self
.
_dest
full_path
=
[]
if
dest
is
None
else
[
dest
]
full_path
.
extend
(
self
.
_path
)
return
pkg_resources
.
Environment
(
full_path
)
def
_env_rescan_dest
(
self
):
self
.
_env
.
scan
(
self
.
_get_dest_dist_paths
())
self
.
_eggify_env_dest_dists
(
self
.
_env
,
self
.
_dest
)
def
_get_dest_dist_paths
(
self
):
dest
=
self
.
_dest
if
dest
is
None
:
return
[]
eggs
=
glob
.
glob
(
os
.
path
.
join
(
dest
,
'*.egg'
))
dists
=
[
os
.
path
.
dirname
(
dist_info
)
for
dist_info
in
glob
.
glob
(
os
.
path
.
join
(
dest
,
'*'
,
'*.dist-info'
))]
return
list
(
set
(
eggs
+
dists
))
@
staticmethod
def
_eggify_env_dest_dists
(
env
,
dest
):
"""
Make sure everything found under `dest` is seen as an egg, even if it's
some other kind of dist.
"""
for
project_name
in
env
:
for
dist
in
env
[
project_name
]:
if
os
.
path
.
dirname
(
dist
.
location
)
==
dest
:
dist
.
precedence
=
pkg_resources
.
EGG_DIST
if
dest
is
not
None
:
self
.
_env
.
scan
([
dest
])
def
_version_conflict_information
(
self
,
name
):
"""Return textual requirements/constraint information for debug purposes
...
...
@@ -423,11 +429,11 @@ class Installer(object):
str
(
req
))
return
best_we_have
,
None
def
_call_pip_
instal
l
(
self
,
spec
,
dest
,
dist
):
def
_call_pip_
whee
l
(
self
,
spec
,
dest
,
dist
):
tmp
=
tempfile
.
mkdtemp
(
dir
=
dest
)
try
:
paths
=
call_pip_
install
(
spec
,
tmp
)
paths
=
call_pip_
wheel
(
spec
,
tmp
,
self
)
dists
=
[]
env
=
pkg_resources
.
Environment
(
paths
)
...
...
@@ -459,30 +465,78 @@ class Installer(object):
result
=
[]
for
d
in
dists
:
result
.
append
(
_move_to_eggs_dir_and_compile
(
d
,
dest
))
result
.
append
(
_move_to_eggs_dir_and_compile
(
d
,
dest
,
self
))
return
result
finally
:
zc
.
buildout
.
rmtree
.
rmtree
(
tmp
)
def
_obtain
(
self
,
requirement
,
source
=
None
):
# initialize out index for this project:
def
_obtain
(
self
,
requirement
,
source
=
None
,
networkcache_failed
=
False
):
# get the non-patched version
req
=
str
(
requirement
)
if
PATCH_MARKER
in
req
:
requirement
=
pkg_resources
.
Requirement
.
parse
(
re
.
sub
(
orig_versions_re
,
''
,
req
))
wheel
=
getattr
(
requirement
,
'wheel'
,
False
)
def
filter_precedence
(
dist
):
return
(
dist
.
precedence
==
WHL_DIST
)
==
wheel
and
(
dist
.
precedence
==
pkg_resources
.
SOURCE_DIST
if
source
else
not
(
dist
.
precedence
==
pkg_resources
.
DEVELOP_DIST
and
{
'setup.py'
,
'pyproject.toml'
}.
isdisjoint
(
os
.
listdir
(
dist
.
location
)))
)
index
=
self
.
_index
if
not
networkcache_failed
:
try
:
(
operator
,
version
,),
=
requirement
.
specs
except
ValueError
:
pass
else
:
# Network cache is not expected to contain all versions so it
# couldn't tell whether a found version is the best existing
# one. Therefore, it's only accessed when we have a
# specification for a single version, which is anyway enough
# for our usage (picked versions not allowed).
if
operator
==
'=='
:
# But first, avoid any network access by checking local
# urls. PackageIndex.add_find_links scans them immediately.
dists
=
[
dist
for
dist
in
index
[
requirement
.
project_name
]
if
dist
in
requirement
and
filter_precedence
(
dist
)
and
(
FILE_SCHEME
(
dist
.
location
)
or
not
URL_SCHEME
(
dist
.
location
))]
if
dists
:
return
max
(
dists
)
from
.buildout
import
networkcache_client
as
nc
if
nc
:
key
=
networkcache_key
(
requirement
.
key
,
version
)
if
nc
.
tryDownload
(
key
):
with
nc
:
for
entry
in
nc
.
select
(
key
):
basename
=
entry
[
'basename'
]
for
dist
in
distros_for_location
(
entry
[
'sha512'
],
basename
):
# The version comparison is to keep
# the one that's correctly parsed by
# distros_for_location.
if
(
dist
.
version
==
version
and
self
.
_env
.
can_add
(
dist
)
and
filter_precedence
(
dist
)):
dist
.
networkcache
=
(
basename
,
requirement
,
source
)
dists
.
append
(
dist
)
if
dists
:
return
max
(
dists
)
# initialize out index for this project:
if
index
.
obtain
(
requirement
)
is
None
:
# Nothing is available.
return
None
# Filter the available dists for the requirement and source flag
dists
=
[
dist
for
dist
in
index
[
requirement
.
project_name
]
if
((
dist
in
requirement
)
and
((
not
source
)
or
(
dist
.
precedence
==
pkg_resources
.
SOURCE_DIST
)
)
)
]
if
dist
in
requirement
and
filter_precedence
(
dist
)]
# If we prefer final dists, filter for final and use the
# result if it is non empty.
...
...
@@ -519,18 +573,41 @@ class Installer(object):
):
return
dist
best
.
sort
()
return
best
[
-
1
]
return
max
(
best
)
def
_fetch
(
self
,
dist
,
tmp
,
download_cache
):
if
(
download_cache
and
(
realpath
(
os
.
path
.
dirname
(
dist
.
location
))
==
download_cache
)
):
logger
.
debug
(
"Download cache has %s at: %s"
,
dist
,
dist
.
location
)
return
dist
from
.buildout
import
networkcache_client
as
nc
while
hasattr
(
dist
,
'networkcache'
):
basename
,
requirement
,
source
=
dist
.
networkcache
new_location
=
os
.
path
.
join
(
tmp
,
basename
)
with
nc
,
closing
(
nc
.
download
(
dist
.
location
))
as
src
,
\
open
(
new_location
,
'wb'
)
as
dst
:
shutil
.
copyfileobj
(
src
,
dst
)
break
# Downloading content from network cache failed: let's resume index
# lookup to get a fallback url. This will respect _satisfied()
# decision because the specification is for a single version.
dist
=
self
.
_obtain
(
requirement
,
source
,
networkcache_failed
=
True
)
if
dist
is
None
:
raise
zc
.
buildout
.
UserError
(
"Couldn't find a distribution for %r."
%
str
(
requirement
))
else
:
if
(
download_cache
and
(
realpath
(
os
.
path
.
dirname
(
dist
.
location
))
==
download_cache
)
):
logger
.
debug
(
"Download cache has %s at: %s"
,
dist
,
dist
.
location
)
return
dist
logger
.
debug
(
"Fetching %s from: %s"
,
dist
,
dist
.
location
)
new_location
=
self
.
_index
.
download
(
dist
.
location
,
tmp
)
if
nc
:
key
=
networkcache_key
(
dist
.
key
,
dist
.
version
)
if
nc
.
tryUpload
(
key
):
with
nc
,
open
(
new_location
,
'rb'
)
as
f
:
nc
.
upload
(
f
,
key
,
basename
=
os
.
path
.
basename
(
new_location
))
logger
.
debug
(
"Fetching %s from: %s"
,
dist
,
dist
.
location
)
new_location
=
self
.
_index
.
download
(
dist
.
location
,
tmp
)
if
(
download_cache
and
(
realpath
(
new_location
)
==
realpath
(
dist
.
location
))
and
os
.
path
.
isfile
(
new_location
)
...
...
@@ -578,7 +655,7 @@ class Installer(object):
raise
zc
.
buildout
.
UserError
(
"Couldn't download distribution %s."
%
avail
)
dists
=
[
_move_to_eggs_dir_and_compile
(
dist
,
self
.
_dest
)]
dists
=
[
_move_to_eggs_dir_and_compile
(
dist
,
self
.
_dest
,
self
)]
for
_d
in
dists
:
if
_d
not
in
ws
:
ws
.
add
(
_d
,
replace
=
True
)
...
...
@@ -660,6 +737,9 @@ class Installer(object):
"""Return requirement with optional [versions] constraint added."""
constraint
=
self
.
_versions
.
get
(
requirement
.
project_name
.
lower
())
if
constraint
:
wheel
=
constraint
.
endswith
(
':whl'
)
if
wheel
:
constraint
=
constraint
[:
-
4
]
try
:
requirement
=
_constrained_requirement
(
constraint
,
requirement
)
...
...
@@ -667,12 +747,15 @@ class Installer(object):
logger
.
info
(
self
.
_version_conflict_information
(
requirement
.
project_name
.
lower
()))
raise
if
wheel
:
requirement
.
wheel
=
True
return
requirement
def
install
(
self
,
specs
,
working_set
=
None
):
def
install
(
self
,
specs
,
working_set
=
None
,
patch_dict
=
None
):
logger
.
debug
(
'Installing %s.'
,
repr
(
specs
)[
1
:
-
1
])
__doing__
=
_doing_list
,
self
.
_requirements_and_constraints
self
.
_requirements_and_constraints
.
append
(
"Base installation request: %s"
%
repr
(
specs
)[
1
:
-
1
])
...
...
@@ -693,6 +776,9 @@ class Installer(object):
ws
=
working_set
for
requirement
in
requirements
:
if
patch_dict
and
requirement
.
project_name
in
patch_dict
:
self
.
_env
.
scan
(
self
.
build
(
str
(
requirement
),
{},
patch_dict
=
patch_dict
))
for
dist
in
self
.
_get_dist
(
requirement
,
ws
):
self
.
_maybe_add_setuptools
(
ws
,
dist
)
...
...
@@ -706,10 +792,6 @@ class Installer(object):
requirements
.
reverse
()
# Set up the stack.
processed
=
{}
# This is a set of processed requirements.
best
=
{}
# This is a mapping of package name -> dist.
# Note that we don't use the existing environment, because we want
# to look for new eggs unless what we have is the best that
# matches the requirement.
env
=
pkg_resources
.
Environment
(
ws
.
entries
)
while
requirements
:
# Process dependencies breadth-first.
...
...
@@ -721,7 +803,15 @@ class Installer(object):
dist
=
best
.
get
(
req
.
key
)
if
dist
is
None
:
try
:
dist
=
env
.
best_match
(
req
,
ws
)
# Note that we first attempt to find an already active dist
# in the working set. This will detect version conflicts.
# XXX We expressly avoid activating dists in the entries of
# the current working set: they might not reflect the order
# of the environment. This is not so bad when the versions
# are pinned, but when calling install(['zc.buildout']), it
# can come up with completely different dists than the ones
# currently running.
dist
=
ws
.
find
(
req
)
except
pkg_resources
.
VersionConflict
as
err
:
logger
.
debug
(
"Version conflict while processing requirement %s "
...
...
@@ -741,6 +831,9 @@ class Installer(object):
else
:
logger
.
debug
(
'Adding required %r'
,
str
(
req
))
self
.
_log_requirement
(
ws
,
req
)
if
patch_dict
and
req
.
project_name
in
patch_dict
:
self
.
_env
.
scan
(
self
.
build
(
str
(
req
),
{},
patch_dict
=
patch_dict
))
for
dist
in
self
.
_get_dist
(
req
,
ws
):
self
.
_maybe_add_setuptools
(
ws
,
dist
)
if
dist
not
in
req
:
...
...
@@ -787,7 +880,7 @@ class Installer(object):
processed
[
req
]
=
True
return
ws
def
build
(
self
,
spec
,
build_ext
):
def
build
(
self
,
spec
,
build_ext
,
patch_dict
=
None
):
requirement
=
self
.
_constrain
(
pkg_resources
.
Requirement
.
parse
(
spec
))
...
...
@@ -838,14 +931,33 @@ class Installer(object):
)
base
=
os
.
path
.
dirname
(
setups
[
0
])
setup_cfg_dict
=
{
'build_ext'
:
build_ext
}
patch_dict
=
(
patch_dict
or
{}).
get
(
re
.
sub
(
'[<>=].*'
,
''
,
spec
))
if
patch_dict
:
setup_cfg_dict
.
update
(
{
'egg_info'
:{
'tag_build'
:
'+%s%03d'
%
(
PATCH_MARKER
,
patch_dict
[
'patch_revision'
])}})
for
i
,
patch
in
enumerate
(
patch_dict
[
'patches'
]):
url
,
md5sum
=
(
patch
.
strip
().
split
(
'#'
,
1
)
+
[
''
])[:
2
]
download
=
zc
.
buildout
.
download
.
Download
()
path
,
is_temp
=
download
(
url
,
md5sum
=
md5sum
or
None
,
path
=
os
.
path
.
join
(
tmp
,
'patch.%s'
%
i
))
args
=
[
patch_dict
[
'patch_binary'
]]
+
patch_dict
[
'patch_options'
]
kwargs
=
{
'cwd'
:
base
,
'stdin'
:
open
(
path
)}
popen
=
subprocess
.
Popen
(
args
,
**
kwargs
)
popen
.
communicate
()
if
popen
.
returncode
!=
0
:
raise
subprocess
.
CalledProcessError
(
popen
.
returncode
,
' '
.
join
(
args
))
setup_cfg
=
os
.
path
.
join
(
base
,
'setup.cfg'
)
if
not
os
.
path
.
exists
(
setup_cfg
):
f
=
open
(
setup_cfg
,
'w'
)
f
.
close
()
setuptools
.
command
.
setopt
.
edit_config
(
setup_cfg
,
dict
(
build_ext
=
build_ext
)
)
setup_cfg
,
setup_cfg_dict
)
dists
=
self
.
_call_pip_
instal
l
(
base
,
self
.
_dest
,
dist
)
dists
=
self
.
_call_pip_
whee
l
(
base
,
self
.
_dest
,
dist
)
return
[
dist
.
location
for
dist
in
dists
]
finally
:
...
...
@@ -946,6 +1058,12 @@ def get_picked_versions():
required_by
=
Installer
.
_required_by
return
(
picked_versions
,
required_by
)
def
extra_paths
(
setting
=
None
):
old
=
Installer
.
_extra_paths
if
setting
is
not
None
:
Installer
.
_extra_paths
=
setting
return
old
def
install
(
specs
,
dest
,
links
=
(),
index
=
None
,
...
...
@@ -957,6 +1075,7 @@ def install(specs, dest,
allowed_eggs_from_site_packages
=
None
,
check_picked
=
True
,
allow_unknown_extras
=
False
,
patch_dict
=
None
,
):
assert
executable
==
sys
.
executable
,
(
executable
,
sys
.
executable
)
assert
include_site_packages
is
None
...
...
@@ -968,31 +1087,19 @@ def install(specs, dest,
allow_hosts
=
allow_hosts
,
check_picked
=
check_picked
,
allow_unknown_extras
=
allow_unknown_extras
)
return
installer
.
install
(
specs
,
working_set
)
buildout_and_setuptools_dists
=
list
(
install
([
'zc.buildout'
],
None
,
check_picked
=
False
))
buildout_and_setuptools_path
=
sorted
({
d
.
location
for
d
in
buildout_and_setuptools_dists
})
pip_dists
=
[
d
for
d
in
buildout_and_setuptools_dists
if
d
.
project_name
!=
'zc.buildout'
]
pip_path
=
sorted
({
d
.
location
for
d
in
pip_dists
})
logger
.
debug
(
'after restricting versions: pip_path %r'
,
pip_path
)
pip_pythonpath
=
os
.
pathsep
.
join
(
pip_path
)
setuptools_path
=
pip_path
setuptools_pythonpath
=
pip_pythonpath
return
installer
.
install
(
specs
,
working_set
,
patch_dict
=
patch_dict
)
def
build
(
spec
,
dest
,
build_ext
,
links
=
(),
index
=
None
,
executable
=
sys
.
executable
,
path
=
None
,
newest
=
True
,
versions
=
None
,
allow_hosts
=
(
'*'
,)):
path
=
None
,
newest
=
True
,
versions
=
None
,
allow_hosts
=
(
'*'
,),
patch_dict
=
None
):
assert
executable
==
sys
.
executable
,
(
executable
,
sys
.
executable
)
installer
=
Installer
(
dest
,
links
,
index
,
executable
,
True
,
path
,
newest
,
versions
,
allow_hosts
=
allow_hosts
)
return
installer
.
build
(
spec
,
build_ext
)
return
installer
.
build
(
spec
,
build_ext
,
patch_dict
=
patch_dict
)
def
_rm
(
*
paths
):
...
...
@@ -1097,9 +1204,15 @@ def develop(setup, dest,
undo
.
append
(
lambda
:
os
.
remove
(
tsetup
))
undo
.
append
(
lambda
:
os
.
close
(
fd
))
extra_path
=
os
.
environ
.
get
(
'PYTHONEXTRAPATH'
)
extra_path_list
=
[]
if
extra_path
:
extra_path_list
=
extra_path
.
split
(
os
.
pathsep
)
os
.
write
(
fd
,
(
runsetup_template
%
dict
(
setupdir
=
directory
,
setup
=
setup
,
path_list
=
extra_path_list
,
__file__
=
setup
,
)).
encode
())
...
...
@@ -1159,6 +1272,10 @@ def scripts(reqs, working_set, executable, dest=None,
if
p
not
in
unique_path
:
unique_path
.
append
(
p
)
path
=
[
realpath
(
p
)
for
p
in
unique_path
]
try
:
path
.
remove
(
python_lib
)
except
ValueError
:
pass
generated
=
[]
...
...
@@ -1176,14 +1293,16 @@ def scripts(reqs, working_set, executable, dest=None,
req
=
pkg_resources
.
Requirement
.
parse
(
req
)
if
req
.
marker
and
not
req
.
marker
.
evaluate
():
continue
has_extras
=
set
(
req
.
extras
).
issuperset
dist
=
working_set
.
find
(
req
)
# regular console_scripts entry points
for
name
in
pkg_resources
.
get_entry_map
(
dist
,
'console_scripts'
):
entry_point
=
dist
.
get_entry_info
(
'console_scripts'
,
name
)
entry_points
.
append
(
(
name
,
entry_point
.
module_name
,
'.'
.
join
(
entry_point
.
attrs
))
)
if
has_extras
(
entry_point
.
extras
):
entry_points
.
append
(
(
name
,
entry_point
.
module_name
,
'.'
.
join
(
entry_point
.
attrs
))
)
# The metadata on "old-style" distutils scripts is not retained by
# distutils/setuptools, except by placing the original scripts in
# /EGG-INFO/scripts/.
...
...
@@ -1329,6 +1448,12 @@ join = os.path.join
base = os.path.dirname(os.path.abspath(os.path.realpath(__file__)))
"""
def
_initialization
(
path
,
initialization
):
return
"""sys.path[0:0] = [
%s,
]
"""
%
path
+
initialization
if
path
else
initialization
def
_script
(
module_name
,
attrs
,
path
,
dest
,
arguments
,
initialization
,
rsetup
):
if
is_win32
:
dest
+=
'-script.py'
...
...
@@ -1337,11 +1462,10 @@ def _script(module_name, attrs, path, dest, arguments, initialization, rsetup):
contents
=
script_template
%
dict
(
python
=
python
,
path
=
path
,
module_name
=
module_name
,
attrs
=
attrs
,
arguments
=
arguments
,
initialization
=
initialization
,
initialization
=
_initialization
(
path
,
initialization
)
,
relative_paths_setup
=
rsetup
,
)
return
_create_script
(
contents
,
dest
)
...
...
@@ -1374,8 +1498,7 @@ def _distutils_script(path, dest, script_content, initialization, rsetup):
contents
=
distutils_script_template
%
dict
(
python
=
python
,
path
=
path
,
initialization
=
initialization
,
initialization
=
_initialization
(
path
,
initialization
),
relative_paths_setup
=
rsetup
,
before
=
before
,
after
=
after
...
...
@@ -1443,9 +1566,6 @@ script_template = script_header + '''\
%(relative_paths_setup)s
import sys
sys.path[0:0] = [
%(path)s,
]
%(initialization)s
import %(module_name)s
...
...
@@ -1457,9 +1577,6 @@ distutils_script_template = script_header + '''
%(before)s
%(relative_paths_setup)s
import sys
sys.path[0:0] = [
%(path)s,
]
%(initialization)s
%(after)s'''
...
...
@@ -1472,14 +1589,12 @@ def _pyscript(path, dest, rsetup, initialization=''):
dest
+=
'-script.py'
python
=
_safe_arg
(
sys
.
executable
)
if
path
:
path
+=
','
# Courtesy comma at the end of the list.
contents
=
py_script_template
%
dict
(
python
=
python
,
path
=
path
,
relative_paths_setup
=
rsetup
,
initialization
=
initialization
,
initialization
=
_initialization
(
path
,
initialization
)
,
)
changed
=
_file_changed
(
dest
,
contents
)
...
...
@@ -1514,9 +1629,6 @@ py_script_template = script_header + '''\
%%(relative_paths_setup)s
import sys
sys.path[0:0] = [
%%(path)s
]
%%(initialization)s
_interactive = True
...
...
@@ -1551,8 +1663,14 @@ import sys
sys.path.insert(0, %%(setupdir)r)
sys.path[0:0] = %r
for extra_path in %%(path_list)r:
sys.path.insert(0, extra_path)
import os, setuptools
os.environ['PYTHONPATH'] = (os.pathsep).join(sys.path[:])
__file__ = %%(__file__)r
os.chdir(%%(setupdir)r)
...
...
@@ -1595,25 +1713,49 @@ class MissingDistribution(zc.buildout.UserError):
req
,
ws
=
self
.
data
return
"Couldn't find a distribution for %r."
%
str
(
req
)
def
chmod
(
path
):
mode
=
os
.
lstat
(
path
).
st_mode
if
stat
.
S_ISLNK
(
mode
):
return
# give the same permission but write as owner to group and other.
mode
=
stat
.
S_IMODE
(
mode
)
urx
=
(
mode
>>
6
)
&
5
new_mode
=
mode
&
~
0o77
|
urx
<<
3
|
urx
if
new_mode
!=
mode
:
os
.
chmod
(
path
,
new_mode
)
def
redo_pyc
(
egg
):
if
not
os
.
path
.
isdir
(
egg
):
return
for
dirpath
,
dirnames
,
filenames
in
os
.
walk
(
egg
):
chmod
(
dirpath
)
for
filename
in
filenames
:
filepath
=
os
.
path
.
join
(
dirpath
,
filename
)
try
:
chmod
(
filepath
)
except
OSError
as
e
:
if
e
.
errno
!=
errno
.
ENOENT
:
raise
continue
if
not
filename
.
endswith
(
'.py'
):
continue
filepath
=
os
.
path
.
join
(
dirpath
,
filename
)
if
not
(
os
.
path
.
exists
(
filepath
+
'c'
)
or
os
.
path
.
exists
(
filepath
+
'o'
)):
old
=
[]
pycache
=
os
.
path
.
join
(
dirpath
,
'__pycache__'
,
filename
[:
-
3
]
+
'.*.py'
)
for
suffix
in
'co'
:
if
os
.
path
.
exists
(
filepath
+
suffix
):
old
.
append
(
filepath
+
suffix
)
old
+=
glob
.
glob
(
pycache
+
suffix
)
if
not
old
:
# If it wasn't compiled, it may not be compilable
continue
# OK, it looks like we should try to compile.
# Remove old files.
for
suffix
in
'co'
:
if
os
.
path
.
exists
(
filepath
+
suffix
):
os
.
remove
(
filepath
+
suffix
)
for
old
in
old
:
os
.
remove
(
old
)
# Compile under current optimization
try
:
...
...
@@ -1657,19 +1799,34 @@ class IncompatibleConstraintError(zc.buildout.UserError):
IncompatibleVersionError
=
IncompatibleConstraintError
# Backward compatibility
def
call_pip_install
(
spec
,
dest
):
# Temporary HOME with .pydistutils.cfg to disable setup_requires
pip_pydistutils_home
=
tempfile
.
mkdtemp
(
'pip-pydistutils-home'
)
with
open
(
os
.
path
.
join
(
pip_pydistutils_home
,
'.pydistutils.cfg'
),
'w'
)
as
f
:
f
.
write
(
"[easy_install]
\
n
"
"index-url = file:///dev/null"
)
atexit
.
register
(
zc
.
buildout
.
rmtree
.
rmtree
,
pip_pydistutils_home
)
def
call_pip_wheel
(
spec
,
dest
,
options
):
"""
Call `pip
instal
l` from a subprocess to install a
Call `pip
whee
l` from a subprocess to install a
distribution specified by `spec` into `dest`.
Returns all the paths inside `dest` created by the above.
"""
args
=
[
sys
.
executable
,
'-m'
,
'pip'
,
'
install'
,
'--no-deps'
,
'-t
'
,
dest
]
args
=
[
sys
.
executable
,
'-m'
,
'pip'
,
'
wheel'
,
'--no-deps'
,
'-w
'
,
dest
]
level
=
logger
.
getEffectiveLevel
()
if
level
>=
logging
.
INFO
:
args
.
append
(
'-q'
)
else
:
args
.
append
(
'-v'
)
# Prevent pip from installing build dependencies on the fly
# without respecting pinned versions. This only works for
# PEP 517 specifications using pyproject.toml and not for
# dependencies in setup_requires option in legacy setup.py
if
not
options
.
_allow_picked_versions
:
args
.
append
(
'--no-index'
)
args
.
append
(
'--no-build-isolation'
)
args
.
append
(
spec
)
try
:
...
...
@@ -1678,14 +1835,19 @@ def call_pip_install(spec, dest):
except
ImportError
:
HAS_WARNING_OPTION
=
False
if
HAS_WARNING_OPTION
:
if
not
hasattr
(
call_pip_
instal
l
,
'displayed'
):
call_pip_
instal
l
.
displayed
=
True
if
not
hasattr
(
call_pip_
whee
l
,
'displayed'
):
call_pip_
whee
l
.
displayed
=
True
else
:
args
.
append
(
'--no-python-version-warning'
)
env
=
copy
.
copy
(
os
.
environ
)
python_path
=
copy
.
copy
(
pip_path
)
python_path
.
append
(
env
.
get
(
'PYTHONPATH'
,
''
))
env
=
os
.
environ
.
copy
()
python_path
=
pip_path
[:]
env_paths
=
env
.
get
(
'PYTHONPATH'
)
if
env_paths
:
python_path
.
append
(
env_paths
)
extra_env_path
=
env
.
get
(
'PYTHONEXTRAPATH'
)
if
extra_env_path
:
python_path
.
append
(
extra_env_path
)
env
[
'PYTHONPATH'
]
=
os
.
pathsep
.
join
(
python_path
)
if
level
<=
logging
.
DEBUG
:
...
...
@@ -1694,138 +1856,33 @@ def call_pip_install(spec, dest):
sys
.
stdout
.
flush
()
# We want any pending output first
exit_code
=
subprocess
.
call
(
list
(
args
),
env
=
env
)
# Prevent setuptools from downloading and thus installing
# build dependencies specified in setup_requires option of
# legacy setup.py by providing a crafted .pydistutils.cfg.
# This is used in complement to --no-build-isolation.
if
not
options
.
_allow_picked_versions
:
env
[
'HOME'
]
=
pip_pydistutils_home
if
exit_code
:
logger
.
error
(
"An error occurred when trying to install %s. "
"Look above this message for any errors that "
"were output by pip install."
,
spec
)
sys
.
exit
(
1
)
subprocess
.
check_call
(
args
,
env
=
env
)
split_entries
=
[
os
.
path
.
splitext
(
entry
)
for
entry
in
os
.
listdir
(
dest
)]
entries
=
os
.
listdir
(
dest
)
try
:
distinfo_dir
=
[
base
+
ext
for
base
,
ext
in
split_entries
if
ext
==
".dist-info"
][
0
]
except
Index
Error
:
assert
len
(
entries
)
==
1
,
"Got multiple entries afer pip wheel"
wheel
=
entries
[
0
]
assert
os
.
path
.
splitext
(
wheel
)[
1
]
==
'.whl'
,
"Expected a .whl"
except
Assertion
Error
:
logger
.
error
(
"No .
dist-info directory after successful pip instal
l of %s"
,
"No .
whl after successful pip whee
l of %s"
,
spec
)
raise
return
make_egg_after_pip_
install
(
dest
,
distinfo_dir
)
return
make_egg_after_pip_
wheel
(
dest
,
wheel
)
def
make_egg_after_pip_install
(
dest
,
distinfo_dir
):
"""build properly named egg directory"""
# `pip install` does not build the namespace aware __init__.py files
# but they are needed in egg directories.
# Add them before moving files setup by pip
namespace_packages_file
=
os
.
path
.
join
(
dest
,
distinfo_dir
,
'namespace_packages.txt'
)
if
os
.
path
.
isfile
(
namespace_packages_file
):
with
open
(
namespace_packages_file
)
as
f
:
namespace_packages
=
[
line
.
strip
().
replace
(
'.'
,
os
.
path
.
sep
)
for
line
in
f
.
readlines
()
]
for
namespace_package
in
namespace_packages
:
namespace_package_dir
=
os
.
path
.
join
(
dest
,
namespace_package
)
if
os
.
path
.
isdir
(
namespace_package_dir
):
init_py_file
=
os
.
path
.
join
(
namespace_package_dir
,
'__init__.py'
)
with
open
(
init_py_file
,
'w'
)
as
f
:
f
.
write
(
"__import__('pkg_resources')."
"declare_namespace(__name__)"
)
# Remove `bin` directory if needed
# as there is no way to avoid script installation
# when running `pip install`
entry_points_file
=
os
.
path
.
join
(
dest
,
distinfo_dir
,
'entry_points.txt'
)
if
os
.
path
.
isfile
(
entry_points_file
):
with
open
(
entry_points_file
)
as
f
:
content
=
f
.
read
()
if
"console_scripts"
in
content
or
"gui_scripts"
in
content
:
bin_dir
=
os
.
path
.
join
(
dest
,
BIN_SCRIPTS
)
if
os
.
path
.
exists
(
bin_dir
):
shutil
.
rmtree
(
bin_dir
)
# Make properly named new egg dir
distro
=
list
(
pkg_resources
.
find_distributions
(
dest
))[
0
]
base
=
"{}-{}"
.
format
(
distro
.
egg_name
(),
pkg_resources
.
get_supported_platform
()
)
egg_name
=
base
+
'.egg'
new_distinfo_dir
=
base
+
'.dist-info'
egg_dir
=
os
.
path
.
join
(
dest
,
egg_name
)
os
.
mkdir
(
egg_dir
)
# Move ".dist-info" dir into new egg dir
os
.
rename
(
os
.
path
.
join
(
dest
,
distinfo_dir
),
os
.
path
.
join
(
egg_dir
,
new_distinfo_dir
)
)
top_level_file
=
os
.
path
.
join
(
egg_dir
,
new_distinfo_dir
,
'top_level.txt'
)
if
os
.
path
.
isfile
(
top_level_file
):
with
open
(
top_level_file
)
as
f
:
top_levels
=
filter
(
(
lambda
x
:
len
(
x
)
!=
0
),
[
line
.
strip
()
for
line
in
f
.
readlines
()]
)
else
:
top_levels
=
()
# Move all top_level modules or packages
for
top_level
in
top_levels
:
# as package
top_level_dir
=
os
.
path
.
join
(
dest
,
top_level
)
if
os
.
path
.
exists
(
top_level_dir
):
shutil
.
move
(
top_level_dir
,
egg_dir
)
continue
# as module
top_level_py
=
top_level_dir
+
'.py'
if
os
.
path
.
exists
(
top_level_py
):
shutil
.
move
(
top_level_py
,
egg_dir
)
top_level_pyc
=
top_level_dir
+
'.pyc'
if
os
.
path
.
exists
(
top_level_pyc
):
shutil
.
move
(
top_level_pyc
,
egg_dir
)
continue
record_file
=
os
.
path
.
join
(
egg_dir
,
new_distinfo_dir
,
'RECORD'
)
if
os
.
path
.
isfile
(
record_file
):
if
PY3
:
with
open
(
record_file
,
newline
=
''
)
as
f
:
all_files
=
[
row
[
0
]
for
row
in
csv
.
reader
(
f
)]
else
:
with
open
(
record_file
,
'rb'
)
as
f
:
all_files
=
[
row
[
0
]
for
row
in
csv
.
reader
(
f
)]
# There might be some c extensions left over
for
entry
in
all_files
:
if
entry
.
endswith
((
'.pyc'
,
'.pyo'
)):
continue
dest_entry
=
os
.
path
.
join
(
dest
,
entry
)
# work around pip install -t bug that leaves entries in RECORD
# that starts with '../../'
if
not
os
.
path
.
abspath
(
dest_entry
).
startswith
(
dest
):
continue
egg_entry
=
os
.
path
.
join
(
egg_dir
,
entry
)
if
os
.
path
.
exists
(
dest_entry
)
and
not
os
.
path
.
exists
(
egg_entry
):
egg_entry_dir
=
os
.
path
.
dirname
(
egg_entry
)
if
not
os
.
path
.
exists
(
egg_entry_dir
):
os
.
makedirs
(
egg_entry_dir
)
os
.
rename
(
dest_entry
,
egg_entry
)
return
[
egg_dir
]
def
make_egg_after_pip_wheel
(
dest
,
wheel
):
unpack_wheel
(
os
.
path
.
join
(
dest
,
wheel
),
dest
)
assert
len
(
os
.
listdir
(
dest
))
==
2
return
glob
.
glob
(
os
.
path
.
join
(
dest
,
'*.egg'
))
def
unpack_egg
(
location
,
dest
):
...
...
@@ -1872,7 +1929,7 @@ def _get_matching_dist_in_location(dist, location):
if
dist_infos
==
[(
dist
.
project_name
.
lower
(),
dist
.
parsed_version
)]:
return
dists
.
pop
()
def
_move_to_eggs_dir_and_compile
(
dist
,
dest
):
def
_move_to_eggs_dir_and_compile
(
dist
,
dest
,
options
):
"""Move distribution to the eggs destination directory.
And compile the py files, if we have actually moved the dist.
...
...
@@ -1913,7 +1970,7 @@ def _move_to_eggs_dir_and_compile(dist, dest):
unpacker
(
dist
.
location
,
tmp_dest
)
[
tmp_loc
]
=
glob
.
glob
(
os
.
path
.
join
(
tmp_dest
,
'*'
))
else
:
[
tmp_loc
]
=
call_pip_
install
(
dist
.
location
,
tmp_dest
)
[
tmp_loc
]
=
call_pip_
wheel
(
dist
.
location
,
tmp_dest
,
options
)
installed_with_pip
=
True
# We have installed the dist. Now try to rename/move it.
...
...
@@ -1954,7 +2011,7 @@ def _move_to_eggs_dir_and_compile(dist, dest):
return
newdist
def
sort_working_set
(
ws
,
eggs_dir
,
develop_eggs_dir
):
def
get_develop_paths
(
develop_eggs_dir
):
develop_paths
=
set
()
pattern
=
os
.
path
.
join
(
develop_eggs_dir
,
'*.egg-link'
)
for
egg_link
in
glob
.
glob
(
pattern
):
...
...
@@ -1962,21 +2019,30 @@ def sort_working_set(ws, eggs_dir, develop_eggs_dir):
path
=
f
.
readline
().
strip
()
if
path
:
develop_paths
.
add
(
path
)
return
develop_paths
sorted_paths
=
[]
egg_paths
=
[]
other_paths
=
[]
def
sort_working_set
(
ws
,
buildout_dir
,
eggs_dir
,
develop_eggs_dir
):
develop_paths
=
get_develop_paths
(
develop_eggs_dir
)
dists_priorities
=
tuple
([]
for
i
in
range
(
5
))
for
dist
in
ws
:
path
=
dist
.
location
if
path
in
develop_paths
:
sorted_paths
.
append
(
path
)
elif
os
.
path
.
commonprefix
([
path
,
eggs_dir
])
==
eggs_dir
:
egg_paths
.
append
(
path
)
if
os
.
path
.
commonprefix
([
path
,
eggs_dir
])
==
eggs_dir
:
# Dists from eggs first because we know they contain a single dist.
priority
=
0
if
os
.
path
.
commonprefix
([
path
,
buildout_dir
])
==
buildout_dir
:
# We assume internal locations contain a single dist too.
priority
=
1
+
2
*
(
path
not
in
develop_paths
)
# 1 or 3
else
:
other_paths
.
append
(
path
)
sorted_paths
.
extend
(
egg_paths
)
sorted_paths
.
extend
(
other_paths
)
return
pkg_resources
.
WorkingSet
(
sorted_paths
)
priority
=
2
+
2
*
(
path
not
in
develop_paths
)
# 2 or 4
dists_priorities
[
priority
].
append
(
dist
)
# We add dists to an empty working set manually instead of adding the paths
# to avoid activating other dists at the same locations.
ws
=
pkg_resources
.
WorkingSet
([])
for
dists
in
dists_priorities
:
for
dist
in
dists
:
ws
.
add
(
dist
)
return
ws
NOT_PICKED_AND_NOT_ALLOWED
=
"""
\
...
...
src/zc/buildout/rmtree.py
View file @
3b16f5af
...
...
@@ -16,7 +16,8 @@
import
shutil
import
os
import
doctest
import
time
import
errno
import
sys
def
rmtree
(
path
):
"""
...
...
@@ -26,6 +27,10 @@ def rmtree (path):
process (e.g. antivirus scanner). This tries to chmod the
file to writeable and retries 10 times before giving up.
Also it tries to remove symlink itself if a symlink as passed as
path argument.
Finally, it tries to make parent directory writable.
>>> from tempfile import mkdtemp
Let's make a directory ...
...
...
@@ -41,13 +46,51 @@ def rmtree (path):
>>> foo = os.path.join (d, 'foo')
>>> with open (foo, 'w') as f: _ = f.write ('huhu')
>>> bar = os.path.join (d, 'bar')
>>> os.symlink(bar, bar)
and make it unwriteable
>>> os.chmod (foo, 256) # 0400
>>> os.chmod (foo, 0o400)
and make parent dir unwritable
>>> os.chmod (d, 0o400)
rmtree should be able to remove it:
>>> rmtree (d)
and now the directory is gone
>>> os.path.isdir (d)
0
Let's make a directory ...
>>> d = mkdtemp()
and make sure it is actually there
>>> os.path.isdir (d)
1
Now create a broken symlink ...
>>> foo = os.path.join (d, 'foo')
>>> os.symlink(foo + '.not_exist', foo)
rmtree should be able to remove it:
>>> rmtree (foo)
and now the directory is gone
>>> os.path.isdir (foo)
0
cleanup directory
>>> rmtree (d)
and now the directory is gone
...
...
@@ -55,20 +98,43 @@ def rmtree (path):
>>> os.path.isdir (d)
0
"""
def
retry_writeable
(
func
,
path
,
exc
):
os
.
chmod
(
path
,
384
)
# 0600
for
i
in
range
(
10
):
try
:
func
(
path
)
break
except
OSError
:
time
.
sleep
(
0.1
)
def
chmod_retry
(
func
,
failed_path
,
exc_info
):
"""Make sure the directories are executable and writable.
"""
if
func
is
os
.
path
.
islink
:
os
.
unlink
(
path
)
elif
func
is
os
.
lstat
or
func
is
os
.
open
:
if
not
os
.
path
.
islink
(
path
):
raise
os
.
unlink
(
path
)
else
:
# tried 10 times without success, thus
# finally rethrow the last exception
# Depending on the Python version, the following items differ.
if
sys
.
version_info
>=
(
3
,
):
expected_error_type
=
PermissionError
expected_func_tuple
=
(
os
.
lstat
,
os
.
open
)
else
:
expected_error_type
=
OSError
expected_func_tuple
=
(
os
.
listdir
,
)
e
=
exc_info
[
1
]
if
isinstance
(
e
,
expected_error_type
):
if
e
.
errno
==
errno
.
ENOENT
:
# because we are calling again rmtree on listdir errors, this path might
# have been already deleted by the recursive call to rmtree.
return
if
e
.
errno
==
errno
.
EACCES
:
if
func
in
expected_func_tuple
:
os
.
chmod
(
failed_path
,
0o700
)
# corner case to handle errors in listing directories.
# https://bugs.python.org/issue8523
return
shutil
.
rmtree
(
failed_path
,
onerror
=
chmod_retry
)
# If parent directory is not writable, we still cannot delete the file.
# But make sure not to change the parent of the folder we are deleting.
if
failed_path
!=
path
:
os
.
chmod
(
os
.
path
.
dirname
(
failed_path
),
0o700
)
return
func
(
failed_path
)
raise
shutil
.
rmtree
(
path
,
onerror
=
retry_writeable
)
shutil
.
rmtree
(
path
,
onerror
=
chmod_retry
)
def
test_suite
():
return
doctest
.
DocTestSuite
()
...
...
src/zc/buildout/testing.py
View file @
3b16f5af
...
...
@@ -23,6 +23,7 @@ except ImportError:
from
BaseHTTPServer
import
HTTPServer
,
BaseHTTPRequestHandler
from
urllib2
import
urlopen
import
base64
import
errno
import
logging
import
multiprocessing
...
...
@@ -222,6 +223,9 @@ class Buildout(zc.buildout.buildout.Buildout):
Options
=
TestOptions
def
initialize
(
self
,
*
args
):
pass
def
buildoutSetUp
(
test
):
test
.
globs
[
'__tear_downs'
]
=
__tear_downs
=
[]
...
...
@@ -412,6 +416,23 @@ class Handler(BaseHTTPRequestHandler):
self
.
__server
.
__log
=
False
return
k
()
if
self
.
path
.
startswith
(
'/private/'
):
auth
=
self
.
headers
.
get
(
'Authorization'
)
if
auth
and
auth
.
startswith
(
'Basic '
)
and
\
self
.
path
[
9
:].
encode
()
==
base64
.
b64decode
(
self
.
headers
.
get
(
'Authorization'
)[
6
:]):
return
k
()
# But not returning 401+WWW-Authenticate, we check that the client
# skips auth challenge, which is not free (in terms of performance)
# and useless for what we support.
self
.
send_response
(
403
,
'Forbidden'
)
out
=
'<html><body>Forbidden</body></html>'
.
encode
()
self
.
send_header
(
'Content-Length'
,
str
(
len
(
out
)))
self
.
send_header
(
'Content-Type'
,
'text/html'
)
self
.
end_headers
()
self
.
wfile
.
write
(
out
)
return
path
=
os
.
path
.
abspath
(
os
.
path
.
join
(
self
.
tree
,
*
self
.
path
.
split
(
'/'
)))
if
not
(
((
path
==
self
.
tree
)
or
path
.
startswith
(
self
.
tree
+
os
.
path
.
sep
))
...
...
@@ -622,6 +643,8 @@ ignore_not_upgrading = (
'
Not
upgrading
because
not
running
a
local
buildout
command
.
\
n
'
), '')
os.environ['
BUILDOUT_INFO_REINSTALL_REASON
'] = '
0
'
def run_buildout(command):
# Make sure we don'
t
get
.
buildout
os
.
environ
[
'HOME'
]
=
os
.
path
.
join
(
os
.
getcwd
(),
'home'
)
...
...
src/zc/buildout/tests/__init__.py
View file @
3b16f5af
...
...
@@ -123,6 +123,45 @@ def create_sample_eggs(test, executable=sys.executable):
)
zc
.
buildout
.
testing
.
bdist_egg
(
tmp
,
sys
.
executable
,
dest
)
write
(
tmp
,
'builddep.py'
,
''
)
write
(
tmp
,
'setup.py'
,
"from setuptools import setup
\
n
"
"setup(name='builddep', "
" py_modules=['builddep'], "
" zip_safe=True, version='0.1')
\
n
"
)
zc
.
buildout
.
testing
.
sdist
(
tmp
,
dest
)
write
(
tmp
,
'withsetuprequires.py'
,
''
)
write
(
tmp
,
'setup.py'
,
"from setuptools import setup
\
n
"
"setup(name='withsetuprequires', "
" setup_requires = 'builddep', "
" py_modules=['withsetuprequires'], "
" zip_safe=True, version='0.1')
\
n
"
"import builddep"
)
zc
.
buildout
.
testing
.
sdist
(
tmp
,
dest
)
write
(
tmp
,
'withbuildsystemrequires.py'
,
''
)
write
(
tmp
,
'pyproject.toml'
,
'[build-system]
\
n
'
'requires = ["builddep"]'
)
write
(
tmp
,
'setup.py'
,
"from setuptools import setup
\
n
"
"setup(name='withbuildsystemrequires', "
" setup_requires = 'builddep', "
" py_modules=['withbuildsystemrequires'], "
" package_data={'withbuildsystemrequires': ['pyproject.toml']}, "
" zip_safe=True, version='0.1')
\
n
"
"import builddep"
)
zc
.
buildout
.
testing
.
sdist
(
tmp
,
dest
)
finally
:
shutil
.
rmtree
(
tmp
)
...
...
src/zc/buildout/tests/allow-unknown-extras.txt
View file @
3b16f5af
...
...
@@ -40,6 +40,7 @@ Now we can run the buildout and see that it fails:
...
While:
Installing eggs.
Base installation request: 'allowdemo[bad_extra]'
Error: Couldn't find the required extra...
If we flip the option on, the buildout succeeds
...
...
src/zc/buildout/tests/allowhosts.txt
View file @
3b16f5af
...
...
@@ -61,6 +61,8 @@ Now we can run the buildout and make sure all attempts to dist.plone.org fails::
...
While:
Installing eggs.
Base installation request: 'allowdemo'
Requirement of allowdemo: kss.core
Getting distribution for 'kss.core'.
Error: Couldn't find a distribution for 'kss.core'.
...
...
@@ -92,6 +94,8 @@ Now we can run the buildout and make sure all attempts to dist.plone.org fails::
...
While:
Installing eggs.
Base installation request: 'allowdemo'
Requirement of allowdemo: kss.core
Getting distribution for 'kss.core'.
Error: Couldn't find a distribution for 'kss.core'.
...
...
src/zc/buildout/tests/buildout.txt
View file @
3b16f5af
...
...
@@ -337,6 +337,10 @@ we'll see that the directory gets removed and recreated::
... path = mydata
... """)
>>> print_(system(buildout+' --dry-run'), end='')
Develop: '/sample-buildout/recipes'
Uninstalling data-dir.
Installing data-dir.
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling data-dir.
...
...
@@ -357,6 +361,10 @@ If any of the files or directories created by a recipe are removed,
the part will be reinstalled::
>>> rmdir(sample_buildout, 'mydata')
>>> print_(system(buildout+' --dry-run'), end='')
Develop: '/sample-buildout/recipes'
Uninstalling data-dir.
Installing data-dir.
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling data-dir.
...
...
@@ -816,6 +824,8 @@ the origin of the value (file name or ``COMPUTED_VALUE``, ``DEFAULT_VALUE``,
DEFAULT_VALUE
directory= /sample-buildout
COMPUTED_VALUE
dry-run= false
DEFAULT_VALUE
eggs-directory= /sample-buildout/eggs
DEFAULT_VALUE
executable= ...
...
...
@@ -911,6 +921,11 @@ You get more information about the way values are computed::
AS COMPUTED_VALUE
SET VALUE = /sample-buildout
<BLANKLINE>
dry-run= false
<BLANKLINE>
AS DEFAULT_VALUE
SET VALUE = false
<BLANKLINE>
eggs-directory= /sample-buildout/eggs
<BLANKLINE>
AS DEFAULT_VALUE
...
...
@@ -1269,6 +1284,102 @@ the current section. We can also use the special option,
my_name debug
recipe recipes:debug
It is possible to have access to profile base url from section by
using ${:_profile_base_location_}:
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = data-dir debug
... log-level = INFO
...
... [debug]
... recipe = recipes:debug
... profile_base_location = ${:_profile_base_location_}
...
... [data-dir]
... recipe = recipes:mkdir
... path = mydata
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Updating data-dir.
Installing debug.
_profile_base_location_ /sample-buildout
profile_base_location /sample-buildout
recipe recipes:debug
Keep in mind that in case of sections spaning across multiple profiles,
the topmost value will be presented:
>>> write(sample_buildout, 'extended.cfg',
... """
... [debug]
... profile_base_location = ${:_profile_base_location_}
... """)
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... extends = extended.cfg
... develop = recipes
... parts = data-dir debug
... log-level = INFO
...
... [debug]
... recipe = recipes:debug
... profile_base_location = ${:_profile_base_location_}
...
... [data-dir]
... recipe = recipes:mkdir
... path = mydata
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Updating data-dir.
Updating debug.
_profile_base_location_ /sample-buildout
profile_base_location /sample-buildout
recipe recipes:debug
But of course, in case if accessing happens in extended profile's section,
this profile's location will be exposed:
>>> write(sample_buildout, 'extended.cfg',
... """
... [debug]
... profile_base_location = ${:_profile_base_location_}
... """)
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... extends = extended.cfg
... develop = recipes
... parts = data-dir debug
... log-level = INFO
...
... [debug]
... recipe = recipes:debug
...
... [data-dir]
... recipe = recipes:mkdir
... path = mydata
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Updating data-dir.
Updating debug.
_profile_base_location_ /sample-buildout
profile_base_location /sample-buildout
recipe recipes:debug
>>> remove(sample_buildout, 'extended.cfg')
Automatic part selection and ordering
-------------------------------------
...
...
@@ -2700,7 +2811,7 @@ were created.
The ``.installed.cfg`` is only updated for the recipes that ran::
>>> cat(sample_buildout, '.installed.cfg')
... # doctest: +NORMALIZE_WHITESPACE
... # doctest: +NORMALIZE_WHITESPACE
+ELLIPSIS
[buildout]
installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link
parts = debug d1 d2 d3 d4
...
...
@@ -2730,7 +2841,7 @@ The ``.installed.cfg`` is only updated for the recipes that ran::
<BLANKLINE>
[d4]
__buildout_installed__ = /sample-buildout/data2-extra
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
d2:...
path = /sample-buildout/data2-extra
recipe = recipes:mkdir
...
...
@@ -2804,10 +2915,10 @@ provide alternate locations, and even names for these directories::
Creating directory '/sample-alt/work'.
Creating directory '/sample-alt/developbasket'.
Develop: '/sample-buildout/recipes'
Uninstalling d4.
Uninstalling d3.
Uninstalling d2.
Uninstalling debug.
Uninstalling d4.
Uninstalling d3.
>>> ls(alt)
d basket
...
...
@@ -2915,8 +3026,10 @@ database is shown::
bin-directory = /sample-buildout/bin
develop-eggs-directory = /sample-buildout/develop-eggs
directory = /sample-buildout
dry-run = false
eggs-directory = /sample-buildout/eggs
executable = python
extra-paths = ...
find-links =
install-from-cache = false
installed = /sample-buildout/.installed.cfg
...
...
@@ -3234,7 +3347,6 @@ or paths to use::
>>> remove('setup.cfg')
>>> print_(system(buildout + ' -csetup.cfg init demo other ./src'), end='')
Creating '/sample-bootstrapped/setup.cfg'.
Creating directory '/sample-bootstrapped/develop-eggs'.
Getting distribution for 'zc.recipe.egg>=2.0.6'.
Got zc.recipe.egg
Installing py.
...
...
@@ -3293,7 +3405,6 @@ for us::
>>> remove('setup.cfg')
>>> print_(system(buildout + ' -csetup.cfg init demo other ./src'), end='')
Creating '/sample-bootstrapped/setup.cfg'.
Creating directory '/sample-bootstrapped/develop-eggs'.
Installing py.
Generated script '/sample-bootstrapped/bin/demo'.
Generated script '/sample-bootstrapped/bin/distutilsscript'.
...
...
src/zc/buildout/tests/dependencylinks.txt
View file @
3b16f5af
...
...
@@ -87,6 +87,8 @@ buildout to see where the egg comes from this time.
...
While:
Updating eggs.
Base installation request: 'depdemo'
Requirement of depdemo: demoneeded
Getting distribution for 'demoneeded'.
Error: Couldn't find a distribution for 'demoneeded'.
...
...
src/zc/buildout/tests/download.txt
View file @
3b16f5af
...
...
@@ -63,6 +63,32 @@ When trying to access a file that doesn't exist, we'll get an exception:
... else: print_('woops')
download error
An alternate URL can be used in case of HTTPError with the main one.
Useful when a version of a resource can only be downloaded with a temporary
URL as long as it's the last version, and this version is then moved to a
permanent place when a newer version is released. In such case, when using
a cache (in particular networkcache), it's important that the main URL (`url`)
is always used as cache key. And `alternate_url` shall be the temporary URL.
>>> path, is_temp = download(server_url+'not-there',
... alternate_url=server_url+'foo.txt')
>>> cat(path)
This is a foo text.
>>> is_temp
True
>>> remove(path)
The main URL is tried first:
>>> write(server_data, 'other.txt', 'This is some other text.')
>>> path, is_temp = download(server_url+'other.txt',
... alternate_url=server_url+'foo.txt')
>>> cat(path)
This is some other text.
>>> is_temp
True
>>> remove(path)
Downloading a local file doesn't produce a temporary file but simply returns
the local file itself:
...
...
@@ -126,6 +152,37 @@ This is a foo text.
>>> remove(path)
HTTP basic authentication:
>>> download = Download()
>>> user_url = server_url.replace('/localhost:', '/%s@localhost:') + 'private/'
>>> path, is_temp = download(user_url % 'foo:' + 'foo:')
>>> is_temp; remove(path)
True
>>> path, is_temp = download(user_url % 'foo:bar' + 'foo:bar')
>>> is_temp; remove(path)
True
>>> download(user_url % 'bar:' + 'foo:')
Traceback (most recent call last):
UserError: Error downloading ...: HTTP Error 403: Forbidden
... with netrc:
>>> url = server_url + 'private/foo:bar'
>>> download(url)
Traceback (most recent call last):
UserError: Error downloading ...: HTTP Error 403: Forbidden
>>> import os, zc.buildout.download
>>> old_home = os.environ['HOME']
>>> home = os.environ['HOME'] = tmpdir('test-netrc')
>>> netrc = join(home, '.netrc')
>>> write(netrc, 'machine localhost login foo password bar')
>>> os.chmod(netrc, 0o600)
>>> zc.buildout.download.netrc.__init__()
>>> path, is_temp = download(url)
>>> is_temp; remove(path)
True
>>> os.environ['HOME'] = old_home
Downloading using the download cache
------------------------------------
...
...
@@ -165,14 +222,6 @@ the file on the server to see this:
>>> cat(path)
This is a foo text.
If we specify an MD5 checksum for a file that is already in the cache, the
cached copy's checksum will be verified:
>>> download(server_url+'foo.txt', md5('The wrong text.'.encode()).hexdigest())
Traceback (most recent call last):
ChecksumError: MD5 checksum mismatch for cached download
from 'http://localhost/foo.txt' at '/download-cache/foo.txt'
Trying to access another file at a different URL which has the same base name
will result in the cached copy being used:
...
...
@@ -184,6 +233,14 @@ will result in the cached copy being used:
>>> cat(path)
This is a foo text.
If we specify an MD5 checksum for a file that is already in the cache, the
cached copy's checksum will be verified and the cache will be refreshed:
>>> path, is_temp = download(server_url+'foo.txt', md5('The wrong text.'.encode()).hexdigest())
>>> is_temp
True
>>> remove(path)
Given a target path for the download, the utility will provide a copy of the
file at that location both when first downloading the file and when using a
cached copy:
...
...
@@ -259,7 +316,7 @@ If the file is completely missing it should notify the user of the error:
>>> download(server_url+'bar.txt') # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS
Traceback (most recent call last):
...
UserError: Error downloading
extends for URL
http://localhost/bar.txt:
UserError: Error downloading http://localhost/bar.txt:
...404...
>>> ls(cache)
...
...
@@ -442,18 +499,22 @@ However, when downloading the file normally with the cache being used in
fall-back mode, the file will be downloaded from the net and the cached copy
will be replaced with the new content:
>>> cat(download(server_url+'foo.txt')[0])
>>> path, is_temp = download(server_url+'foo.txt')
>>> cat(path)
The wrong text.
>>> cat(cache, 'foo.txt')
The wrong text.
>>> is_temp
True
>>> remove(path)
When trying to download a resource whose checksum does not match, the cached
c
opy will neither be used nor overwritten
:
Fall-back mode is meaningless if md5sum is given. If the checksum of the
c
ached copy matches, the resource is not downloaded
:
>>> write(server_data, 'foo.txt', 'This is a foo text.')
>>> download(server_url+'foo.txt', md5('The wrong text.'.encode()).hexdigest())
Traceback (most recent call last):
ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt'
>>>
path, is_temp =
download(server_url+'foo.txt', md5('The wrong text.'.encode()).hexdigest())
>>> print_(path)
/download-cache/foo.txt
>>> cat(cache, 'foo.txt')
The wrong text.
...
...
src/zc/buildout/tests/downloadcache.txt
View file @
3b16f5af
...
...
@@ -33,11 +33,12 @@ download:
>>> print_(get(link_server), end='')
<html><body>
<a href="bigdemo-0.1-py2.4.egg">bigdemo-0.1-py2.4.egg</a><br>
<a href="demo-0.1-py2.4.egg">demo-0.1-py2.4.egg</a><br>
<a href="demo-0.2-py2.4.egg">demo-0.2-py2.4.egg</a><br>
<a href="demo-0.3-py2.4.egg">demo-0.3-py2.4.egg</a><br>
<a href="demo-0.4rc1-py2.4.egg">demo-0.4rc1-py2.4.egg</a><br>
<a href="bigdemo-0.1-pyN.N.egg">bigdemo-0.1-pyN.N.egg</a><br>
<a href="builddep-0.1.zip">builddep-0.1.zip</a><br>
<a href="demo-0.1-pyN.N.egg">demo-0.1-pyN.N.egg</a><br>
<a href="demo-0.2-pyN.N.egg">demo-0.2-pyN.N.egg</a><br>
<a href="demo-0.3-pyN.N.egg">demo-0.3-pyN.N.egg</a><br>
<a href="demo-0.4rc1-pyN.N.egg">demo-0.4rc1-pyN.N.egg</a><br>
<a href="demoneeded-1.0.zip">demoneeded-1.0.zip</a><br>
<a href="demoneeded-1.1.zip">demoneeded-1.1.zip</a><br>
<a href="demoneeded-1.2rc1.zip">demoneeded-1.2rc1.zip</a><br>
...
...
@@ -45,7 +46,9 @@ download:
<a href="extdemo-1.4.zip">extdemo-1.4.zip</a><br>
<a href="index/">index/</a><br>
<a href="mixedcase-0.5.zip">mixedcase-0.5.zip</a><br>
<a href="other-1.0-py2.4.egg">other-1.0-py2.4.egg</a><br>
<a href="other-1.0-pyN.N.egg">other-1.0-pyN.N.egg</a><br>
<a href="withbuildsystemrequires-0.1.zip">withbuildsystemrequires-0.1.zip</a><br>
<a href="withsetuprequires-0.1.zip">withsetuprequires-0.1.zip</a><br>
</body></html>
...
...
src/zc/buildout/tests/easy_install.txt
View file @
3b16f5af
...
...
@@ -97,11 +97,12 @@ We have a link server that has a number of eggs:
>>> print_(get(link_server), end='')
<html><body>
<a href="bigdemo-0.1-py2.4.egg">bigdemo-0.1-py2.4.egg</a><br>
<a href="demo-0.1-py2.4.egg">demo-0.1-py2.4.egg</a><br>
<a href="demo-0.2-py2.4.egg">demo-0.2-py2.4.egg</a><br>
<a href="demo-0.3-py2.4.egg">demo-0.3-py2.4.egg</a><br>
<a href="demo-0.4rc1-py2.4.egg">demo-0.4rc1-py2.4.egg</a><br>
<a href="bigdemo-0.1-pyN.N.egg">bigdemo-0.1-pyN.N.egg</a><br>
<a href="builddep-0.1.zip">builddep-0.1.zip</a><br>
<a href="demo-0.1-pyN.N.egg">demo-0.1-pyN.N.egg</a><br>
<a href="demo-0.2-pyN.N.egg">demo-0.2-pyN.N.egg</a><br>
<a href="demo-0.3-pyN.N.egg">demo-0.3-pyN.N.egg</a><br>
<a href="demo-0.4rc1-pyN.N.egg">demo-0.4rc1-pyN.N.egg</a><br>
<a href="demoneeded-1.0.zip">demoneeded-1.0.zip</a><br>
<a href="demoneeded-1.1.zip">demoneeded-1.1.zip</a><br>
<a href="demoneeded-1.2rc1.zip">demoneeded-1.2rc1.zip</a><br>
...
...
@@ -109,7 +110,9 @@ We have a link server that has a number of eggs:
<a href="extdemo-1.4.zip">extdemo-1.4.zip</a><br>
<a href="index/">index/</a><br>
<a href="mixedcase-0.5.zip">mixedcase-0.5.zip</a><br>
<a href="other-1.0-py2.4.egg">other-1.0-py2.4.egg</a><br>
<a href="other-1.0-pyN.N.egg">other-1.0-pyN.N.egg</a><br>
<a href="withbuildsystemrequires-0.1.zip">withbuildsystemrequires-0.1.zip</a><br>
<a href="withsetuprequires-0.1.zip">withsetuprequires-0.1.zip</a><br>
</body></html>
Let's make a directory and install the demo egg to it, using the demo:
...
...
@@ -765,9 +768,9 @@ An interpreter can also be generated without other eggs:
<BLANKLINE>
import sys
<BLANKLINE>
sys.path[0:0] = [
<BLANKLINE>
]
<BLANKLINE>
_interactive = True
...
An additional argument can be passed to define which scripts to install
...
...
@@ -1233,11 +1236,12 @@ Let's update our link server with a new version of extdemo:
>>> update_extdemo()
>>> print_(get(link_server), end='')
<html><body>
<a href="bigdemo-0.1-py2.4.egg">bigdemo-0.1-py2.4.egg</a><br>
<a href="demo-0.1-py2.4.egg">demo-0.1-py2.4.egg</a><br>
<a href="demo-0.2-py2.4.egg">demo-0.2-py2.4.egg</a><br>
<a href="demo-0.3-py2.4.egg">demo-0.3-py2.4.egg</a><br>
<a href="demo-0.4rc1-py2.4.egg">demo-0.4rc1-py2.4.egg</a><br>
<a href="bigdemo-0.1-pyN.N.egg">bigdemo-0.1-pyN.N.egg</a><br>
<a href="builddep-0.1.zip">builddep-0.1.zip</a><br>
<a href="demo-0.1-pyN.N.egg">demo-0.1-pyN.N.egg</a><br>
<a href="demo-0.2-pyN.N.egg">demo-0.2-pyN.N.egg</a><br>
<a href="demo-0.3-pyN.N.egg">demo-0.3-pyN.N.egg</a><br>
<a href="demo-0.4rc1-pyN.N.egg">demo-0.4rc1-pyN.N.egg</a><br>
<a href="demoneeded-1.0.zip">demoneeded-1.0.zip</a><br>
<a href="demoneeded-1.1.zip">demoneeded-1.1.zip</a><br>
<a href="demoneeded-1.2rc1.zip">demoneeded-1.2rc1.zip</a><br>
...
...
@@ -1246,7 +1250,9 @@ Let's update our link server with a new version of extdemo:
<a href="extdemo-1.5.zip">extdemo-1.5.zip</a><br>
<a href="index/">index/</a><br>
<a href="mixedcase-0.5.zip">mixedcase-0.5.zip</a><br>
<a href="other-1.0-py2.4.egg">other-1.0-py2.4.egg</a><br>
<a href="other-1.0-pyN.N.egg">other-1.0-pyN.N.egg</a><br>
<a href="withbuildsystemrequires-0.1.zip">withbuildsystemrequires-0.1.zip</a><br>
<a href="withsetuprequires-0.1.zip">withsetuprequires-0.1.zip</a><br>
</body></html>
The easy_install caches information about servers to reduce network
...
...
@@ -1445,9 +1451,8 @@ Now when we install the distributions:
... ['demo==0.2'], dest,
... links=[link_server], index=link_server+'index/')
GET 200 /
GET 404 /index/demo/
GET 200 /index/
GET 404 /index/demoneeded/
GET 200 /index/
>>> zc.buildout.easy_install.build(
... 'extdemo', dest,
...
...
@@ -1469,6 +1474,7 @@ from the link server:
>>> ws = zc.buildout.easy_install.install(
... ['demo'], dest,
... links=[link_server], index=link_server+'index/')
GET 404 /index/demo/
GET 200 /demo-0.3-py2.4.egg
Normally, the download cache is the preferred source of downloads, but
...
...
src/zc/buildout/tests/extends-cache.txt
→
src/zc/buildout/tests/extends-cache.txt
.disabled
View file @
3b16f5af
...
...
@@ -492,9 +492,9 @@ a better solution would re-use the logging already done by the utility.)
>>> import zc.buildout
>>> old_download = zc.buildout.download.Download.download
>>> def wrapper_download(self, url,
md5sum=None, path=None
):
>>> def wrapper_download(self, url,
*args, **kw
):
... print_("The URL %s was downloaded." % url)
... return old_download(url,
md5sum, path
)
... return old_download(url,
*args, **kw
)
>>> zc.buildout.download.Download.download = wrapper_download
>>> zc.buildout.buildout.main([])
...
...
src/zc/buildout/tests/repeatable.txt
View file @
3b16f5af
...
...
@@ -207,6 +207,7 @@ versions:
Getting section foo.
Initializing section foo.
Installing recipe spam.
Base installation request: 'spam'
Getting distribution for 'spam'.
Error: Picked: spam = 2
...
...
...
src/zc/buildout/tests/test_all.py
View file @
3b16f5af
...
...
@@ -143,7 +143,8 @@ class TestEasyInstall(unittest.TestCase):
result
=
zc
.
buildout
.
easy_install
.
_move_to_eggs_dir_and_compile
(
dist
,
dest
dest
,
None
,
# ok because we don't fallback to pip
)
self
.
assertIsNotNone
(
result
)
...
...
@@ -433,6 +434,9 @@ Now, let's create a buildout that requires y and z:
Requirement of sampley: demoneeded==1.0
While:
Installing eggs.
Base installation request: 'sampley', 'samplez'
Requirement of samplez: demoneeded==1.1
Requirement of sampley: demoneeded==1.0
Error: There is a version conflict.
We already have: demoneeded 1.1
but sampley 1 requires 'demoneeded==1.0'.
...
...
@@ -483,6 +487,12 @@ If we use the verbose switch, we can see where requirements are coming from:
Requirement of sampley: demoneeded==1.0
While:
Installing eggs.
Base installation request: 'samplea', 'samplez'
Requirement of samplez: demoneeded==1.1
Requirement of samplea: sampleb
Requirement of sampleb: samplea
Requirement of sampleb: sampley
Requirement of sampley: demoneeded==1.0
Error: There is a version conflict.
We already have: demoneeded 1.1
but sampley 1 requires 'demoneeded==1.0'.
...
...
@@ -551,6 +561,11 @@ that we can't find. when run in verbose mode
...
While:
Installing eggs.
Base installation request: 'samplea'
Requirement of samplea: sampleb
Requirement of sampleb: samplea
Requirement of sampleb: sampley
Requirement of sampley: demoneeded
Getting distribution for 'demoneeded'.
Error: Couldn't find a distribution for 'demoneeded'.
"""
...
...
@@ -1181,6 +1196,8 @@ Uninstall recipes need to be called when a part is removed too:
uninstalling
Installing demo.
installing
Section `demo` contains unused option(s): 'x'.
This may be an indication for either a typo in the option's name or a bug in the used recipe.
>>> write('buildout.cfg', '''
...
...
@@ -1679,7 +1696,7 @@ some evil recipes that exit uncleanly:
>>> mkdir('recipes')
>>> write('recipes', 'recipes.py',
... '''
... import
o
s
... import
sy
s
...
... class Clean:
... def __init__(*_): pass
...
...
@@ -1687,10 +1704,10 @@ some evil recipes that exit uncleanly:
... def update(_): pass
...
... class EvilInstall(Clean):
... def install(_):
os._
exit(1)
... def install(_):
sys.
exit(1)
...
... class EvilUpdate(Clean):
... def update(_):
os._
exit(1)
... def update(_):
sys.
exit(1)
... ''')
>>> write('recipes', 'setup.py',
...
...
@@ -1784,10 +1801,10 @@ Now let's look at 3 cases:
>>> print_(system(buildout+' buildout:parts='), end='')
Develop: '/sample-buildout/recipes'
Uninstalling p2.
Uninstalling p1.
Uninstalling p4.
Uninstalling p3.
Uninstalling p2.
Uninstalling p1.
3. We exit while installing or updating after uninstalling:
...
...
@@ -2214,6 +2231,28 @@ def dealing_with_extremely_insane_dependencies():
...
While:
Installing pack1.
Base installation request: 'pack0'
Requirement of pack0: pack4
Requirement of pack0: pack3
Requirement of pack0: pack2
Requirement of pack0: pack1
Requirement of pack4: pack5
Requirement of pack4: pack3
Requirement of pack4: pack2
Requirement of pack4: pack1
Requirement of pack4: pack0
Requirement of pack3: pack4
Requirement of pack3: pack2
Requirement of pack3: pack1
Requirement of pack3: pack0
Requirement of pack2: pack4
Requirement of pack2: pack3
Requirement of pack2: pack1
Requirement of pack2: pack0
Requirement of pack1: pack4
Requirement of pack1: pack3
Requirement of pack1: pack2
Requirement of pack1: pack0
Getting distribution for 'pack5'.
Error: Couldn't find a distribution for 'pack5'.
...
...
@@ -2255,10 +2294,209 @@ def dealing_with_extremely_insane_dependencies():
...
While:
Installing pack1.
Base installation request: 'pack0'
Requirement of pack0: pack4
Requirement of pack0: pack3
Requirement of pack0: pack2
Requirement of pack0: pack1
Requirement of pack4: pack5
Requirement of pack4: pack3
Requirement of pack4: pack2
Requirement of pack4: pack1
Requirement of pack4: pack0
Requirement of pack3: pack4
Requirement of pack3: pack2
Requirement of pack3: pack1
Requirement of pack3: pack0
Requirement of pack2: pack4
Requirement of pack2: pack3
Requirement of pack2: pack1
Requirement of pack2: pack0
Requirement of pack1: pack4
Requirement of pack1: pack3
Requirement of pack1: pack2
Requirement of pack1: pack0
Getting distribution for 'pack5'.
Error: Couldn't find a distribution for 'pack5'.
"""
def
test_part_pulled_by_recipe
():
"""
>>> mkdir(sample_buildout, 'recipes')
>>> write(sample_buildout, 'recipes', 'test.py',
... '''
... class Recipe:
...
... def __init__(self, buildout, name, options):
... self.x = buildout[options['x']][name]
...
... def install(self):
... print(self.x)
... return ()
...
... update = install
... ''')
>>> write(sample_buildout, 'recipes', 'setup.py',
... '''
... from setuptools import setup
... setup(
... name = "recipes",
... entry_points = {'zc.buildout': ['test = test:Recipe']},
... )
... ''')
>>> write(sample_buildout, 'buildout.cfg',
... '''
... [buildout]
... develop = recipes
... parts = a
... [a]
... recipe = recipes:test
... x = b
... [b]
... <= a
... a = A
... b = B
... c = ${c:x}
... [c]
... x = c
... ''')
>>> os.chdir(sample_buildout)
>>> buildout = os.path.join(sample_buildout, 'bin', 'buildout')
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Installing b.
B
Section `b` contains unused option(s): 'c'.
This may be an indication for either a typo in the option's name or a bug in the used recipe.
Installing a.
A
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Updating b.
B
Updating a.
A
>>> cat('.installed.cfg') # doctest: +ELLIPSIS
[buildout]
...
[b]
__buildout_installed__ =
__buildout_signature__ = recipes-... c:...
...
[a]
__buildout_installed__ =
__buildout_signature__ = recipes-... b:...
...
"""
def
test_recipe_options_are_escaped
():
"""
>>> mkdir(sample_buildout, 'recipes')
>>> write(sample_buildout, 'recipes', 'test.py',
... '''
... class Recipe:
...
... def __init__(self, buildout, name, options):
... options['option'] = '${buildout_syntax_should_be_escaped}'
... print ("Option value: %s" % options['option'])
...
... def install(self):
... return ()
...
... update = install
... ''')
>>> write(sample_buildout, 'recipes', 'setup.py',
... '''
... from setuptools import setup
... setup(
... name = "recipes",
... entry_points = {'zc.buildout': ['test = test:Recipe']},
... )
... ''')
>>> write(sample_buildout, 'buildout.cfg',
... '''
... [buildout]
... develop = recipes
... parts = a
... [a]
... recipe = recipes:test
... ''')
>>> os.chdir(sample_buildout)
>>> buildout = os.path.join(sample_buildout, 'bin', 'buildout')
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Option value: ${buildout_syntax_should_be_escaped}
Installing a.
>>> cat('.installed.cfg') # doctest: +ELLIPSIS
[buildout]
...
[a]
__buildout_installed__ =
__buildout_signature__ = recipes-...
option = $${buildout_syntax_should_be_escaped}
recipe = recipes:test
"""
def
test_recipe_invalid_options_are_rejected
():
r"""
>>> mkdir(sample_buildout, 'recipes')
>>> write(sample_buildout, 'recipes', 'test.py',
... '''
... class Recipe:
...
... def __init__(self, buildout, name, options):
... options['[section]\\noption'] = 'invalid'
...
... def install(self):
... return ()
...
... update = install
... ''')
>>> write(sample_buildout, 'recipes', 'setup.py',
... '''
... from setuptools import setup
... setup(
... name = "recipes",
... entry_points = {'zc.buildout': ['test = test:Recipe']},
... )
... ''')
>>> write(sample_buildout, 'buildout.cfg',
... '''
... [buildout]
... develop = recipes
... parts = a
... [a]
... recipe = recipes:test
... ''')
>>> os.chdir(sample_buildout)
>>> buildout = os.path.join(sample_buildout, 'bin', 'buildout')
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
While:
Installing.
Getting section a.
Initializing section a.
Error: Invalid option name '[section]\noption'
"""
def
read_find_links_to_load_extensions
():
r"""
We'll create a wacky buildout extension that just announces itself when used:
...
...
@@ -2572,7 +2810,7 @@ def wont_downgrade_due_to_prefer_final():
If we install a non-final buildout version, we don't want to
downgrade just because we prefer-final. If a buildout version
isn't specified using a versions entry, then buildout's version
requirement gets set to >=CURRENT_VERSION.
requirement gets set to >=
PUBLIC_PART_OF_
CURRENT_VERSION.
>>> write('buildout.cfg',
... '''
...
...
@@ -2585,7 +2823,7 @@ def wont_downgrade_due_to_prefer_final():
... if l.startswith('zc.buildout = >=')]
>>> v == pkg_resources.working_set.find(
... pkg_resources.Requirement.parse('zc.buildout')
... ).
version
... ).
parsed_version.public
True
>>> write('buildout.cfg',
...
...
@@ -2677,7 +2915,8 @@ honoring our version specification.
... eggs = foo
... ''' % ('\n'.join(
... '%s = %s' % (d.key, d.version)
... for d in zc.buildout.easy_install.buildout_and_setuptools_dists)))
... for d in pkg_resources.working_set.resolve(
... pkg_resources.parse_requirements('zc.buildout')))))
>>> print_(system(buildout), end='')
Installing foo.
...
...
@@ -2923,6 +3162,73 @@ def increment_on_command_line():
recipe='zc.buildout:debug'
"""
def
bug_664539_simple_buildout
():
r"""
>>> write('buildout.cfg', '''
... [buildout]
... parts = escape
...
... [escape]
... recipe = zc.buildout:debug
... foo = $${nonexistent:option}
... ''')
>>> print_(system(buildout), end='')
Installing escape.
foo='${nonexistent:option}'
recipe='zc.buildout:debug'
"""
def
bug_664539_reference
():
r"""
>>> write('buildout.cfg', '''
... [buildout]
... parts = escape
...
... [escape]
... recipe = zc.buildout:debug
... foo = ${:bar}
... bar = $${nonexistent:option}
... ''')
>>> print_(system(buildout), end='')
Installing escape.
bar='${nonexistent:option}'
foo='${nonexistent:option}'
recipe='zc.buildout:debug'
"""
def
bug_664539_complex_buildout
():
r"""
>>> write('buildout.cfg', '''
... [buildout]
... parts = escape
...
... [escape]
... recipe = zc.buildout:debug
... foo = ${level1:foo}
...
... [level1]
... recipe = zc.buildout:debug
... foo = ${level2:foo}
...
... [level2]
... recipe = zc.buildout:debug
... foo = $${nonexistent:option}
... ''')
>>> print_(system(buildout), end='')
Installing level2.
foo='${nonexistent:option}'
recipe='zc.buildout:debug'
Installing level1.
foo='${nonexistent:option}'
recipe='zc.buildout:debug'
Installing escape.
foo='${nonexistent:option}'
recipe='zc.buildout:debug'
"""
def
test_constrained_requirement
():
"""
zc.buildout.easy_install._constrained_requirement(constraint, requirement)
...
...
@@ -3054,6 +3360,7 @@ def want_new_zcrecipeegg():
Getting section egg.
Initializing section egg.
Installing recipe zc.recipe.egg <2dev.
Base installation request: 'zc.recipe.egg <2dev'
Getting distribution for 'zc.recipe.egg<2dev,>=2.0.6'.
Error: Couldn't find a distribution for 'zc.recipe.egg<2dev,>=2.0.6'.
"""
...
...
@@ -3262,6 +3569,209 @@ def test_buildout_doesnt_keep_adding_itself_to_versions():
True
"""
def
test_missing_setup_requires_fails
():
r"""
When not allow_picked_versions, ensure setup_requires dependencies
are not installed implicitly without respecting pinned versions.
>>> zc.buildout.easy_install.allow_picked_versions(False)
True
>>> mkdir('dest')
>>> ws = zc.buildout.easy_install.install(
... ['withsetuprequires'], 'dest',
... links=[link_server], index=link_server+'index/',
... versions = dict(withsetuprequires='0.1')) # doctest: +ELLIPSIS
Traceback (most recent call last):
...
subprocess.CalledProcessError: ...pip...wheel... non-zero exit status 1.
>>> zc.buildout.easy_install.allow_picked_versions(True)
False
"""
def
test_available_setup_requires_succeeds
():
r"""
When not allow_picked_versions, ensure setup_requires dependencies
can be installed first and passed explictly.
>>> import subprocess
>>> zc.buildout.easy_install.allow_picked_versions(False)
True
>>> mkdir('dest')
>>> ws = zc.buildout.easy_install.install(
... ['builddep'], 'dest',
... links=[link_server], index=link_server+'index/',
... versions = dict(builddep='0.1'))
>>> import os
>>> builddep_egg = [
... f for f in os.listdir('dest')
... if f.endswith('.egg')
... and f.startswith('builddep')
... ][0]
>>> builddep_path = os.path.join(os.getcwd(), 'dest', builddep_egg)
>>> os.environ['PYTHONEXTRAPATH'] = builddep_path
>>> _ = zc.buildout.easy_install.install(
... ['withsetuprequires'], 'dest',
... links=[link_server], index=link_server+'index/',
... versions = dict(withsetuprequires='0.1'))
>>> del os.environ['PYTHONEXTRAPATH']
>>> zc.buildout.easy_install.allow_picked_versions(True)
False
"""
def
test_missing_build_system_requires_fails
():
r"""
When not allow_picked_versions, ensure build-system.requires dependencies
are not installed implicitly without respecting pinned versions.
>>> zc.buildout.easy_install.allow_picked_versions(False)
True
>>> mkdir('dest')
>>> ws = zc.buildout.easy_install.install(
... ['withbuildsystemrequires'], 'dest',
... links=[link_server], index=link_server+'index/',
... versions = dict(withbuildsystemrequires='0.1'))
... # doctest: +ELLIPSIS
Traceback (most recent call last):
...
subprocess.CalledProcessError: ...pip...wheel... non-zero exit status 1.
>>> zc.buildout.easy_install.allow_picked_versions(True)
False
"""
def
test_available_build_system_requires_succeeds
():
r"""
When not allow_picked_versions, ensure build-system.requires
dependencies can be installed first and passed explictly.
>>> import subprocess
>>> zc.buildout.easy_install.allow_picked_versions(False)
True
>>> mkdir('dest')
>>> ws = zc.buildout.easy_install.install(
... ['builddep'], 'dest',
... links=[link_server], index=link_server+'index/',
... versions = dict(builddep='0.1'))
>>> import os
>>> builddep_egg = [
... f for f in os.listdir('dest')
... if f.endswith('.egg')
... and f.startswith('builddep')
... ][0]
>>> builddep_path = os.path.join(os.getcwd(), 'dest', builddep_egg)
>>> os.environ['PYTHONEXTRAPATH'] = builddep_path
>>> _ = zc.buildout.easy_install.install(
... ['withbuildsystemrequires'], 'dest',
... links=[link_server], index=link_server+'index/',
... versions = dict(withbuildsystemrequires='0.1'))
>>> del os.environ['PYTHONEXTRAPATH']
>>> zc.buildout.easy_install.allow_picked_versions(True)
False
"""
def
test_pin_setup_requires_without_setup_eggs
():
r"""
>>> write('buildout.cfg',
... '''
... [buildout]
... find-links = %(link_server)s
... index = %(link_server)s+'index/'
... allow-picked-versions = false
... parts = withsetuprequires
... [withsetuprequires]
... recipe = zc.recipe.egg
... egg = withsetuprequires
... [versions]
... withsetuprequires = 0.1
... ''' % globals())
>>> print(system(join('bin', 'buildout'))) # doctest: +ELLIPSIS
Installing withsetuprequires.
Getting distribution for 'withsetuprequires==0.1'.
error: subprocess-exited-with-error
<BLANKLINE>
× python setup.py egg_info did not run successfully.
│ exit code: 1
...
subprocess.CalledProcessError: ...pip...wheel... non-zero exit status 1.
<BLANKLINE>
"""
def
test_pin_setup_requires_with_setup_eggs
():
"""
>>> write('buildout.cfg',
... '''
... [buildout]
... find-links = %(link_server)s
... index = %(link_server)s+'index/'
... allow-picked-versions = false
... parts = withsetuprequires
... [withsetuprequires]
... recipe = zc.recipe.egg:custom
... egg = withsetuprequires
... setup-eggs = builddep
... [versions]
... withsetuprequires = 0.1
... builddep = 0.1
... ''' % globals())
>>> print(system(join('bin', 'buildout')))
Installing withsetuprequires.
Getting distribution for 'builddep==0.1'.
Got builddep 0.1.
<BLANKLINE>
"""
def
test_pin_build_system_requires_without_setup_eggs
():
r"""
>>> write('buildout.cfg',
... '''
... [buildout]
... find-links = %(link_server)s
... index = %(link_server)s+'index/'
... allow-picked-versions = false
... parts = withbuildsystemrequires
... [withbuildsystemrequires]
... recipe = zc.recipe.egg
... egg = withbuildsystemrequires
... [versions]
... withbuildsystemrequires = 0.1
... ''' % globals())
>>> print(system(join('bin', 'buildout'))) # doctest: +ELLIPSIS
Installing withbuildsystemrequires.
Getting distribution for 'withbuildsystemrequires==0.1'.
error: subprocess-exited-with-error
<BLANKLINE>
× Preparing metadata (pyproject.toml) did not run successfully.
│ exit code: 1
...
subprocess.CalledProcessError: ...pip...wheel... non-zero exit status 1.
<BLANKLINE>
"""
def
test_pin_build_system_requires_with_setup_eggs
():
"""
>>> write('buildout.cfg',
... '''
... [buildout]
... find-links = %(link_server)s
... index = %(link_server)s+'index/'
... allow-picked-versions = false
... parts = withbuildsystemrequires
... [withbuildsystemrequires]
... recipe = zc.recipe.egg:custom
... egg = withbuildsystemrequires
... setup-eggs = builddep
... [versions]
... withbuildsystemrequires = 0.1
... builddep = 0.1
... ''' % globals())
>>> print(system(join('bin', 'buildout')))
Installing withbuildsystemrequires.
Getting distribution for 'builddep==0.1'.
Got builddep 0.1.
<BLANKLINE>
"""
if
sys
.
platform
==
'win32'
:
del
buildout_honors_umask
# umask on dohs is academic
...
...
@@ -3291,23 +3801,25 @@ def buildout_txt_setup(test):
os
.
path
.
join
(
eggs
,
'zc.recipe.egg'
),
)
egg_parse
=
re
.
compile
(
r'([0-9a-zA-Z_.]+)-([0-9a-zA-Z_.]+)-py(\
d[.]
\d+)$'
egg_parse
=
re
.
compile
(
r'([0-9a-zA-Z_.]+)-([0-9a-zA-Z_.
+
]+)-py(\
d[.]
\d+)$'
).
match
def
makeNewRelease
(
project
,
ws
,
dest
,
version
=
'99.99'
):
dist
=
ws
.
find
(
pkg_resources
.
Requirement
.
parse
(
project
))
eggname
,
oldver
,
pyver
=
egg_parse
(
dist
.
egg_name
()).
groups
()
dest
=
os
.
path
.
join
(
dest
,
"%s-%s-py%s.egg"
%
(
eggname
,
version
,
pyver
))
if
os
.
path
.
isfile
(
dist
.
location
):
shutil
.
copy
(
dist
.
location
,
dest
)
zip
=
zipfile
.
ZipFile
(
dest
,
'a'
)
zip
.
writestr
(
'EGG-INFO/PKG-INFO'
,
((
zip
.
read
(
'EGG-INFO/PKG-INFO'
).
decode
(
'ISO-8859-1'
)
).
replace
(
"Version: %s"
%
oldver
,
"Version: %s"
%
version
)
).
encode
(
'ISO-8859-1'
)
)
zip
.
close
()
with
zipfile
.
ZipFile
(
dist
.
location
,
'r'
)
as
old_zip
:
with
zipfile
.
ZipFile
(
dest
,
'w'
)
as
new_zip
:
for
item
in
old_zip
.
infolist
():
data
=
old_zip
.
read
(
item
.
filename
)
if
item
.
filename
==
'EGG-INFO/PKG-INFO'
:
data
=
data
.
decode
(
'ISO-8859-1'
).
replace
(
"Version: %s"
%
oldver
,
"Version: %s"
%
version
).
encode
(
'ISO-8859-1'
)
new_zip
.
writestr
(
item
,
data
)
elif
dist
.
location
.
endswith
(
'site-packages'
):
os
.
mkdir
(
dest
)
shutil
.
copytree
(
...
...
@@ -3603,7 +4115,7 @@ def test_suite():
),
doctest.DocFileSuite(
'
download
.
txt
',
'
extends
-
cache
.
txt
',
'
download
.
txt
',
setUp=easy_install_SetUp,
tearDown=zc.buildout.testing.buildoutTearDown,
optionflags=doctest.NORMALIZE_WHITESPACE | doctest.ELLIPSIS,
...
...
zc.recipe.egg_/setup.py
View file @
3b16f5af
...
...
@@ -14,7 +14,7 @@
"""Setup for zc.recipe.egg package
"""
version
=
'2.0.8.dev0'
version
=
'2.0.8.dev0
+slapos001
'
import
os
from
setuptools
import
setup
,
find_packages
...
...
zc.recipe.egg_/src/zc/recipe/egg/README.rst
View file @
3b16f5af
...
...
@@ -9,6 +9,19 @@ eggs
requirement strings. Each string must be given on a separate
line.
patch-binary
The path to the patch executable.
EGGNAME-patches
A new-line separated list of patchs to apply when building.
EGGNAME-patch-options
Options to give to the patch program when applying patches.
EGGNAME-patch-revision
An integer to specify the revision (default is the number of
patches).
find-links
A list of URLs, files, or directories to search for distributions.
...
...
zc.recipe.egg_/src/zc/recipe/egg/api.rst
View file @
3b16f5af
...
...
@@ -97,14 +97,14 @@ of extra requirements to be included in the working set.
We can see that the options were augmented with additional data
computed by the egg recipe by looking at .installed.cfg:
>>> cat(sample_buildout, '.installed.cfg')
>>> cat(sample_buildout, '.installed.cfg')
# doctest: +ELLIPSIS
[buildout]
installed_develop_eggs = /sample-buildout/develop-eggs/sample.egg-link
parts = sample-part
<BLANKLINE>
[sample-part]
__buildout_installed__ =
__buildout_signature__ = ...
__buildout_signature__ =
sample-... setuptools-... zc.buildout-... zc.recipe.egg-
...
_b = /sample-buildout/bin
_d = /sample-buildout/develop-eggs
_e = /sample-buildout/eggs
...
...
zc.recipe.egg_/src/zc/recipe/egg/custom.py
View file @
3b16f5af
...
...
@@ -15,6 +15,7 @@
"""
import
logging
import
os
import
re
import
sys
import
zc.buildout.easy_install
...
...
@@ -28,17 +29,19 @@ class Base:
self
.
name
,
self
.
options
=
name
,
options
options
[
'_d'
]
=
buildout
[
'buildout'
][
'develop-eggs-directory'
]
self
.
build_ext
=
build_ext
(
buildout
,
options
)
def
update
(
self
):
return
self
.
install
()
options
[
'_e'
]
=
buildout
[
'buildout'
][
'eggs-directory'
]
class
Custom
(
Base
):
environment_section
=
options
.
get
(
'environment'
)
if
environment_section
:
self
.
environment
=
buildout
[
environment_section
]
else
:
self
.
environment
=
{}
environment_data
=
list
(
self
.
environment
.
items
())
environment_data
.
sort
()
options
[
'_environment-data'
]
=
repr
(
environment_data
)
def
__init__
(
self
,
buildout
,
name
,
options
):
Base
.
__init__
(
self
,
buildout
,
name
,
options
)
self
.
build_ext
=
build_ext
(
buildout
,
options
)
links
=
options
.
get
(
'find-links'
,
buildout
[
'buildout'
].
get
(
'find-links'
))
...
...
@@ -54,45 +57,20 @@ class Custom(Base):
options
[
'index'
]
=
index
self
.
index
=
index
environment_section
=
options
.
get
(
'environment'
)
if
environment_section
:
self
.
environment
=
buildout
[
environment_section
]
else
:
self
.
environment
=
{}
environment_data
=
list
(
self
.
environment
.
items
())
environment_data
.
sort
()
options
[
'_environment-data'
]
=
repr
(
environment_data
)
options
[
'_e'
]
=
buildout
[
'buildout'
][
'eggs-directory'
]
if
buildout
[
'buildout'
].
get
(
'offline'
)
==
'true'
:
self
.
install
=
lambda
:
()
self
.
newest
=
buildout
[
'buildout'
].
get
(
'newest'
)
==
'true'
def
install
(
self
):
options
=
self
.
options
distribution
=
options
.
get
(
'egg'
)
if
distribution
is
None
:
distribution
=
options
.
get
(
'eggs'
)
if
distribution
is
None
:
distribution
=
self
.
name
else
:
logger
.
warn
(
"The eggs option is deprecated. Use egg instead"
)
distribution
=
options
.
get
(
'egg'
,
options
.
get
(
'eggs'
,
self
.
name
)
).
strip
()
def
install
(
self
):
self
.
_set_environment
()
try
:
return
zc
.
buildout
.
easy_install
.
build
(
distribution
,
options
[
'_d'
],
self
.
build_ext
,
self
.
links
,
self
.
index
,
sys
.
executable
,
[
options
[
'_e'
]],
newest
=
self
.
newest
,
)
self
.
_install_setup_eggs
()
return
self
.
_install
()
finally
:
self
.
_restore_environment
()
def
update
(
self
):
return
self
.
install
()
def
_set_environment
(
self
):
self
.
_saved_environment
=
{}
...
...
@@ -114,6 +92,78 @@ class Custom(Base):
except
KeyError
:
pass
def
_install_setup_eggs
(
self
):
options
=
self
.
options
setup_eggs
=
[
r
.
strip
()
for
r
in
options
.
get
(
'setup-eggs'
,
''
).
split
(
'
\
n
'
)
if
r
.
strip
()]
if
setup_eggs
:
ws
=
zc
.
buildout
.
easy_install
.
install
(
setup_eggs
,
options
[
'_e'
],
links
=
self
.
links
,
index
=
self
.
index
,
executable
=
sys
.
executable
,
path
=
[
options
[
'_d'
],
options
[
'_e'
]],
newest
=
self
.
newest
,
)
extra_path
=
os
.
pathsep
.
join
(
ws
.
entries
)
self
.
environment
[
'PYTHONEXTRAPATH'
]
=
os
.
environ
[
'PYTHONEXTRAPATH'
]
=
extra_path
def
_get_patch_dict
(
self
,
options
,
distribution
):
patch_dict
=
{}
global_patch_binary
=
options
.
get
(
'patch-binary'
,
'patch'
)
def
get_option
(
egg
,
key
,
default
):
return
options
.
get
(
'%s-%s'
%
(
egg
,
key
),
options
.
get
(
key
,
default
))
egg
=
re
.
sub
(
'[<>=].*'
,
''
,
distribution
)
patches
=
filter
(
lambda
x
:
x
,
map
(
lambda
x
:
x
.
strip
(),
get_option
(
egg
,
'patches'
,
''
).
splitlines
()))
patches
=
list
(
patches
)
if
not
patches
:
return
patch_dict
patch_options
=
get_option
(
egg
,
'patch-options'
,
'-p0'
).
split
()
patch_binary
=
get_option
(
egg
,
'patch-binary'
,
global_patch_binary
)
patch_revision
=
int
(
get_option
(
egg
,
'patch-revision'
,
len
(
patches
)))
patch_dict
[
egg
]
=
{
'patches'
:
patches
,
'patch_options'
:
patch_options
,
'patch_binary'
:
patch_binary
,
'patch_revision'
:
patch_revision
,
}
return
patch_dict
class
Custom
(
Base
):
def
__init__
(
self
,
buildout
,
name
,
options
):
Base
.
__init__
(
self
,
buildout
,
name
,
options
)
if
buildout
[
'buildout'
].
get
(
'offline'
)
==
'true'
:
self
.
_install
=
lambda
:
()
def
_install
(
self
):
options
=
self
.
options
distribution
=
options
.
get
(
'egg'
)
if
distribution
is
None
:
distribution
=
options
.
get
(
'eggs'
)
if
distribution
is
None
:
distribution
=
self
.
name
else
:
logger
.
warn
(
"The eggs option is deprecated. Use egg instead"
)
distribution
=
options
.
get
(
'egg'
,
options
.
get
(
'eggs'
,
self
.
name
)
).
strip
()
patch_dict
=
self
.
_get_patch_dict
(
options
,
distribution
)
return
zc
.
buildout
.
easy_install
.
build
(
distribution
,
options
[
'_d'
],
self
.
build_ext
,
self
.
links
,
self
.
index
,
sys
.
executable
,
[
options
[
'_e'
]],
newest
=
self
.
newest
,
patch_dict
=
patch_dict
,
)
class
Develop
(
Base
):
...
...
@@ -122,7 +172,7 @@ class Develop(Base):
options
[
'setup'
]
=
os
.
path
.
join
(
buildout
[
'buildout'
][
'directory'
],
options
[
'setup'
])
def
install
(
self
):
def
_
install
(
self
):
options
=
self
.
options
return
zc
.
buildout
.
easy_install
.
develop
(
options
[
'setup'
],
options
[
'_d'
],
self
.
build_ext
)
...
...
zc.recipe.egg_/src/zc/recipe/egg/custom.rst
View file @
3b16f5af
...
...
@@ -20,6 +20,23 @@ rpath
A new-line separated list of directories to search for dynamic libraries
at run time.
setup-eggs
A new-line separated list of eggs that need to be installed
beforehand. It is useful to meet the `setup_requires` requirement.
patch-binary
The path to the patch executable.
patches
A new-line separated list of patchs to apply when building.
patch-options
Options to give to the patch program when applying patches.
patch-revision
An integer to specify the revision (default is the number of
patches).
define
A comma-separated list of names of C preprocessor variables to
define.
...
...
@@ -434,8 +451,8 @@ Create a clean buildout.cfg w/o the checkenv recipe, and delete the recipe:
... """ % dict(server=link_server))
>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
Develop: '
/
sample
-
buildout
/
recipes
'
Uninstalling checkenv.
Uninstalling extdemo.
Uninstalling checkenv.
Installing extdemo...
>>> rmdir(sample_buildout, '
recipes
')
...
...
@@ -463,6 +480,10 @@ rpath
A new-line separated list of directories to search for dynamic libraries
at run time.
setup-eggs
A new-line separated list of eggs that need to be installed
beforehand. It is useful to meet the `setup_requires` requirement.
define
A comma-separated list of names of C preprocessor variables to
define.
...
...
@@ -499,6 +520,10 @@ swig-cpp
swig-opts
List of SWIG command line options
environment
The name of a section with additional environment variables. The
environment variables are set before the egg is built.
To illustrate this, we'
ll
use
a
directory
containing
the
extdemo
example
from
the
earlier
section
:
...
...
zc.recipe.egg_/src/zc/recipe/egg/egg.py
View file @
3b16f5af
...
...
@@ -51,11 +51,44 @@ class Eggs(object):
if
host
.
strip
()
!=
''
])
self
.
allow_hosts
=
allow_hosts
self
.
buildout_dir
=
b_options
[
'directory'
]
options
[
'eggs-directory'
]
=
b_options
[
'eggs-directory'
]
options
[
'_e'
]
=
options
[
'eggs-directory'
]
# backward compat.
options
[
'develop-eggs-directory'
]
=
b_options
[
'develop-eggs-directory'
]
options
[
'_d'
]
=
options
[
'develop-eggs-directory'
]
# backward compat.
def
_get_patch_dict
(
self
,
options
,
egg
=
None
):
patch_dict
=
{}
global_patch_binary
=
options
.
get
(
'patch-binary'
,
'patch'
)
if
egg
:
egg
=
re
.
sub
(
'[<>=].*'
,
''
,
egg
)
egg_list
=
[
egg
]
else
:
egg_list
=
[
x
[:
-
8
]
for
x
in
options
.
keys
()
if
x
.
endswith
(
'-patches'
)]
def
get_option
(
egg
,
key
,
default
):
if
len
(
egg_list
)
==
1
:
return
options
.
get
(
'%s-%s'
%
(
egg
,
key
),
options
.
get
(
key
,
default
))
else
:
return
options
.
get
(
'%s-%s'
%
(
egg
,
key
),
default
)
for
egg
in
egg_list
:
patches
=
filter
(
lambda
x
:
x
,
map
(
lambda
x
:
x
.
strip
(),
get_option
(
egg
,
'patches'
,
''
).
splitlines
()))
patches
=
list
(
patches
)
if
not
patches
:
continue
patch_options
=
get_option
(
egg
,
'patch-options'
,
'-p0'
).
split
()
patch_binary
=
get_option
(
egg
,
'patch-binary'
,
global_patch_binary
)
patch_revision
=
int
(
get_option
(
egg
,
'patch-revision'
,
len
(
patches
)))
patch_dict
[
egg
]
=
{
'patches'
:
patches
,
'patch_options'
:
patch_options
,
'patch_binary'
:
patch_binary
,
'patch_revision'
:
patch_revision
,
}
return
patch_dict
def
working_set
(
self
,
extra
=
()):
"""Separate method to just get the working set
...
...
@@ -77,6 +110,7 @@ class Eggs(object):
distributions
=
orig_distributions
+
list
(
extra
),
develop_eggs_dir
=
options
[
'develop-eggs-directory'
],
eggs_dir
=
options
[
'eggs-directory'
],
buildout_dir
=
self
.
buildout_dir
,
offline
=
(
buildout_section
.
get
(
'offline'
)
==
'true'
),
newest
=
(
buildout_section
.
get
(
'newest'
)
==
'true'
),
links
=
self
.
links
,
...
...
@@ -98,6 +132,7 @@ class Eggs(object):
distributions
,
eggs_dir
,
develop_eggs_dir
,
buildout_dir
,
offline
=
False
,
newest
=
True
,
links
=
(),
...
...
@@ -131,6 +166,7 @@ class Eggs(object):
[
develop_eggs_dir
,
eggs_dir
]
)
else
:
patch_dict
=
self
.
_get_patch_dict
(
self
.
options
)
ws
=
zc
.
buildout
.
easy_install
.
install
(
distributions
,
eggs_dir
,
links
=
links
,
...
...
@@ -138,9 +174,10 @@ class Eggs(object):
path
=
[
develop_eggs_dir
],
newest
=
newest
,
allow_hosts
=
allow_hosts
,
allow_unknown_extras
=
allow_unknown_extras
)
allow_unknown_extras
=
allow_unknown_extras
,
patch_dict
=
patch_dict
)
ws
=
zc
.
buildout
.
easy_install
.
sort_working_set
(
ws
,
eggs_dir
,
develop_eggs_dir
ws
,
buildout_dir
,
eggs_dir
,
develop_eggs_dir
)
cache_storage
[
cache_key
]
=
ws
...
...
zc.recipe.egg_/src/zc/recipe/egg/patches.rst
0 → 100644
View file @
3b16f5af
Patching eggs before installation
---------------------------------
The SlapOS extensions of ``zc.recipe.egg`` supports applying patches before installing eggs.
The syntax is to use a version with the magic string ``SlapOSPatched`` plus the number of
patches to apply.
Let's use a patch for demoneeded egg:
>>> write(sample_buildout, 'demoneeded.patch',
... r"""diff -ru before/demoneeded-1.1/eggrecipedemoneeded.py after/demoneeded-1.1/eggrecipedemoneeded.py
... --- before/demoneeded-1.1/eggrecipedemoneeded.py 2020-09-08 09:27:36.000000000 +0200
... +++ after/demoneeded-1.1/eggrecipedemoneeded.py 2020-09-08 09:46:16.482243822 +0200
... @@ -1,3 +1,3 @@
... -y=1
... +y="patched demoneeded"
... def f():
... pass
... \ No newline at end of file
... """)
First, we install demoneeded directly:
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... parts = demoneeded
...
... [demoneeded]
... recipe = zc.recipe.egg:eggs
... eggs = demoneeded
... find-links = %(server)s
... index = %(server)s/index
... demoneeded-patches =
... ./demoneeded.patch#4b8ad56711dd0d898a2b7957e9604079
... demoneeded-patch-options = -p2
...
... [versions]
... demoneeded = 1.1+SlapOSPatched001
... """ % dict(server=link_server))
When running buildout, we have a warning that a different version is installed, but that's not fatal.
>>> print_(system(buildout), end='')
Installing demoneeded.
patching file eggrecipedemoneeded.py
Installing demoneeded 1.1
Caused installation of a distribution:
demoneeded 1.1+slapospatched001
with a different version.
The installed egg has the slapospatched001 marker
>>> ls(sample_buildout, 'eggs')
d demoneeded-1.1+slapospatched001-pyN.N.egg
- setuptools-0.7-py2.3.egg
d zc.buildout-1.0-py2.3.egg
The code of the egg has been patched:
>>> import glob
>>> import os.path
>>> cat(glob.glob(os.path.join(sample_buildout, 'eggs', 'demoneeded-1.1+slapospatched001*', 'eggrecipedemoneeded.py'))[0])
y="patched demoneeded"
def f():
pass
Reset the state and also remove the installed egg
>>> remove('.installed.cfg')
>>> rmdir(glob.glob(os.path.join(sample_buildout, 'eggs', 'demoneeded-1.1+slapospatched001*'))[0])
In the previous example we applied patches to an egg installed directly, but
the same technique can be used to apply patches on eggs installed as dependencies.
In this example we install demo and apply a patch to demoneeded, which is a dependency to demo.
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... parts = demo
...
... [demo]
... recipe = zc.recipe.egg
... eggs = demo
... find-links = %(server)s
... index = %(server)s/index
... demoneeded-patches =
... ./demoneeded.patch#4b8ad56711dd0d898a2b7957e9604079
... demoneeded-patch-options = -p2
...
... [versions]
... demoneeded = 1.1+SlapOSPatched001
... """ % dict(server=link_server))
When running buildout, we also have that warning that a different version is installed.
>>> print_(system(buildout), end='')
Installing demo.
Getting distribution for 'demo'.
Got demo 0.3.
patching file eggrecipedemoneeded.py
Installing demoneeded 1.1
Caused installation of a distribution:
demoneeded 1.1+slapospatched001
with a different version.
Generated script '/sample-buildout/bin/demo'.
The installed egg has the slapospatched001 marker
>>> ls(sample_buildout, 'eggs')
d demo-0.3-pyN.N.egg
d demoneeded-1.1+slapospatched001-pyN.N.egg
- setuptools-0.7-py2.3.egg
d zc.buildout-1.0-py2.3.egg
If we run the demo script we see the patch was applied:
>>> print_(system(join(sample_buildout, 'bin', 'demo')), end='')
3 patched demoneeded
zc.recipe.egg_/src/zc/recipe/egg/tests.py
View file @
3b16f5af
...
...
@@ -100,6 +100,26 @@ def test_suite():
zc.buildout.testing.not_found,
])
),
doctest.DocFileSuite(
'
patches
.
rst
',
setUp=setUp, tearDown=zc.buildout.testing.buildoutTearDown,
optionflags=doctest.NORMALIZE_WHITESPACE | doctest.ELLIPSIS,
checker=renormalizing.RENormalizing([
zc.buildout.testing.normalize_path,
zc.buildout.testing.normalize_endings,
zc.buildout.testing.normalize_script,
zc.buildout.testing.normalize_egg_py,
zc.buildout.tests.normalize_bang,
zc.buildout.tests.normalize_S,
zc.buildout.testing.not_found,
zc.buildout.testing.easy_install_deprecated,
(re.compile(r'
[
d
-
]
zc
.
buildout
(
-
\
S
+
)
?
[.]
egg
(
-
link
)
?
'),
'
zc
.
buildout
.
egg
'),
(re.compile(r'
[
d
-
]
setuptools
-
[
^-
]
+-
'), '
setuptools
-
X
-
'),
(re.compile(r'
eggs
\\\\
demo
'), '
eggs
/
demo
'),
(re.compile(r'
[
a
-
zA
-
Z
]:
\\\\
foo
\\\\
bar
'), '
/
foo
/
bar
'),
])
),
]
if not WINDOWS:
suites.append(
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment