Commit cca3e01b authored by Jason R. Coombs's avatar Jason R. Coombs Committed by GitHub

Merge pull request #2397 from pypa/feature/2093-docs-revamp

Apply docs revamp
parents 662816b6 8cca69b8
doc: simplify index and group deprecated files
doc overhaul step 2: break main doc into multiple sections
\ No newline at end of file
doc overhaul step 3: update userguide
"Eggsecutable" Scripts
----------------------
.. deprecated:: 45.3.0
Occasionally, there are situations where it's desirable to make an ``.egg``
file directly executable. You can do this by including an entry point such
as the following::
setup(
# other arguments here...
entry_points={
"setuptools.installation": [
"eggsecutable = my_package.some_module:main_func",
]
}
)
Any eggs built from the above setup script will include a short executable
prelude that imports and calls ``main_func()`` from ``my_package.some_module``.
The prelude can be run on Unix-like platforms (including Mac and Linux) by
invoking the egg with ``/bin/sh``, or by enabling execute permissions on the
``.egg`` file. For the executable prelude to run, the appropriate version of
Python must be available via the ``PATH`` environment variable, under its
"long" name. That is, if the egg is built for Python 2.3, there must be a
``python2.3`` executable present in a directory on ``PATH``.
IMPORTANT NOTE: Eggs with an "eggsecutable" header cannot be renamed, or
invoked via symlinks. They *must* be invoked using their original filename, in
order to ensure that, once running, ``pkg_resources`` will know what project
and version is in use. The header script will check this and exit with an
error if the ``.egg`` file has been renamed or is invoked via a symlink that
changes its base name.
\ No newline at end of file
======================================================
Guides on backward compatibility & deprecated practice
======================================================
``Setuptools`` has undergone tremendous changes since its first debut. As its
development continues to roll forward, many of the practice and mechanisms it
had established are now considered deprecated. But they still remain relevant
as a plethora of libraries continue to depend on them. Many people also find
it necessary to equip themselves with the knowledge to better support backward
compatibility. This guide aims to provide the essential information for such
objectives.
.. toctree::
:maxdepth: 1
python3
python_eggs
easy_install
distutils-legacy
......@@ -7,13 +7,8 @@ designed to facilitate packaging Python projects.
Documentation content:
.. toctree::
:maxdepth: 2
:maxdepth: 1
setuptools
pkg_resources
python3
development
roadmap
Deprecated: Easy Install <easy_install>
distutils-legacy
history
User guide <userguide/index>
Development guide <development>
Backward compatibility & deprecated practice <deprecated/index>
This source diff could not be displayed because it is too large. You can view the blob instead.
-----------------
Command Reference
-----------------
.. _alias:
``alias`` - Define shortcuts for commonly used commands
=======================================================
Sometimes, you need to use the same commands over and over, but you can't
necessarily set them as defaults. For example, if you produce both development
snapshot releases and "stable" releases of a project, you may want to put
the distributions in different places, or use different ``egg_info`` tagging
options, etc. In these cases, it doesn't make sense to set the options in
a distutils configuration file, because the values of the options changed based
on what you're trying to do.
Setuptools therefore allows you to define "aliases" - shortcut names for
an arbitrary string of commands and options, using ``setup.py alias aliasname
expansion``, where aliasname is the name of the new alias, and the remainder of
the command line supplies its expansion. For example, this command defines
a sitewide alias called "daily", that sets various ``egg_info`` tagging
options::
setup.py alias --global-config daily egg_info --tag-build=development
Once the alias is defined, it can then be used with other setup commands,
e.g.::
setup.py daily bdist_egg # generate a daily-build .egg file
setup.py daily sdist # generate a daily-build source distro
setup.py daily sdist bdist_egg # generate both
The above commands are interpreted as if the word ``daily`` were replaced with
``egg_info --tag-build=development``.
Note that setuptools will expand each alias *at most once* in a given command
line. This serves two purposes. First, if you accidentally create an alias
loop, it will have no effect; you'll instead get an error message about an
unknown command. Second, it allows you to define an alias for a command, that
uses that command. For example, this (project-local) alias::
setup.py alias bdist_egg bdist_egg rotate -k1 -m.egg
redefines the ``bdist_egg`` command so that it always runs the ``rotate``
command afterwards to delete all but the newest egg file. It doesn't loop
indefinitely on ``bdist_egg`` because the alias is only expanded once when
used.
You can remove a defined alias with the ``--remove`` (or ``-r``) option, e.g.::
setup.py alias --global-config --remove daily
would delete the "daily" alias we defined above.
Aliases can be defined on a project-specific, per-user, or sitewide basis. The
default is to define or remove a project-specific alias, but you can use any of
the `configuration file options`_ (listed under the `saveopts`_ command, below)
to determine which distutils configuration file an aliases will be added to
(or removed from).
Note that if you omit the "expansion" argument to the ``alias`` command,
you'll get output showing that alias' current definition (and what
configuration file it's defined in). If you omit the alias name as well,
you'll get a listing of all current aliases along with their configuration
file locations.
``bdist_egg`` - Create a Python Egg for the project
===================================================
.. warning::
**eggs** are deprecated in favor of wheels, and not supported by pip.
This command generates a Python Egg (``.egg`` file) for the project. Python
Eggs are the preferred binary distribution format for EasyInstall, because they
are cross-platform (for "pure" packages), directly importable, and contain
project metadata including scripts and information about the project's
dependencies. They can be simply downloaded and added to ``sys.path``
directly, or they can be placed in a directory on ``sys.path`` and then
automatically discovered by the egg runtime system.
This command runs the `egg_info`_ command (if it hasn't already run) to update
the project's metadata (``.egg-info``) directory. If you have added any extra
metadata files to the ``.egg-info`` directory, those files will be included in
the new egg file's metadata directory, for use by the egg runtime system or by
any applications or frameworks that use that metadata.
You won't usually need to specify any special options for this command; just
use ``bdist_egg`` and you're done. But there are a few options that may
be occasionally useful:
``--dist-dir=DIR, -d DIR``
Set the directory where the ``.egg`` file will be placed. If you don't
supply this, then the ``--dist-dir`` setting of the ``bdist`` command
will be used, which is usually a directory named ``dist`` in the project
directory.
``--plat-name=PLATFORM, -p PLATFORM``
Set the platform name string that will be embedded in the egg's filename
(assuming the egg contains C extensions). This can be used to override
the distutils default platform name with something more meaningful. Keep
in mind, however, that the egg runtime system expects to see eggs with
distutils platform names, so it may ignore or reject eggs with non-standard
platform names. Similarly, the EasyInstall program may ignore them when
searching web pages for download links. However, if you are
cross-compiling or doing some other unusual things, you might find a use
for this option.
``--exclude-source-files``
Don't include any modules' ``.py`` files in the egg, just compiled Python,
C, and data files. (Note that this doesn't affect any ``.py`` files in the
EGG-INFO directory or its subdirectories, since for example there may be
scripts with a ``.py`` extension which must still be retained.) We don't
recommend that you use this option except for packages that are being
bundled for proprietary end-user applications, or for "embedded" scenarios
where space is at an absolute premium. On the other hand, if your package
is going to be installed and used in compressed form, you might as well
exclude the source because Python's ``traceback`` module doesn't currently
understand how to display zipped source code anyway, or how to deal with
files that are in a different place from where their code was compiled.
There are also some options you will probably never need, but which are there
because they were copied from similar ``bdist`` commands used as an example for
creating this one. They may be useful for testing and debugging, however,
which is why we kept them:
``--keep-temp, -k``
Keep the contents of the ``--bdist-dir`` tree around after creating the
``.egg`` file.
``--bdist-dir=DIR, -b DIR``
Set the temporary directory for creating the distribution. The entire
contents of this directory are zipped to create the ``.egg`` file, after
running various installation commands to copy the package's modules, data,
and extensions here.
``--skip-build``
Skip doing any "build" commands; just go straight to the
install-and-compress phases.
.. _develop:
``develop`` - Deploy the project source in "Development Mode"
=============================================================
This command allows you to deploy your project's source for use in one or more
"staging areas" where it will be available for importing. This deployment is
done in such a way that changes to the project source are immediately available
in the staging area(s), without needing to run a build or install step after
each change.
The ``develop`` command works by creating an ``.egg-link`` file (named for the
project) in the given staging area. If the staging area is Python's
``site-packages`` directory, it also updates an ``easy-install.pth`` file so
that the project is on ``sys.path`` by default for all programs run using that
Python installation.
The ``develop`` command also installs wrapper scripts in the staging area (or
a separate directory, as specified) that will ensure the project's dependencies
are available on ``sys.path`` before running the project's source scripts.
And, it ensures that any missing project dependencies are available in the
staging area, by downloading and installing them if necessary.
Last, but not least, the ``develop`` command invokes the ``build_ext -i``
command to ensure any C extensions in the project have been built and are
up-to-date, and the ``egg_info`` command to ensure the project's metadata is
updated (so that the runtime and wrappers know what the project's dependencies
are). If you make any changes to the project's setup script or C extensions,
you should rerun the ``develop`` command against all relevant staging areas to
keep the project's scripts, metadata and extensions up-to-date. Most other
kinds of changes to your project should not require any build operations or
rerunning ``develop``, but keep in mind that even minor changes to the setup
script (e.g. changing an entry point definition) require you to re-run the
``develop`` or ``test`` commands to keep the distribution updated.
Here are some of the options that the ``develop`` command accepts. Note that
they affect the project's dependencies as well as the project itself, so if you
have dependencies that need to be installed and you use ``--exclude-scripts``
(for example), the dependencies' scripts will not be installed either! For
this reason, you may want to use pip to install the project's dependencies
before using the ``develop`` command, if you need finer control over the
installation options for dependencies.
``--uninstall, -u``
Un-deploy the current project. You may use the ``--install-dir`` or ``-d``
option to designate the staging area. The created ``.egg-link`` file will
be removed, if present and it is still pointing to the project directory.
The project directory will be removed from ``easy-install.pth`` if the
staging area is Python's ``site-packages`` directory.
Note that this option currently does *not* uninstall script wrappers! You
must uninstall them yourself, or overwrite them by using pip to install a
different version of the package. You can also avoid installing script
wrappers in the first place, if you use the ``--exclude-scripts`` (aka
``-x``) option when you run ``develop`` to deploy the project.
``--multi-version, -m``
"Multi-version" mode. Specifying this option prevents ``develop`` from
adding an ``easy-install.pth`` entry for the project(s) being deployed, and
if an entry for any version of a project already exists, the entry will be
removed upon successful deployment. In multi-version mode, no specific
version of the package is available for importing, unless you use
``pkg_resources.require()`` to put it on ``sys.path``, or you are running
a wrapper script generated by ``setuptools``. (In which case the wrapper
script calls ``require()`` for you.)
Note that if you install to a directory other than ``site-packages``,
this option is automatically in effect, because ``.pth`` files can only be
used in ``site-packages`` (at least in Python 2.3 and 2.4). So, if you use
the ``--install-dir`` or ``-d`` option (or they are set via configuration
file(s)) your project and its dependencies will be deployed in multi-
version mode.
``--install-dir=DIR, -d DIR``
Set the installation directory (staging area). If this option is not
directly specified on the command line or in a distutils configuration
file, the distutils default installation location is used. Normally, this
will be the ``site-packages`` directory, but if you are using distutils
configuration files, setting things like ``prefix`` or ``install_lib``,
then those settings are taken into account when computing the default
staging area.
``--script-dir=DIR, -s DIR``
Set the script installation directory. If you don't supply this option
(via the command line or a configuration file), but you *have* supplied
an ``--install-dir`` (via command line or config file), then this option
defaults to the same directory, so that the scripts will be able to find
their associated package installation. Otherwise, this setting defaults
to the location where the distutils would normally install scripts, taking
any distutils configuration file settings into account.
``--exclude-scripts, -x``
Don't deploy script wrappers. This is useful if you don't want to disturb
existing versions of the scripts in the staging area.
``--always-copy, -a``
Copy all needed distributions to the staging area, even if they
are already present in another directory on ``sys.path``. By default, if
a requirement can be met using a distribution that is already available in
a directory on ``sys.path``, it will not be copied to the staging area.
``--egg-path=DIR``
Force the generated ``.egg-link`` file to use a specified relative path
to the source directory. This can be useful in circumstances where your
installation directory is being shared by code running under multiple
platforms (e.g. Mac and Windows) which have different absolute locations
for the code under development, but the same *relative* locations with
respect to the installation directory. If you use this option when
installing, you must supply the same relative path when uninstalling.
In addition to the above options, the ``develop`` command also accepts all of
the same options accepted by ``easy_install``. If you've configured any
``easy_install`` settings in your ``setup.cfg`` (or other distutils config
files), the ``develop`` command will use them as defaults, unless you override
them in a ``[develop]`` section or on the command line.
.. _egg_info:
``egg_info`` - Create egg metadata and set build tags
=====================================================
This command performs two operations: it updates a project's ``.egg-info``
metadata directory (used by the ``bdist_egg``, ``develop``, and ``test``
commands), and it allows you to temporarily change a project's version string,
to support "daily builds" or "snapshot" releases. It is run automatically by
the ``sdist``, ``bdist_egg``, ``develop``, and ``test`` commands in order to
update the project's metadata, but you can also specify it explicitly in order
to temporarily change the project's version string while executing other
commands. (It also generates the ``.egg-info/SOURCES.txt`` manifest file, which
is used when you are building source distributions.)
In addition to writing the core egg metadata defined by ``setuptools`` and
required by ``pkg_resources``, this command can be extended to write other
metadata files as well, by defining entry points in the ``egg_info.writers``
group. See the section on `Adding new EGG-INFO Files`_ below for more details.
Note that using additional metadata writers may require you to include a
``setup_requires`` argument to ``setup()`` in order to ensure that the desired
writers are available on ``sys.path``.
Release Tagging Options
-----------------------
The following options can be used to modify the project's version string for
all remaining commands on the setup command line. The options are processed
in the order shown, so if you use more than one, the requested tags will be
added in the following order:
``--tag-build=NAME, -b NAME``
Append NAME to the project's version string. Due to the way setuptools
processes "pre-release" version suffixes beginning with the letters "a"
through "e" (like "alpha", "beta", and "candidate"), you will usually want
to use a tag like ".build" or ".dev", as this will cause the version number
to be considered *lower* than the project's default version. (If you
want to make the version number *higher* than the default version, you can
always leave off --tag-build and then use one or both of the following
options.)
If you have a default build tag set in your ``setup.cfg``, you can suppress
it on the command line using ``-b ""`` or ``--tag-build=""`` as an argument
to the ``egg_info`` command.
``--tag-date, -d``
Add a date stamp of the form "-YYYYMMDD" (e.g. "-20050528") to the
project's version number.
``--no-date, -D``
Don't include a date stamp in the version number. This option is included
so you can override a default setting in ``setup.cfg``.
(Note: Because these options modify the version number used for source and
binary distributions of your project, you should first make sure that you know
how the resulting version numbers will be interpreted by automated tools
like pip. See the section above on `Specifying Your Project's Version`_ for an
explanation of pre- and post-release tags, as well as tips on how to choose and
verify a versioning scheme for your project.)
For advanced uses, there is one other option that can be set, to change the
location of the project's ``.egg-info`` directory. Commands that need to find
the project's source directory or metadata should get it from this setting:
Other ``egg_info`` Options
--------------------------
``--egg-base=SOURCEDIR, -e SOURCEDIR``
Specify the directory that should contain the .egg-info directory. This
should normally be the root of your project's source tree (which is not
necessarily the same as your project directory; some projects use a ``src``
or ``lib`` subdirectory as the source root). You should not normally need
to specify this directory, as it is normally determined from the
``package_dir`` argument to the ``setup()`` function, if any. If there is
no ``package_dir`` set, this option defaults to the current directory.
``egg_info`` Examples
---------------------
Creating a dated "nightly build" snapshot egg::
setup.py egg_info --tag-date --tag-build=DEV bdist_egg
Creating a release with no version tags, even if some default tags are
specified in ``setup.cfg``::
setup.py egg_info -RDb "" sdist bdist_egg
(Notice that ``egg_info`` must always appear on the command line *before* any
commands that you want the version changes to apply to.)
.. _rotate:
``rotate`` - Delete outdated distribution files
===============================================
As you develop new versions of your project, your distribution (``dist``)
directory will gradually fill up with older source and/or binary distribution
files. The ``rotate`` command lets you automatically clean these up, keeping
only the N most-recently modified files matching a given pattern.
``--match=PATTERNLIST, -m PATTERNLIST``
Comma-separated list of glob patterns to match. This option is *required*.
The project name and ``-*`` is prepended to the supplied patterns, in order
to match only distributions belonging to the current project (in case you
have a shared distribution directory for multiple projects). Typically,
you will use a glob pattern like ``.zip`` or ``.egg`` to match files of
the specified type. Note that each supplied pattern is treated as a
distinct group of files for purposes of selecting files to delete.
``--keep=COUNT, -k COUNT``
Number of matching distributions to keep. For each group of files
identified by a pattern specified with the ``--match`` option, delete all
but the COUNT most-recently-modified files in that group. This option is
*required*.
``--dist-dir=DIR, -d DIR``
Directory where the distributions are. This defaults to the value of the
``bdist`` command's ``--dist-dir`` option, which will usually be the
project's ``dist`` subdirectory.
**Example 1**: Delete all .tar.gz files from the distribution directory, except
for the 3 most recently modified ones::
setup.py rotate --match=.tar.gz --keep=3
**Example 2**: Delete all Python 2.3 or Python 2.4 eggs from the distribution
directory, except the most recently modified one for each Python version::
setup.py rotate --match=-py2.3*.egg,-py2.4*.egg --keep=1
.. _saveopts:
``saveopts`` - Save used options to a configuration file
========================================================
Finding and editing ``distutils`` configuration files can be a pain, especially
since you also have to translate the configuration options from command-line
form to the proper configuration file format. You can avoid these hassles by
using the ``saveopts`` command. Just add it to the command line to save the
options you used. For example, this command builds the project using
the ``mingw32`` C compiler, then saves the --compiler setting as the default
for future builds (even those run implicitly by the ``install`` command)::
setup.py build --compiler=mingw32 saveopts
The ``saveopts`` command saves all options for every command specified on the
command line to the project's local ``setup.cfg`` file, unless you use one of
the `configuration file options`_ to change where the options are saved. For
example, this command does the same as above, but saves the compiler setting
to the site-wide (global) distutils configuration::
setup.py build --compiler=mingw32 saveopts -g
Note that it doesn't matter where you place the ``saveopts`` command on the
command line; it will still save all the options specified for all commands.
For example, this is another valid way to spell the last example::
setup.py saveopts -g build --compiler=mingw32
Note, however, that all of the commands specified are always run, regardless of
where ``saveopts`` is placed on the command line.
Configuration File Options
--------------------------
Normally, settings such as options and aliases are saved to the project's
local ``setup.cfg`` file. But you can override this and save them to the
global or per-user configuration files, or to a manually-specified filename.
``--global-config, -g``
Save settings to the global ``distutils.cfg`` file inside the ``distutils``
package directory. You must have write access to that directory to use
this option. You also can't combine this option with ``-u`` or ``-f``.
``--user-config, -u``
Save settings to the current user's ``~/.pydistutils.cfg`` (POSIX) or
``$HOME/pydistutils.cfg`` (Windows) file. You can't combine this option
with ``-g`` or ``-f``.
``--filename=FILENAME, -f FILENAME``
Save settings to the specified configuration file to use. You can't
combine this option with ``-g`` or ``-u``. Note that if you specify a
non-standard filename, the ``distutils`` and ``setuptools`` will not
use the file's contents. This option is mainly included for use in
testing.
These options are used by other ``setuptools`` commands that modify
configuration files, such as the `alias`_ and `setopt`_ commands.
.. _setopt:
``setopt`` - Set a distutils or setuptools option in a config file
==================================================================
This command is mainly for use by scripts, but it can also be used as a quick
and dirty way to change a distutils configuration option without having to
remember what file the options are in and then open an editor.
**Example 1**. Set the default C compiler to ``mingw32`` (using long option
names)::
setup.py setopt --command=build --option=compiler --set-value=mingw32
**Example 2**. Remove any setting for the distutils default package
installation directory (short option names)::
setup.py setopt -c install -o install_lib -r
Options for the ``setopt`` command:
``--command=COMMAND, -c COMMAND``
Command to set the option for. This option is required.
``--option=OPTION, -o OPTION``
The name of the option to set. This option is required.
``--set-value=VALUE, -s VALUE``
The value to set the option to. Not needed if ``-r`` or ``--remove`` is
set.
``--remove, -r``
Remove (unset) the option, instead of setting it.
In addition to the above options, you may use any of the `configuration file
options`_ (listed under the `saveopts`_ command, above) to determine which
distutils configuration file the option will be added to (or removed from).
.. _test:
``test`` - Build package and run a unittest suite
=================================================
.. warning::
``test`` is deprecated and will be removed in a future version. Users
looking for a generic test entry point independent of test runner are
encouraged to use `tox <https://tox.readthedocs.io>`_.
When doing test-driven development, or running automated builds that need
testing before they are deployed for downloading or use, it's often useful
to be able to run a project's unit tests without actually deploying the project
anywhere, even using the ``develop`` command. The ``test`` command runs a
project's unit tests without actually deploying it, by temporarily putting the
project's source on ``sys.path``, after first running ``build_ext -i`` and
``egg_info`` to ensure that any C extensions and project metadata are
up-to-date.
To use this command, your project's tests must be wrapped in a ``unittest``
test suite by either a function, a ``TestCase`` class or method, or a module
or package containing ``TestCase`` classes. If the named suite is a module,
and the module has an ``additional_tests()`` function, it is called and the
result (which must be a ``unittest.TestSuite``) is added to the tests to be
run. If the named suite is a package, any submodules and subpackages are
recursively added to the overall test suite. (Note: if your project specifies
a ``test_loader``, the rules for processing the chosen ``test_suite`` may
differ; see the :ref:`test_loader <test_loader>` documentation for more details.)
Note that many test systems including ``doctest`` support wrapping their
non-``unittest`` tests in ``TestSuite`` objects. So, if you are using a test
package that does not support this, we suggest you encourage its developers to
implement test suite support, as this is a convenient and standard way to
aggregate a collection of tests to be run under a common test harness.
By default, tests will be run in the "verbose" mode of the ``unittest``
package's text test runner, but you can get the "quiet" mode (just dots) if
you supply the ``-q`` or ``--quiet`` option, either as a global option to
the setup script (e.g. ``setup.py -q test``) or as an option for the ``test``
command itself (e.g. ``setup.py test -q``). There is one other option
available:
``--test-suite=NAME, -s NAME``
Specify the test suite (or module, class, or method) to be run
(e.g. ``some_module.test_suite``). The default for this option can be
set by giving a ``test_suite`` argument to the ``setup()`` function, e.g.::
setup(
# ...
test_suite="my_package.tests.test_all"
)
If you did not set a ``test_suite`` in your ``setup()`` call, and do not
provide a ``--test-suite`` option, an error will occur.
New in 41.5.0: Deprecated the test command.
.. _upload:
``upload`` - Upload source and/or egg distributions to PyPI
===========================================================
The ``upload`` command was deprecated in version 40.0 and removed in version
42.0. Use `twine <https://pypi.org/p/twine>`_ instead.
For more information on the current best practices in uploading your packages
to PyPI, see the Python Packaging User Guide's "Packaging Python Projects"
tutorial specifically the section on `uploading the distribution archives
<https://packaging.python.org/tutorials/packaging-projects/#uploading-the-distribution-archives>`_.
====================
Data Files Support
====================
The distutils have traditionally allowed installation of "data files", which
are placed in a platform-specific location. However, the most common use case
for data files distributed with a package is for use *by* the package, usually
by including the data files in the package directory.
Setuptools offers three ways to specify data files to be included in your
packages. First, you can simply use the ``include_package_data`` keyword,
e.g.::
from setuptools import setup, find_packages
setup(
...
include_package_data=True
)
This tells setuptools to install any data files it finds in your packages.
The data files must be specified via the distutils' ``MANIFEST.in`` file.
(They can also be tracked by a revision control system, using an appropriate
plugin. See the section below on `Adding Support for Revision Control
Systems`_ for information on how to write such plugins.)
If you want finer-grained control over what files are included (for example,
if you have documentation files in your package directories and want to exclude
them from installation), then you can also use the ``package_data`` keyword,
e.g.::
from setuptools import setup, find_packages
setup(
...
package_data={
# If any package contains *.txt or *.rst files, include them:
"": ["*.txt", "*.rst"],
# And include any *.msg files found in the "hello" package, too:
"hello": ["*.msg"],
}
)
The ``package_data`` argument is a dictionary that maps from package names to
lists of glob patterns. The globs may include subdirectory names, if the data
files are contained in a subdirectory of the package. For example, if the
package tree looks like this::
setup.py
src/
mypkg/
__init__.py
mypkg.txt
data/
somefile.dat
otherdata.dat
The setuptools setup file might look like this::
from setuptools import setup, find_packages
setup(
...
packages=find_packages("src"), # include all packages under src
package_dir={"": "src"}, # tell distutils packages are under src
package_data={
# If any package contains *.txt files, include them:
"": ["*.txt"],
# And include any *.dat files found in the "data" subdirectory
# of the "mypkg" package, also:
"mypkg": ["data/*.dat"],
}
)
Notice that if you list patterns in ``package_data`` under the empty string,
these patterns are used to find files in every package, even ones that also
have their own patterns listed. Thus, in the above example, the ``mypkg.txt``
file gets included even though it's not listed in the patterns for ``mypkg``.
Also notice that if you use paths, you *must* use a forward slash (``/``) as
the path separator, even if you are on Windows. Setuptools automatically
converts slashes to appropriate platform-specific separators at build time.
If datafiles are contained in a subdirectory of a package that isn't a package
itself (no ``__init__.py``), then the subdirectory names (or ``*``) are required
in the ``package_data`` argument (as shown above with ``"data/*.dat"``).
When building an ``sdist``, the datafiles are also drawn from the
``package_name.egg-info/SOURCES.txt`` file, so make sure that this is removed if
the ``setup.py`` ``package_data`` list is updated before calling ``setup.py``.
(Note: although the ``package_data`` argument was previously only available in
``setuptools``, it was also added to the Python ``distutils`` package as of
Python 2.4; there is `some documentation for the feature`__ available on the
python.org website. If using the setuptools-specific ``include_package_data``
argument, files specified by ``package_data`` will *not* be automatically
added to the manifest unless they are listed in the MANIFEST.in file.)
__ https://docs.python.org/3/distutils/setupscript.html#installing-package-data
Sometimes, the ``include_package_data`` or ``package_data`` options alone
aren't sufficient to precisely define what files you want included. For
example, you may want to include package README files in your revision control
system and source distributions, but exclude them from being installed. So,
setuptools offers an ``exclude_package_data`` option as well, that allows you
to do things like this::
from setuptools import setup, find_packages
setup(
...
packages=find_packages("src"), # include all packages under src
package_dir={"": "src"}, # tell distutils packages are under src
include_package_data=True, # include everything in source control
# ...but exclude README.txt from all packages
exclude_package_data={"": ["README.txt"]},
)
The ``exclude_package_data`` option is a dictionary mapping package names to
lists of wildcard patterns, just like the ``package_data`` option. And, just
as with that option, a key of ``""`` will apply the given pattern(s) to all
packages. However, any files that match these patterns will be *excluded*
from installation, even if they were listed in ``package_data`` or were
included as a result of using ``include_package_data``.
In summary, the three options allow you to:
``include_package_data``
Accept all data files and directories matched by ``MANIFEST.in``.
``package_data``
Specify additional patterns to match files that may or may
not be matched by ``MANIFEST.in`` or found in source control.
``exclude_package_data``
Specify patterns for data files and directories that should *not* be
included when a package is installed, even if they would otherwise have
been included due to the use of the preceding options.
NOTE: Due to the way the distutils build process works, a data file that you
include in your project and then stop including may be "orphaned" in your
project's build directories, requiring you to run ``setup.py clean --all`` to
fully remove them. This may also be important for your users and contributors
if they track intermediate revisions of your project using Subversion; be sure
to let them know when you make changes that remove files from inclusion so they
can run ``setup.py clean --all``.
Accessing Data Files at Runtime
-------------------------------
Typically, existing programs manipulate a package's ``__file__`` attribute in
order to find the location of data files. However, this manipulation isn't
compatible with PEP 302-based import hooks, including importing from zip files
and Python Eggs. It is strongly recommended that, if you are using data files,
you should use the :ref:`ResourceManager API` of ``pkg_resources`` to access
them. The ``pkg_resources`` module is distributed as part of setuptools, so if
you're using setuptools to distribute your package, there is no reason not to
use its resource management API. See also `Importlib Resources`_ for
a quick example of converting code that uses ``__file__`` to use
``pkg_resources`` instead.
.. _Importlib Resources: https://docs.python.org/3/library/importlib.html#module-importlib.resources
Non-Package Data Files
----------------------
Historically, ``setuptools`` by way of ``easy_install`` would encapsulate data
files from the distribution into the egg (see `the old docs
<https://github.com/pypa/setuptools/blob/52aacd5b276fedd6849c3a648a0014f5da563e93/docs/setuptools.txt#L970-L1001>`_). As eggs are deprecated and pip-based installs
fall back to the platform-specific location for installing data files, there is
no supported facility to reliably retrieve these resources.
Instead, the PyPA recommends that any data files you wish to be accessible at
run time be included in the package.
\ No newline at end of file
-----------------------------------------
Configuring setup() using setup.cfg files
-----------------------------------------
.. note:: New in 30.3.0 (8 Dec 2016).
.. important::
If compatibility with legacy builds (i.e. those not using the :pep:`517`
build API) is desired, a ``setup.py`` file containing a ``setup()`` function
call is still required even if your configuration resides in ``setup.cfg``.
``Setuptools`` allows using configuration files (usually :file:`setup.cfg`)
to define a package’s metadata and other options that are normally supplied
to the ``setup()`` function (declarative config).
This approach not only allows automation scenarios but also reduces
boilerplate code in some cases.
.. note::
This implementation has limited compatibility with the distutils2-like
``setup.cfg`` sections used by the ``pbr`` and ``d2to1`` packages.
Namely: only metadata-related keys from ``metadata`` section are supported
(except for ``description-file``); keys from ``files``, ``entry_points``
and ``backwards_compat`` are not supported.
.. code-block:: ini
[metadata]
name = my_package
version = attr: src.VERSION
description = My package description
long_description = file: README.rst, CHANGELOG.rst, LICENSE.rst
keywords = one, two
license = BSD 3-Clause License
classifiers =
Framework :: Django
License :: OSI Approved :: BSD License
Programming Language :: Python :: 3
Programming Language :: Python :: 3.5
[options]
zip_safe = False
include_package_data = True
packages = find:
scripts =
bin/first.py
bin/second.py
install_requires =
requests
importlib; python_version == "2.6"
[options.package_data]
* = *.txt, *.rst
hello = *.msg
[options.extras_require]
pdf = ReportLab>=1.2; RXP
rest = docutils>=0.3; pack ==1.1, ==1.3
[options.packages.find]
exclude =
src.subpackage1
src.subpackage2
[options.data_files]
/etc/my_package =
site.d/00_default.conf
host.d/00_default.conf
data = data/img/logo.png, data/svg/icon.svg
Metadata and options are set in the config sections of the same name.
* Keys are the same as the keyword arguments one provides to the ``setup()``
function.
* Complex values can be written comma-separated or placed one per line
in *dangling* config values. The following are equivalent:
.. code-block:: ini
[metadata]
keywords = one, two
[metadata]
keywords =
one
two
* In some cases, complex values can be provided in dedicated subsections for
clarity.
* Some keys allow ``file:``, ``attr:``, ``find:``, and ``find_namespace:`` directives in
order to cover common usecases.
* Unknown keys are ignored.
Using a ``src/`` layout
=======================
One commonly used package configuration has all the module source code in a
subdirectory (often called the ``src/`` layout), like this::
├── src
│   └── mypackage
│   ├── __init__.py
│   └── mod1.py
├── setup.py
└── setup.cfg
You can set up your ``setup.cfg`` to automatically find all your packages in
the subdirectory like this:
.. code-block:: ini
# This example contains just the necessary options for a src-layout, set up
# the rest of the file as described above.
[options]
package_dir=
=src
packages=find:
[options.packages.find]
where=src
Specifying values
=================
Some values are treated as simple strings, some allow more logic.
Type names used below:
* ``str`` - simple string
* ``list-comma`` - dangling list or string of comma-separated values
* ``list-semi`` - dangling list or string of semicolon-separated values
* ``bool`` - ``True`` is 1, yes, true
* ``dict`` - list-comma where keys are separated from values by ``=``
* ``section`` - values are read from a dedicated (sub)section
Special directives:
* ``attr:`` - Value is read from a module attribute. ``attr:`` supports
callables and iterables; unsupported types are cast using ``str()``.
In order to support the common case of a literal value assigned to a variable
in a module containing (directly or indirectly) third-party imports,
``attr:`` first tries to read the value from the module by examining the
module's AST. If that fails, ``attr:`` falls back to importing the module.
* ``file:`` - Value is read from a list of files and then concatenated
.. note::
The ``file:`` directive is sandboxed and won't reach anything outside
the directory containing ``setup.py``.
Metadata
--------
.. note::
The aliases given below are supported for compatibility reasons,
but their use is not advised.
============================== ================= ================= =============== =====
Key Aliases Type Minimum Version Notes
============================== ================= ================= =============== =====
name str
version attr:, file:, str 39.2.0 (1)
url home-page str
download_url download-url str
project_urls dict 38.3.0
author str
author_email author-email str
maintainer str
maintainer_email maintainer-email str
classifiers classifier file:, list-comma
license str
license_file str
license_files list-comma
description summary file:, str
long_description long-description file:, str
long_description_content_type str 38.6.0
keywords list-comma
platforms platform list-comma
provides list-comma
requires list-comma
obsoletes list-comma
============================== ================= ================= =============== =====
.. note::
A version loaded using the ``file:`` directive must comply with PEP 440.
It is easy to accidentally put something other than a valid version
string in such a file, so validation is stricter in this case.
Notes:
1. The `version` file attribute has only been supported since 39.2.0.
Options
-------
======================= =================================== =============== =====
Key Type Minimum Version Notes
======================= =================================== =============== =====
zip_safe bool
setup_requires list-semi
install_requires list-semi
extras_require section
python_requires str
entry_points file:, section
use_2to3 bool
use_2to3_fixers list-comma
use_2to3_exclude_fixers list-comma
convert_2to3_doctests list-comma
scripts list-comma
eager_resources list-comma
dependency_links list-comma
tests_require list-semi
include_package_data bool
packages find:, find_namespace:, list-comma
package_dir dict
package_data section (1)
exclude_package_data section
namespace_packages list-comma
py_modules list-comma
data_files dict 40.6.0
======================= =================================== =============== =====
.. note::
**packages** - The ``find:`` and ``find_namespace:`` directive can be further configured
in a dedicated subsection ``options.packages.find``. This subsection
accepts the same keys as the `setuptools.find_packages` and the
`setuptools.find_namespace_packages` function:
``where``, ``include``, and ``exclude``.
**find_namespace directive** - The ``find_namespace:`` directive is supported since Python >=3.3.
Notes:
1. In the `package_data` section, a key named with a single asterisk (`*`)
refers to all packages, in lieu of the empty string used in `setup.py`.
=====================================
Dependencies Management in Setuptools
=====================================
There are three types of dependency styles offered by setuptools:
1) build system requirement, required dependency and 3) optional
dependency.
.. Note::
Packages that are added to dependency can be optionally specified with the
version by following `PEP 440 <https://www.python.org/dev/peps/pep-0440/>`_
.. contents::
Build system requirement
========================
Package requirement
-------------------
After organizing all the scripts and files and getting ready for packaging,
there needs to be a way to tell Python what programs it need to actually
do the packgaging (in our case, ``setuptools`` of course). Usually,
you also need the ``wheel`` package as well since it is recommended that you
upload a ``.whl`` file to PyPI alongside your ``.tar.gz`` file. Unlike the
other two types of dependency keyword, this one is specified in your
``pyproject.toml`` file (if you have forgot what this is, go to
:ref:`quickstart` or (WIP)):
.. code-block:: ini
[build-system]
requires = ["setuptools", "wheel"]
#...
.. note::
This used to be accomplished with the ``setup_requires`` keyword but is
now considered deprecated in favor of the PEP 517 style described above.
To peek into how this legacy keyword is used, consult our :ref:`guide on
deprecated practice (WIP)`
Declaring required dependency
=============================
This is where a package declares its core dependencies, without which it won't
be able to run. ``setuptools`` support automatically download and install
these dependencies when the package is installed. Although there is more
finess to it, let's start with a simple example.
.. code-block:: ini
[options]
#...
install_requires =
docutils
BazSpam ==1.1
.. code-block:: python
setup(
#...,
install_requires = [
'docutils',
'BazSpam ==1.1'
]
)
When your project is installed (e.g. using pip), all of the dependencies not
already installed will be located (via PyPI), downloaded, built (if necessary),
and installed and 2) Any scripts in your project will be installed with wrappers
that verify the availability of the specified dependencies at runtime.
Platform specific dependencies
------------------------------
Setuptools offer the capability to evaluate certain conditions before blindly
installing everything listed in ``install_requires``. This is great for platform
specific dependencies. For example, the ``enum`` package was added in Python
3.4, therefore, package that depends on it can elect to install it only when
the Python version is older than 3.4. To accomplish this
.. code-block:: ini
[options]
#...
install_requires =
enum34;python_version<'3.4'
.. code-block:: python
setup(
#...
install_requires=[
"enum34;python_version<'3.4'",]
)
Similarly, if you also wish to declare ``pywin32`` with a minimal version of 1.0
and only install it if the user is using a Windows operating system:
.. code-block:: ini
[options]
#...
install_requires =
enum34;python_version<'3.4'
pywin32 >= 1.0;platform_system=='Windows'
.. code-block:: python
setup(
#...
install_requires=[
"enum34;python_version<'3.4'",
"pywin32 >= 1.0;platform_system=='Windows'"
]
)
The environmental markers that may be used for testing platform types are
detailed in `PEP 508 <https://www.python.org/dev/peps/pep-0508/>`_.
Dependencies that aren't in PyPI
--------------------------------
.. warning::
Dependency links support has been dropped by pip starting with version
19.0 (released 2019-01-22).
If your project depends on packages that don't exist on PyPI, you may still be
able to depend on them, as long as they are available for download as:
- an egg, in the standard distutils ``sdist`` format,
- a single ``.py`` file, or
- a VCS repository (Subversion, Mercurial, or Git).
You just need to add some URLs to the ``dependency_links`` argument to
``setup()``.
The URLs must be either:
1. direct download URLs,
2. the URLs of web pages that contain direct download links, or
3. the repository's URL
In general, it's better to link to web pages, because it is usually less
complex to update a web page than to release a new version of your project.
You can also use a SourceForge ``showfiles.php`` link in the case where a
package you depend on is distributed via SourceForge.
If you depend on a package that's distributed as a single ``.py`` file, you
must include an ``"#egg=project-version"`` suffix to the URL, to give a project
name and version number. (Be sure to escape any dashes in the name or version
by replacing them with underscores.) EasyInstall will recognize this suffix
and automatically create a trivial ``setup.py`` to wrap the single ``.py`` file
as an egg.
In the case of a VCS checkout, you should also append ``#egg=project-version``
in order to identify for what package that checkout should be used. You can
append ``@REV`` to the URL's path (before the fragment) to specify a revision.
Additionally, you can also force the VCS being used by prepending the URL with
a certain prefix. Currently available are:
- ``svn+URL`` for Subversion,
- ``git+URL`` for Git, and
- ``hg+URL`` for Mercurial
A more complete example would be:
``vcs+proto://host/path@revision#egg=project-version``
Be careful with the version. It should match the one inside the project files.
If you want to disregard the version, you have to omit it both in the
``requires`` and in the URL's fragment.
This will do a checkout (or a clone, in Git and Mercurial parlance) to a
temporary folder and run ``setup.py bdist_egg``.
The ``dependency_links`` option takes the form of a list of URL strings. For
example, this will cause a search of the specified page for eggs or source
distributions, if the package's dependencies aren't already installed:
.. code-block:: ini
[options]
#...
dependency_links = http://peak.telecommunity.com/snapshots/
.. code-block:: python
setup(
#...
dependency_links=[
"http://peak.telecommunity.com/snapshots/"
],
)
Optional dependencies
=====================
Setuptools allows you to declare dependencies that only get installed under
specific circumstances. These dependencies are specified with ``extras_require``
keyword and are only installed if another package depends on it (either
directly or indirectly) This makes it convenient to declare dependencies for
ancillary functions such as "tests" and "docs".
.. note::
``tests_require`` is now deprecated
For example, Package-A offers optional PDF support and requires two other
dependencies for it to work:
.. code-block:: ini
[metadata]
name = Package-A
[options.extras_require]
PDF = ReportLab>=1.2; RXP
.. code-block:: python
setup(
name="Project-A",
#...
extras_require={
"PDF": ["ReportLab>=1.2", "RXP"],
}
)
The name ``PDF`` is an arbitary identifier of such a list of dependencies, to
which other components can refer and have them installed. There are two common
use cases.
First is the console_scripts entry point:
.. code-block:: ini
[metadata]
name = Project A
#...
[options]
#...
entry_points=
[console_scripts]
rst2pdf = project_a.tools.pdfgen [PDF]
rst2html = project_a.tools.htmlgen
.. code-block:: python
setup(
name = "Project-A"
#...,
entry_points={
"console_scripts": [
"rst2pdf = project_a.tools.pdfgen [PDF]",
"rst2html = project_a.tools.htmlgen",
],
}
)
When the script ``rst2pdf`` is run, it will trigger the installation of
the two dependencies ``PDF`` maps to.
The second use case is that other package can use this "extra" for their
own dependencies. For example, if "Project-B" needs "project A" with PDF support
installed, it might declare the dependency like this::
.. code-block:: ini
[metadata]
name = Project-B
#...
[options]
#...
install_requires =
Project-A[PDF]
.. code-block:: python
setup(
name="Project-B",
install_requires=["Project-A[PDF]"],
...
)
This will cause ReportLab to be installed along with project A, if project B is
installed -- even if project A was already installed. In this way, a project
can encapsulate groups of optional "downstream dependencies" under a feature
name, so that packages that depend on it don't have to know what the downstream
dependencies are. If a later version of Project A builds in PDF support and
no longer needs ReportLab, or if it ends up needing other dependencies besides
ReportLab in order to provide PDF support, Project B's setup information does
not need to change, but the right packages will still be installed if needed.
.. note::
Best practice: if a project ends up not needing any other packages to
support a feature, it should keep an empty requirements list for that feature
in its ``extras_require`` argument, so that packages depending on that feature
don't break (due to an invalid feature name).
Python requirement
==================
In some cases, you might need to specify the minimum required python version.
This is handled with the ``python_requires`` keyword supplied to ``setup.cfg``
or ``setup.py``.
Example WIP
\ No newline at end of file
"Development Mode"
==================
Under normal circumstances, the ``distutils`` assume that you are going to
build a distribution of your project, not use it in its "raw" or "unbuilt"
form. If you were to use the ``distutils`` that way, you would have to rebuild
and reinstall your project every time you made a change to it during
development.
Another problem that sometimes comes up with the ``distutils`` is that you may
need to do development on two related projects at the same time. You may need
to put both projects' packages in the same directory to run them, but need to
keep them separate for revision control purposes. How can you do this?
Setuptools allows you to deploy your projects for use in a common directory or
staging area, but without copying any files. Thus, you can edit each project's
code in its checkout directory, and only need to run build commands when you
change a project's C extensions or similarly compiled files. You can even
deploy a project into another project's checkout directory, if that's your
preferred way of working (as opposed to using a common independent staging area
or the site-packages directory).
To do this, use the ``setup.py develop`` command. It works very similarly to
``setup.py install``, except that it doesn't actually install anything.
Instead, it creates a special ``.egg-link`` file in the deployment directory,
that links to your project's source code. And, if your deployment directory is
Python's ``site-packages`` directory, it will also update the
``easy-install.pth`` file to include your project's source code, thereby making
it available on ``sys.path`` for all programs using that Python installation.
If you have enabled the ``use_2to3`` flag, then of course the ``.egg-link``
will not link directly to your source code when run under Python 3, since
that source code would be made for Python 2 and not work under Python 3.
Instead the ``setup.py develop`` will build Python 3 code under the ``build``
directory, and link there. This means that after doing code changes you will
have to run ``setup.py build`` before these changes are picked up by your
Python 3 installation.
In addition, the ``develop`` command creates wrapper scripts in the target
script directory that will run your in-development scripts after ensuring that
all your ``install_requires`` packages are available on ``sys.path``.
You can deploy the same project to multiple staging areas, e.g. if you have
multiple projects on the same machine that are sharing the same project you're
doing development work.
When you're done with a given development task, you can remove the project
source from a staging area using ``setup.py develop --uninstall``, specifying
the desired staging area if it's not the default.
There are several options to control the precise behavior of the ``develop``
command; see the section on the `develop`_ command below for more details.
Note that you can also apply setuptools commands to non-setuptools projects,
using commands like this::
python -c "import setuptools; with open('setup.py') as f: exec(compile(f.read(), 'setup.py', 'exec'))" develop
That is, you can simply list the normal setup commands and options following
the quoted part.
\ No newline at end of file
Tagging and "Daily Build" or "Snapshot" Releases
------------------------------------------------
When a set of related projects are under development, it may be important to
track finer-grained version increments than you would normally use for e.g.
"stable" releases. While stable releases might be measured in dotted numbers
with alpha/beta/etc. status codes, development versions of a project often
need to be tracked by revision or build number or even build date. This is
especially true when projects in development need to refer to one another, and
therefore may literally need an up-to-the-minute version of something!
To support these scenarios, ``setuptools`` allows you to "tag" your source and
egg distributions by adding one or more of the following to the project's
"official" version identifier:
* A manually-specified pre-release tag, such as "build" or "dev", or a
manually-specified post-release tag, such as a build or revision number
(``--tag-build=STRING, -bSTRING``)
* An 8-character representation of the build date (``--tag-date, -d``), as
a postrelease tag
You can add these tags by adding ``egg_info`` and the desired options to
the command line ahead of the ``sdist`` or ``bdist`` commands that you want
to generate a daily build or snapshot for. See the section below on the
`egg_info`_ command for more details.
(Also, before you release your project, be sure to see the section above on
`Specifying Your Project's Version`_ for more information about how pre- and
post-release tags affect how version numbers are interpreted. This is
important in order to make sure that dependency processing tools will know
which versions of your project are newer than others.)
Finally, if you are creating builds frequently, and either building them in a
downloadable location or are copying them to a distribution server, you should
probably also check out the `rotate`_ command, which lets you automatically
delete all but the N most-recently-modified distributions matching a glob
pattern. So, you can use a command line like::
setup.py egg_info -rbDEV bdist_egg rotate -m.egg -k3
to build an egg whose version info includes "DEV-rNNNN" (where NNNN is the
most recent Subversion revision that affected the source tree), and then
delete any egg files from the distribution directory except for the three
that were built most recently.
If you have to manage automated builds for multiple packages, each with
different tagging and rotation policies, you may also want to check out the
`alias`_ command, which would let each package define an alias like ``daily``
that would perform the necessary tag, build, and rotate commands. Then, a
simpler script or cron job could just run ``setup.py daily`` in each project
directory. (And, you could also define sitewide or per-user default versions
of the ``daily`` alias, so that projects that didn't define their own would
use the appropriate defaults.)
Generating Source Distributions
-------------------------------
``setuptools`` enhances the distutils' default algorithm for source file
selection with pluggable endpoints for looking up files to include. If you are
using a revision control system, and your source distributions only need to
include files that you're tracking in revision control, use a corresponding
plugin instead of writing a ``MANIFEST.in`` file. See the section below on
`Adding Support for Revision Control Systems`_ for information on plugins.
If you need to include automatically generated files, or files that are kept in
an unsupported revision control system, you'll need to create a ``MANIFEST.in``
file to specify any files that the default file location algorithm doesn't
catch. See the distutils documentation for more information on the format of
the ``MANIFEST.in`` file.
But, be sure to ignore any part of the distutils documentation that deals with
``MANIFEST`` or how it's generated from ``MANIFEST.in``; setuptools shields you
from these issues and doesn't work the same way in any case. Unlike the
distutils, setuptools regenerates the source distribution manifest file
every time you build a source distribution, and it builds it inside the
project's ``.egg-info`` directory, out of the way of your main project
directory. You therefore need not worry about whether it is up-to-date or not.
Indeed, because setuptools' approach to determining the contents of a source
distribution is so much simpler, its ``sdist`` command omits nearly all of
the options that the distutils' more complex ``sdist`` process requires. For
all practical purposes, you'll probably use only the ``--formats`` option, if
you use any option at all.
Making "Official" (Non-Snapshot) Releases
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
When you make an official release, creating source or binary distributions,
you will need to override the tag settings from ``setup.cfg``, so that you
don't end up registering versions like ``foobar-0.7a1.dev-r34832``. This is
easy to do if you are developing on the trunk and using tags or branches for
your releases - just make the change to ``setup.cfg`` after branching or
tagging the release, so the trunk will still produce development snapshots.
Alternately, if you are not branching for releases, you can override the
default version options on the command line, using something like::
setup.py egg_info -Db "" sdist bdist_egg
The first part of this command (``egg_info -Db ""``) will override the
configured tag information, before creating source and binary eggs. Thus, these
commands will use the plain version from your ``setup.py``, without adding the
build designation string.
Of course, if you will be doing this a lot, you may wish to create a personal
alias for this operation, e.g.::
setup.py alias -u release egg_info -Db ""
You can then use it like this::
setup.py release sdist bdist_egg
Or of course you can create more elaborate aliases that do all of the above.
See the sections below on the `egg_info`_ and `alias`_ commands for more ideas.
Distributing Extensions compiled with Cython
--------------------------------------------
``setuptools`` will detect at build time whether Cython is installed or not.
If Cython is not found ``setuptools`` will ignore pyx files.
To ensure Cython is available, include Cython in the build-requires section
of your pyproject.toml::
[build-system]
requires=[..., "cython"]
Built with pip 10 or later, that declaration is sufficient to include Cython
in the build. For broader compatibility, declare the dependency in your
setup-requires of setup.cfg::
[options]
setup_requires =
...
cython
As long as Cython is present in the build environment, ``setuptools`` includes
transparent support for building Cython extensions, as
long as extensions are defined using ``setuptools.Extension``.
If you follow these rules, you can safely list ``.pyx`` files as the source
of your ``Extension`` objects in the setup script. If it is, then ``setuptools``
will use it.
Of course, for this to work, your source distributions must include the C
code generated by Cython, as well as your original ``.pyx`` files. This means
that you will probably want to include current ``.c`` files in your revision
control system, rebuilding them whenever you check changes in for the ``.pyx``
source files. This will ensure that people tracking your project in a revision
control system will be able to build it even if they don't have Cython
installed, and that your source releases will be similarly usable with or
without Cython.
Specifying Your Project's Version
---------------------------------
Setuptools can work well with most versioning schemes; there are, however, a
few special things to watch out for, in order to ensure that setuptools and
other tools can always tell what version of your package is newer than another
version. Knowing these things will also help you correctly specify what
versions of other projects your project depends on.
A version consists of an alternating series of release numbers and pre-release
or post-release tags. A release number is a series of digits punctuated by
dots, such as ``2.4`` or ``0.5``. Each series of digits is treated
numerically, so releases ``2.1`` and ``2.1.0`` are different ways to spell the
same release number, denoting the first subrelease of release 2. But ``2.10``
is the *tenth* subrelease of release 2, and so is a different and newer release
from ``2.1`` or ``2.1.0``. Leading zeros within a series of digits are also
ignored, so ``2.01`` is the same as ``2.1``, and different from ``2.0.1``.
Following a release number, you can have either a pre-release or post-release
tag. Pre-release tags make a version be considered *older* than the version
they are appended to. So, revision ``2.4`` is *newer* than revision ``2.4c1``,
which in turn is newer than ``2.4b1`` or ``2.4a1``. Postrelease tags make
a version be considered *newer* than the version they are appended to. So,
revisions like ``2.4-1`` and ``2.4pl3`` are newer than ``2.4``, but are *older*
than ``2.4.1`` (which has a higher release number).
A pre-release tag is a series of letters that are alphabetically before
"final". Some examples of prerelease tags would include ``alpha``, ``beta``,
``a``, ``c``, ``dev``, and so on. You do not have to place a dot or dash
before the prerelease tag if it's immediately after a number, but it's okay to
do so if you prefer. Thus, ``2.4c1`` and ``2.4.c1`` and ``2.4-c1`` all
represent release candidate 1 of version ``2.4``, and are treated as identical
by setuptools.
In addition, there are three special prerelease tags that are treated as if
they were the letter ``c``: ``pre``, ``preview``, and ``rc``. So, version
``2.4rc1``, ``2.4pre1`` and ``2.4preview1`` are all the exact same version as
``2.4c1``, and are treated as identical by setuptools.
A post-release tag is either a series of letters that are alphabetically
greater than or equal to "final", or a dash (``-``). Post-release tags are
generally used to separate patch numbers, port numbers, build numbers, revision
numbers, or date stamps from the release number. For example, the version
``2.4-r1263`` might denote Subversion revision 1263 of a post-release patch of
version ``2.4``. Or you might use ``2.4-20051127`` to denote a date-stamped
post-release.
Notice that after each pre or post-release tag, you are free to place another
release number, followed again by more pre- or post-release tags. For example,
``0.6a9.dev-r41475`` could denote Subversion revision 41475 of the in-
development version of the ninth alpha of release 0.6. Notice that ``dev`` is
a pre-release tag, so this version is a *lower* version number than ``0.6a9``,
which would be the actual ninth alpha of release 0.6. But the ``-r41475`` is
a post-release tag, so this version is *newer* than ``0.6a9.dev``.
For the most part, setuptools' interpretation of version numbers is intuitive,
but here are a few tips that will keep you out of trouble in the corner cases:
* Don't stick adjoining pre-release tags together without a dot or number
between them. Version ``1.9adev`` is the ``adev`` prerelease of ``1.9``,
*not* a development pre-release of ``1.9a``. Use ``.dev`` instead, as in
``1.9a.dev``, or separate the prerelease tags with a number, as in
``1.9a0dev``. ``1.9a.dev``, ``1.9a0dev``, and even ``1.9.a.dev`` are
identical versions from setuptools' point of view, so you can use whatever
scheme you prefer.
* If you want to be certain that your chosen numbering scheme works the way
you think it will, you can use the ``pkg_resources.parse_version()`` function
to compare different version numbers::
>>> from pkg_resources import parse_version
>>> parse_version("1.9.a.dev") == parse_version("1.9a0dev")
True
>>> parse_version("2.1-rc2") < parse_version("2.1")
True
>>> parse_version("0.6a9dev-r41475") < parse_version("0.6a9")
True
Once you've decided on a version numbering scheme for your project, you can
have setuptools automatically tag your in-development releases with various
pre- or post-release tags. See the following sections for more details:
* `Tagging and "Daily Build" or "Snapshot" Releases`_
* The `egg_info`_ command
\ No newline at end of file
.. _`entry_points`:
============
Entry Points
============
Packages may provide commands to be run at the console (console scripts),
such as the ``pip`` command. These commands are defined for a package
as a specific kind of entry point in the ``setup.cfg`` or
``setup.py``.
Console Scripts
===============
First consider an example without entry points. Imagine a package
defined thus::
.. code-block:: bash
timmins/
timmins/__init__.py
timmins/__main__.py
setup.cfg # or setup.py
#other necessary files
with ``__init__.py`` as:
.. code-block:: python
def helloworld():
print("Hello world")
and ``__main__.py`` providing a hook:
from . import hello_world
if __name__ == '__main__':
hello_world()
After installing the package, the function may be invoked through the
`runpy <https://docs.python.org/3/library/runpy.html>`_ module::
.. code-block:: bash
python -m timmins
Adding a console script entry point allows the package to define a
user-friendly name for installers of the package to execute. Installers
like pip will create wrapper scripts to execute a function. In the
above example, to create a command ``hello-world`` that invokes
``timmins.hello_world``, add a console script entry point to
``setup.cfg``::
.. code-block:: ini
[options.entry_points]
console_scripts =
hello-world = timmins:hello_world
After installing the package, a user may invoke that function by simply calling
``hello-world`` on the command line.
The syntax for entry points is specified as follows:
.. code-block::
<name> = [<package>.[<subpackage>.]]<module>[:<object>.<object>]
where ``name`` is the name for the script you want to create, the left hand
side of ``:`` is the module that contains your function and the right hand
side is the object you want to invoke (e.g. a function).
In addition to ``console_scripts``, Setuptools supports ``gui_scripts``, which
will launch a GUI application without running in a terminal window.
Advertising Behavior
====================
Console scripts are one use of the more general concept of entry points. Entry
points more generally allow a packager to advertise behavior for discovery by
other libraries and applications. This feature enables "plug-in"-like
functionality, where one library solicits entry points and any number of other
libraries provide those entry points.
A good example of this plug-in behavior can be seen in
`pytest plugins <https://docs.pytest.org/en/latest/writing_plugins.html>`_,
where pytest is a test framework that allows other libraries to extend
or modify its functionality through the ``pytest11`` entry point.
The console scripts work similarly, where libraries advertise their commands
and tools like ``pip`` create wrapper scripts that invoke those commands.
For a project wishing to solicit entry points, Setuptools recommends the
`importlib.metadata <https://docs.python.org/3/library/importlib.metadata.html>`_
module (part of stdlib since Python 3.8) or its backport,
`importlib_metadata <https://pypi.org/project/importlib_metadata>`_.
For example, to find the console script entry points from the example above::
.. code-block:: python
>>> from importlib import metadata
>>> eps = metadata.entry_points()['console_scripts']
``eps`` is now a list of ``EntryPoint`` objects, one of which corresponds
to the ``hello-world = timmins:hello_world`` defined above. Each ``EntryPoint``
contains the ``name``, ``group``, and ``value``. It also supplies a ``.load()``
method to import and load that entry point (module or object).
.. code-block:: ini
[options.entry_points]
my.plugins =
hello-world = timmins:hello_world
Then, a different project wishing to load 'my.plugins' plugins could run
the following routine to load (and invoke) such plugins::
.. code-block:: python
>>> from importlib import metadata
>>> eps = metadata.entry_points()['my.plugins']
>>> for ep in eps:
... plugin = ep.load()
... plugin()
The project soliciting the entry points needs not to have any dependency
or prior knowledge about the libraries implementing the entry points, and
downstream users are able to compose functionality by pulling together
libraries implementing the entry points.
Dependency Management
=====================
Some entry points may require additional dependencies to properly function.
For such an entry point, declare in square brakets any number of dependency
``extras`` following the entry point definition. Such entry points will only
be viable if their extras were declared and installed. See the
:ref:`guide on dependencies management <dependency_management>` for
more information on defining extra requirements. Consider from the
above example::
.. code-block:: ini
[options.entry_points]
console_scripts =
hello-world = timmins:hello_world [pretty-printer]
In this case, the ``hello-world`` script is only viable if the ``pretty-printer``
extra is indicated, and so a plugin host might exclude that entry point
(i.e. not install a console script) if the relevant extra dependencies are not
installed.
Creating ``distutils`` Extensions
=================================
It can be hard to add new commands or setup arguments to the distutils. But
the ``setuptools`` package makes it a bit easier, by allowing you to distribute
a distutils extension as a separate project, and then have projects that need
the extension just refer to it in their ``setup_requires`` argument.
With ``setuptools``, your distutils extension projects can hook in new
commands and ``setup()`` arguments just by defining "entry points". These
are mappings from command or argument names to a specification of where to
import a handler from. (See the section on `Dynamic Discovery of Services and
Plugins`_ above for some more background on entry points.)
Adding Commands
---------------
You can add new ``setup`` commands by defining entry points in the
``distutils.commands`` group. For example, if you wanted to add a ``foo``
command, you might add something like this to your distutils extension
project's setup script::
setup(
# ...
entry_points={
"distutils.commands": [
"foo = mypackage.some_module:foo",
],
},
)
(Assuming, of course, that the ``foo`` class in ``mypackage.some_module`` is
a ``setuptools.Command`` subclass.)
Once a project containing such entry points has been activated on ``sys.path``,
(e.g. by running "install" or "develop" with a site-packages installation
directory) the command(s) will be available to any ``setuptools``-based setup
scripts. It is not necessary to use the ``--command-packages`` option or
to monkeypatch the ``distutils.command`` package to install your commands;
``setuptools`` automatically adds a wrapper to the distutils to search for
entry points in the active distributions on ``sys.path``. In fact, this is
how setuptools' own commands are installed: the setuptools project's setup
script defines entry points for them!
Adding ``setup()`` Arguments
----------------------------
.. warning:: Adding arguments to setup is discouraged as such arguments
are only supported through imperative execution and not supported through
declarative config.
Sometimes, your commands may need additional arguments to the ``setup()``
call. You can enable this by defining entry points in the
``distutils.setup_keywords`` group. For example, if you wanted a ``setup()``
argument called ``bar_baz``, you might add something like this to your
distutils extension project's setup script::
setup(
# ...
entry_points={
"distutils.commands": [
"foo = mypackage.some_module:foo",
],
"distutils.setup_keywords": [
"bar_baz = mypackage.some_module:validate_bar_baz",
],
},
)
The idea here is that the entry point defines a function that will be called
to validate the ``setup()`` argument, if it's supplied. The ``Distribution``
object will have the initial value of the attribute set to ``None``, and the
validation function will only be called if the ``setup()`` call sets it to
a non-None value. Here's an example validation function::
def assert_bool(dist, attr, value):
"""Verify that value is True, False, 0, or 1"""
if bool(value) != value:
raise DistutilsSetupError(
"%r must be a boolean value (got %r)" % (attr,value)
)
Your function should accept three arguments: the ``Distribution`` object,
the attribute name, and the attribute value. It should raise a
``DistutilsSetupError`` (from the ``distutils.errors`` module) if the argument
is invalid. Remember, your function will only be called with non-None values,
and the default value of arguments defined this way is always None. So, your
commands should always be prepared for the possibility that the attribute will
be ``None`` when they access it later.
If more than one active distribution defines an entry point for the same
``setup()`` argument, *all* of them will be called. This allows multiple
distutils extensions to define a common argument, as long as they agree on
what values of that argument are valid.
Also note that as with commands, it is not necessary to subclass or monkeypatch
the distutils ``Distribution`` class in order to add your arguments; it is
sufficient to define the entry points in your extension, as long as any setup
script using your extension lists your project in its ``setup_requires``
argument.
Customizing Distribution Options
--------------------------------
Plugins may wish to extend or alter the options on a Distribution object to
suit the purposes of that project. For example, a tool that infers the
``Distribution.version`` from SCM-metadata may need to hook into the
option finalization. To enable this feature, Setuptools offers an entry
point "setuptools.finalize_distribution_options". That entry point must
be a callable taking one argument (the Distribution instance).
If the callable has an ``.order`` property, that value will be used to
determine the order in which the hook is called. Lower numbers are called
first and the default is zero (0).
Plugins may read, alter, and set properties on the distribution, but each
plugin is encouraged to load the configuration/settings for their behavior
independently.
Adding new EGG-INFO Files
-------------------------
Some extensible applications or frameworks may want to allow third parties to
develop plugins with application or framework-specific metadata included in
the plugins' EGG-INFO directory, for easy access via the ``pkg_resources``
metadata API. The easiest way to allow this is to create a distutils extension
to be used from the plugin projects' setup scripts (via ``setup_requires``)
that defines a new setup keyword, and then uses that data to write an EGG-INFO
file when the ``egg_info`` command is run.
The ``egg_info`` command looks for extension points in an ``egg_info.writers``
group, and calls them to write the files. Here's a simple example of a
distutils extension defining a setup argument ``foo_bar``, which is a list of
lines that will be written to ``foo_bar.txt`` in the EGG-INFO directory of any
project that uses the argument::
setup(
# ...
entry_points={
"distutils.setup_keywords": [
"foo_bar = setuptools.dist:assert_string_list",
],
"egg_info.writers": [
"foo_bar.txt = setuptools.command.egg_info:write_arg",
],
},
)
This simple example makes use of two utility functions defined by setuptools
for its own use: a routine to validate that a setup keyword is a sequence of
strings, and another one that looks up a setup argument and writes it to
a file. Here's what the writer utility looks like::
def write_arg(cmd, basename, filename):
argname = os.path.splitext(basename)[0]
value = getattr(cmd.distribution, argname, None)
if value is not None:
value = "\n".join(value) + "\n"
cmd.write_or_delete_file(argname, filename, value)
As you can see, ``egg_info.writers`` entry points must be a function taking
three arguments: a ``egg_info`` command instance, the basename of the file to
write (e.g. ``foo_bar.txt``), and the actual full filename that should be
written to.
In general, writer functions should honor the command object's ``dry_run``
setting when writing files, and use the ``distutils.log`` object to do any
console output. The easiest way to conform to this requirement is to use
the ``cmd`` object's ``write_file()``, ``delete_file()``, and
``write_or_delete_file()`` methods exclusively for your file operations. See
those methods' docstrings for more details.
Adding Support for Revision Control Systems
-------------------------------------------------
If the files you want to include in the source distribution are tracked using
Git, Mercurial or SVN, you can use the following packages to achieve that:
- Git and Mercurial: `setuptools_scm <https://pypi.org/project/setuptools_scm/>`_
- SVN: `setuptools_svn <https://pypi.org/project/setuptools_svn/>`_
If you would like to create a plugin for ``setuptools`` to find files tracked
by another revision control system, you can do so by adding an entry point to
the ``setuptools.file_finders`` group. The entry point should be a function
accepting a single directory name, and should yield all the filenames within
that directory (and any subdirectories thereof) that are under revision
control.
For example, if you were going to create a plugin for a revision control system
called "foobar", you would write a function something like this:
.. code-block:: python
def find_files_for_foobar(dirname):
# loop to yield paths that start with `dirname`
And you would register it in a setup script using something like this::
entry_points={
"setuptools.file_finders": [
"foobar = my_foobar_module:find_files_for_foobar",
]
}
Then, anyone who wants to use your plugin can simply install it, and their
local setuptools installation will be able to find the necessary files.
It is not necessary to distribute source control plugins with projects that
simply use the other source control system, or to specify the plugins in
``setup_requires``. When you create a source distribution with the ``sdist``
command, setuptools automatically records what files were found in the
``SOURCES.txt`` file. That way, recipients of source distributions don't need
to have revision control at all. However, if someone is working on a package
by checking out with that system, they will need the same plugin(s) that the
original author is using.
A few important points for writing revision control file finders:
* Your finder function MUST return relative paths, created by appending to the
passed-in directory name. Absolute paths are NOT allowed, nor are relative
paths that reference a parent directory of the passed-in directory.
* Your finder function MUST accept an empty string as the directory name,
meaning the current directory. You MUST NOT convert this to a dot; just
yield relative paths. So, yielding a subdirectory named ``some/dir`` under
the current directory should NOT be rendered as ``./some/dir`` or
``/somewhere/some/dir``, but *always* as simply ``some/dir``
* Your finder function SHOULD NOT raise any errors, and SHOULD deal gracefully
with the absence of needed programs (i.e., ones belonging to the revision
control system itself. It *may*, however, use ``distutils.log.warn()`` to
inform the user of the missing program(s).
\ No newline at end of file
========================================================
Using setuptools to package and distribute your project
========================================================
``setuptools`` offers a variety of functionalities that make it easy to
build and distribute your python package. Here we provide an overview on
the commonly used ones.
==================================================
Building and Distributing Packages with Setuptools
==================================================
``Setuptools`` is a collection of enhancements to the Python ``distutils``
that allow developers to more easily build and
distribute Python packages, especially ones that have dependencies on other
packages.
Packages built and distributed using ``setuptools`` look to the user like
ordinary Python packages based on the ``distutils``.
.. toctree::
:maxdepth: 1
quickstart
package_discovery
entry_point
dependency_management
datafiles
development_mode
distribution
extension
declarative_config
keywords
commands
New and Changed ``setup()`` Keywords
====================================
The following keyword arguments to ``setup()`` are added or changed by
``setuptools``. All of them are optional; you do not have to supply them
unless you need the associated ``setuptools`` feature.
``include_package_data``
If set to ``True``, this tells ``setuptools`` to automatically include any
data files it finds inside your package directories that are specified by
your ``MANIFEST.in`` file. For more information, see the section below on
`Including Data Files`_.
``exclude_package_data``
A dictionary mapping package names to lists of glob patterns that should
be *excluded* from your package directories. You can use this to trim back
any excess files included by ``include_package_data``. For a complete
description and examples, see the section below on `Including Data Files`_.
``package_data``
A dictionary mapping package names to lists of glob patterns. For a
complete description and examples, see the section below on `Including
Data Files`_. You do not need to use this option if you are using
``include_package_data``, unless you need to add e.g. files that are
generated by your setup script and build process. (And are therefore not
in source control or are files that you don't want to include in your
source distribution.)
``zip_safe``
A boolean (True or False) flag specifying whether the project can be
safely installed and run from a zip file. If this argument is not
supplied, the ``bdist_egg`` command will have to analyze all of your
project's contents for possible problems each time it builds an egg.
``install_requires``
A string or list of strings specifying what other distributions need to
be installed when this one is. See the section below on `Declaring
Dependencies`_ for details and examples of the format of this argument.
``entry_points``
A dictionary mapping entry point group names to strings or lists of strings
defining the entry points. Entry points are used to support dynamic
discovery of services or plugins provided by a project. See `Dynamic
Discovery of Services and Plugins`_ for details and examples of the format
of this argument. In addition, this keyword is used to support `Automatic
Script Creation`_.
``extras_require``
A dictionary mapping names of "extras" (optional features of your project)
to strings or lists of strings specifying what other distributions must be
installed to support those features. See the section below on `Declaring
Dependencies`_ for details and examples of the format of this argument.
``python_requires``
A string corresponding to a version specifier (as defined in PEP 440) for
the Python version, used to specify the Requires-Python defined in PEP 345.
``setup_requires``
A string or list of strings specifying what other distributions need to
be present in order for the *setup script* to run. ``setuptools`` will
attempt to obtain these (using pip if available) before processing the
rest of the setup script or commands. This argument is needed if you
are using distutils extensions as part of your build process; for
example, extensions that process setup() arguments and turn them into
EGG-INFO metadata files.
(Note: projects listed in ``setup_requires`` will NOT be automatically
installed on the system where the setup script is being run. They are
simply downloaded to the ./.eggs directory if they're not locally available
already. If you want them to be installed, as well as being available
when the setup script is run, you should add them to ``install_requires``
**and** ``setup_requires``.)
``dependency_links``
A list of strings naming URLs to be searched when satisfying dependencies.
These links will be used if needed to install packages specified by
``setup_requires`` or ``tests_require``. They will also be written into
the egg's metadata for use during install by tools that support them.
``namespace_packages``
A list of strings naming the project's "namespace packages". A namespace
package is a package that may be split across multiple project
distributions. For example, Zope 3's ``zope`` package is a namespace
package, because subpackages like ``zope.interface`` and ``zope.publisher``
may be distributed separately. The egg runtime system can automatically
merge such subpackages into a single parent package at runtime, as long
as you declare them in each project that contains any subpackages of the
namespace package, and as long as the namespace package's ``__init__.py``
does not contain any code other than a namespace declaration. See the
section below on `Namespace Packages`_ for more information.
``test_suite``
A string naming a ``unittest.TestCase`` subclass (or a package or module
containing one or more of them, or a method of such a subclass), or naming
a function that can be called with no arguments and returns a
``unittest.TestSuite``. If the named suite is a module, and the module
has an ``additional_tests()`` function, it is called and the results are
added to the tests to be run. If the named suite is a package, any
submodules and subpackages are recursively added to the overall test suite.
Specifying this argument enables use of the `test`_ command to run the
specified test suite, e.g. via ``setup.py test``. See the section on the
`test`_ command below for more details.
New in 41.5.0: Deprecated the test command.
``tests_require``
If your project's tests need one or more additional packages besides those
needed to install it, you can use this option to specify them. It should
be a string or list of strings specifying what other distributions need to
be present for the package's tests to run. When you run the ``test``
command, ``setuptools`` will attempt to obtain these (using pip if
available). Note that these required projects will *not* be installed on
the system where the tests are run, but only downloaded to the project's setup
directory if they're not already installed locally.
New in 41.5.0: Deprecated the test command.
.. _test_loader:
``test_loader``
If you would like to use a different way of finding tests to run than what
setuptools normally uses, you can specify a module name and class name in
this argument. The named class must be instantiable with no arguments, and
its instances must support the ``loadTestsFromNames()`` method as defined
in the Python ``unittest`` module's ``TestLoader`` class. Setuptools will
pass only one test "name" in the `names` argument: the value supplied for
the ``test_suite`` argument. The loader you specify may interpret this
string in any way it likes, as there are no restrictions on what may be
contained in a ``test_suite`` string.
The module name and class name must be separated by a ``:``. The default
value of this argument is ``"setuptools.command.test:ScanningLoader"``. If
you want to use the default ``unittest`` behavior, you can specify
``"unittest:TestLoader"`` as your ``test_loader`` argument instead. This
will prevent automatic scanning of submodules and subpackages.
The module and class you specify here may be contained in another package,
as long as you use the ``tests_require`` option to ensure that the package
containing the loader class is available when the ``test`` command is run.
New in 41.5.0: Deprecated the test command.
``eager_resources``
A list of strings naming resources that should be extracted together, if
any of them is needed, or if any C extensions included in the project are
imported. This argument is only useful if the project will be installed as
a zipfile, and there is a need to have all of the listed resources be
extracted to the filesystem *as a unit*. Resources listed here
should be "/"-separated paths, relative to the source root, so to list a
resource ``foo.png`` in package ``bar.baz``, you would include the string
``bar/baz/foo.png`` in this argument.
If you only need to obtain resources one at a time, or you don't have any C
extensions that access other files in the project (such as data files or
shared libraries), you probably do NOT need this argument and shouldn't
mess with it. For more details on how this argument works, see the section
below on `Automatic Resource Extraction`_.
``use_2to3``
Convert the source code from Python 2 to Python 3 with 2to3 during the
build process. See :doc:`python3` for more details.
``convert_2to3_doctests``
List of doctest source files that need to be converted with 2to3.
See :doc:`python3` for more details.
``use_2to3_fixers``
A list of modules to search for additional fixers to be used during
the 2to3 conversion. See :doc:`python3` for more details.
``project_urls``
An arbitrary map of URL names to hyperlinks, allowing more extensible
documentation of where various resources can be found than the simple
``url`` and ``download_url`` options provide.
\ No newline at end of file
Automatic Resource Extraction
-----------------------------
If you are using tools that expect your resources to be "real" files, or your
project includes non-extension native libraries or other files that your C
extensions expect to be able to access, you may need to list those files in
the ``eager_resources`` argument to ``setup()``, so that the files will be
extracted together, whenever a C extension in the project is imported.
This is especially important if your project includes shared libraries *other*
than distutils-built C extensions, and those shared libraries use file
extensions other than ``.dll``, ``.so``, or ``.dylib``, which are the
extensions that setuptools 0.6a8 and higher automatically detects as shared
libraries and adds to the ``native_libs.txt`` file for you. Any shared
libraries whose names do not end with one of those extensions should be listed
as ``eager_resources``, because they need to be present in the filesystem when
he C extensions that link to them are used.
The ``pkg_resources`` runtime for compressed packages will automatically
extract *all* C extensions and ``eager_resources`` at the same time, whenever
*any* C extension or eager resource is requested via the ``resource_filename()``
API. (C extensions are imported using ``resource_filename()`` internally.)
This ensures that C extensions will see all of the "real" files that they
expect to see.
Note also that you can list directory resource names in ``eager_resources`` as
well, in which case the directory's contents (including subdirectories) will be
extracted whenever any C extension or eager resource is requested.
Please note that if you're not sure whether you need to use this argument, you
don't! It's really intended to support projects with lots of non-Python
dependencies and as a last resort for crufty projects that can't otherwise
handle being compressed. If your package is pure Python, Python plus data
files, or Python plus C, you really don't need this. You've got to be using
either C or an external program that needs "real" files in your project before
there's any possibility of ``eager_resources`` being relevant to your project.
Defining Additional Metadata
----------------------------
Some extensible applications and frameworks may need to define their own kinds
of metadata to include in eggs, which they can then access using the
``pkg_resources`` metadata APIs. Ordinarily, this is done by having plugin
developers include additional files in their ``ProjectName.egg-info``
directory. However, since it can be tedious to create such files by hand, you
may want to create a distutils extension that will create the necessary files
from arguments to ``setup()``, in much the same way that ``setuptools`` does
for many of the ``setup()`` arguments it adds. See the section below on
`Creating distutils Extensions`_ for more details, especially the subsection on
`Adding new EGG-INFO Files`_.
Setting the ``zip_safe`` flag
-----------------------------
For some use cases (such as bundling as part of a larger application), Python
packages may be run directly from a zip file.
Not all packages, however, are capable of running in compressed form, because
they may expect to be able to access either source code or data files as
normal operating system files. So, ``setuptools`` can install your project
as a zipfile or a directory, and its default choice is determined by the
project's ``zip_safe`` flag.
You can pass a True or False value for the ``zip_safe`` argument to the
``setup()`` function, or you can omit it. If you omit it, the ``bdist_egg``
command will analyze your project's contents to see if it can detect any
conditions that would prevent it from working in a zipfile. It will output
notices to the console about any such conditions that it finds.
Currently, this analysis is extremely conservative: it will consider the
project unsafe if it contains any C extensions or datafiles whatsoever. This
does *not* mean that the project can't or won't work as a zipfile! It just
means that the ``bdist_egg`` authors aren't yet comfortable asserting that
the project *will* work. If the project contains no C or data files, and does
no ``__file__`` or ``__path__`` introspection or source code manipulation, then
there is an extremely solid chance the project will work when installed as a
zipfile. (And if the project uses ``pkg_resources`` for all its data file
access, then C extensions and other data files shouldn't be a problem at all.
See the `Accessing Data Files at Runtime`_ section above for more information.)
However, if ``bdist_egg`` can't be *sure* that your package will work, but
you've checked over all the warnings it issued, and you are either satisfied it
*will* work (or if you want to try it for yourself), then you should set
``zip_safe`` to ``True`` in your ``setup()`` call. If it turns out that it
doesn't work, you can always change it to ``False``, which will force
``setuptools`` to install your project as a directory rather than as a zipfile.
In the future, as we gain more experience with different packages and become
more satisfied with the robustness of the ``pkg_resources`` runtime, the
"zip safety" analysis may become less conservative. However, we strongly
recommend that you determine for yourself whether your project functions
correctly when installed as a zipfile, correct any problems if you can, and
then make an explicit declaration of ``True`` or ``False`` for the ``zip_safe``
flag, so that it will not be necessary for ``bdist_egg`` to try to guess
whether your project can work as a zipfile.
.. _`package_discovery`:
========================================
Package Discovery and Namespace Package
========================================
.. note::
a full specification for the keyword supplied to ``setup.cfg`` or
``setup.py`` can be found at :ref:`keywords reference <keywords_ref>`
.. note::
the examples provided here are only to demonstrate the functionality
introduced. More metadata and options arguments need to be supplied
if you want to replicate them on your system. If you are completely
new to setuptools, the :ref:`quickstart section <quickstart>` is a good
place to start.
``Setuptools`` provide powerful tools to handle package discovery, including
support for namespace package. Normally, you would specify the package to be
included manually in the following manner:
.. code-block:: ini
[options]
#...
packages =
mypkg1
mypkg2
.. code-block:: python
setup(
#...
packages = ['mypkg1', 'mypkg2']
)
This can get tiresome reallly quickly. To speed things up, we introduce two
functions provided by setuptools:
.. code-block:: ini
[options]
packages = find:
#or
packages = find_namespace:
.. code-block:: python
from setuptools import find_packages
#or
from setuptools import find_namespace_packages
Using ``find:`` or ``find_packages``
====================================
Let's start with the first tool. ``find:`` (``find_packages``) takes a source
directory and two lists of package name patterns to exclude and include, and
then return a list of ``str`` representing the packages it could find. To use
it, consider the following directory
.. code-block:: bash
mypkg/
src/
pkg1/__init__.py
pkg2/__init__.py
additional/__init__.py
setup.cfg #or setup.py
To have your setup.cfg or setup.py to automatically include packages found
in ``src`` that starts with the name ``pkg`` and not ``additional``:
.. code-block:: ini
[options]
packages = find:
package_dir =
=src
[options.packages.find]
where = src
include = pkg*
exclude = additional
.. code-block:: python
setup(
#...
packages = find_packages(
where = 'src',
include = ['pkg*',],
exclude = ['tests',]
),
package_dir = {"":"src"}
#...
)
Using ``find_namespace:`` or ``find_namespace_packages``
========================================================
``setuptools`` provides the ``find_namespace:`` (``find_namespace_packages``)
which behaves similarly to ``find:`` but works with namespace package. Before
diving in, it is important to have a good understanding of what namespace
packages are. Here is a quick recap:
Suppose you have two packages named as follows:
.. code-block:: bash
/Users/Desktop/timmins/foo/__init__.py
/Library/timmins/bar/__init__.py
If both ``Desktop`` and ``Library`` are on your ``PYTHONPATH``, then a
namespace package called ``timmins`` will be created automatically for you when
you invoke the import mechanism, allowing you to accomplish the following
.. code-block:: python
>>> import timmins.foo
>>> import timmins.bar
as if there is only one ``timmins`` on your system. The two packages can then
be distributed separately and installed individually without affecting the
other one. Suppose you are packaging the ``foo`` part:
.. code-block:: bash
foo/
src/
timmins/foo/__init__.py
setup.cfg # or setup.py
and you want the ``foo`` to be automatically included, ``find:`` won't work
because timmins doesn't contain ``__init__.py`` directly, instead, you have
to use ``find_namespace:``:
.. code-block:: ini
[options]
package_dir =
=src
packages = find_namespace:
[options.packages.find_namespace]
where = src
When you install the zipped distribution, ``timmins.foo`` would become
available to your interpreter.
You can think of ``find_namespace:`` as identical to ``find:`` except it
would count a directory as a package even if it doesn't contain ``__init__.py``
file directly. As a result, this creates an interesting side effect. If you
organize your package like this:
.. code-block:: bash
foo/
timmins/
foo/__init__.py
setup.cfg # or setup.py
tests/
test_foo/__init__.py
a naive ``find_namespace:`` would include tests as part of your package to
be installed. A simple way to fix it is to adopt the aforementioned
``src`` layout.
Legacy Namespace Packages
=========================
The fact you can create namespace package so effortlessly above is credited
to `PEP 420 <https://www.python.org/dev/peps/pep-0420/>`_. It use to be more
cumbersome to accomplish the same result. Historically, there were two methods
to create namespace packages. One is the ``pkg_resources`` style supported by
``setuptools`` and the other one being ``pkgutils`` style offered by
``pkgutils`` module in Python. Both are now considered deprecated despite the
fact they still linger in many existing packages. These two differ in many
subtle yet significant aspects and you can find out more on `Python packaging
user guide <https://packaging.python.org/guides/packaging-namespace-packages/>`_
``pkg_resource`` style namespace package
----------------------------------------
This is the method ``setuptools`` directly supports. Starting with the same
layout, there are two pieces you need to add to it. First, an ``__init__.py``
file directly under your namespace package directory that contains the
following:
.. code-block:: python
__import__("pkg_resources").declare_namespace(__name__)
And the ``namespace_packages`` keyword in your ``setup.cfg`` or ``setup.py``:
.. code-block:: ini
[options]
namespace_packages = timmins
.. code-block:: python
setup(
# ...
namespace_packages = ['timmins']
)
And your directory should look like this
.. code-block:: bash
/foo/
src/
timmins/
__init__.py
foo/__init__.py
setup.cfg #or setup.py
Repeat the same for other packages and you can achieve the same result as
the previous section.
``pkgutil`` style namespace package
-----------------------------------
This method is almost identical to the ``pkg_resource`` except that the
``namespace_packages`` declaration is omitted and the ``__init__.py``
file contains the following:
.. code-block:: python
__path__ = __import__('pkgutil').extend_path(__path__, __name__)
The project layout remains the same and ``setup.cfg`` remains the same.
==========================
``setuptools`` Quickstart
==========================
.. contents::
Installation
============
To install the latest version of setuptools, use::
pip install --upgrade setuptools
Python packaging at a glance
============================
The landscape of Python packaging is shifting and ``Setuptools`` has evolved to
only provide backend support, no longer being the de-facto packaging tool in
the market. All python package must provide a ``pyproject.toml`` and specify
the backend (build system) it wants to use. The distribution can then
be generated with whatever tools that provides a ``build sdist``-alike
functionality. While this may appear cumbersome, given the added pieces,
it in fact tremendously enhances the portability of your package. The
change is driven under `PEP 517 <https://www.python.org/dev/peps/pep-0517/#
build-requirements>``. To learn more about Python packaging in general,
navigate to the `bottom <Resources on python packaging>`_ of this page.
Basic Use
=========
For basic use of setuptools, you will need a ``pyproject.toml`` with the
exact following info, which declares you want to use ``setuptools`` to
package your project:
.. code-block:: toml
[build-system]
requires = ["setuptools", "wheel"]
build-backend = "setuptools.build_meta"
Then, you will need a ``setup.cfg`` to specify your package information,
such as metadata, contents, dependencies, etc. Here we demonstrate the minimum
.. code-block:: ini
[metadata]
name = "mypackage"
version = 0.0.1
[options]
packages = "mypackage"
install_requires =
requests
importlib; python_version == "2.6"
This is what your project would look like::
~/mypackage/
pyproject.toml
setup.cfg
mypackage/__init__.py
Then, you need an installer, such as `pep517 <https://pypi.org/project/pep517/>`_
which you can obtain via ``pip install pep517``. After downloading it, invoke
the installer::
python -m pep517.build
You now have your distribution ready (e.g. a ``tar.gz`` file and a ``.whl``
file in the ``dist`` directory), which you can upload to PyPI!
Of course, before you release your project to PyPI, you'll want to add a bit
more information to your setup script to help people find or learn about your
project. And maybe your project will have grown by then to include a few
dependencies, and perhaps some data files and scripts. In the next few section,
we will walk through those additional but essential information you need
to specify to properly package your project.
Automatic package discovery
===========================
For simple projects, it's usually easy enough to manually add packages to
the ``packages`` keyword in ``setup.cfg``. However, for very large projects
, it can be a big burden to keep the package list updated. ``setuptools``
therefore provides two convenient tools to ease the burden: ``find: `` and
``find_namespace: ``. To use it in your project:
.. code-block:: ini
[options]
packages = find:
[options.packages.find] #optional
include=pkg1, pkg2
exclude=pk3, pk4
When you pass the above information, alongside other necessary ones,
``setuptools`` walks through the directory specified in ``where`` (omitted
here as the package reside in current directory) and filters the packages
it can find following the ``include`` (default to none), then remove
those that match the ``exclude`` and return a list of Python packages. Note
that each entry in the ``[options.packages.find]`` is optional. The above
setup also allows you to adopt a ``src/`` layout. For more details and advanced
use, go to :ref:`package_discovery`
Entry points and automatic script creation
===========================================
Setuptools support automatic creation of scripts upon installation, that runs
code within your package if you specify them with the ``entry_point`` keyword.
This is what allows you to run commands like ``pip install`` instead of having
to type ``python -m pip install``. To accomplish this, add the entry_points
keyword in your ``setup.cfg``:
.. code-block:: ini
[options]
entry_points =
[console_script]
main = mypkg:some_func
When this project is installed, a ``main`` script will be installed and will
invoke the ``some_func`` in the ``__init__.py`` file when called by the user.
For detailed usage, including managing the additional or optional dependencies,
go to :ref:`entry_point`.
Dependency management
=====================
``setuptools`` supports automatically installing dependencies when a package is
installed. The simplest way to include requirement specifiers is to use the
``install_requires`` argument to ``setup.cfg``. It takes a string or list of
strings containing requirement specifiers (A version specifier is one of the
operators <, >, <=, >=, == or !=, followed by a version identifier):
.. code-block:: ini
[options]
install_requires =
docutils >= 0.3
requests <= 0.4
When your project is installed, all of the dependencies not already installed
will be located (via PyPI), downloaded, built (if necessary), and installed.
This, of course, is a simplified scenarios. ``setuptools`` also provide
additional keywords such as ``setup_requires`` that allows you to install
dependencies before running the script, and ``extras_requires`` that take
care of those needed by automatically generated scripts. It also provides
mechanisms to handle dependencies that are not in PyPI. For more advanced use,
see :ref:`dependency_management`
Including Data Files
====================
The distutils have traditionally allowed installation of "data files", which
are placed in a platform-specific location. Setuptools offers three ways to
specify data files to be included in your packages. For the simpliest use, you
can simply use the ``include_package_data`` keyword:
.. code-block:: ini
[options]
include_package_data = True
This tells setuptools to install any data files it finds in your packages.
The data files must be specified via the distutils' ``MANIFEST.in`` file.
For more details, see :ref:`datafiles`
Development mode
================
``setuptools`` allows you to install a package without copying any files
to your interpretor directory (e.g. the ``site-packages`` directory). This
allows you to modify your source code and have the changes take effect without
you having to rebuild and reinstall. This is currently incompatible with
PEP 517 and therefore it requires a ``setup.py`` script with the following
content::
import setuptools
setuptools.setup()
Then::
pip install --editable .
This creates a link file in your interpretor site package directory which
associate with your source code. For more information, see: (WIP)
Uploading your package to PyPI
==============================
After generating the distribution files, next step would be to upload your
distribution so others can use it. This functionality is provided by
``twine <https://pypi.org/project/twine/>`` and we will only demonstrate the
basic use here.
Transitioning from ``setup.py`` to ``setup.cfg``
==================================================
To avoid executing arbitary scripts and boilerplate code, we are transitioning
into a full-fledged ``setup.cfg`` to declare your package information instead
of running ``setup()``. This inevitably brings challenges due to a different
syntax. Here we provide a quick guide to understanding how ``setup.cfg`` is
parsed by ``setuptool`` to ease the pain of transition.
Resources on Python packaging
=============================
Packaging in Python is hard. Here we provide a list of links for those that
want to learn more.
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment