Beginning with conda version 4.3 and conda-build 2.1, two new types of noarch packages are supported. Noarch Python packages cut down on the overhead of building multiple different pure Python packages on different architectures and Python versions by sorting out platform and Python version-specific differences at install time. Noarch generic packages allow users to distribute docs, datasets, and source code in conda packages.
It’s true that conda-build has had a noarch_python
option for a while, but the user experience has been suboptimal. The deprecated noarch_python
flag adds a pre-link script to the package to handle the several install-time platform-dependent differences. Our noarch Python implementation teaches conda itself about noarch packages of the Python type. The noarch Python-dependent install logic is moved out of the package and into conda, where any extra capabilities and bugs can be directly addressed without the need to rebuild the package.
How to Build Noarch Packages
To build a noarch Python package, specify noarch in your meta.yaml:
build: noarch: python
Similarly, to build a noarch generic package, specify noarch in your meta.yaml:
build: noarch: generic
While there are currently only two flavors of supported noarch package—generic
and python
—we’ll likely extend the concept in future releases.
The Anatomy of a Python Noarch Package
Similar to a regular conda package, a noarch Python package will contain a site-packages
and info
directory that define the package. In addition to the standard files in the info
directory there is link.json
that defines the type of noarch package and any entry points or scripts. Entry points, defined in the setup.py
style entry_points['console_scripts']
will be created by conda when the package is installed. These packages also do not contain any .pyc
files since these differ among python versions. Instead, generation of .pyc
files is handled by conda at install time. All other scripts associated with the package, for example those found in bin
or Scripts
, will be included in the directory python-scripts
.
The package structure will look something like:
package
- info/
- files
about.json
- index.json
- link.json
- recipe/
- ...
- site-packages/
- python-scripts/
Where, link.json
will have a noarch section that will look something like:
{
"noarch": {
"type": "python",
"entry_points": [
"pkg = pkg.foo:main"
]
}
}
The Noarch Python Package Build Process
By defining noarch: python
in meta.yaml, conda-build will create a noarch Python package as defined above without any .pyc
files or __pycache__
directories. It will also create a info/link.json
file with information about the type of noarch package and the entry point information.
For example, consider the flask package on the anaconda-recipes repo. This is a pure Python package that can easily be turned into a noarch package by slight modification to the meta.yaml file:
build: noarch: python entry_points: flask = flask.cli:main
Then, build the package as any normal conda package: conda build .
The resulting package is a noarch flask package and installable on any architecture and Python version (that the package itself supports).
The Noarch Python Package Install Process
To install these packages, conda will map site-packages
and python-scripts
directories to the corresponding correct locations within the install prefix. It will then generate the entry points for the package if applicable. On Windows systems, it will include the shim script required for entry points to work. Finally, conda will compile .pyc
files. From the user’s perspective, installing a noarch package is the same as any other, and is as simple as conda install <package>
.
Uninstalling noarch packages works the same way that uninstalling regular conda packages works. That is, conda uninstall <package>
will remove it from the environment.
Looking to the Future
This new way of treating noarch packages aims to provide users with a flexible way of creating conda packages that are platform and python version agnostic.
Talk to an Expert
Talk to one of our experts to find solutions for your AI journey.