Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
31 changes: 17 additions & 14 deletions .github/workflows/build-and-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,8 @@ env:
LATEST_SUPPORTED_PYTHON_VERSION: "3.14"
# gsutil never seems to support the latest python
GSUTIL_PYTHON_VERSION: "3.13"
GDAL_VERSION_FOR_WHEEL_BUILD: "3.10.*"


jobs:
check-syntax-errors:
Expand Down Expand Up @@ -102,6 +104,8 @@ jobs:
${{ env.CONDA_DEFAULT_DEPENDENCIES }}
${{ matrix.platform-specific-dependencies }}
python=${{ matrix.python-version }}
gdal==${{ env.GDAL_VERSION_FOR_WHEEL_BUILD }}
wheel

- name: Download previous conda environment.yml
continue-on-error: true
Expand Down Expand Up @@ -130,21 +134,20 @@ jobs:
NATCAP_INVEST_GDAL_LIB_PATH="$CONDA_PREFIX/Library" python -m build --wheel
ls -la dist

# This produces a wheel that should work on any distro with glibc>=2.39.
# This is a very recent version. If we want to support older versions, I
# suspect we would need to build GDAL from source on an appropriate
# system (such as a manylinux docker container) to ensure compatibility.
# Symbols used in libgdal are the cause of the high minimum version,
# possibly because of installing with conda.
- name: Audit and repair wheel for manylinux
if: matrix.os == 'ubuntu-latest'
# Modify the wheel metadata to require the same minor version of gdal that we built against.
# The C++ extensions will be compiled and linked against the libgdal version
# that's in the build environment, and so that same libgdal version has to be
# available in the target environment. Each libgdal version corresponds to a
# minor version of GDAL itself. Adding this requirement to the package metadata
# should prevent it from being installed in an env without the necessary libgdal.
# Users who need a different GDAL version can use the conda-forge package or
# build their own wheel from source.
- name: Modify wheel to add gdal constraint
run: |
ldd --version
pip install auditwheel
WHEEL=$(find dist -name "natcap[._-]invest*.whl")
auditwheel show $WHEEL
auditwheel repair $WHEEL --plat manylinux_2_39_x86_64 -w dist
rm $WHEEL # remove the original wheel
wheel unpack dist/*.whl --dest tmp
sed -i.bak 's/Requires-Dist: gdal.*/Requires-Dist: gdal==${{ env.GDAL_VERSION_FOR_WHEEL_BUILD }}/I' tmp/*/*.dist-info/METADATA
rm tmp/*/*.dist-info/*.bak
wheel pack tmp/* --dest-dir dist

- name: Install wheel and run model tests
run: |
Expand Down
9 changes: 9 additions & 0 deletions HISTORY.rst
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,15 @@ General
``deploy_data``, ``deploy_userguide``, ``deploy_workbench``) and updated
these targets to fail on missing artifacts instead of silently ignoring
errors. (`#831 <https://github.com/natcap/invest/issues/813>`_)
* Pre-built wheels are now constrained to require one specific minor version of
GDAL in an attempt to ensure compatibility of the compiled extensions with
the version of ``libgdal`` available. Users who need a different GDAL version
may install ``natcap.invest`` from conda-forge or build their own wheel from
source. (`#2206 <https://github.com/natcap/invest/issues/2206>`_)
* The ``manylinux_2_39`` wheels have been replaced with ``linux`` wheels built
on Ubuntu. This reduces the size of the wheels and avoids licensing concerns
around redistributing libraries.
(`#2483 <https://github.com/natcap/invest/issues/2483>`_)


Workbench
Expand Down
8 changes: 5 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,11 @@ classifiers = [
"Topic :: Scientific/Engineering :: GIS"
]
# the version is provided dynamically by setuptools_scm
# `dependencies` and `optional-dependencies` are provided by setuptools
# using the corresponding setup args `install_requires` and `extras_require`
dynamic = ["version", "dependencies", "optional-dependencies"]
Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're not using optional-dependencies (though maybe we should!), so I removed it for now.

# dependencies are provided by setuptools reading from the requirements file
dynamic = ["version", "dependencies"]

[tool.setuptools.dynamic]
dependencies = {file = ["requirements.txt"]}

[project.urls]
homepage = "http://github.com/natcap/invest"
Expand Down
8 changes: 0 additions & 8 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,6 @@
from setuptools.command.build_py import build_py as _build_py
from setuptools.extension import Extension

# Read in requirements.txt and populate the python readme with the
# non-comment, non-environment-specifier contents.
_REQUIREMENTS = [req.split(';')[0].split('#')[0].strip() for req in
Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not a necessary change for this PR, but we might as well take advantage of the support for reading a requirements file directly in pyproject.toml.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, makes sense to consolidate towards pyproject.toml wherever possible!

open('requirements.txt').readlines()
if (not req.startswith(('#', 'hg+', 'git+'))
and len(req.strip()) > 0)]

include_dirs = [numpy.get_include(), 'src/natcap/invest/managed_raster']
if platform.system() == 'Windows':
compiler_args = ['/std:c++20']
Expand Down Expand Up @@ -60,7 +53,6 @@ def run(self):


setup(
install_requires=_REQUIREMENTS,
ext_modules=cythonize([
Extension(
name=f'natcap.invest.{package}.{module}',
Expand Down
Loading