Skip to content

Commit 4c6ea0f

Browse files
committed
Merge remote-tracking branch 'origin/develop' into rl4sem
Conflicts: src/sensai/data_transformation/dft.py src/sensai/torch/torch_base.py
2 parents 4c45838 + 62bd604 commit 4c6ea0f

35 files changed

+898
-586
lines changed

.bumpversion.cfg

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[bumpversion]
2-
current_version = 1.3.0
2+
current_version = 1.4.0
33
commit = False
44
tag = False
55
allow_dirty = False

.github/workflows/ci.yaml

Lines changed: 136 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,136 @@
1+
name: CI
2+
3+
on:
4+
push:
5+
branches: [develop, master]
6+
pull_request:
7+
branches: [develop]
8+
9+
jobs:
10+
# Define matrix build for multiple test scenarios
11+
test:
12+
runs-on: ubuntu-latest
13+
strategy:
14+
matrix:
15+
# py_backwardscompat is not included because there is also an issue with numpy compatibility on Python 3.10
16+
env_name: [py_pinned_dependencies, py_latest_dependencies]
17+
steps:
18+
- name: Checkout Code
19+
uses: actions/checkout@v3
20+
with:
21+
lfs: true
22+
23+
- name: Setup Python
24+
uses: actions/setup-python@v4
25+
with:
26+
python-version: "3.10"
27+
28+
- name: Install Dependencies
29+
run: |
30+
python -m pip install --upgrade pip
31+
if [[ "${{ matrix.env_name }}" == "py_pinned_dependencies" ]]; then
32+
pip install -v \
33+
-r requirements.txt \
34+
-r requirements_torch.txt \
35+
-r requirements_lightgbm.txt \
36+
-r requirements_geoanalytics.txt \
37+
-r requirements_xgboost.txt \
38+
pytest \
39+
pytest-cov \
40+
pytest-xdist \
41+
pytorch-lightning~=1.1.0 \
42+
coverage \
43+
coverage-badge
44+
elif [[ "${{ matrix.env_name }}" == "py_latest_dependencies" ]]; then
45+
pip install \
46+
pytest \
47+
jupyter==1.0.0 \
48+
nbconvert==6.5.0 \
49+
clearml==0.17.1 \
50+
"pytorch-lightning>=1.1" \
51+
".[full]"
52+
elif [[ "${{ matrix.env_name }}" == "py_backwardscompat" ]]; then
53+
pip install \
54+
pytest \
55+
"scikit-learn==1.0.2" \
56+
"numpy<1.21" \
57+
".[torch]"
58+
fi
59+
pip install --no-deps -e .
60+
61+
- name: Run Tests
62+
run: |
63+
if [[ "${{ matrix.env_name }}" == "py_pinned_dependencies" ]]; then
64+
coverage erase
65+
pytest -n 4 --cov --cov-append --cov-report=term-missing tests;
66+
coverage html
67+
coverage-badge -o badges/coverage.svg -f
68+
elif [[ "${{ matrix.env_name }}" == "py_latest_dependencies" ]]; then
69+
pytest
70+
elif [[ "${{ matrix.env_name }}" == "py_backwardscompat" ]]; then
71+
pytest tests/backwardscompat;
72+
fi
73+
74+
- name: Run Notebook Tests
75+
run: |
76+
if [[ "${{ matrix.env_name }}" == "py_latest_dependencies" ]]; then
77+
pytest notebooks;
78+
fi
79+
80+
- name: Upload Notebook Artifacts
81+
if: matrix.env_name == 'py_latest_dependencies'
82+
uses: actions/upload-artifact@v4
83+
with:
84+
name: docs_outputs
85+
path: docs/
86+
87+
# Documentation build job
88+
docs:
89+
needs: test
90+
runs-on: ubuntu-latest
91+
steps:
92+
- name: Checkout Code
93+
uses: actions/checkout@v3
94+
95+
- name: Download Notebook Artifacts
96+
uses: actions/download-artifact@v4
97+
with:
98+
name: docs_outputs
99+
path: docs/
100+
101+
- name: Setup Python
102+
uses: actions/setup-python@v4
103+
with:
104+
python-version: "3.10"
105+
106+
- name: Install Documentation Dependencies
107+
run: |
108+
python -m pip install --upgrade pip
109+
pip install \
110+
sphinx==5.0.2 \
111+
sphinxcontrib-websupport==1.2.4 \
112+
sphinx-toolbox==3.7.0 \
113+
sphinx_rtd_theme \
114+
nbsphinx \
115+
ipython \
116+
ipywidgets \
117+
jupyter-book==0.15.1
118+
119+
- name: Build Documentation
120+
run: sh build-docs.sh
121+
122+
- name: Prepare Pages
123+
if: github.ref == 'refs/heads/develop'
124+
run: |
125+
mv docs/build/* public/docs
126+
127+
- name: Deploy Pages
128+
if: github.ref == 'refs/heads/develop'
129+
uses: JamesIves/github-pages-deploy-action@3.7.1
130+
with:
131+
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
132+
BRANCH: gh-pages
133+
FOLDER: public
134+
TARGET_FOLDER: .
135+
CLEAN: true
136+
SINGLE_COMMIT: true

.github/workflows/release.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ jobs:
3232
- name: Set up Python for PyPI Release
3333
uses: actions/setup-python@v1
3434
with:
35-
python-version: '3.7'
35+
python-version: '3.10'
3636
- name: Install dependencies for PyPI Release
3737
run: |
3838
python -m pip install --upgrade pip

.github/workflows/tox.yaml

Lines changed: 0 additions & 73 deletions
This file was deleted.

CHANGELOG.md

Lines changed: 41 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,48 @@
11
# Changelog
22

3-
43
## Unreleased
54

5+
### Breaking Changes
6+
7+
* Dropped support for Python versions below 3.10
8+
* Dropped support for TensorFlow (removing `sensai.tensorflow`)
9+
10+
### Improvements/Changes
11+
12+
* `util`:
13+
* `util.cache`:
14+
* `cache_mysql.MySQLPersistentKeyValueCache`:
15+
* Switch from MySQLdb to pymsql
16+
* Add support for additional connection arguments
17+
* Use autocommit and remove option of using deferred commits as it's the only way to guarantee
18+
that no stale data is read due to transactions going on too long
19+
* Handle duplicate key upon insertion due to race condition by providing a more informative Exception
20+
* `util.logging`:
21+
* `configure`: Allow the output stream to be configured
22+
* `util.git`:
23+
* `git_status`: Add option `log_error`
24+
25+
26+
## 1.4.0 (2025-01-21)
27+
28+
### Improvements/Changes
29+
30+
* `evaluation`: For cases where the model projects the input data to a subset of rows,
31+
the evaluator now projects the ground truth data accordingly.
32+
* `util`:
33+
* `util.deprecation`: Annotations of class init functions now report the respective class as being deprecated
34+
* `util.plot`: Add method `Plot.show` for convenience
35+
* `util.string`:
36+
* `pretty_string_repr` (and `ToStringMixin.pprint` and `.pprints`):
37+
Handle content in curly braces, i.e. dictionaries, treating them like an object with content indented if it is too long
38+
* `to_string` (and `ToStringMixin`):
39+
for objects of type `str`, return a quoted string (as returned by `repr`) to avoid strings with line breaks distracting the output
40+
* `util.helper`: Add `flatten_dict` to flatten a dictionary into a single-level dictionary with keys as dot-separated paths
41+
* `util.tensorboard`: New module with utilities for loading/comparing series data from tensorboard logs
42+
* `util.git`: New module with function `git_status` for retrieving the status of a git repository (which can be useful for logging)
43+
44+
## v1.3.0 (2024-11-29)
45+
646
### Improvements/Changes
747

848
* `vector_model`:

README-dev.md

Lines changed: 19 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -18,30 +18,39 @@ Use conda to set up the Python environment:
1818

1919
Solving the environment may take several minutes (but should ultimately work).
2020

21-
NOTE: versions are mostly unpinned in the environment specification, because this facilitates conda dependency resolution. Also, sensAI is intended to be compatible with *all* (newer) versions of the dependencies. If it isn't, we need to specify an upper version bound in `setup.py` (where it matters the most) as well as in `environment.yml`. Compatibility with old (pinned) versions and the latest versions is tested in the tox build (see below).
21+
NOTE: versions are mostly unpinned in the environment specification, because this facilitates conda dependency resolution.
22+
Also, sensAI is intended to be compatible with *all* (newer) versions of the dependencies.
23+
If it isn't, we need to specify an upper version bound in `setup.py` (where it matters the most) as well as in `environment.yml`.
24+
Compatibility with old (pinned) versions and the latest versions is tested in the GitHub build (see below).
2225

2326
# Build and Test Pipeline
2427

25-
The tests and docs build are executed via **tox** in several environments:
26-
* `py`: the "regular" test environment, where we test against the pinned dependencies (by explicitly including `requirements.txt` with the pinned versions; this is also the environment in which we test the execution of notebooks
27-
* `py_latest_dependencies`: the environment where we use the latest versions of all dependencies (except where we have identified an incompatibility; see `setup.py` definitions `DEPS_VERSION_LOWER_BOUND` and `DEPS_VERSION_UPPER_BOUND_EXCLUSIVE`); by not including `requirements.txt`, we depend on the latest admissible versions according to `setup.py`
28-
* `docs`: the environment in which docs are built via sphinx
28+
The tests and docs build are executed in several environments:
29+
* `py_pinned_dependencies`: the "regular" test environment, where we test against the pinned dependencies
30+
(by explicitly including `requirements.txt` with the pinned versions; this is also the environment in which we test the
31+
execution of notebooks
32+
* `py_latest_dependencies`: the environment where we use the latest versions of all dependencies (except where we have
33+
identified an incompatibility; see `setup.py` definitions `DEPS_VERSION_LOWER_BOUND` and `DEPS_VERSION_UPPER_BOUND_EXCLUSIVE`);
34+
by not including `requirements.txt`, we depend on the latest admissible versions according to `setup.py`
35+
* `py_backwardscompa`: a special environment with old versions of some critical dependences where we can test backwards compatibility
36+
with persisted models of very old sensAI versions (that used older versions of the dependencies, e.g. sklearn)
2937

3038
## Automated Tests
3139

32-
The tests can be locally run without tox via
40+
The tests can be locally via
3341

3442
sh run_pytest_tests.sh
3543

3644
## Docs Build
3745

38-
Docs are automatically created during the GitHub build via tox.
46+
Docs are automatically created during the GitHub build.
3947

4048
All .rst files are auto-generated (by `build_scripts/update_docs.py`), with the exception of the root index file `index.rst`.
4149

42-
### Declaring Optional Dependencies
50+
### Declaring Mock Imports for Dependencies
4351

44-
**Attention**: Make sure that any optional sensAI dependencies (which are not included in the `docs` tox environment) are added to `docs/conf.py` under `autodoc_mock_imports`. Otherwise the tox build will fail.
52+
**Attention**: Make sure that any sensAI dependencies are added to `docs/_config.yml` under `autodoc_mock_imports`.
53+
Otherwise, the docs build will fail.
4554

4655
### Notebooks
4756

@@ -55,7 +64,7 @@ For changes in notebooks to be reflected in the docs build, the test needs to be
5564

5665
### Manually Running the Docs Build
5766

58-
The docs build can be run without tox via
67+
The docs build can be run via
5968

6069
sh build-docs.sh
6170

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
<div align="center" style="text-align:center">
88
<a href="https://pypi.org/project/sensai/" style="text-decoration:none"><img src="https://img.shields.io/pypi/v/sensai.svg" alt="PyPI"></a>
99
<a href="https://raw.githubusercontent.com/jambit/sensAI/master/LICENSE" style="text-decoration:none"><img alt="License" src="https://img.shields.io/pypi/l/sensai"></a>
10-
<a href="https://github.com/jambit/sensAI/actions/workflows/tox.yaml" style="text-decoration:none"><img src="https://github.com/jambit/sensAI/actions/workflows/tox.yaml/badge.svg" alt="Build status"></a>
10+
<a href="https://github.com/jambit/sensAI/actions/workflows/ci.yaml" style="text-decoration:none"><img src="https://github.com/jambit/sensAI/actions/workflows/ci.yaml/badge.svg" alt="Build status"></a>
1111
</div>
1212
</p>
1313

0 commit comments

Comments
 (0)