Skip to content

Commit 5b35e5c

Browse files
committed
doc update module, functional, geometry types
1 parent a6e94af commit 5b35e5c

File tree

13 files changed

+378
-98
lines changed

13 files changed

+378
-98
lines changed

.github/workflows/docs.yml

Lines changed: 6 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -29,16 +29,13 @@ jobs:
2929

3030
- name: Install dependencies
3131
run: |
32-
pip install mkdocs
33-
pip install mkdocs-rtd-dropdown
34-
pip install mkdocstrings
35-
pip install mkdocstrings-python
36-
pip install pymdown-extensions
37-
pip install torch
38-
pip install warpconvnet==0.3.4
32+
python -m pip install --upgrade pip
33+
pip install -r docs/requirements.txt
34+
pip install torch torchvision --index-url https://download.pytorch.org/whl/cpu
35+
echo "PYTHONPATH=$PWD" >> $GITHUB_ENV
3936
4037
- name: Build documentation
41-
run: mkdocs build -f docs/mkdocs-readthedocs.yml
38+
run: mkdocs build
4239

4340
- name: Setup Pages
4441
uses: actions/configure-pages@v4
@@ -58,4 +55,4 @@ jobs:
5855
steps:
5956
- name: Deploy to GitHub Pages
6057
id: deployment
61-
uses: actions/deploy-pages@v4
58+
uses: actions/deploy-pages@v4

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -180,10 +180,10 @@ pytest tests/ --benchmark-only
180180
uv pip install -r docs/requirements.txt
181181

182182
# Build docs
183-
mkdocs build -f mkdocs-readthedocs.yml
183+
mkdocs build
184184

185185
# Serve locally
186-
mkdocs serve -f mkdocs-readthedocs.yml
186+
mkdocs serve
187187
```
188188

189189
📖 **Documentation**: [https://nvlabs.github.io/WarpConvNet/](https://nvlabs.github.io/WarpConvNet/)

docs/api/geometry.md

Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,56 @@
11
# Geometry
22

33
::: warpconvnet.geometry
4+
5+
## Geometry containers
6+
7+
Every geometry type in WarpConvNet wraps a `Coords` instance and a `Features`
8+
instance that share the same ragged-batch metadata.
9+
10+
- `warpconvnet.geometry.base.coords.Coords` stores concatenated coordinates plus an
11+
`offsets` vector marking where each example begins.
12+
- `warpconvnet.geometry.base.features.Features` (and the `CatFeatures` and `PadFeatures`
13+
specializations) stores feature tensors that obey the same offsets so coordinates and
14+
features always stay aligned.
15+
- `warpconvnet.geometry.base.geometry.Geometry` wires the pair together, validates their
16+
shapes, and exposes device/dtype utilities with AMP-aware accessors.
17+
18+
This shared contract lets subclasses freely switch between point clouds, voxels, or grids
19+
without duplicating batching logic. See [Batched coordinate layout](./geometry_batched.md)
20+
for a deeper explanation of how concatenated tensors and offsets interact.
21+
22+
## Types
23+
24+
WarpConvNet ships several geometry containers that unify coordinate systems
25+
with their associated features. Use these types as the canonical interfaces for
26+
points, voxels, dense grids, and FIGConvNet factor grids.
27+
28+
### Points
29+
30+
Flexible point-cloud geometry supporting ragged batches, feature paddings, and
31+
neighbor search utilities for sparse convolution modules.
32+
33+
::: warpconvnet.geometry.types.points.Points
34+
35+
### Voxels
36+
37+
Sparse voxel geometry that accepts integer coordinates with tensor strides and
38+
offers helpers to move between dense tensors and CSR-style batched features.
39+
40+
::: warpconvnet.geometry.types.voxels.Voxels
41+
42+
### Grid
43+
44+
Regular dense grid representation that keeps `GridCoords` and `GridFeatures`
45+
in sync, providing utilities for shape validation, format conversions, and
46+
batch-aware initialization.
47+
48+
::: warpconvnet.geometry.types.grid.Grid
49+
50+
### Factor grid
51+
52+
Container that bundles multiple `Grid` instances with distinct factorized
53+
memory formats so that FIGConvNet layers can operate on complementary spatial
54+
perspectives.
55+
56+
::: warpconvnet.geometry.types.factor_grid.FactorGrid

docs/api/geometry_batched.md

Lines changed: 61 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,61 @@
1+
# Batched coordinate layout
2+
3+
::: warpconvnet.geometry.base.geometry.Geometry
4+
5+
Geometry containers in WarpConvNet treat batched data as a pair of concatenated tensors
6+
plus an `offsets` vector. This “batched” layout stores every sample back-to-back inside a
7+
single tensor while recording where each sample begins so ragged batches stay addressable.
8+
9+
## Concatenated coordinates
10+
11+
`warpconvnet.geometry.base.coords.Coords` keeps two tensors:
12+
13+
- `batched_tensor`: shape `[N, D]` with all coordinates concatenated in batch order.
14+
- `offsets`: shape `[B + 1]` with `offsets[b]` marking the starting row for batch `b`.
15+
16+
The difference `offsets[b + 1] - offsets[b]` gives the number of coordinates in sample
17+
`b`. Because the data lives in one tensor, it is cheap to move, sort, or feed to CUDA
18+
kernels that expect CSR-style layouts.
19+
20+
## Features that share offsets
21+
22+
`warpconvnet.geometry.base.features.Features` mirrors the coordinate batching strategy.
23+
Both `CatFeatures` (concatenated) and `PadFeatures` (padded) expose the same offsets so
24+
`Geometry` subclasses can hop between dense and ragged views without recomputing metadata.
25+
26+
- Concatenated features: `[N, C]` storage with the same CSR-style offsets as the
27+
coordinates.
28+
- Padded features: `[B, L_max, C]` storage when kernels need dense memory, but each row
29+
still references the shared offsets to keep track of valid entries.
30+
31+
## Putting it together
32+
33+
The base `Geometry` class validates that coordinates and features have identical offsets,
34+
then exposes helpers such as `geometry.batch_indexed_coordinates` and AMP-aware feature
35+
accessors. Subclasses (e.g., `Points`, `Voxels`, `Grid`) inherit these guarantees and add
36+
type-specific methods on top.
37+
38+
```python
39+
import torch
40+
from warpconvnet.geometry.types.points import Points
41+
42+
coords = torch.tensor(
43+
[
44+
[0, 0, 0],
45+
[0, 1, 0],
46+
[1, 0, 0],
47+
[5, 5, 5],
48+
[6, 5, 5],
49+
],
50+
dtype=torch.int32,
51+
)
52+
offsets = torch.tensor([0, 3, 5], dtype=torch.int32)
53+
features = torch.randn(5, 32)
54+
55+
points = Points(coords, features, offsets=offsets)
56+
# points.batched_coordinates.offsets == points.batched_features.offsets
57+
```
58+
59+
In this example two ragged samples (lengths 3 and 2) share one coordinate tensor and one
60+
feature tensor. The shared offsets allow the geometry module to slice, move devices, or
61+
convert between concatenated and padded features without copying metadata.

docs/api/nn.md

Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,65 @@
11
# Neural Networks
22

33
::: warpconvnet.nn
4+
5+
## Modules
6+
7+
### Activations
8+
9+
::: warpconvnet.nn.modules.activations
10+
11+
### Attention
12+
13+
::: warpconvnet.nn.modules.attention
14+
15+
### Base module
16+
17+
::: warpconvnet.nn.modules.base_module
18+
19+
### Factor grid
20+
21+
::: warpconvnet.nn.modules.factor_grid
22+
23+
### Grid convolution
24+
25+
::: warpconvnet.nn.modules.grid_conv
26+
27+
### MLP
28+
29+
::: warpconvnet.nn.modules.mlp
30+
31+
### Normalizations
32+
33+
::: warpconvnet.nn.modules.normalizations
34+
35+
### Point convolution
36+
37+
::: warpconvnet.nn.modules.point_conv
38+
39+
### Point pooling
40+
41+
::: warpconvnet.nn.modules.point_pool
42+
43+
### Prune
44+
45+
::: warpconvnet.nn.modules.prune
46+
47+
### Sequential
48+
49+
::: warpconvnet.nn.modules.sequential
50+
51+
### Sparse convolution
52+
53+
::: warpconvnet.nn.modules.sparse_conv
54+
55+
### Sparse depthwise convolution
56+
57+
::: warpconvnet.nn.modules.sparse_conv_depth
58+
59+
### Sparse pooling
60+
61+
::: warpconvnet.nn.modules.sparse_pool
62+
63+
### Transforms
64+
65+
::: warpconvnet.nn.modules.transforms

docs/api/nn_functional.md

Lines changed: 61 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,61 @@
1+
# Neural Network Functionals
2+
3+
::: warpconvnet.nn.functional
4+
5+
## Functionals
6+
7+
### Sparse convolution
8+
9+
::: warpconvnet.nn.functional.sparse_conv
10+
11+
### Sparse depthwise convolution
12+
13+
::: warpconvnet.nn.functional.sparse_conv_depth
14+
15+
### Sparse pooling
16+
17+
::: warpconvnet.nn.functional.sparse_pool
18+
19+
### Sparse ops helpers
20+
21+
::: warpconvnet.nn.functional.sparse_ops
22+
23+
### Point pooling
24+
25+
::: warpconvnet.nn.functional.point_pool
26+
27+
### Point unpooling
28+
29+
::: warpconvnet.nn.functional.point_unpool
30+
31+
### Grid convolution
32+
33+
::: warpconvnet.nn.functional.grid_conv
34+
35+
### Factor grid
36+
37+
::: warpconvnet.nn.functional.factor_grid
38+
39+
### Global pooling
40+
41+
::: warpconvnet.nn.functional.global_pool
42+
43+
### Feature transforms
44+
45+
::: warpconvnet.nn.functional.transforms
46+
47+
### Encodings
48+
49+
::: warpconvnet.nn.functional.encodings
50+
51+
### Normalizations
52+
53+
::: warpconvnet.nn.functional.normalizations
54+
55+
### Segmented arithmetic
56+
57+
::: warpconvnet.nn.functional.segmented_arithmetics
58+
59+
### Batched matrix multiplication
60+
61+
::: warpconvnet.nn.functional.bmm

docs/deployment.md

Lines changed: 12 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -9,17 +9,20 @@ The documentation is automatically built and deployed using GitHub Actions whene
99
### How it Works
1010

1111
1. **GitHub Actions Workflow**: `.github/workflows/docs.yml`
12+
1213
- Triggers on pushes to main/master branch
1314
- Installs Python dependencies
1415
- Builds documentation using MkDocs
1516
- Deploys to GitHub Pages
1617

17-
2. **Configuration**: `mkdocs-readthedocs.yml`
18+
2. **Configuration**: `mkdocs.yml`
19+
1820
- Uses ReadTheDocs theme
1921
- Configures site metadata
2022
- Sets up navigation structure
2123

2224
3. **Dependencies**: `docs/requirements.txt`
25+
2326
- Lists all required Python packages
2427
- Ensures consistent builds
2528

@@ -32,7 +35,7 @@ If you need to deploy manually:
3235
uv pip install -r docs/requirements.txt
3336

3437
# Build documentation
35-
mkdocs build -f mkdocs-readthedocs.yml
38+
mkdocs build
3639

3740
# The site/ directory contains the built documentation
3841
```
@@ -43,7 +46,7 @@ For local development and testing:
4346

4447
```bash
4548
# Serve documentation locally
46-
mkdocs serve -f mkdocs-readthedocs.yml
49+
mkdocs serve
4750

4851
# This will start a local server at http://127.0.0.1:8000
4952
```
@@ -60,16 +63,19 @@ To enable GitHub Pages:
6063
## Troubleshooting
6164

6265
### Build Failures
66+
6367
- Check GitHub Actions logs for error details
6468
- Ensure all dependencies are listed in `docs/requirements.txt`
6569
- Verify MkDocs configuration syntax
6670

6771
### Missing Pages
72+
6873
- Check that all referenced markdown files exist
69-
- Verify navigation structure in `mkdocs-readthedocs.yml`
74+
- Verify navigation structure in `mkdocs.yml`
7075
- Ensure files are committed to the repository
7176

7277
### Theme Issues
78+
7379
- ReadTheDocs theme is automatically installed by the workflow
74-
- Check theme configuration in `mkdocs-readthedocs.yml`
75-
- Verify JavaScript and CSS assets are loading correctly
80+
- Check theme configuration in `mkdocs.yml`
81+
- Verify JavaScript and CSS assets are loading correctly

0 commit comments

Comments
 (0)