Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Levelset method for 4D change detection #342

Open
wants to merge 23 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ jobs:

- name: Run Python tests
run: |
python -m pytest --nbval
python -m pytest --nbval --ignore=jupyter/levelset-analysis.ipynb

coverage-test:
name: Coverage Testing
Expand Down Expand Up @@ -195,6 +195,8 @@ jobs:
echo "leak:PyArrayMethod_FromSpec_int" >> supp.txt
echo "leak:PyDict_Copy" >> supp.txt
echo "leak:PyUnicode_New" >> supp.txt
echo "leak:PyUFunc_FromFuncAndDataAndSignatureAndIdentity" >> supp.txt
echo "leak:PyArrayIdentityHash_New" >> supp.txt
echo "leak:pyo3::types::function::PyCFunction::internal_new_from_pointers" >> supp.txt
echo "leak:pyo3::types::function::PyCFunction::internal_new::" >> supp.txt
# hack to prevent external libs from dlclosing libraries,
Expand Down
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -177,3 +177,6 @@ dmypy.json

# Cython debug symbols
cython_debug/
jupyter/new_data_24_168_1/

jupyter/synthetic_k0_10_10/
2 changes: 1 addition & 1 deletion ext/pybind11
Submodule pybind11 updated 140 files
20 changes: 19 additions & 1 deletion jupyter/4dobc-analysis.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -143,6 +143,24 @@
"We can plot single objects interactively:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"test = objects[1]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"test"
]
},
{
"cell_type": "code",
"execution_count": null,
Expand Down Expand Up @@ -216,7 +234,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.6"
"version": "3.11.9"
}
},
"nbformat": 4,
Expand Down
231 changes: 231 additions & 0 deletions jupyter/levelset-analysis.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,231 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import py4dgeo"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"analysis = py4dgeo.SpatiotemporalAnalysis(\"synthetic.zip\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"analysis.invalidate_results(seeds=False, objects=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"Options for the data setup:\n",
"- `first_timestep` (int): The first timestep to process. Default is 0.\n",
"- `last_timestep` (int): The last timestep to process. If set to -1, all timesteps until the end are processed. Default is -1.\n",
"- `timestep_interval` (int): The interval between timesteps to process. Default is 1.\n",
"\n",
"Example for the interval pairings:\n",
"interval: 24 (every timestep will be compared to the timestep + 24)\n",
"first_timestep: 0 (start with the 0 index)\n",
"last_timestep: 75 (last index to be used will be 75, so the highest index accessed will be 99)\n",
"step0: f1:0 f2:24\n",
"step1: f1:1 f2:25\n",
"step2: f1:2 f2:26\n",
"step3: f3:3 f2:27\n",
"...\n",
"stepN f1:75 f2:99\n",
"\n",
"Options for level set algorithm:\n",
"- `reuse_intermediate` (bool): Re-use intermediate calculations\n",
"(neighbors, normals, tangents). Default is True.\n",
"- `active_contour_model` (str): Active contours model, either\n",
"'chan_vese' or 'lmv' (Local Mean and Variance). Default is 'chan_vese'.\n",
"- `num_cycles` (int): Number of cycles, each runs a number of steps\n",
"then stores a result. Default is 12.\n",
"- `num_steps` (int): Number of steps per cycle. Default is 50.\n",
"- `num_smooth` (int): Number of smoothing passes for zeta.\n",
"Default is 1.\n",
"- `stepsize` (int): Explicit Euler step size. Default is 1000.\n",
"- `nu` (float): Controls regularization. Default is 0.0001.\n",
"- `mu` (float): Controls curvature. Default is 0.0025.\n",
"- `lambda1` (float): Controls zeta-in term. Default is 1.0.\n",
"- `lambda2` (float): Controls zeta-out term. Default is 1.0.\n",
"- `epsilon` (float): Heaviside/delta approximation \"width\",\n",
"is scaled with `h`. Default is 1.0.\n",
"- `h` (float): Approximate neighborhood radius\n",
"(all k neighbors should be within). Default is 2.5.\n",
"- `k` (int): Number of kNN neighbors. Default is 7.\n",
"- `tolerance` (float): Termination tolerance. Default is 5e-5.\n",
"- `cue_clip_pc` (float): Robust cues, clip at X%. Default is 99.9.\n",
"- `vox_size` (int): Initialization voxel size. Default is 10.\n",
"- `init_pc` (int): Initialization cue percentage. Default is 50.\n",
"- `init_method` (str): Initialization method, either 'voxel' or 'cue'.\n",
"Default is 'voxel'.\n",
"- `extraction_threshold` (int): Neighbor threshold for points\n",
"to be extracted\n",
"(must have >= salient neighbors to be extracted).\n",
"Calculated as `k // 2`.\n",
"- `center_data` (bool): Recenter cues by subtracting cue median.\n",
"Default is False.\n",
"\n",
"\n",
"Options for the shape analysis:\n",
"- `_filter` (str): Choose between the positive and negative data files.\n",
"Default is 'positive'.\n",
"- `distance_threshold` (float): How far points can be to still be\n",
"considered of the same object. Default is 1.\n",
"- `change_threshold` (float): How high the change value needs to be to\n",
"be considered a valid entry. Default is 0.5.\n",
"- `alpha` (float): Alpha parameter for the alpha shape identification,\n",
"the lower the smoother the shape, but less exact. Default is 1.\n",
"- `area_threshold` (int): Area threshold for filtering small polygons.\n",
"Default is 100.\n",
"- `iou_threshold` (float): Intersection over Union (IoU) threshold for\n",
"assigning objects IDs in different time steps. Default is 0.5.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"algo = py4dgeo.LevelSetAlgorithm(\n",
" first_timestep=0,\n",
" last_timestep=4,\n",
" timestep_interval=10,\n",
" alpha=0.1,\n",
" iou_threshold=0.5,\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"algo.run(analysis)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"objects = analysis.objects\n",
"print(objects)\n",
"test_obj = objects[0]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"test_obj.plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"test_obj.change_histogram(nbins_x=10)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Overview of the most important object properties\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"test_obj.gdf"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Filter for positiv or negative files used in the ojbect generation\n",
"print(\"Filter:\", test_obj.filter)\n",
"print(\"Timestep interval:\", test_obj.interval)\n",
"print(\"IoU threshold:\", test_obj.iou_thresholds)\n",
"\n",
"\n",
"# The .timesteps attribute is a list of all the timesteps available for this object.\n",
"# These indices correspond to the first epoch of the calculation\n",
"# These indices are used for the .indicies, .coordinates and .distances attributes\n",
"\n",
"print(\"Timesteps:\", test_obj.timesteps)\n",
"print(\"Indices keys:\", test_obj.indices.keys())\n",
"print(\"Coordinates keys:\", test_obj.coordinates.keys())\n",
"print(\"distances keys:\", test_obj.distances.keys())\n",
"\n",
"# print(test_obj.indices[0])\n",
"# print(test_obj.coordinates[0])\n",
"# print(test_obj.distances[0])"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.9"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
10 changes: 10 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,16 @@ dependencies = [
"scikit-learn",
"vedo",
"xdg",
"geopandas",
"shapely",
"pandas",
"scipy",
"plotly",
"alphashape",
"networkx",
"tqdm",


]

# Command line scripts installed as part of the installation
Expand Down
2 changes: 2 additions & 0 deletions src/py4dgeo/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,3 +30,5 @@
)

from py4dgeo.pbm3c2 import *

from py4dgeo.levelset import LevelSetAlgorithm
Loading