Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Alternative to time-origin tof computation #577

Merged
merged 57 commits into from
Dec 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
aa5185b
update disk_chopper and essbeamline fake from other PR. Start adding …
nvaytet Nov 9, 2024
bfd785e
complete provider setup for alternate tof workflow
nvaytet Nov 11, 2024
4e682fd
fix implementation to handle multi-pixels
nvaytet Nov 11, 2024
23c896f
start purging old workflow
nvaytet Nov 12, 2024
83a02be
use FramePeriod and start looking at the tests
nvaytet Nov 15, 2024
de66260
Merge branch 'main' into new-wfm
nvaytet Nov 18, 2024
65c6370
fix wfm tests
nvaytet Nov 18, 2024
d3f3d0a
update some tests
nvaytet Nov 18, 2024
fb203ef
add providers for the case where there are no choppers
nvaytet Nov 19, 2024
19097f4
fix for histogrammed input data
nvaytet Nov 19, 2024
e1801c5
order of dimension in operation
nvaytet Nov 22, 2024
cf1741d
add re-histogramming provider
nvaytet Nov 22, 2024
22d7b4e
dimension fixes in rehistogram and fix test
nvaytet Nov 22, 2024
bb14f32
use ess fake in pulse skipping tests
nvaytet Nov 23, 2024
c518155
fix pulse skipping tests
nvaytet Nov 25, 2024
75869db
fix dream wfm test and remove old to_time_of_flight test covered by t…
nvaytet Nov 25, 2024
e710488
cleanup unwrap cose
nvaytet Nov 25, 2024
ecd6988
cleanup
nvaytet Nov 25, 2024
cc1fbd3
update notebooks in docs
nvaytet Nov 25, 2024
ad490f0
use polygon least-squares method which yields better results
nvaytet Nov 26, 2024
9849337
use half pulse offset when we have no choppers
nvaytet Nov 26, 2024
140a30a
add event data generator to docs module. This should probably just go…
nvaytet Nov 26, 2024
ff3b8e7
update frame unwrapping notebook
nvaytet Nov 27, 2024
15618f7
update tof version
nvaytet Nov 27, 2024
a8f9df5
Apply automatic formatting
pre-commit-ci-lite[bot] Nov 27, 2024
094e9bc
formatting
nvaytet Nov 27, 2024
e785665
Merge branch 'main' into new-wfm
nvaytet Nov 27, 2024
e791db2
update tof version in buildconfig
nvaytet Nov 27, 2024
bc157aa
fix sections in doc notebook
nvaytet Nov 27, 2024
bf0c7bd
maybe the no_choppers providers are not needed?
nvaytet Nov 29, 2024
5932fab
remove no choppers providers
nvaytet Nov 29, 2024
10fbd6c
check frame extent for every pixel
nvaytet Nov 29, 2024
d246b6d
add comment about time zero origin
nvaytet Nov 29, 2024
c399113
rename lookup types that were actually not yet lookups at that stage,…
nvaytet Nov 29, 2024
40e3253
fix docs notebook and add comment to describe x and y
nvaytet Dec 2, 2024
c36e02a
Merge branch 'main' into new-wfm
nvaytet Dec 2, 2024
97e0336
use nanmedian for new bin width to remove outliers, and add basic han…
nvaytet Dec 4, 2024
d0a76c6
move polygon approximation code into a separate function
nvaytet Dec 4, 2024
6602bcf
add comments
nvaytet Dec 4, 2024
85f9384
use uniform distribution instead of normal to generate events in bins
nvaytet Dec 4, 2024
e288a5c
fix frame at detector in the case where pulse skipping chopper has 18…
nvaytet Dec 9, 2024
8a1c0fb
add test with 180deg offset with pulse skipping
nvaytet Dec 9, 2024
e4925c8
Apply automatic formatting
pre-commit-ci-lite[bot] Dec 9, 2024
1ca97b1
start adding PulseStrideOffset so we can set which pulse number in th…
nvaytet Dec 9, 2024
7a801f8
add default workflow params and start adding test where first half of…
nvaytet Dec 10, 2024
8e91f95
Apply automatic formatting
pre-commit-ci-lite[bot] Dec 10, 2024
4b51d09
fix test with half frame missing
nvaytet Dec 10, 2024
5de689b
Merge branch 'new-wfm' of github.com:scipp/scippneutron into new-wfm
nvaytet Dec 10, 2024
accc51e
add to_events function and use that in re-histogram function
nvaytet Dec 10, 2024
6a48eab
fix tof tests
nvaytet Dec 10, 2024
202a542
update notebook on frame unwrapping
nvaytet Dec 10, 2024
24ca971
Merge branch 'main' into new-wfm
nvaytet Dec 10, 2024
b9bed49
update wfm notebooks
nvaytet Dec 10, 2024
e29b9e7
Apply automatic formatting
pre-commit-ci-lite[bot] Dec 10, 2024
9956fcd
fix dream notebook
nvaytet Dec 10, 2024
4af881e
Merge branch 'new-wfm' of github.com:scipp/scippneutron into new-wfm
nvaytet Dec 10, 2024
c3823a6
add support for masks in to_events
nvaytet Dec 11, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .buildconfig/ci-linux.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ dependencies:
- sphinx-copybutton==0.5.2
- sphinx-design==0.6.1
- sphinxcontrib-bibtex==2.6.3
- tof==24.10.0
- tof==24.12.0

# docs and tests
- sciline==24.10.0
746 changes: 676 additions & 70 deletions docs/user-guide/chopper/frame-unwrapping.ipynb

Large diffs are not rendered by default.

100 changes: 0 additions & 100 deletions docs/user-guide/chopper/frameunwrapping/__init__.py

This file was deleted.

59 changes: 11 additions & 48 deletions docs/user-guide/wfm/dream-wfm.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -261,7 +261,7 @@
"metadata": {},
"outputs": [],
"source": [
"raw_data = ess_beamline.get_monitor(\"detector\")\n",
"raw_data = ess_beamline.get_monitor(\"detector\")[0]\n",
"\n",
"# Visualize\n",
"raw_data.hist(event_time_offset=300).sum(\"pulse\").plot()"
Expand Down Expand Up @@ -385,14 +385,9 @@
"metadata": {},
"outputs": [],
"source": [
"workflow = sl.Pipeline(\n",
" unwrap.unwrap_providers()\n",
" + unwrap.time_of_flight_providers()\n",
" + unwrap.time_of_flight_origin_from_choppers_providers(wfm=True)\n",
")\n",
"workflow = sl.Pipeline(unwrap.providers(), params=unwrap.params())\n",
"\n",
"workflow[unwrap.PulsePeriod] = sc.reciprocal(ess_beamline.source.frequency)\n",
"workflow[unwrap.PulseStride | None] = None\n",
"workflow[unwrap.SourceTimeRange] = pulse_tmin, pulse_tmax\n",
"workflow[unwrap.SourceWavelengthRange] = pulse_wmin, pulse_wmax\n",
"workflow[unwrap.Choppers] = choppers\n",
Expand Down Expand Up @@ -420,7 +415,7 @@
"metadata": {},
"outputs": [],
"source": [
"bounds = workflow.compute(unwrap.CleanFrameAtDetector).subbounds()\n",
"bounds = workflow.compute(unwrap.FrameAtDetector).subbounds()\n",
"bounds"
]
},
Expand Down Expand Up @@ -597,7 +592,7 @@
"outputs": [],
"source": [
"raw_data = sc.concat(\n",
" [ess_beamline.get_monitor(key) for key in monitors.keys()],\n",
" [ess_beamline.get_monitor(key)[0] for key in monitors.keys()],\n",
" dim='detector_number',\n",
" )\n",
"\n",
Expand Down Expand Up @@ -743,44 +738,8 @@
"id": "48",
"metadata": {},
"source": [
"We now update our workflow with the new choppers and raw data,\n",
"compute the frame bounds and overlay them on the figure."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "49",
"metadata": {},
"outputs": [],
"source": [
"workflow[unwrap.Choppers] = choppers\n",
"workflow[unwrap.Ltotal] = Ltotal\n",
"workflow[unwrap.RawData] = ess_beamline.get_monitor(\"detector\")\n",
"\n",
"\n",
"bounds = workflow.compute(unwrap.CleanFrameAtDetector).subbounds()\n",
"\n",
"for b in sc.collapse(bounds[\"time\"], keep=\"bound\").values():\n",
" cascade_ax.axvspan(\n",
" b[0].to(unit=\"ms\").value,\n",
" b[1].to(unit=\"ms\").value,\n",
" color=\"gray\",\n",
" alpha=0.2,\n",
" zorder=-5,\n",
" )\n",
"\n",
"cascade_fig"
]
},
{
"cell_type": "markdown",
"id": "50",
"metadata": {},
"source": [
"We see that instead of having overlapping bounds, the region in the middle has been excluded.\n",
"\n",
"This means that the neutrons in the overlapping region will be discarded in the time-of-flight calculation\n",
"To avoid the overlap in time, the region in the middle will be excluded,\n",
"discarding the neutrons from the time-of-flight calculation\n",
"(in practice, they are given a NaN value as a time-of-flight).\n",
"\n",
"This is visible when comparing to the true neutron wavelengths,\n",
Expand All @@ -790,10 +749,14 @@
{
"cell_type": "code",
"execution_count": null,
"id": "51",
"id": "49",
"metadata": {},
"outputs": [],
"source": [
"workflow[unwrap.Choppers] = choppers\n",
"workflow[unwrap.Ltotal] = Ltotal\n",
"workflow[unwrap.RawData] = ess_beamline.get_monitor(\"detector\")[0]\n",
"\n",
"# Compute time-of-flight\n",
"tofs = workflow.compute(unwrap.TofData)\n",
"# Compute wavelength\n",
Expand Down
13 changes: 4 additions & 9 deletions docs/user-guide/wfm/wfm-time-of-flight.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -282,7 +282,7 @@
"metadata": {},
"outputs": [],
"source": [
"raw_data = ess_beamline.get_monitor(\"detector\")\n",
"raw_data = ess_beamline.get_monitor(\"detector\")[0]\n",
"\n",
"# Visualize\n",
"raw_data.hist(event_time_offset=300).sum(\"pulse\").plot()"
Expand Down Expand Up @@ -368,10 +368,10 @@
"outputs": [],
"source": [
"# Chop the frames\n",
"frames = frames.chop(choppers.values())\n",
"chopped = frames.chop(choppers.values())\n",
"\n",
"# Propagate the neutrons to the detector\n",
"at_sample = frames.propagate_to(Ltotal)\n",
"at_sample = chopped.propagate_to(Ltotal)\n",
"\n",
"# Visualize the results\n",
"cascade_fig, cascade_ax = at_sample.draw()"
Expand Down Expand Up @@ -406,14 +406,9 @@
"metadata": {},
"outputs": [],
"source": [
"workflow = sl.Pipeline(\n",
" unwrap.unwrap_providers()\n",
" + unwrap.time_of_flight_providers()\n",
" + unwrap.time_of_flight_origin_from_choppers_providers(wfm=True)\n",
")\n",
"workflow = sl.Pipeline(unwrap.providers(), params=unwrap.params())\n",
"\n",
"workflow[unwrap.PulsePeriod] = sc.reciprocal(ess_beamline.source.frequency)\n",
"workflow[unwrap.PulseStride | None] = None\n",
"workflow[unwrap.SourceTimeRange] = time_min, time_max\n",
"workflow[unwrap.SourceWavelengthRange] = wavs_min, wavs_max\n",
"workflow[unwrap.Choppers] = choppers\n",
Expand Down
4 changes: 2 additions & 2 deletions requirements/base.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@
#
# pip-compile-multi
#
contourpy==1.3.0
contourpy==1.3.1
# via matplotlib
cycler==0.12.1
# via matplotlib
fonttools==4.54.1
fonttools==4.55.0
# via matplotlib
h5py==3.12.1
# via
Expand Down
13 changes: 10 additions & 3 deletions requirements/basetest.txt
Original file line number Diff line number Diff line change
Expand Up @@ -15,21 +15,26 @@ charset-normalizer==3.4.0
# via requests
comm==0.2.2
# via ipywidgets
contourpy==1.3.0
contourpy==1.3.1
# via matplotlib
cyclebane==24.10.0
# via sciline
cycler==0.12.1
# via matplotlib
decorator==5.1.1
# via ipython
exceptiongroup==1.2.2
# via
# hypothesis
# ipython
# pytest
execnet==2.1.1
# via pytest-xdist
executing==2.1.0
# via stack-data
fonttools==4.54.1
fonttools==4.55.0
# via matplotlib
hypothesis==6.118.8
hypothesis==6.119.4
# via -r basetest.in
idna==3.10
# via requests
Expand Down Expand Up @@ -122,6 +127,8 @@ sortedcontainers==2.4.0
# via hypothesis
stack-data==0.6.3
# via ipython
tomli==2.1.0
# via pytest
traitlets==5.14.3
# via
# comm
Expand Down
8 changes: 7 additions & 1 deletion requirements/ci.txt
Original file line number Diff line number Diff line change
Expand Up @@ -44,9 +44,15 @@ requests==2.32.3
# via -r ci.in
smmap==5.0.1
# via gitdb
tomli==2.1.0
# via
# pyproject-api
# tox
tox==4.23.2
# via -r ci.in
typing-extensions==4.12.2
# via tox
urllib3==2.2.3
# via requests
virtualenv==20.27.1
virtualenv==20.28.0
# via tox
14 changes: 7 additions & 7 deletions requirements/dev.txt
Original file line number Diff line number Diff line change
Expand Up @@ -34,23 +34,23 @@ click==8.1.7
# pip-tools
copier==9.4.1
# via -r dev.in
dunamai==1.22.0
dunamai==1.23.0
# via copier
fqdn==1.5.1
# via jsonschema
funcy==2.0
# via copier
h11==0.14.0
# via httpcore
httpcore==1.0.6
httpcore==1.0.7
# via httpx
httpx==0.27.2
# via jupyterlab
isoduration==20.11.0
# via jsonschema
jinja2-ansible-filters==1.3.2
# via copier
json5==0.9.28
json5==0.10.0
# via jupyterlab-server
jsonpointer==3.0.0
# via jsonschema
Expand All @@ -71,7 +71,7 @@ jupyter-server==2.14.2
# notebook-shim
jupyter-server-terminals==0.5.3
# via jupyter-server
jupyterlab==4.3.0
jupyterlab==4.3.1
# via -r dev.in
jupyterlab-server==2.27.3
# via jupyterlab
Expand All @@ -91,9 +91,9 @@ prometheus-client==0.21.0
# via jupyter-server
pycparser==2.22
# via cffi
pydantic==2.9.2
pydantic==2.10.2
# via copier
pydantic-core==2.23.4
pydantic-core==2.27.1
# via pydantic
python-json-logger==2.0.7
# via jupyter-events
Expand Down Expand Up @@ -131,7 +131,7 @@ webcolors==24.11.1
# via jsonschema
websocket-client==1.8.0
# via jupyter-server
wheel==0.45.0
wheel==0.45.1
# via pip-tools

# The following packages are considered to be unsafe in a requirements file:
Expand Down
Loading