-
Notifications
You must be signed in to change notification settings - Fork 169
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Jp-3207 ifu outlier detection #7590
Jp-3207 ifu outlier detection #7590
Conversation
@hbushouse @tapastro I am stuck. I have the ifu outlier rejection code flagging outliers. The way the outlier_detection_step is set up I can not figure out how to update the input models. with the flagged dqs The flag dq arrays and corresponding sci arrays (that are nan) are not returned so the output files written are not flagged. I can not figure out the way that outlier_detection is set up how to return the updated arrays. I have looked in detail at how |
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## master #7590 +/- ##
==========================================
- Coverage 77.27% 77.01% -0.26%
==========================================
Files 456 456
Lines 36629 36625 -4
==========================================
- Hits 28304 28208 -96
- Misses 8325 8417 +92
*This pull request uses carry forward flags. Click here to find out more.
☔ View full report in Codecov by Sentry. |
0a529c7
to
38e8449
Compare
@hbushouse I did not mean to put requirements-max.txt into this PR. I think I removed it. But could you check I did that correctly before we merge |
@hbushouse So how do we record changes made in datamodels in stdatamodels ? I put a note under datamodels and I put a note in the stdatamodels CHANGE log. |
You can include a link to the stdatamodels PR using the following syntax:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me; I've tested it on a few example data sets and it seems to be working as expected.
pars files are ready for both outlier_detection and an updated spec3. The spec3 file contains updated extract1d calls too for the MRS PR on extract1d changes. |
f8276df
to
36b4829
Compare
@jemorrison I believe Also have you run the code with at least two different detectors? I'm trying to figure out the logic. |
While looking at the open files issues in stdatamodels and jwst I noticed that this PR should include an update to: jwst/jwst/datamodels/tests/test_api.py Line 30 in 21c7b51
and the addition of a new submodule and model in spacetelescope/stdatamodels#164 will cause the above test to fail. |
@nden I have run this code on NIRSpec and MIRI data using an association for each instrument that contains 2 detectors. The algorithm pulls the data from 1 detector and is basically trying to find bad pixels that are not in the bad pixel mask because the bad pixels vary with time. So it looks for outliers with adjacent pixels that are outliers on all then input data for that detector. |
@nden I needed requirements-max.txt locally to be changed so that I could use the newer version that I created to update the outlierParsModel. I added that to the PR by mistake. I guess I did not mean to remove requirements-max.txt completely just my update to it. Is that what you are saying I did ? |
@nden made the changes you requested. I remove a few things that were not needed. I am going to run the regression tests now- more just to make sure nothing breaks. The regression results for IFU outlier detection will be completely different |
c25307c
to
72b366b
Compare
b04c0f3
to
0d02f2a
Compare
@nden @hbushouse |
Another (hopefully final?) regtest run started at https://plwishmaster.stsci.edu:8081/job/RT/job/JWST-Developers-Pull-Requests/743/ |
Not sure I understand why the CI tests are still failing due to some kind of error with not finding jwst.datamodels.outlierifuoutput.py, even though that module is now included in this PR and I think it also has the appropriate change to jwst/setup.cfg to use stdatamodels/master. Everything seems to run fine when using this PR branch in regression tests, so maybe it's just something unique in what the CI tests are trying to test. @zacharyburnett @braingram any ideas? |
I believe the CI is throwing errors because the file The 2 errors are because pytest is inspecting I believe the regression tests aren't showing the same error because they don't search the Let me know if it's helpful to document any of this. There is an open issue about improving the documentation on adding a new datamodel which is likely related: #4436 |
@braingram OK, I'll try removing the outlierifuoutput.py module from this PR and see what happens. |
Yay! CI tests now succeed. Thanks @braingram |
Looks like the doc module https://raw.githubusercontent.com/spacetelescope/jwst/master/docs/jwst/outlier_detection/outlier_detection.rst also has some references to IFU mode, which will need updating. If you don't mind, I'll take a stab at updating that and pushing to this PR branch. |
99ed8c3
to
a887537
Compare
Woot! CI success. As soon as the latest regtest run at https://plwishmaster.stsci.edu:8081/job/RT/job/JWST-Developers-Pull-Requests/751/console finishes, I'll be merging this. |
Co-authored-by: Howard Bushouse <[email protected]>
Resolves JP-3207
Closes #
This PR changes the way outlier detection is done on IFU data, implementing a completely new algorithm developed by @drlaw1558. A change to the outlierpars data models was needed [spacetelescope/stdatamodels#164 for stdatamodels]. In addition, a new data model outlierifuoutput was added to stdatamodels to support analysis of the new algorithm. If save_intermediate_results = True then an output file following outlierifuoutput is created.
Order of work flow:
Checklist for maintainers
CHANGES.rst
within the relevant release sectionHow to run regression tests on a PR