-
Notifications
You must be signed in to change notification settings - Fork 141
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PDA Updater #913
PDA Updater #913
Conversation
… which works best
posterior_mean += kalman_gain @ sum_of_innovations | ||
posterior_covariance += \ | ||
kalman_gain @ (sum_of_weighted_cov - sum_of_innovations @ sum_of_innovations.T) \ | ||
@ kalman_gain.T |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do these lines work if line 162 is False? I can't work out where posterior_mean, posterior_covariance or kalman_gain would come from in that case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ignore me, I think I've got it now - the first hypothesis always a MissedDetection hypothesis right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the first hypothesis always a MissedDetection hypothesis right?
It seems to be but that's a matter for the associator.associate()
method. I think this should work whatever the order; it exploits the fact that not hypothesis
returns True for a missed detection (not sure why that is) and calculates the required mean and covariance, and attaches a 0-innovation.
# probably exists a less clunky way of doing this using exists() or overwritten += | ||
# All these floats should be redundant if/when the bug in Probability.__mult__() is | ||
# fixed. | ||
if n == 0: | ||
sum_of_innovations = float(hypothesis.probability) * innovation | ||
sum_of_weighted_cov = float(hypothesis.probability) * (innovation @ innovation.T) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Set these two sum variables outside the loop to 0
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wasn't too sure if the proposed would work, but doing a simple test, looks like it should.
>>> a = 0
>>> a += np.array([[-5], [2]])
>>> a
array([[-5],
[ 2]])
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So,
sum_of_innovations = 0
sum_of_innovations += hypothesis.probability * innovation
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So,
sum_of_innovations = 0
sum_of_innovations += hypothesis.probability * innovation
?
But won't matter if :class:Probability
is corrected.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
diff --git a/stonesoup/updater/probability.py b/stonesoup/updater/probability.py
index c471e514..c6495111 100644
--- a/stonesoup/updater/probability.py
+++ b/stonesoup/updater/probability.py
@@ -156,6 +156,8 @@ class PDAUpdater(ExtendedKalmanUpdater):
The covariance of the reduced/single Gaussian
"""
+ sum_of_innovations = 0
+ sum_of_weighted_cov = 0
for n, hypothesis in enumerate(hypotheses):
# Check for the existence of an associated measurement. Because of the way the
# hypothesis is constructed, you can do this:
@@ -174,15 +176,10 @@ class PDAUpdater(ExtendedKalmanUpdater):
innovation = hypothesis.measurement.state_vector - \
hypothesis.measurement_prediction.state_vector
- # probably exists a less clunky way of doing this using exists() or overwritten +=
# All these floats should be redundant if/when the bug in Probability.__mult__() is
# fixed.
- if n == 0:
- sum_of_innovations = float(hypothesis.probability) * innovation
- sum_of_weighted_cov = float(hypothesis.probability) * (innovation @ innovation.T)
- else:
- sum_of_innovations += float(hypothesis.probability) * innovation
- sum_of_weighted_cov += float(hypothesis.probability) * (innovation @ innovation.T)
+ sum_of_innovations += float(hypothesis.probability) * innovation
+ sum_of_weighted_cov += float(hypothesis.probability) * (innovation @ innovation.T)
posterior_mean += kalman_gain @ sum_of_innovations
posterior_covariance += \
Looking at this made me think whether we could make this more modular by using a
The above would pave the way for a |
Something similar was done for the DIEKF in #891: https://github.com/dstl/Stone-Soup/blob/e27fc3639d791d2441b6c4b8572c57932651deb3/stonesoup/updater/iterated.py |
Per @sglvladi comment, I think it would be good to change structure to support more flexible structure, but I'll merge this for now. |
Provides an updater based on probabilistic data association. See the documentation in the Updaters section, or have a look at this gist: https://gist.github.com/jmbarr/92dc83e28c04026136d4f8706a1159c1