-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ENH] Adding Pipeline support to Development #376
Comments
@jbogaardt from my understanding, it looks like just adding self.w_ to the product here would work? probably not thinking through all the other side effects. |
I can't think of any harm from that. Assuming |
do you think there's value to standardizing all the n_period and drop functions? right now some are passing parameters, some are not, etc. what if they are all retrieve parameters from self and directly modify self.w_? |
That makes sense. Passing arguments around that are already bound to |
after some more tinkering, as well as reflecting on how n_period and drops are being used elsewhere, i take back what i said about w_. here is where my head's now at. It might make sense to add a function set_weights(factor, secondary_rank, previous_weight, n_periods, drop_high...) under development base, intentionally skipping the leading _ and passing in self. Then all development classes can use this function to drop undesirable factors (the current development.fit approach can bug. see snippet below), regardless of what kind of factor (Munich CL can drop paid-to-incurred or residuals; Outstanding can drop paid-to-reserve or reserve-to-reserve factors). genins = cl.load_sample("genins")
genins_dev = cl.Development().fit_transform(genins)
genins_cl = cl.Chainladder().fit(genins_dev)
genins_incr = cl.IncrementalAdditive(average = "volume", n_periods = 6,drop_high = 1, drop_low = 1).fit_transform(genins, sample_weight = genins_cl.ultimate_ * 5)
print(genins_incr.tri_zeta)
print(genins_incr.fit_zeta_) i'm gonna think this through some more. |
create #385 to move this forward. i wrote a new set of drop functions for better access and uniformity. i couldn't think of a situation where the drop parameters would vary for any of the development classes during fit. so these new functions are still retrieving drop parameters from self. however the ldf triangle is an input, in order to generalize between all development methods. these new functions fixes some existing bugs and will streamline pipeline support. but it's a fairly major change. lots of testing is needed. |
Thank you @henrydingliu. I agree we should be liberal with testing. The more varied test cases we can add to test_development.py the better. Test coverage is not a perfect measure, but is a decent proxy for code quality. This PR makes our code coverage drop by about 2%. BTW, you should be able to create branches directly on casact/chainladder-python now rather than your own fork. |
i'll add some of my test codes into test_development. will also figure out branches. |
my plan for next steps:
|
created #393 to beta both _set_weight_func and pipeline. couldn't figure out how to directly commit from my dev environment to a branch here. in the future, i will commit to my fork, pull to a branch, wait for test, then ask to merge to main. |
You can remove your local fork of this repo and then clone (not fork) directly from casact/chainladder-python. From there you should have privileges to make new branches and commit to any branches (other than master). |
I tried that but got a 403 access denied when I pushed to a new branch here. Will try again after I get this latest batch of code merged. |
created #397 to move this forward |
Maybe @kennethshsu has an idea since you and he share the same access levels on this repo. |
I usually just do
Does this not work? I don't remember having to do any additional setup. |
i switched from a beta token to a classic token and it's working now. so i think it was permissioning with the beta token. |
Is your feature request related to a problem? Please describe.
within the default development class, ata selections (drop, n_period, etc.) is performed via a fixed order. custom order is not possible since development currently does not use the previous ata selections (e.g. Development(n_period = 4).fit_transform(Development(drop_low = 1).fit_transform(tri)) is not equal to Development(n_period = 4,drop_low = 1).fit_transform(tri))
Is your feature request at odds with the scope of the package?
this feature is strictly for reserving and would offer greater alignment with the sklearn API
Describe the solution you'd like
let the previous ata selections presist when further transforming
Describe alternatives you've considered
parameterize order of selection to eliminate the need of chaining/pipelining development transformers. messy
Additional context
probably clean up ata selection logic in the meantime
The text was updated successfully, but these errors were encountered: