Skip to content

Exponential time smoothing #39

Closed
Darel13712 opened this issue Nov 15, 2021 · 0 comments · Fixed by #50
Closed

Exponential time smoothing #39

Darel13712 opened this issue Nov 15, 2021 · 0 comments · Fixed by #50
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@Darel13712
Copy link
Contributor

We can add some time-awareness to models by applying time-dependent weights on relevance values. This weightening can happen at three places in the recommendation process:

  1. Before model training
  2. At prediction time before get_top_k method
  3. After prediction, as a way to rerank final recommendations

Regardless of the option we choose (or support all of them), we should have functions that calculate these weights.

Arguments should include

  • decay — the "half-life" of a weight, the number of days the weight is reduced by 50%. Probably float.
  • limit — the minimal value the weight can reach, to avoid zeroing very old interactions.

There are two options to calculate weights: for each interaction and for each item.

Both take log with timestamp values as an input, but return values are different.

Item-weights return a new DataFrame mapping item_id to weight.
Interaction-weights modify log relevance values in place.

@Darel13712 Darel13712 added the enhancement New feature or request label Nov 15, 2021
@Darel13712 Darel13712 self-assigned this Nov 15, 2021
@Darel13712 Darel13712 added this to the 0.8.0 milestone Nov 17, 2021
@Darel13712 Darel13712 linked a pull request Dec 2, 2021 that will close this issue
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant