You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
How to use HSTU's cache? From the code, it seems that the cache is associated with delta_x_offsets, but I'm not sure what delta_x_offsets is supposed to look like.
The text was updated successfully, but these errors were encountered:
from hstu.py, the code flattened_offsets = delta_x_offsets[1] + torch.arange(start=0, end=B * n, step=n, device=delta_x_offsets[1].device, dtype=delta_x_offsets[1].dtype),
This means that each row in the batch has only one element that needs to be updated? In other words, as long as each user has a new item-id, it will be sent to HSTU for an update?
How to use HSTU's cache? From the code, it seems that the cache is associated with delta_x_offsets, but I'm not sure what delta_x_offsets is supposed to look like.
The text was updated successfully, but these errors were encountered: