Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Transform] Ensure bulk requests are not over memory limit - handle errors due to limits gracefully #60391

Open
hendrikmuhs opened this issue Jul 29, 2020 · 1 comment

Comments

@hendrikmuhs
Copy link

#58885 introduced write limits, this might effect transform bulk indexing.

At the moment index is retried 10 times and finally goes into FAILED state. There is no differentiation whether indexing fails due to the memory limit or due to another failure. Right now, we assume a bulk indexing error is intermittent.

A possible solution could be to do it similar to the search side, we check the failure and e.g. decrease the page size in case of a CBE. We could do similar in this case.

Severity: As long as the user does not change the default page size (500) it seems unlikely, that we hit this error, however the user can change the page size to 10k (in future 65k). As the user can lower the page size, there is a workaround for this problem.

@elasticmachine
Copy link
Collaborator

Pinging @elastic/ml-core (:ml/Transform)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants