-
Notifications
You must be signed in to change notification settings - Fork 25k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add unit conversion ingest node processor #31737
Comments
Pinging @elastic/es-core-infra |
Relates to #31244. |
Would you expect all these types to need to convert numeric type to numeric type ?
That sounds alot like a general purpose math processor. Would something like this satisfy your use case(s) ?
It could be expanded (if needed) for user defined functions. related: I just submitted a PR for converting human readable byte string to their byte value. (e.g 1kb -> 1024) |
Can you show what a script doing conversion looks like, for comparison against your other suggested examples here? |
Linking #31244 here as in case we have such a processor, it should work well together with a potential duration type in this example. |
This has been open for quite a while, and we haven't made much progress on this due to focus in other areas. For now I'm going to close this as something we aren't planning on implementing. We can re-open it later if needed. |
Describe the feature:
Working with Filebeat modules, I have come across several instances where I need to convert seconds to millis and kilobytes/megabytes/gigabytes to bytes. While the script processor is available to perform such unit conversions, a dedicated processor for this task would facilitate the development of modules greatly by eliminating the need to write and rewrite them in painless. It might look something like this:
Then use
rename
to adjust the field name accordingly:Even better might a processor that accepts a conversion factor or possibly formula so it could be applied to any unit of measure; e.g., temperature, speed, energy, leagues, etc.
The text was updated successfully, but these errors were encountered: