-
Notifications
You must be signed in to change notification settings - Fork 6.8k
[OpPerf] Implement remaining random sampling ops #17502
[OpPerf] Implement remaining random sampling ops #17502
Conversation
@mxnet-label-bot add [pr-awaiting-review] |
"args": DEFAULT_ARGS} | ||
"args": DEFAULT_ARGS, | ||
"grid": DEFAULT_GRID, | ||
"data_bilinearsampler": DEFAULT_DATA_BILINEAR, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems there are very operator specific. What if these arguments change in mxnet 2.0 does it break this opperf utility. Is there a way to generalize these op specific params, like passing them in as general args
param?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If these arguments change (in terms of data type or arg name) the opperf utility will fail, as is the case for all of the more general args used in this file. I could use a general args
param to pass in the op-specific arguments and establish a mapping between keywords and values on the fly, which might be more resilient to potential changes in arg structure (such as arg order or name changes). @ChaiBapchya any thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As discussed offline @apeforest this should be covered as part of #17515 where changes made to ops shouldn't break OpPerf.
Incorrect way to rebase @connorgoggins First get your master branch updated
Once your fork master is sync'ed with remote master, rebase your branch on master
Let me know if this works! |
0df02c2
to
6e1a0f6
Compare
if op_name in optimizer_ops and op_name not in unique_ops: | ||
optimizer_mx_operators[op_name] = mx_operators[op_name] | ||
for op_name, op_params in mx_operators.items(): | ||
if op_name in optimizer_ops: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wait doesn't this give indent error?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Didn't throw an error, but fixed regardless.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
…r sample_multinomial as it does work with random data
a195f95
to
bbf6f8e
Compare
* Added support for remaining random sampling ops, removed exception for sample_multinomial as it does work with random data * Dropped unused unique_ops variable * Added affine transform, dropped parentheses, changed 19 to 18 * Dropped unique_ops condition - no longer in use * Fixed indentation * Dropped unique_ops
* Added support for remaining random sampling ops, removed exception for sample_multinomial as it does work with random data * Dropped unused unique_ops variable * Added affine transform, dropped parentheses, changed 19 to 18 * Dropped unique_ops condition - no longer in use * Fixed indentation * Dropped unique_ops
Description
This PR serves to implement the remaining operators from the Random Sampling category (
BilinearSampler
,GridGenerator
, andsample_multinomial
) in opperf. To achieve this, I added a list to track ops requiringcustom_data
inop_registry_utils.py
(same strategy @ChaiBapchya used) and added the new random sampling ops to the list, which allowed me to add custom data dimensions for each new op indefault_params.py
. I then added these new ops to the comments inrandom_sampling_operators.py
.Furthermore, I investigated the
sample_multinomial
op, a Random Sampling op which had previously been added to theunique_ops
tuple inop_registry_utils.py
. This inclusion of the op inunique_ops
effectively removed the op from opperf. While the documentation claimed that this op only worked when all values on the last axis summed to 1, I tested the op extensively and found that the op did in fact work on random data. With this in mind, I added this op to opperf and assigned its data parameter to lower dimensional data than the standard data shape to facilitate quick testing (on the standard data shape, the op took several minutes to complete a single forward pass).Checklist
Essentials
Please feel free to remove inapplicable items for your PR.
Changes
Comments
Tested on p2.16xl w/ubuntu 16.04 and Mac OS with:
run_mx_random_sampling_operators_benchmarks
- runs all Random Sampling ops on relevant dataopperf.py
(full run of all ops)Performance Results
Group of operator test - all Random Sampling ops (CPU)
Full OpPerf test (CPU)
Group of operator test - all Random Sampling ops (GPU)
Full OpPerf test (GPU)
@apeforest @access2rohit @ChaiBapchya