Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Relax][Frontend][Onnx] fix params name bug in onnx frontend #17350

Merged
merged 2 commits into from
Sep 11, 2024

Conversation

HongHongHongL
Copy link
Contributor

@HongHongHongL HongHongHongL commented Sep 9, 2024

  1. In onnx_frontend.py, for parameters with names starting with "onnx::", a strip processing is performed. In self._params, they should be stored as the new var_name.
if self._keep_params_in_input:
    # Pytorch sometimes inserts silly weight prefix. Remove it.
    var_name = init_tensor.name.strip("onnx::")
    init_var = self._new_var(var_name, shape=array.shape, dtype=array.dtype)
    self._nodes[init_tensor.name] = init_var
    # We need to keep track of both the real value and variable for this variable.
    self._params[init_tensor.name] = (init_var, array)
  1. In ONNX models, a param can be used many times. As a result, we should not use pop in get_constant.
# Params is actually both the graph nodes and param dictionary, unpack them.
graph_nodes, params = params
# Convert if possible
if isinstance(var, relax.Var) and var.name_hint in params:
    # When converting a parameter to a constant, update references to it as well.
    _, value = params.pop(var.name_hint)
    const_value = relax.const(value)
    graph_nodes[var.name_hint] = const_value
    return const_value

@Hzfengsy
Copy link
Member

Please add a regression test, thank!

@Hzfengsy Hzfengsy merged commit f52143e into apache:main Sep 11, 2024
18 of 19 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants