Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Why the input of int64 automatically converted to int32 when the model is loaded? #24072

Closed
3 tasks done
LinGeLin opened this issue Apr 17, 2024 · 6 comments
Closed
3 tasks done
Assignees
Labels

Comments

@LinGeLin
Copy link

OpenVINO Version

2024.0.0

Operating System

Ubuntu 20.04 (LTS)

Device used for inference

CPU

Framework

ONNX

Model used

TFRA

Issue description

I work on an extension. The input must be int64. But when I debug it, I find that the input will be converted to int32. Is there any special mechanism to control it? I have observed that the validate_and_infer_types function is called many times. May I ask where in ov will this function of the extension plugin be called at runtime? And from the log I printed, the first few times the input is still int64, but later it becomes int32, which is very confusing.
image

Step-by-step reproduction

No response

Relevant log output

No response

Issue submission checklist

  • I'm reporting an issue. It's not a question.
  • I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
  • There is reproducer code and related data files such as images, videos, models, etc.
@LinGeLin LinGeLin added bug Something isn't working support_request labels Apr 17, 2024
@ilya-lavrenov ilya-lavrenov added the category: CPU OpenVINO CPU plugin label Apr 17, 2024
@dmitry-gorokhov dmitry-gorokhov assigned nshchego and unassigned mg-intel Apr 18, 2024
@nshchego
Copy link
Contributor

Hello,
Please check which data types the CPU plugin supports here. In particular, i64 is not supported. You can execute extension node in i64, but inputs/outputs will be converted anyway. There is no mechanism to force it.
Thanks

@LinGeLin
Copy link
Author

Hello, Please check which data types the CPU plugin supports here. In particular, i64 is not supported. You can execute extension node in i64, but inputs/outputs will be converted anyway. There is no mechanism to force it. Thanks

What happens if I delete the conver precision for i64->i32?

@avitial avitial removed the bug Something isn't working label Apr 23, 2024
@nshchego
Copy link
Contributor

Hello, Please check which data types the CPU plugin supports here. In particular, i64 is not supported. You can execute extension node in i64, but inputs/outputs will be converted anyway. There is no mechanism to force it. Thanks

What happens if I delete the conver precision for i64->i32?

Various issues may arise during the compilation and execution of the graph, depending on the model. Even if graph will be successfully compiled end inferred, the CPU's nodes will perform an internal conversion i64->i32.

@LinGeLin
Copy link
Author

LinGeLin commented Apr 28, 2024

Hello, Please check which data types the CPU plugin supports here. In particular, i64 is not supported. You can execute extension node in i64, but inputs/outputs will be converted anyway. There is no mechanism to force it. Thanks

What happens if I delete the conver precision for i64->i32?

Various issues may arise during the compilation and execution of the graph, depending on the model. Even if graph will be successfully compiled end inferred, the CPU's nodes will perform an internal conversion i64->i32.

I notice that you have some development branch as support i64, such as cpu/i64/proto, Do any of these branches meet my needs? How should I use it if supported? For example, how to configure parameters when starting tritonserver?

@nshchego
Copy link
Contributor

Yes, you can use this branch for i64. Just set enableNativeI64 to True here. Also take into account that it has quite old base (more than one year).

@avitial
Copy link
Contributor

avitial commented Jun 21, 2024

Closing this, I hope previous responses were sufficient to help you proceed. Feel free to reopen and ask additional questions related to this topic.

@avitial avitial closed this as completed Jun 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants