Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Creating TRT Cache much slower on Linux than on Windows #23380

Open
BengtGustafsson opened this issue Jan 15, 2025 · 2 comments
Open

Creating TRT Cache much slower on Linux than on Windows #23380

BengtGustafsson opened this issue Jan 15, 2025 · 2 comments
Assignees
Labels
ep:TensorRT issues related to TensorRT execution provider platform:windows issues related to the Windows platform

Comments

@BengtGustafsson
Copy link
Contributor

Describe the issue

When we create .engine files on Windows we see a run time of 15-30 s and it is not very dependent on the input sizes we use.

On Linux the same networks on the same hardware can take up to 10 minutes to optimize. The hardware includes a T1000 GPU and a recent Intel CPU.

Any ideas on this? Could it be that we run Linux via Docker?

To reproduce

Set up to create .engine files on Windows and Linux. check the time it takes.

Urgency

No response

Platform

Linux

OS Version

Ubuntu 20.04

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.20

ONNX Runtime API

C++

Architecture

X64

Execution Provider

TensorRT

Execution Provider Library Version

CUDA 11.6, TrT 10.4.0.26

@github-actions github-actions bot added ep:TensorRT issues related to TensorRT execution provider platform:windows issues related to the Windows platform labels Jan 15, 2025
@jywu-msft jywu-msft assigned yf711 and chilo-ms and unassigned chilo-ms Jan 15, 2025
@jywu-msft
Copy link
Member

jywu-msft commented Jan 15, 2025

@yf711 can you confirm if this behavior repros for you?

@yf711
Copy link
Contributor

yf711 commented Jan 15, 2025

Is your Linux env a dockerenv under your windows host env? Or it’s under another Linux host env

Similar thing happened to me when I ran Linux using Docker within Windows system and path to model/engine were projected from windows file system.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ep:TensorRT issues related to TensorRT execution provider platform:windows issues related to the Windows platform
Projects
None yet
Development

No branches or pull requests

4 participants