Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Mobile] Bug in android implementation for loading large onnx models #19599

Open
l3utterfly opened this issue Feb 22, 2024 · 3 comments
Open
Labels
platform:mobile issues related to ONNX Runtime mobile; typically submitted using template stale issues that have not been addressed in a while; categorized by a bot

Comments

@l3utterfly
Copy link

l3utterfly commented Feb 22, 2024

Describe the issue

The current code tries to load the file on the JVM heap, before passing it to the native code, resulting in unable to load large onnx files on Android due to limited jvm heap.

Can be fixed by commenting out the load from assets section in OnnxruntimeModule.java:

// load model via model path string uri
      // InputStream modelStream =
      //     reactContext.getApplicationContext().getContentResolver().openInputStream(Uri.parse(uri));
      // Reader reader = new BufferedReader(new InputStreamReader(modelStream));
      // byte[] modelArray = new byte[modelStream.available()];
      // modelStream.read(modelArray);
      // modelStream.close();

      // directly pass the uri to createSession, we don't support loading from assets
      ortSession = ortEnvironment.createSession(uri, sessionOptions);

This disables loading from assets. Perhaps consider adding more logic here that bypasses the JVM heap if the uri is from disk.

To reproduce

Load any large onnx file greater than 500MB on Android

Urgency

No response

Platform

Android

OS Version

Android 14

ONNX Runtime Installation

Built from Source

Compiler Version (if 'Built from Source')

No response

Package Name (if 'Released Package')

None

ONNX Runtime Version or Commit ID

1.16

ONNX Runtime API

Java

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

@l3utterfly l3utterfly added the platform:mobile issues related to ONNX Runtime mobile; typically submitted using template label Feb 22, 2024
@skottmckay
Copy link
Contributor

@Craigacp any thoughts?

Feels like duplicating the model bytes in memory is coming up a few times lately with models getting much bigger. I'm definitely not a Java expert so I don't know what the best approach is to avoid that happening.

If ORT can read from a path on Android with no additional logic that might be the best general solution to recommend/implement.

IIRC passing raw bytes in is ineffective in reducing memory as we have to call

const bool result = model_proto_.ParseFromArray(model_data, model_data_len);
anyway as the bytes are protobuf encoded.

@Craigacp
Copy link
Contributor

Craigacp commented Feb 23, 2024

I can add a Java session constructor which accepts a ByteBuffer, then users could memory map the file on disk with (https://docs.oracle.com/javase/8/docs/api/java/nio/channels/FileChannel.html#map-java.nio.channels.FileChannel.MapMode-long-long-), pass through the ByteBuffer to JNI where I can pull out the address from it and then hand that byte array into the C API session constructor. However I've not done much work with memory mapping in Java, so I don't know if there are additional considerations or if it's troublesome on Android. I also don't know if that would help the react native wrapper as I don't know anything about react native.

I agree that if the native code can read from a filesystem path on Android then we should expose that too, but it's already exposed in Java & Android so maybe that's just a react native problem?

Copy link
Contributor

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Mar 25, 2024
skottmckay pushed a commit that referenced this issue Sep 15, 2024
#20062)

### Description
Adds support for constructing an `OrtSession` from a
`java.nio.ByteBuffer`. These buffers can be memory mapped from files
which means there doesn't need to be copies of the model protobuf held
in Java, reducing peak memory usage during session construction.

### Motivation and Context
Reduces memory usage on model construction by not requiring as many
copies on the Java side. Should help with #19599.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:mobile issues related to ONNX Runtime mobile; typically submitted using template stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

3 participants