-
Notifications
You must be signed in to change notification settings - Fork 314
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(ai.triton.server): Add Model Encryption support for Triton Server Service #3986
feat(ai.triton.server): Add Model Encryption support for Triton Server Service #3986
Conversation
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
…ation Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
This reverts commit a2b746e.
Signed-off-by: Mattia Dal Ben <[email protected]>
Signed-off-by: Mattia Dal Ben <[email protected]>
try { | ||
FileUtils.cleanDirectory(new File(modelRootPath)); | ||
} catch (IOException e) { | ||
logger.warn("Cannot clean directory at path {}", modelRootPath); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please log also exception stacktraces on failure.
try { | ||
FileUtils.cleanDirectory(new File(modelRootPath)); | ||
} catch (IOException e) { | ||
logger.warn("Cannot clean directory at path {}", modelRootPath); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please log also exception stacktraces on failure.
Signed-off-by: Mattia Dal Ben <[email protected]>
This PR adds the org.bouncycastle/bcpg-jdk15on/1.68 dependency: |
Add Model Encryption support for Triton Server Service
Description of the solution adopted:
If a password was specified during configuration we assume all models in the path are encrypted with the same algorithm and can be decrypted with said password.
Therefore Kura, upon applying the TritonServerImpl configuration, will perform the following:
Create a new folder in /tmp with the required permission to avoid allowing access from users that have no root privileges or need that (i.e. Triton, Kura). For this example we'll call this folder /tmp/decypted_models
Run the Inference Server using this new folder as its Model Repository
For each model specified in the "Model" property:
Will look for it in the "Model Repository Path" passed in the configuration
Decrypt it and store it in the /tmp/decrypted_models folder (for example /tmp/decrypted_models/autoencoder_fp32)
Tell Triton to load it from the /tmp/decrypted_models folder
Wait for Triton to signal correct loading of the decrypted model
Wipe the decrypted model folder /tmp/decrypted_models/autoencoder_fp32 from the filesystem as soon as Triton signals model load completion (see "Model Ready" and here)
Encryption procedure
Given a trained model inside the folder tf_autoencoder_fp32
We'll need to archive it with:
and encrypt the archive with:
The resulting archive tf_autoencoder_fp32.zip.asc can be decrypted by this method.