Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NativeLibraryConfig.WithLogs() overload to set log level #529

Merged
merged 1 commit into from
Feb 21, 2024

Conversation

swharden
Copy link
Contributor

Adds a NativeLibraryConfig.WithLogs() overload to let the user indicate the log level (with "info" as the default)

Adds a NativeLibraryConfig.WithLogs() overload to let the user indicate the log level (with "info" as the default)
@swharden
Copy link
Contributor Author

Rather than having a bool to enable/disabling logging (with a separate overload), do you think we should add a None member to LLamaLogLevel?

@martindevans
Copy link
Member

Right now LLamaLogLevel is a direct translation of an enum inside of llama.cpp, I think it's this one: https://github.com/ggerganov/llama.cpp/blob/c14f72db9c62d71d35eb1c141745c0bd0cb27b49/ggml.h#L510

Adding a new level directly to that is a bit iffy, since we'd be passing a technically invalid value down to llama.cpp. I think the best thing to do would be to observe when None is set and have different behaviour on that path (e.g. pass in a log callback that does nothing).

Another issue is that LLamaLogLevel appears to be wrong! In C# DEBUG = 1 but in C++ DEBUG = 5.

@swharden
Copy link
Contributor Author

swharden commented Feb 21, 2024

Adding a new level directly to that is a bit iffy

I'm fine keeping it like it is in this PR:

NativeLibraryConfig.Instance.WithLogs(false); // disables logging
NativeLibraryConfig.Instance.WithLogs(true); // enables logging
NativeLibraryConfig.Instance.WithLogs(); // enables logging
NativeLibraryConfig.Instance.WithLogs(LLamaLogLevel.Warning); // enables logging with the given level

Another issue is that LLamaLogLevel appears to be wrong! In C# DEBUG = 1 but in C++ DEBUG = 5

Is that something you want fixed in this PR, or are things good as they are and maybe we can open an issue and loop back to correct that later? The current behavior in this PR is working as expected for the example app.

@martindevans
Copy link
Member

Is that something you want fixed in this PR, or are things good as they are and maybe we can open an issue and loop back to correct that later?

LLamaSharp is always using a slightly outdated version of llama.cpp, so it's possible our enum is correct for the current version. For now just open an issue to make sure I pay attention to it next time I update the binaries.

As for the WithLogs calls I think that's fine as is. It's quite clear to me what each indivdual call there does, there's no surprises.

@martindevans
Copy link
Member

Everything else looks good, thanks again!

@martindevans martindevans merged commit a639400 into SciSharp:master Feb 21, 2024
2 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants