-
Notifications
You must be signed in to change notification settings - Fork 370
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLama.Examples: improve model path prompt #526
Conversation
the llama_empty_call() no longer shows configuration information on startup, but it will display it automatically the first time a model is engaged
The last used model is stored in a config file and is re-used when a blank path is provided
Note that I also deleted |
/// <summary> | ||
/// Basic logger printing to console | ||
/// </summary> | ||
internal static class ConsoleLogger |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not really a specific request, just thinking out loud: do you think something like this should be implemented in LLamaSharp itself?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps, but it would be nice to demonstrate how to do custom logging in an example.
I'll create a new PR shortly (#529) involving the logging system and I'll learn a little more then...
EDIT: The logging system is a bit more complicated than I first thought 😅 I'll probably leave this where it is for a little while, but it's a good idea for a possible addition in the future
EDIT2: There's useful information on https://scisharp.github.io/LLamaSharp/0.5/More/log/ which is probably good for now
Looks great, definitely a big improvement to usability. I'll certainly appreciate the saved model path when testing stuff! Just a couple of small nits :) |
I've wanted to move to all top level namespaces for a while. Once this one is merged I think I'll put in a PR to at least move the entire example project over to top level namespaces. I'm less confident about making such a "noisy" change on the main project. The best approach is probably to do it all in one big PR, but that would touch nearly every line of every file :/ |
Thanks for the feedback! I think this is ready to merge now.
Note that I goofed the language... the new C# features are top level statements and file-scoped namespaces I think this can be automated project wide by creating
Then in visual studio hovering on the squiggly underline will let you fix it project-wide |
namespace LLama.Examples.Examples; | ||
|
||
public class Runner | ||
public class ExampleRunner |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The namespace appears to have disappeared from this file altogether
Thanks again for all your work on this! |
This PR adds a user settings class which can store/retrieve the last used model path. When users are prompted for a model, they can now leave the field blank and press enter to use the last used model. If an invalid path is provided, the user can attempt to re-enter a valid path (previously an exception was thrown).
The coding assistant example previously offered to download the model for you, but I removed that functionality and added a check and warning message if the model used is not a code llama model. The new configuration class could be extended to add automatic model downloading functionality to all examples if that functionality is desired.
This PR also makes a few refinements to code styling and documentation. Since these example files serves as documentation I'd like to also use top-level namespaces, but I didn't implement that in this PR because it would appear as many lines of code changed and be hard to review. Let me know if you'd like me to do that (in this PR or another one).