Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLama.Examples: improve model path prompt #526

Merged
merged 8 commits into from
Feb 21, 2024

Conversation

swharden
Copy link
Contributor

This PR adds a user settings class which can store/retrieve the last used model path. When users are prompted for a model, they can now leave the field blank and press enter to use the last used model. If an invalid path is provided, the user can attempt to re-enter a valid path (previously an exception was thrown).

The coding assistant example previously offered to download the model for you, but I removed that functionality and added a check and warning message if the model used is not a code llama model. The new configuration class could be extended to add automatic model downloading functionality to all examples if that functionality is desired.

This PR also makes a few refinements to code styling and documentation. Since these example files serves as documentation I'd like to also use top-level namespaces, but I didn't implement that in this PR because it would appear as many lines of code changed and be hard to review. Let me know if you'd like me to do that (in this PR or another one).

image

image

image

the llama_empty_call() no longer shows configuration information on startup, but it will display it automatically the first time a model is engaged
The last used model is stored in a config file and is re-used when a blank path is provided
@swharden
Copy link
Contributor Author

Note that I also deleted ConsoleLogger in this PR because it wasn't used by anything... If it's intended to be there for documentation purposes, let me know and I can add it back and demonstrate its use in one of the examples.

/// <summary>
/// Basic logger printing to console
/// </summary>
internal static class ConsoleLogger
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not really a specific request, just thinking out loud: do you think something like this should be implemented in LLamaSharp itself?

Copy link
Contributor Author

@swharden swharden Feb 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps, but it would be nice to demonstrate how to do custom logging in an example.

I'll create a new PR shortly (#529) involving the logging system and I'll learn a little more then...

EDIT: The logging system is a bit more complicated than I first thought 😅 I'll probably leave this where it is for a little while, but it's a good idea for a possible addition in the future

EDIT2: There's useful information on https://scisharp.github.io/LLamaSharp/0.5/More/log/ which is probably good for now

@martindevans
Copy link
Member

martindevans commented Feb 21, 2024

Looks great, definitely a big improvement to usability. I'll certainly appreciate the saved model path when testing stuff!

Just a couple of small nits :)

@martindevans
Copy link
Member

top-level namespaces

I've wanted to move to all top level namespaces for a while. Once this one is merged I think I'll put in a PR to at least move the entire example project over to top level namespaces.

I'm less confident about making such a "noisy" change on the main project. The best approach is probably to do it all in one big PR, but that would touch nearly every line of every file :/

@swharden
Copy link
Contributor Author

swharden commented Feb 21, 2024

Thanks for the feedback! I think this is ready to merge now.

top level namespaces

Note that I goofed the language... the new C# features are top level statements and file-scoped namespaces

I think this can be automated project wide by creating .editorconfig containing:

[*.cs]
csharp_style_namespace_declarations=file_scoped:suggestion

Then in visual studio hovering on the squiggly underline will let you fix it project-wide

namespace LLama.Examples.Examples;

public class Runner
public class ExampleRunner
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The namespace appears to have disappeared from this file altogether

@martindevans
Copy link
Member

Thanks again for all your work on this!

@martindevans martindevans merged commit 06ffe3a into SciSharp:master Feb 21, 2024
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants