Releases: justyns/silverbullet-ai
0.0.8
0.0.7
A few internal changes to try and make support for different provider APIs easier to manage, and added Gemini as the first non-openai compatible api to test with!
What's Changed
- Added Perplexity AI API info by @zefhemel in #6
- Add Custom Instructions for chat by @justyns in #8
- Interfaces refactor by @justyns in #10
- Add experimental Google Gemini support for #3 by @justyns in #11
New Contributors
Full Changelog: 0.0.6...0.0.7
0.0.6
Change Highlights:
This release introduces the concept of Templated Prompts. My goal is to get rid of hardcoded prompts and allow users to create their own prompt templates similar to template pages. Examples are in the README now, but here's an example of what it could be used for:
I'll most likely start a small library of example template prompts, so feel free to contribute some of if you have any ideas! After there are examples for each of the hardcoded prompts (like summarization), the hardcoded ones will likely get removed.
Other Changes:
- Add a new command to prompt for a template to execute and render as a prompt
- Add insertAt option for prompt templates (page-start, page-end, cursor)
- Make the cursor behave nicer in interactive chats, fixes #1
- Remove 'Contacting LLM' notification and replace it with a loading placeholder for now #1
- Move some of the flashNotifications to console.log instead
- Dall-e: Use finalFileName instead of the prompt to prevent long prompts form breaking the markdown
- Add queryOpenAI function to use in templates later
- Update Readme for templated prompts, build potential release version
Full Changelog: 0.0.5...0.0.6
0.0.5
Release highlights:
- Rename test stream command
- Add better error handling and notifications
- Misc refactoring to make the codebase easier to work on
- Automatically reload config from SETTINGS and SECRETS page
- Update readme for ollama/mistral.ai examples
- Use editor.insertAtPos instead of insertAtCursor to make streaming text more sane
- Add requireAuth variable to fix cors issue on chrome w/ ollama
- Remove redundant commands, use streaming for others
- Let chat on page work on any page. Add keyboard shortcut for it
- Move cursor to follow streaming response
Full Changelog: 0.0.4...0.0.5
0.0.4
- Add command for 'Chat on current page' to have an interactive chat on a note page
- Use relative image path name for dall-e generated images
- First attempt at supporting streaming responses from openai directly into the editor
Full Changelog: 0.0.3...0.0.4
0.0.3
- Add a new command to call openai using a user note or selection as the prompt, ignoring built-in prompts
- Add support for changing the openai-compatible api url and using a local LLM like Ollama
- Update jsdoc descriptions for each command and add to readme
- Save dall-e generated image locally
- Add script to update readme automatically
- Save and display the revised prompt from dall-e-3
Full Changelog: 0.0.2...0.0.3
0.0.2
Full Changelog: 0.0.1...0.0.2