Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add "Intelligent Dynamic Break Messages" and "Dynamic Break Messages" Plugins #10

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 27 additions & 0 deletions dynamic_break_messages/Readme.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# Intelligent Dynamic Break Messages

This Safe Eyes plugin displays dynamic break messages during breaks. It reads a file and shows the string in it.
## Installation

2. Install the plugin into Safe Eyes by placing it in `~/.config/safeeyes/plugins/dynamic_break_messages`.
3. Configure the plugin through Safe Eyes settings. Add this part to safeeyes config `~/.config/safeeyes/safeeyes.json`:

```
"plugins": [

...

{
"enabled": true,
"id": "dynamic_break_messages",
"settings": {
"message_file_path": "/tmp/safeeyes_message.txt"
},
"version": "0.1"
},
]
```

## Configuration

- `message_file_path`: Specify the file path to read break message from.
23 changes: 23 additions & 0 deletions dynamic_break_messages/config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
{
"meta": {
"name": "Dynamic Break Messages",
"description": "Display dynamic messages during breaks",
"version": "0.1"
},
"dependencies": {
"python_modules": [],
"shell_commands": [],
"operating_systems": [],
"desktop_environments": [],
"resources": []
},
"settings": [
{
"id": "message_file_path",
"label": "Path to the message file",
"type": "TEXT",
"default": "/tmp/safeeyes_message.txt"
}
],
"break_override_allowed": true
}
35 changes: 35 additions & 0 deletions dynamic_break_messages/plugin.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
"""
Display dynamic messages during breaks.
"""

import logging

message_file_path = None
break_message = None

def init(ctx, safeeyes_config, plugin_config):
"""
Initialize the plugin.
"""
logging.debug('Initialize Dynamic Break Messages plugin')
global message_file_path
message_file_path = plugin_config['message_file_path']

def get_widget_title(break_obj):
"""
Return the widget title. This could be a fixed title or based on the message content.
"""
return 'Message:'

def get_widget_content(break_obj):
"""
Return the dynamic break message.
"""
global break_message
try:
with open(message_file_path, 'r') as file:
break_message = file.read().strip()
except Exception as e:
logging.error(f"Error reading break message: {e}")
break_message = None
return break_message
45 changes: 45 additions & 0 deletions intelligent_dynamic_break_messages/Readme.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
# Intelligent Dynamic Break Messages

This Safe Eyes plugin displays dynamic break messages during breaks. It is complicated version of Dynamic Break Message plugin. It can generate messages dynamically using [Ollama(Local LLM runner)](http://ollama.ai/) or read them from a specified file.

## Installation

1. Ensure Ollama is installed and a model is dowloaded(e.g. Install tinyllama with command `ollama run tinyllama:latest` or find other models like mistral in [ollama library](https://ollama.com/library)).
2. Install the plugin into Safe Eyes by placing it in `~/.config/safeeyes/plugins/intelligent_dynamic_break_messages`.
3. Configure the plugin through Safe Eyes settings. Add this part to safeeyes config `~/.config/safeeyes/safeeyes.json`:

```
"plugins": [

...

{
"enabled": true,
"id": "intelligent_dynamic_break_messages",
"settings": {
"message_file_path": "/tmp/safeeyes_message.txt",
"ollama_model": "dolphin-mistral:v2.1",
"ollama_prompt": "Tell me a very short interesting fact.",
"ollama_system_prompt": "You are a helpful assistant.",
"use_ollama": "true"
},
"version": "0.1"
}
]
```

## Configuration

- `use_ollama`: Toggle this to `true` to use Ollama for generating messages. Otherwise, messages will be read from the specified file.
- `ollama_model`: Ollama model (default is **tinyllama:latest**)
- `ollama_prompt`: LLM prompt.
- `ollama_system_prompt`: LLM system prompt
- `message_file_path`: Specify the file path to read break messages from when not using Ollama.

If any of the above variable set incorrectly, Plugin just shows "Enjoy your break!".

## Dependencies

- Ollama command.

Enjoy more personalized and intelligent break messages with Safe Eyes!
47 changes: 47 additions & 0 deletions intelligent_dynamic_break_messages/config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
{
"meta": {
"name": "Intelligent Dynamic Break Messages",
"description": "Display dynamic messages during breaks, optionally using Ollama",
"version": "0.1"
},
"dependencies": {
"python_modules": [],
"shell_commands": ["ollama"],
"operating_systems": [],
"desktop_environments": [],
"resources": []
},
"settings": [
{
"id": "use_ollama",
"label": "Generate messages dynamically with Ollama",
"type": "TEXT",
"default": "false"
},
{
"id": "message_file_path",
"label": "Path to the message file",
"type": "TEXT",
"default": "/tmp/safeeyes_message.txt"
},
{
"id": "ollama_model",
"label": "Ollama model to use",
"type": "TEXT",
"default": "tinyllama:latest"
},
{
"id": "ollama_prompt",
"label": "Prompt for Ollama",
"type": "TEXT",
"default": "Tell me a very short interesting fact."
},
{
"id": "ollama_system_prompt",
"label": "System prompt for Ollama",
"type": "TEXT",
"default": "You are a helpful assistant."
}
],
"break_override_allowed": true
}
99 changes: 99 additions & 0 deletions intelligent_dynamic_break_messages/plugin.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,99 @@
"""
Intelligent Dynamic Break Messages for Safe Eyes.
"""

import logging
import requests
import json

# Configuration parameters
use_ollama = False
message_file_path = None

def init(ctx, safeeyes_config, plugin_config):
"""
Initialize the plugin.
"""

logging.debug('Initialize Intelligent Dynamic Break Messages plugin')
global use_ollama, message_file_path, ollama_model, ollama_prompt, ollama_system_prompt
use_ollama = bool(plugin_config['use_ollama'])
message_file_path = plugin_config['message_file_path']

ollama_model = plugin_config['ollama_model']
ollama_prompt = plugin_config['ollama_prompt']
ollama_system_prompt = plugin_config['ollama_system_prompt']

def get_ollama_response(model="tinyllama:latest",
prompt="Tell me a very short interesting fact.",
system_prompt= "You are a helpful assistant.",
output_parser=None,
max_attempts=10,
max_msg_length=400,
min_msg_length=20):
"""
Generate a message using Ollama.
"""
for attempt in range(max_attempts):
data = {
"model": model,
"messages": [
{"role": "system", "content": system_prompt},
{"role": "user", "content": prompt}
]
}
response = requests.post('http://localhost:11434/v1/chat/completions', headers={'Content-Type': 'application/json'}, json=data)
response_data = response.json()
if 'choices' in response_data and len(response_data['choices']) > 0:
message = response_data['choices'][0]['message']['content']

if output_parser != None:
processed_message = output_parser(message)
# Check if the processed message is under the desired character limit
if len(processed_message) <= max_msg_length and len(processed_message) > min_msg_length:
return processed_message
else:
max_msg_length += max_msg_length/20
else:
return message
else:
return None

# if 'choices' in response_data and len(response_data['choices']) > 0:
# return response_data['choices'][0]['message']['content']
# return None

def get_widget_title(break_obj):
"""
Return the widget title.
"""
return 'Fun Fact!'

def parse_tinyllama_output(message):
# Split the message by colon and new lines and extract the main fact
message = message.strip()
parts = message.split('\n\n', 1)
if len(parts) >= 1:
fact_part = parts[0].split(':',1)[1] if ':' in parts[0] else parts[0]
return fact_part.split('\n')[0].strip()
return message

def get_widget_content(break_obj):
"""
Generate or read the break message based on configuration.
"""

if use_ollama:
if ollama_model == "tinyllama:latest": # Implement custom parser for tinyllama:latest
output_parser = parse_tinyllama_output
else:
output_parser = None
output = get_ollama_response(model=ollama_model, prompt=ollama_prompt, system_prompt=ollama_system_prompt, output_parser=output_parser)
return output or "Enjoy your break!"
else:
try:
with open(message_file_path, 'r') as file:
return file.read().strip()
except Exception as e:
logging.error(f"Error reading break message: {e}")
return "Enjoy your break!"