Skip to content

Commit

Permalink
Cleanup of package structure for comprehensability. (#281)
Browse files Browse the repository at this point in the history
* Cleanup and refactor package structure

* docs: Add comprehensive documentation for Tool struct

* Some renames

* feat: Improve README with comprehensive plugin overview and setup instructions

* docs: Improve README with configuration, development, and release sections

* Improve comment

* Fix merge issue

* Improve readme

* README fixes

* docs: Update system requirements and prerequisites versions

* Some cleanup

* Merge fixes.
  • Loading branch information
crspeller authored Jan 21, 2025
1 parent 4f8a453 commit 07c7319
Show file tree
Hide file tree
Showing 57 changed files with 1,008 additions and 1,024 deletions.
107 changes: 51 additions & 56 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,84 +1,79 @@
# Mattermost Copilot Plugin [![Download Latest Master Build](https://img.shields.io/badge/Download-Latest%20Master%20Build-blue)](https://github.com/mattermost/mattermost-plugin-ai/releases/tag/latest-master)

> Mattermost plugin for local and third-party LLMs
<div align="center">

![The Mattermost Copilot AI Plugin is an extension for mattermost that provides functionality for local and third-party LLMs](https://github.com/mattermost/mattermost-plugin-ai/assets/2040554/6a787ff6-013d-4492-90ce-54aa7a292a4a)
# Mattermost Copilot Plugin [![Download Latest Master Build](https://img.shields.io/badge/Download-Latest%20Master%20Build-blue)](https://github.com/mattermost/mattermost-plugin-ai/releases/tag/latest-master)

The Mattermost Copilot Plugin integrates AI capabilities directly into your [Mattermost](https://github.com/mattermost/mattermost) workspace, supporting both self-hosted and vendor-hosted Large Language Models (LLMs).

</div>

<!-- omit from toc -->
## Table of Contents
![The Mattermost Copilot AI Plugin is an extension for mattermost that provides functionality for self-hosted and vendor-hosted LLMs](img/mattermost-ai-llm-access.webp)

- [Background](#background)
- [Contributing](#contributing)
- [License](#license)
## Installation

## Background
1. Download the latest release from the [releases page](https://github.com/mattermost/mattermost-plugin-ai/releases). You can also download the **experimental** [latest master](https://github.com/mattermost/mattermost-plugin-ai/releases/tag/latest-master)
2. Upload and enable the plugin through the Mattermost System Console
3. Configure your desired LLM provider settings

The Mattermost Copilot Plugin adds functionality for local (self-hosted) and third-party (vendor-hosted) LLMs within Mattermost v9.6 and above. This plugin is currently experimental.
More details on the [Mattermost documentation site](https://docs.mattermost.com/configure/enable-copilot.html)

Contributions and suggestions are welcome. See the [Contributing](#contributing) section for more details!
### System Requirements

Join the discussion in the [~AI-Exchange channel](https://community.mattermost.com/core/channels/ai-exchange) and explore the [Discourse forum](https://forum.mattermost.com/c/openops-ai/40). 💬
- Mattermost Server versions:
- v10.0 or later recommended
- v9.11+ (ESR)
- PostgreSQL database
- Network access to your chosen LLM provider

## Install
## Configuration

We recommend using Mattermost Server v9.6 or later for the best experience. Compatible Mattermost server versions include:
After installation, you'll need to configure the plugin through the System Console:

- v9.6 or later
- v9.5.2+ ([ESR](https://docs.mattermost.com/deploy/mattermost-changelog.html#release-v9-5-extended-support-release))
- v9.4.4+
- v9.3.3+
- v8.1.11+ ([ESR](https://docs.mattermost.com/deploy/mattermost-changelog.html))
1. Navigate to **System Console > Plugins > Copilot**
2. Create a bot
3. Select and setup an upstream provider
4. Check it's working in the copilot RHS

See the [Mattermost Product Documentation](https://docs.mattermost.com/configure/enable-copilot.html) for details on installing, configuring, enabling, and using this Mattermost integration.
For detailed configuration instructions, see the [Mattermost Product Documentation](https://docs.mattermost.com/configure/enable-copilot.html#mattermost-configuration).

**Note**: Installation instructions assume you already have [Mattermost Server](https://mattermost.com/download/) installed and configured with [PostgreSQL](https://www.postgresql.org/).
## Development

## How to Release
### Prerequisites

To trigger a release, follow these steps:
- Go 1.22+
- Node.js 20.11+
- Access to an LLM provider (OpenAI, Anthropic, etc.)

1. **For Patch Release:** Run the following command:
```
make patch
```
This will release a patch change.
### Local Setup

2. **For Minor Release:** Run the following command:
```
make minor
```
This will release a minor change.
1. Setup your Mattermost development environment by following the [Mattermost developer setup guide](https://developers.mattermost.com/contribute/server/developer-setup/). If you have a remote mattermost server you want to develop to you can skip this step.

3. **For Major Release:** Run the following command:
```
make major
```
This will release a major change.
2. Setup your Mattermost plugin development environment by following the [Plugin Developer setup guide](https://developers.mattermost.com/integrate/plugins/developer-setup/).

4. **For Patch Release Candidate (RC):** Run the following command:
```
make patch-rc
```
This will release a patch release candidate.
3. Clone the repository:
```bash
git clone https://github.com/mattermost/mattermost-plugin-ai.git
cd mattermost-plugin-ai
```

5. **For Minor Release Candidate (RC):** Run the following command:
```
make minor-rc
```
This will release a minor release candidate.
4. **Optional**. If you are developing to a remote server, setup environment variables to deploy:
```bash
MM_SERVICESETTINGS_SITEURL=http://localhost:8065
MM_ADMIN_USERNAME=<YOUR_USERNAME>
MM_ADMIN_PASSWORD=<YOUR_PASSWORD>
```

6. **For Major Release Candidate (RC):** Run the following command:
```
make major-rc
```
This will release a major release candidate.
5. Run deploy to build the plugin
```bash
make deploy
```

### Other make commands

## Contributing
- Run `make help` for a list of all make commands
- Run `make check-style` to verify code style
- Run `make test` to run the test suite
- Run `make e2e` to run the e2e tests

Interested in contributing to our open source project? Start by reviewing the [contributor guidelines](./.github/CONTRIBUTING.md) for this repository. See the [Developer Setup Guide](docs/developer-setup-guide.md) for details on setting up a Mattermost instance for development.

## License

Expand Down
20 changes: 0 additions & 20 deletions docs/developer-setup-guide.md

This file was deleted.

Binary file added img/mattermost-ai-llm-access.webp
Binary file not shown.
36 changes: 18 additions & 18 deletions server/ai/anthropic/anthropic.go → server/anthropic/anthropic.go
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ import (
anthropicSDK "github.com/anthropics/anthropic-sdk-go"
"github.com/anthropics/anthropic-sdk-go/option"
"github.com/invopop/jsonschema"
"github.com/mattermost/mattermost-plugin-ai/server/ai"
"github.com/mattermost/mattermost-plugin-ai/server/llm"
"github.com/mattermost/mattermost-plugin-ai/server/metrics"
)

Expand All @@ -26,10 +26,10 @@ type messageState struct {
output chan<- string
errChan chan<- error
depth int
config ai.LLMConfig
tools []ai.Tool
resolver func(name string, argsGetter ai.ToolArgumentGetter, context ai.ConversationContext) (string, error)
context ai.ConversationContext
config llm.LanguageModelConfig
tools []llm.Tool
resolver func(name string, argsGetter llm.ToolArgumentGetter, context llm.ConversationContext) (string, error)
context llm.ConversationContext
}

type Anthropic struct {
Expand All @@ -40,7 +40,7 @@ type Anthropic struct {
outputTokenLimit int
}

func New(llmService ai.ServiceConfig, httpClient *http.Client, metricsService metrics.LLMetrics) *Anthropic {
func New(llmService llm.ServiceConfig, httpClient *http.Client, metricsService metrics.LLMetrics) *Anthropic {
client := anthropicSDK.NewClient(
option.WithAPIKey(llmService.APIKey),
option.WithHTTPClient(httpClient),
Expand All @@ -67,7 +67,7 @@ func isValidImageType(mimeType string) bool {
}

// conversationToMessages creates a system prompt and a slice of input messages from conversation posts.
func conversationToMessages(posts []ai.Post) (string, []anthropicSDK.MessageParam) {
func conversationToMessages(posts []llm.Post) (string, []anthropicSDK.MessageParam) {
systemMessage := ""
messages := make([]anthropicSDK.MessageParam, 0, len(posts))

Expand All @@ -86,15 +86,15 @@ func conversationToMessages(posts []ai.Post) (string, []anthropicSDK.MessagePara

for _, post := range posts {
switch post.Role {
case ai.PostRoleSystem:
case llm.PostRoleSystem:
systemMessage += post.Message
continue
case ai.PostRoleBot:
case llm.PostRoleBot:
if currentRole != "assistant" {
flushCurrentMessage()
currentRole = "assistant"
}
case ai.PostRoleUser:
case llm.PostRoleUser:
if currentRole != "user" {
flushCurrentMessage()
currentRole = "user"
Expand Down Expand Up @@ -147,8 +147,8 @@ func conversationToMessages(posts []ai.Post) (string, []anthropicSDK.MessagePara
return systemMessage, messages
}

func (a *Anthropic) GetDefaultConfig() ai.LLMConfig {
config := ai.LLMConfig{
func (a *Anthropic) GetDefaultConfig() llm.LanguageModelConfig {
config := llm.LanguageModelConfig{
Model: a.defaultModel,
}
if a.outputTokenLimit == 0 {
Expand All @@ -159,7 +159,7 @@ func (a *Anthropic) GetDefaultConfig() ai.LLMConfig {
return config
}

func (a *Anthropic) createConfig(opts []ai.LanguageModelOption) ai.LLMConfig {
func (a *Anthropic) createConfig(opts []llm.LanguageModelOption) llm.LanguageModelConfig {
cfg := a.GetDefaultConfig()
for _, opt := range opts {
opt(&cfg)
Expand Down Expand Up @@ -253,7 +253,7 @@ func (a *Anthropic) streamChatWithTools(state messageState) error {
return nil
}

func (a *Anthropic) ChatCompletion(conversation ai.BotConversation, opts ...ai.LanguageModelOption) (*ai.TextStreamResult, error) {
func (a *Anthropic) ChatCompletion(conversation llm.BotConversation, opts ...llm.LanguageModelOption) (*llm.TextStreamResult, error) {
a.metricsService.IncrementLLMRequests()

output := make(chan string)
Expand Down Expand Up @@ -284,10 +284,10 @@ func (a *Anthropic) ChatCompletion(conversation ai.BotConversation, opts ...ai.L
}
}()

return &ai.TextStreamResult{Stream: output, Err: errChan}, nil
return &llm.TextStreamResult{Stream: output, Err: errChan}, nil
}

func (a *Anthropic) ChatCompletionNoStream(conversation ai.BotConversation, opts ...ai.LanguageModelOption) (string, error) {
func (a *Anthropic) ChatCompletionNoStream(conversation llm.BotConversation, opts ...llm.LanguageModelOption) (string, error) {
// This could perform better if we didn't use the streaming API here, but the complexity is not worth it.
result, err := a.ChatCompletion(conversation, opts...)
if err != nil {
Expand All @@ -300,8 +300,8 @@ func (a *Anthropic) CountTokens(text string) int {
return 0
}

// convertTools converts from ai.Tool to anthropicSDK.Tool format
func convertTools(tools []ai.Tool) []anthropicSDK.ToolParam {
// convertTools converts from llm.Tool to anthropicSDK.Tool format
func convertTools(tools []llm.Tool) []anthropicSDK.ToolParam {
converted := make([]anthropicSDK.ToolParam, len(tools))
for i, tool := range tools {
reflector := jsonschema.Reflector{
Expand Down
Loading

0 comments on commit 07c7319

Please sign in to comment.