Please check out OpenOps, which is the new framework for open source AI-enhanced collaboration with Mattermost.
The Mattermost AI framework offers an open source, self-managed solution for strict security organizations to explore generative AI enhancements while maintaining full data control, and avoiding lock-in to vendor platforms. Benefits include:
-
Fully-featured chat-based collaboration including 1-1 and group messaging across, web, desktop and mobile, with file and media sharing, search, integrations, custom emojis and emoji reactions, syntax highlighting and custom rendering.
-
Conversational AI bot that can add an AI bot to channels where it can be engaged like human users to respond to questions and requests based on different LLMs that can be downloaded and run as part of the framework, including models from the HuggingFace AI community.
-
Discussion Summarization with the ability to produce a concise summary of threaded discussions using LLM models without any data leaving the self-hosted system.
-
Scalable AI model framework that can scale up to deploy on a private cloud or data center using large and powerful open source LLM models for group work, or scale down to run on a commodity laptop, without the need for specialized hardware required by typical AI models, for individual developers to prototype and explore LLM capabilities.
-
Conforming security and compliance platform that can accommodate a broad range of custom security and compliance requirements. With a fully open source and transparent stack, enterprises can scan and evaluate any portion of this platform as well as monitor and secure all incoming and outgoing network traffic as well as deploy to restricted networks.
Example: Watch a minute-long demo for discussion summarization using a fully open source, self-hosted AI/LLM platform:
This project is a framework for a self-hosted AI app in a multi-user chat environment that can be fully private and off-grid AKA air-gapped. Check out the demo from May 15, 2023.
This framework uses a locally-deployed Mattermost app to interface with a variety of LLM AIs. It currently supports local LLMs hosted via Serge, a wrapper around llama.cpp that can run LLMs without a GPU.
This framework consists of three local components:
- Mattermost
- Serge
ai-bot
, a Mattermost app inside the./ai-bot
folder
ai-bot
routes communicaiton between the Mattermost and Serge servers via a REST API.
You will need Docker installed with compose
. This repository should work on a 16GB M1 Macbook.
- Clone and enter this repository:
git clone https://github.com/mattermost/mattermost-ai-framework && cd mattermost-ai-framework
- Start the services:
docker compose up -d
- Download a Serge model (e.g., GPT4All):
- Open Serge at
http://localhost:8008
- Select Download Models
- Download GPT4All and wait for it to finish
- Access Mattermost
- Open Mattermost at
http://localhost:8065
- Select View in Browser
- Create your local account and team
- Install the
ai-bot
Mattermost app
- In any Mattermost channel, use this slash command:
/apps install http http://ai-bot:9000/manifest.json
- Accept the permissions in the modal
- Select Submit
- If unable to complete the above steps, try restarting the app service first:
docker restart ai-bot
- Select the above badge to start your Gitpod workspace
- The workspace will configure itself automatically. Wait for the services to start and for your
root
login for Mattermost to be generated in the terminal - Download a Serge model (e.g., GPT4All):
- Check for blocked pop-ups, or open Serge on the
Ports
tab. - Select Download Models
- Download GPT4All and wait for it to finish
- Access Mattermost and log in with the generated
root
credentials - Install the
ai-bot
Mattermost app
- In any Mattermost channel, use this slash command:
/apps install http http://ai-bot:9000/manifest.json
- Accept the permissions in the modal
- Select Submit
- If unable to complete the above steps, try restarting the app service first:
docker restart ai-bot
You're now ready to use the example ai-bot
! 🎉
In any channel, you can now ask ai-bot
questions with the /ai ask
slash command. For example:
/ai ask "Write a haiku about perseverance"
/ai ask "Why is open source important?"
/ai ask "When were pterodactyls alive?"
Slash command | Response |
---|---|
To summarize threads, first grant the bot account access to public channels:
- Open the top left Mattermost menu button (9 squares) and select Integrations
- Select Bot Accounts then Edit for
ai-bot
- Check the box for post:channels (Bot will have access to post to all Mattermost public channels)
Now, open the message app menu button (4 squares) on any post in a public channel and select Summarize (AI). You can watch a brief demo of this functionality here.
Message app menu button | Response |
---|---|
- OpenOps General Discussion on Mattermost Forum
- OpenOps Troubleshooting Discussion on Mattermost Forum
- OpenOps Q&A on Mattermost Forum
- OpenOps "AI Exchange" channel on Mattermost Community server (for Mattermost community interested in AI)
- OpenOps Discord Server (for AI community interested in Mattermost)
- Mattermost Troubleshooting Discussion on Mattermost Forum
- Mattermost "Peer-to-peer Help" channel on Mattermost Community server
Thank you for your interest in contributing! We’re glad you’re here! ❤️ We recommend reading Mattermost's contributor guide to learn more about our community!
This repository is licensed under Apache-2.