-
Notifications
You must be signed in to change notification settings - Fork 147
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Additions to the Local AI Package #1
Comments
This are all great suggestions, thank you so much @mayphilc! Love the title for your issue too haha |
Adding this to a new project board I'm about to create for the local AI package 👍 |
You can look into using Deployarr to launch the initial proxy + forward auth with traefik, saves so much headache and setup and you can add to what it sets up, it uses includes so you can just add to it |
so ive been talking to the gentleman who created Deployarr (previosuly known as auto-traefik) and hes going to add about 20 ai related apps (i gave him the list) to the next release of Deployarr v5.7 . |
just finished adding langfuse, going to work on supabase next. @coleam00 we should talk id be more than happy to show you my stack so you can work on yours |
Nice work @mayphilc! Yes, I'd love to chat and see everything you have put together! Feel free to email me at [email protected] :) |
hey bro im so glad you added supabase and kept qdrant. if you dont mind id like to make some suggestions
Traefik for reverse proxy - manage local self signed ssl and sub domains (n8n.yourlocaldomain.you)
Authentik - Auth and SSO for the stack
Lissy Dashy - cenrtral hub UI portal for the other services
Dokploy - self hosted app launch platform (your own personal local "github" for all your software projects)
set flowise and n8n to queue mode :
n8n-main
n8n-worker
n8n-webhook-processor
flowise-main
flowise-worker
Redis for process cache and process queue management
redisinsight to visualize process queues
Browser-use webui
Crawl4ai
Perplexica + SearXNG
Bolt.diy
Openhands
ComfyUI
Lastly you can set a variable in the .env file that sets compose to use ollama cpu or gpu instead of launching the stack with a -gpu tag
I have so many more in my plan but I think this outlines a perfect baseline self hosted AI stack when added to your project.
FYI- the only downside to Supabase is that the community edition is only capable of single tenant single thread operations. while it can qeue serial processes it can not execute parallel operations. Also you are limited to a single "project" per supabase stack so even though you can create new tables and schema for various services in the same supabase, you cant have multiple seperate databases isolated fom each other in the same supabase stack. the solution is to setup sub stacks on isolated docker networks bridged to the main netowork, one for each seperate database you need.
like you i also use Docker desktop in windows with WSL2 integration, for this I got my free 3 node Portainer BE license and set that up in WSL, this gives me intricate stack and docker network controls
The text was updated successfully, but these errors were encountered: