Twinny is a free AI extension for Visual Studio Code, offering powerful AI-assisted coding features.
- localhost OpenAI/Ollama Compatible API (default)
- Anthropic
- OpenAI
- Mistral AI
- Perplexity
- Groq
- OpenRouter
The chat functionality has been moved to token.js. If you experience any issues:
- Reset and reconfigure your provider
- If problems persist, please open a GitHub issue
For a quick start guide, visit our documentation.
Twinny provides AI-powered real-time code suggestions to enhance your coding experience.
Use the sidebar to discuss your code with AI, getting explanations, tests, refactoring suggestions, and more.
- Online and offline operation
- Customizable API endpoints
- Preserved chat conversations
- OpenAI API standard compliance
- Single and multiline fill-in-the-middle completions
- Customizable prompt templates
- Git commit message generation
- Easy installation via VS Code marketplace
- Configurable settings (API provider, model, port, path)
- Direct code solution acceptance
- New document creation from code blocks
- Side-by-side diff view
- Full-screen chat mode
- Code solution block copying
- Workspace embeddings for context-aware assistance
- Symmetry network integration for P2P AI inference
Twinny uses workspace embeddings to provide context-aware AI assistance, improving the relevance of suggestions.
A decentralized P2P network for sharing AI inference resources, enhancing the capabilities of Twinny.
For troubleshooting and known issues, please check our GitHub issues page.
We welcome contributions! Please contact us via Twitter, describe your proposed changes in an issue, and submit a pull request. Twinny is MIT licensed.
Twinny is free and open-source. If you'd like to support the project, donations are appreciated:
Bitcoin: 1PVavNkMmBmUz8nRYdnVXiTgXrAyaxfehj
For updates, follow us on Twitter: https://x.com/twinnydotdev
Twinny is actively developed and provided "as is". Functionality may vary between updates.