Skip to content

Latest commit

 

History

History
216 lines (175 loc) · 7.6 KB

README.md

File metadata and controls

216 lines (175 loc) · 7.6 KB

lumen

Crates.io Total Downloads GitHub Releases GitHub License Crates.io Size

A command-line tool that uses AI to streamline your git workflow - from generating commit messages to explaining complex changes, all without requiring an API key.

demo

Table of Contents

Features 🔅

  • Smart Commit Messages: Generate conventional commit messages for your staged changes
  • Git History Insights: Understand what changed in any commit, branch, or your current work
  • Interactive Search: Find and explore commits using fuzzy search
  • Change Analysis: Ask questions about specific changes and their impact
  • Zero Config: Works instantly without an API key, using Phind by default
  • Flexible: Works with any git workflow and supports multiple AI providers
  • Rich Output: Markdown support for readable explanations and diffs (requires: mdcat)

Getting Started 🔅

Prerequisites

Before you begin, ensure you have:

  1. git installed on your system
  2. fzf (optional) - Required for lumen list command
  3. mdcat (optional) - Required for pretty output formatting

Installation

Using Homebrew (MacOS and Linux)

brew install jnsahaj/lumen/lumen

Using Cargo

Important

cargo is a package manager for rust, and is installed automatically when you install rust. See installation guide

cargo install lumen

Usage 🔅

Generate Commit Messages

Create meaningful commit messages for your staged changes:

# Basic usage - generates a commit message based on staged changes
lumen draft
# Output: "feat(button.tsx): Update button color to blue"

# Add context for more meaningful messages
lumen draft --context "match brand guidelines"
# Output: "feat(button.tsx): Update button color to align with brand identity guidelines"

Explain Changes

Understand what changed and why:

# Explain current changes in your working directory
lumen explain --diff                  # All changes
lumen explain --diff --staged         # Only staged changes

# Explain specific commits
lumen explain HEAD                    # Latest commit
lumen explain abc123f                 # Specific commit
lumen explain HEAD~3..HEAD            # Last 3 commits
lumen explain main..feature/A         # Branch comparison
lumen explain main...feature/A        # Branch comparison (merge base)

# Ask specific questions about changes
lumen explain --diff --query "What's the performance impact of these changes?"
lumen explain HEAD --query "What are the potential side effects?"

Interactive Mode

# Launch interactive fuzzy finder to search through commits (requires: fzf)
lumen list

Tips & Tricks

# Copy commit message to clipboard
lumen draft | pbcopy                  # macOS
lumen draft | xclip -selection c      # Linux

# View the commit message and copy it
lumen draft | tee >(pbcopy)

# Open in your favorite editor
lumen draft | code -      

# Directly commit using the generated message
lumen draft | git commit -F -           

If you are using lazygit, you can add this to the user config

customCommands:
  - key: '<c-l>'
    context: 'files'
    command: 'lumen draft | tee >(pbcopy)'
    loadingText: 'Generating message...'
    showOutput: true
  - key: '<c-k>'
    context: 'files'
    command: 'lumen draft -c {{.Form.Context | quote}} | tee >(pbcopy)'
    loadingText: 'Generating message...'
    showOutput: true
    prompts:
          - type: 'input'
            title: 'Context'
            key: 'Context'

AI Providers 🔅

Configure your preferred AI provider:

# Using CLI arguments
lumen -p openai -k "your-api-key" -m "gpt-4o" draft

# Using environment variables
export LUMEN_AI_PROVIDER="openai"
export LUMEN_API_KEY="your-api-key"
export LUMEN_AI_MODEL="gpt-4o"

Supported Providers

Provider API Key Required Models
Phind phind (Default) No Phind-70B
Groq groq Yes (free) llama2-70b-4096, mixtral-8x7b-32768 (default: mixtral-8x7b-32768)
OpenAI openai Yes gpt-4o, gpt-4o-mini, gpt-4, gpt-3.5-turbo (default: gpt-4o-mini)
Claude claude Yes see list (default: claude-3-5-sonnet-20241022)
Ollama ollama No (local) see list (required)
OpenRouter openrouter Yes see list (default: anthropic/claude-3.5-sonnet)
DeepSeek deepseek Yes deepseek-chat, deepseek-reasoner (default: deepseek-reasoner)

Advanced Configuration 🔅

Configuration File

Create a lumen.config.json at your project root or specify a custom path with --config:

{
  "provider": "openai",
  "model": "gpt-4o",
  "api_key": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
  "draft": {
    "commit_types": {
      "docs": "Documentation only changes",
      "style": "Changes that do not affect the meaning of the code",
      "refactor": "A code change that neither fixes a bug nor adds a feature",
      "perf": "A code change that improves performance",
      "test": "Adding missing tests or correcting existing tests",
      "build": "Changes that affect the build system or external dependencies",
      "ci": "Changes to our CI configuration files and scripts",
      "chore": "Other changes that don't modify src or test files",
      "revert": "Reverts a previous commit",
      "feat": "A new feature",
      "fix": "A bug fix"
    }
  }
}

Configuration Precedence

Options are applied in the following order (highest to lowest priority):

  1. CLI Flags
  2. Configuration File
  3. Environment Variables
  4. Default options

Example: Using different providers for different projects:

# Set global defaults in .zshrc/.bashrc
export LUMEN_AI_PROVIDER="openai"
export LUMEN_AI_MODEL="gpt-4o"
export LUMEN_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxx"

# Override per project using config file
{
  "provider": "ollama",
  "model": "llama3.2"
}

# Or override using CLI flags
lumen -p "ollama" -m "llama3.2" draft