인연
Inyeon
Your Agentic AI Git Companion
“You are Daniel Craig but life owes you a Vesper Lynd?”
Split commits, resolve conflicts, review code, and generate PRs & changelogs — all in one command.
How it works
From diff to commit
Inyeon analyzes your changes and generates meaningful commit messages with AI-powered code review.
@@ -15,8 +15,21 @@ class AuthService: def __init__(self, db: Database): self.db = db- self.secret = "hardcoded"+ self.secret = os.environ.get("JWT_SECRET")+ self.session_timeout = 3600+ self.active_sessions: dict[str, Session] = {}- def login(self, user, pwd):- if self.db.check(user, pwd):- return True+ def login(self, username: str, password: str) -> str | None:+ """Authenticate user and create session."""+ user = self.db.verify_credentials(username, password)+ if user:+ session_id = self._create_session(user.id)+ return session_id+ return None+ def _create_session(self, user_id: int) -> str:+ session_id = secrets.token_urlsafe(32)+ self.active_sessions[session_id] = Session(+ user_id=user_id,+ expires_at=time.time() + self.session_timeout+ )+ return session_idfeat(auth): implement secure session management- Replace hardcoded secret with environment variable- Add configurable session timeout (default: 1 hour)- Implement session tracking with active_sessions dict- Add type hints for better code clarity- Create dedicated session creation method- Return session ID on successful loginBREAKING CHANGE: login() now returns session_id stringinstead of boolean## Security Analysis✓ Removed hardcoded secret - now uses environment variable✓ Session tokens use cryptographically secure generation✓ Session expiration implemented## Code Quality✓ Added type hints throughout✓ Improved method naming (check → verify_credentials)✓ Added docstring for public method## SuggestionsConsider adding:• Rate limiting for login attempts• Session invalidation on password change• Logging for authentication eventsinyeon commit --stagedGenerate commit from staged changes
inyeon split --stagedSplit into atomic commits
inyeon review --stagedGet AI code review
git diff | inyeon analyzeAnalyze any diff
Atomic commit splitting
Intelligently group your staged changes into smaller, logical commits using multi-agent AI and clustering strategies.

Clustering Strategies
DIR
directory
Group by folder structure
SEM
semantic
Group by code similarity
CNV
conventional
Group by commit type
HYB
hybrid
Combine all strategies
inyeon split --staged --previewPreview how changes will be split
inyeon split --staged --interactiveApprove each commit individually
inyeon split --staged --executeAuto-commit all groups
inyeon split --staged --strategy semanticUse a specific clustering strategy
Full workflow automation
One command runs the entire pipeline — split, commit, review, and generate a PR description automatically.
Split
Changes grouped into atomic commits by AI clustering
Commit
Conventional commit messages generated and applied
Review
Security, quality, and pattern insights surfaced
PR
Pull request description drafted and ready to paste
Cost-optimized short-circuits
Skips split for single-file changes · Skips review for small diffs (<500 chars) · As few as 2 LLM calls for simple changes
inyeon auto --stagedFull pipeline in one command
inyeon auto --all --dry-runPreview the pipeline without committing
inyeon auto --staged --no-reviewSkip code review step
inyeon auto --staged --no-prSkip PR generation step
Multi-LLM provider support
Switch between OpenAI, Gemini, and Ollama per-command or set a default — use the best model for every task.
Supported Providers
OAI
OpenAI
GPT-4.1
Cloud-hosted, high accuracy
GEM
Gemini
2.5 Flash
Cloud-hosted, cost-efficient
OLL
Ollama
Local models
Self-hosted, fully private
inyeon providersList available providers on the backend
inyeon commit --staged -p openaiUse OpenAI for this command
inyeon review --all --provider geminiUse Gemini for this command
export INYEON_LLM_PROVIDER=openaiSet a default provider via env var
Real-time streaming
Watch agents think step by step. SSE-powered live progress replaces the spinner — every node completion and reasoning step streams as it happens.
Server-Sent Events
All 6 agent operations stream via /api/v1/agent/stream/{operation} — commit, review, PR, split, resolve, and changelog. --stream is on by default.
inyeon commit --stagedLive progress: node completions, reasoning steps
inyeon commit --staged --no-streamClassic mode: spinner until done
inyeon auto --stagedFull pipeline streamed in real time
inyeon review --allWatch the review agent think step by step
Offline mode
Run agents directly in the CLI process — no backend server needed. Use Ollama for fully private, air-gapped operation, or any cloud provider without a middleman.
Execution Engines
LOCAL
LocalEngine
In-process agents, no server needed
REMOTE
HttpEngine
Remote backend via SSE
"Local" means no backend server
--local --provider gemini still calls Gemini's API — the agents just run in your CLI process instead of on a remote server. For fully offline operation, use Ollama.
inyeon commit --staged --localRun commit agent in-process with Ollama
inyeon auto --staged --localFull pipeline without a backend server
inyeon commit --staged --local -p geminiLocal mode with Gemini API (no backend, still cloud LLM)
inyeon review --all --local -p openaiLocal mode with OpenAI API
Features
Everything you need for smarter git
Real-Time Streaming
SSE-powered live progress for all agent operations — watch reasoning steps and node completions as they happen
Offline Mode
Run agents directly in the CLI process with --local — no backend server needed, works with Ollama, Gemini, or OpenAI
Full Workflow Automation
Split, commit, review, and generate PRs in one command with inyeon auto
Atomic Commit Splitting
Intelligently group changes into smaller, logical commits with 4 clustering strategies
AI Conflict Resolution
Understands both sides of a merge conflict and produces clean, merged code
Changelog Generation
Groups commits by type with narrative summaries — write to file or stdout
RAG-Powered Context
ChromaDB indexes your codebase for semantic understanding and smart retrieval
Smart Code Review
AI-powered insights on security, code quality, and patterns — runs inside auto pipeline
Multi-LLM Providers
Switch between OpenAI, Gemini, and Ollama per-command or set a default via environment variable
Built with
AI Agents
Execution Engines
Tests Passing
Open Source
Transform your git workflow
Open source. Runs locally. Ready to deploy.
Have ideas or want to contribute?