Give your AI agent
eyes on your codebase.
Mikoshi is a local-first CLI + MCP server. It indexes your repo, builds a dependency graph, and serves structured code context to AI agents — GitHub Copilot, Claude, Cursor, Codex — through the Model Context Protocol. No cloud. No telemetry. Your code never leaves your machine.
An MCP server that understands your code — not just your files.
Most AI agents get raw file dumps when they ask about code. Mikoshi gives them structured, ranked, enriched context: function signatures, call graphs, usage examples, related tests — all compressed to fit a token budget. The agent asks a natural language question; Mikoshi returns the exact code context it needs to answer correctly.
The ONNX embeddings model runs locally at ~/.mikoshi/models/. No API keys required for semantic search.
7-stage retrieval pipeline
Every query runs through a full pipeline before results reach your agent.
10 tools exposed to your agent
Once connected, your AI agent can call these tools directly — no prompt engineering needed.
Get running in 60 seconds
Connect to GitHub Copilot
Add this to .vscode/mcp.json in your workspace. Works with Copilot, Claude, Cursor, and any MCP-compatible client.
{
"servers": {
"Mikoshi-mcp": {
"type": "stdio",
"command": "mikoshi-mcp",
"cwd": "${workspaceFolder}"
}
}
}The MCP server warms up the embeddings model in the background on start — the first query is fast.
Database intelligence for AI agents
Connect Mikoshi to your Supabase project and your AI agent can introspect your schema, audit RLS policies, list roles, and run SQL — all with safety gates built in.
Your codebase. Your machine. Your agent.
Install once. Index any repo. Connect any MCP client.