Back to Catalog

llm-context.py

cyberchitta/llm-context.py
🔗 Latest commit:16033bf
🕒 Updated:Sep 9, 2025, 01:06 PM
Python
AI Tools

Share code with LLMs via Model Context Protocol or clipboard. Rule-based customization enables easy switching between different tasks (like code review and documentation). Includes smart code outlining.

MCP Trust Score
Based on our comprehensive evaluation criteria
🤖 Evaluated by gemini-2.5-flashFix
Trust Score28/100
GitHub Metrics
Repository statistics and activity
⭐ GitHub Stars:266
👥 Contributors:2
📋 Total Issues:4
📦 Has Releases:No
🔧 Has CI/CD Pipeline:No
Configuration
Configuration example extracted from README.md for Claude Desktop and other clients.
🤖 Evaluated by gemini-2.5-flashFix
{
  "llm-context": {
    "command": "uvx",
    "args": [
      "--from",
      "llm-context",
      "lc-mcp"
    ]
  }
}
MCP Protocol Support
Implemented MCP protocol features
🤖 Evaluated by gemini-2.5-flashFix
Tools:
Prompts:
Resources:
Sampling:
Roots:
Logging:
STDIO Transport:
HTTP Transport:
OAuth2 Auth:
Dependencies
8 dependencies
Libraries and frameworks used by this MCP server
🤖 Evaluated by gemini-2.5-flashFix
Add Quality Badge
Show your MCP trust score in your README
Trust Score Badge
[![Trust Score](https://archestra.ai/mcp-catalog/api/badge/quality/cyberchitta/llm-context.py)](https://archestra.ai/mcp-catalog/cyberchitta__llm-context.py)
README.md

LLM Context

License
PyPI version
Downloads

Reduce friction when providing context to LLMs. Share relevant project files instantly through smart selection and rule-based filtering.

The Problem

Getting project context into LLM chats is tedious:

  • Manually copying/pasting files takes forever
  • Hard to identify which files are relevant
  • Including too much hits context limits, too little misses important details
  • AI requests for additional files require manual fetching
  • Repeating this process for every conversation

The Solution

lc-sel-files    # Smart file selection
lc-context      # Instant formatted context
# Paste and work - AI can access additional files seamlessly

Result: From "I need to share my project" to productive AI collaboration in seconds.

Note: This project was developed in collaboration with several Claude Sonnets (3.5, 3.6, 3.7 and 4.0), as well as Groks (3 and 4), using LLM Context itself to share code during development. All code in the repository is heavily human-curated (by me 😇, @restlessronin).

Installation

uv tool install "llm-context>=0.4.0"

Quick Start

Basic Usage

# One-time setup
cd your-project
lc-init

# Daily usage
lc-sel-files
lc-context

MCP Integration (Recommended)

{
  "mcpServers": {
    "llm-context": {
      "command": "uvx",
      "args": ["--from", "llm-context", "lc-mcp"]
    }
  }
}

With MCP, AI can access additional files directly during conversations.

Project Customization

# Create project-specific filters
cat > .llm-context/rules/flt-repo-base.md << 'EOF'
---
compose:
  filters: [lc/flt-base]
gitignores:
  full-files: ["*.md", "/tests", "/node_modules"]
---
EOF

# Customize main development rule
cat > .llm-context/rules/prm-code.md << 'EOF'
---
instructions: [lc/ins-developer, lc/sty-python]
compose:
  filters: [flt-repo-base]
---
EOF

Core Commands

CommandPurpose
lc-initInitialize project configuration
lc-sel-filesSelect files based on current rule
lc-contextGenerate and copy context
lc-context -ntGenerate context for non-MCP environments
lc-set-rule <name>Switch between rules
lc-clip-filesHandle file requests (non-MCP)

Rule System

Rules use a systematic four-category structure:

  • Prompt Rules (prm-): Generate project contexts (e.g., lc/prm-developer, lc/prm-rule-create)
  • Filter Rules (flt-): Control file inclusion (e.g., lc/flt-base, lc/flt-no-files)
  • Instruction Rules (ins-): Provide guidelines (e.g., lc/ins-developer, lc/ins-rule-framework)
  • Style Rules (sty-): Enforce coding standards (e.g., lc/sty-python, lc/sty-code)

Example Rule

---
description: "Debug API authentication issues"
compose:
  filters: [lc/flt-no-files]
also-include:
  full-files: ["/src/auth/**", "/tests/auth/**"]
---
Focus on authentication system and related tests.

Workflow Patterns

Daily Development

lc-set-rule lc/prm-developer
lc-sel-files
lc-context
# AI can review changes, access additional files as needed

Focused Tasks

# Let AI help create minimal context
lc-set-rule lc/prm-rule-create
lc-context -nt
# Work with AI to create task-specific rule using tmp-prm- prefix

MCP Benefits

  • Code review: AI examines your changes for completeness/correctness
  • Additional files: AI accesses initially excluded files when needed
  • Change tracking: See what's been modified during conversations
  • Zero friction: No manual file operations during development discussions

Key Features

Smart File Selection: Rules automatically include/exclude appropriate files
Instant Context Generation: Formatted context copied to clipboard in seconds
MCP Integration: AI can access additional files without manual intervention
Systematic Rule Organization: Four-category system for clear rule composition
AI-Assisted Rule Creation: Let AI help create minimal context for specific tasks

Learn More

License

Apache License, Version 2.0. See LICENSE for details.

llm-context.py MCP Server | Documentation & Integration | Archestra