Skip to content

feat: Add cgc prompt command for custom LLM prompts (#583)#613

Open
DhineshPonnarasan wants to merge 1 commit intoCodeGraphContext:mainfrom
DhineshPonnarasan:feature/cgc-prompt-command
Open

feat: Add cgc prompt command for custom LLM prompts (#583)#613
DhineshPonnarasan wants to merge 1 commit intoCodeGraphContext:mainfrom
DhineshPonnarasan:feature/cgc-prompt-command

Conversation

@DhineshPonnarasan
Copy link
Contributor

@DhineshPonnarasan DhineshPonnarasan commented Feb 8, 2026

Summary

Implements #583 - Adds a new cgc prompt CLI command that allows users to register custom LLM prompt files that are injected into the system prompt at runtime.

Features

  • New CLI command group: cgc prompt with subcommands:
  • cgc prompt add <path> - Register a custom prompt file
  • cgc prompt list - List all registered prompts with status
  • cgc prompt remove <path> - Unregister a prompt file
  • Project-level configuration stored in .cgc/config.json
  • Paths stored as relative paths (when within project)
  • File validation and duplicate prevention
  • Custom prompts prepended to base system prompt at MCP server initialization
  • Missing files at runtime are skipped with warnings (no crashes)
  • Full backward compatibility - works exactly as before when no prompts configured

Implementation Details

  • Created project_config.py module for managing .cgc/config.json
  • Added build_system_prompt() function to dynamically load and combine prompts
  • Integrated with MCP server initialization to inject custom prompts
  • Uses proper logging patterns for runtime warnings

Testing

  • All CLI commands tested and working
  • Config persistence verified
  • File validation working correctly
  • Multiple prompts load in correct order
  • Missing file handling works without crashes
  • Backward compatibility verified (no breaking changes)
  • MCP server initialization tested with custom prompts

Screenshots:

image image image image image

…GraphContext#583)

- Add new 'cgc prompt' CLI command group with add/list/remove subcommands
- Create project_config.py module for managing .cgc/config.json
- Store custom prompt paths in project-level config (relative paths)
- Implement build_system_prompt() to prepend custom prompts at runtime
- Integrate with MCP server initialization to inject custom prompts
- Add file validation, duplicate prevention, and missing file handling
- Maintain full backward compatibility when no prompts are configured
- Use proper logging for runtime warnings

Changes:
- src/codegraphcontext/cli/project_config.py (new)
- src/codegraphcontext/cli/main.py (add prompt command group)
- src/codegraphcontext/prompts.py (add build_system_prompt function)
- src/codegraphcontext/server.py (use build_system_prompt in initialize)

Resolves CodeGraphContext#583
Copilot AI review requested due to automatic review settings February 8, 2026 08:53
@vercel
Copy link

vercel bot commented Feb 8, 2026

@DhineshPonnarasan is attempting to deploy a commit to the shashankss1205's projects Team on Vercel.

A member of the Team first needs to authorize it.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds support for project-specific “custom system prompts” by introducing a new cgc prompt CLI command group, persisting prompt file registrations in a per-project config, and injecting those prompt files into the MCP server’s system prompt during initialization.

Changes:

  • Add cgc prompt add|list|remove commands to manage registered prompt files.
  • Introduce .cgc/config.json project config handling for storing prompt paths and loading file contents.
  • Update MCP server initialization to use a dynamically-built system prompt that prepends custom prompts.

Reviewed changes

Copilot reviewed 4 out of 5 changed files in this pull request and generated 6 comments.

Show a summary per file
File Description
website/package-lock.json Lockfile updates (adds peer: true metadata on several entries).
src/codegraphcontext/server.py Switches MCP initialize response to use build_system_prompt() output.
src/codegraphcontext/prompts.py Adds build_system_prompt() to prepend custom prompt file contents to LLM_SYSTEM_PROMPT.
src/codegraphcontext/cli/project_config.py New module for .cgc/config.json management and prompt file registration/loading.
src/codegraphcontext/cli/main.py Adds cgc prompt Typer subcommands wired to project_config.
Files not reviewed (1)
  • website/package-lock.json: Language not supported

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +10 to +13
from rich.console import Console

console = Console()
logger = logging.getLogger(__name__)
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

project_config is imported at MCP server init via build_system_prompt(). Using Console() (default stdout) risks writing to stdout (e.g., warnings/errors) and breaking the JSON-RPC protocol, since the server uses stdout for responses. Prefer removing console output from this module (use logger.*) or at minimum configure Console(stderr=True) and ensure all non-JSON output goes to stderr.

Copilot uses AI. Check for mistakes.
Comment on lines +61 to +66
except json.JSONDecodeError as e:
console.print(f"[yellow]Warning: Invalid JSON in {config_file}: {e}[/yellow]")
return {"prompts": []}
except Exception as e:
console.print(f"[yellow]Warning: Could not load config: {e}[/yellow]")
return {"prompts": []}
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These console.print(...) warnings in config loading can end up on stdout when called from the MCP server, corrupting JSON-RPC output. Switch these to logger.warning(...) (optionally with exc_info=True) or print explicitly to stderr.

Copilot uses AI. Check for mistakes.
Comment on lines +119 to +125
# Load current config
config = load_project_config(project_root)

# Check for duplicates
if prompt_path_str in config["prompts"]:
console.print(f"[yellow]⚠ Prompt file already registered: {prompt_path_str}[/yellow]")
return False
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Duplicate prevention is currently string-based (prompt_path_str in config["prompts"]), so paths like skills.md vs ./skills.md (or differing case on Windows) can be registered multiple times. Normalize to a canonical form (e.g., resolve() then relative_to(project_root) when possible, and consistent slash/case normalization) before storing and comparing.

Copilot uses AI. Check for mistakes.
Comment on lines +359 to +365
# ============================================================================
# PROMPT COMMAND GROUP - Custom LLM Prompts
# ============================================================================

prompt_app = typer.Typer(help="Manage custom LLM prompt files")
app.add_typer(prompt_app, name="prompt")

Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New cgc prompt commands introduce user-facing behavior (config persistence, relative path handling, duplicate detection, missing-file status). Add CLI tests (e.g., Typer CliRunner) to cover prompt add/list/remove and verify .cgc/config.json contents and exit codes.

Copilot uses AI. Check for mistakes.
Comment on lines +134 to +143
def build_system_prompt(project_root: Optional[Path] = None) -> str:
"""
Build the complete system prompt by prepending custom prompt files.

Args:
project_root: Root directory of the project. If None, uses current directory.

Returns:
Complete system prompt with custom prompts prepended to the base prompt.
"""
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

build_system_prompt() now has non-trivial logic (loading config, reading multiple files, ordering/formatting, and fallback behavior on missing/invalid configs). Add unit tests for: no config, config with multiple prompts, missing prompt files (skipped), invalid JSON config (does not crash / no stdout), and preservation of base prompt.

Copilot uses AI. Check for mistakes.
Comment on lines +144 to +150
# Import here to avoid circular dependencies
try:
from codegraphcontext.cli.project_config import get_prompt_file_contents

# Get custom prompt contents
custom_prompts = get_prompt_file_contents(project_root)

Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

build_system_prompt() pulls get_prompt_file_contents from codegraphcontext.cli.project_config, which couples prompt construction (used by the MCP server) to the CLI layer and its Rich console side effects. Consider moving the project config reader into a non-CLI module (or make it side-effect-free) so the server can import it without bringing in CLI concerns or risking stdout writes.

Copilot uses AI. Check for mistakes.
@Shashankss1205
Copy link
Collaborator

I dont understand why is this so complex?

We just needed to add prompts.py file in the config and retrieve it from there. Perhaps there's a confusion?

@DhineshPonnarasan
Copy link
Contributor Author

Thank you for the feedback! I want to make sure I understand the requirement correctly.

What I implemented:

  • CLI commands (cgc prompt add/list/remove) to manage custom prompt files
  • Store paths in .cgc/config.json
  • Load and inject custom prompts into the LLM system prompt

Your feedback suggests a simpler approach. Could you clarify:

  1. What was the original requirement in issue New tool/cli cmd #583? (I want to see the exact issue text to ensure I understand correctly)

  2. What approach did you have in mind? For example:

  • Option A: Store a single path in config → {"custom_prompt_path": "path/to/file.md"}
  • Option B: Store the prompt text directly in config → {"custom_prompt": "text here..."}
  • Option C: Something else?
  1. Should there be CLI commands at all, or should users just manually edit the config file?

I'm happy to simplify the implementation once I understand the expected approach

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants