feat: Add cgc prompt command for custom LLM prompts (#583)#613
feat: Add cgc prompt command for custom LLM prompts (#583)#613DhineshPonnarasan wants to merge 1 commit intoCodeGraphContext:mainfrom
Conversation
…GraphContext#583) - Add new 'cgc prompt' CLI command group with add/list/remove subcommands - Create project_config.py module for managing .cgc/config.json - Store custom prompt paths in project-level config (relative paths) - Implement build_system_prompt() to prepend custom prompts at runtime - Integrate with MCP server initialization to inject custom prompts - Add file validation, duplicate prevention, and missing file handling - Maintain full backward compatibility when no prompts are configured - Use proper logging for runtime warnings Changes: - src/codegraphcontext/cli/project_config.py (new) - src/codegraphcontext/cli/main.py (add prompt command group) - src/codegraphcontext/prompts.py (add build_system_prompt function) - src/codegraphcontext/server.py (use build_system_prompt in initialize) Resolves CodeGraphContext#583
|
@DhineshPonnarasan is attempting to deploy a commit to the shashankss1205's projects Team on Vercel. A member of the Team first needs to authorize it. |
There was a problem hiding this comment.
Pull request overview
Adds support for project-specific “custom system prompts” by introducing a new cgc prompt CLI command group, persisting prompt file registrations in a per-project config, and injecting those prompt files into the MCP server’s system prompt during initialization.
Changes:
- Add
cgc prompt add|list|removecommands to manage registered prompt files. - Introduce
.cgc/config.jsonproject config handling for storing prompt paths and loading file contents. - Update MCP server initialization to use a dynamically-built system prompt that prepends custom prompts.
Reviewed changes
Copilot reviewed 4 out of 5 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
| website/package-lock.json | Lockfile updates (adds peer: true metadata on several entries). |
| src/codegraphcontext/server.py | Switches MCP initialize response to use build_system_prompt() output. |
| src/codegraphcontext/prompts.py | Adds build_system_prompt() to prepend custom prompt file contents to LLM_SYSTEM_PROMPT. |
| src/codegraphcontext/cli/project_config.py | New module for .cgc/config.json management and prompt file registration/loading. |
| src/codegraphcontext/cli/main.py | Adds cgc prompt Typer subcommands wired to project_config. |
Files not reviewed (1)
- website/package-lock.json: Language not supported
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| from rich.console import Console | ||
|
|
||
| console = Console() | ||
| logger = logging.getLogger(__name__) |
There was a problem hiding this comment.
project_config is imported at MCP server init via build_system_prompt(). Using Console() (default stdout) risks writing to stdout (e.g., warnings/errors) and breaking the JSON-RPC protocol, since the server uses stdout for responses. Prefer removing console output from this module (use logger.*) or at minimum configure Console(stderr=True) and ensure all non-JSON output goes to stderr.
| except json.JSONDecodeError as e: | ||
| console.print(f"[yellow]Warning: Invalid JSON in {config_file}: {e}[/yellow]") | ||
| return {"prompts": []} | ||
| except Exception as e: | ||
| console.print(f"[yellow]Warning: Could not load config: {e}[/yellow]") | ||
| return {"prompts": []} |
There was a problem hiding this comment.
These console.print(...) warnings in config loading can end up on stdout when called from the MCP server, corrupting JSON-RPC output. Switch these to logger.warning(...) (optionally with exc_info=True) or print explicitly to stderr.
| # Load current config | ||
| config = load_project_config(project_root) | ||
|
|
||
| # Check for duplicates | ||
| if prompt_path_str in config["prompts"]: | ||
| console.print(f"[yellow]⚠ Prompt file already registered: {prompt_path_str}[/yellow]") | ||
| return False |
There was a problem hiding this comment.
Duplicate prevention is currently string-based (prompt_path_str in config["prompts"]), so paths like skills.md vs ./skills.md (or differing case on Windows) can be registered multiple times. Normalize to a canonical form (e.g., resolve() then relative_to(project_root) when possible, and consistent slash/case normalization) before storing and comparing.
| # ============================================================================ | ||
| # PROMPT COMMAND GROUP - Custom LLM Prompts | ||
| # ============================================================================ | ||
|
|
||
| prompt_app = typer.Typer(help="Manage custom LLM prompt files") | ||
| app.add_typer(prompt_app, name="prompt") | ||
|
|
There was a problem hiding this comment.
New cgc prompt commands introduce user-facing behavior (config persistence, relative path handling, duplicate detection, missing-file status). Add CLI tests (e.g., Typer CliRunner) to cover prompt add/list/remove and verify .cgc/config.json contents and exit codes.
| def build_system_prompt(project_root: Optional[Path] = None) -> str: | ||
| """ | ||
| Build the complete system prompt by prepending custom prompt files. | ||
|
|
||
| Args: | ||
| project_root: Root directory of the project. If None, uses current directory. | ||
|
|
||
| Returns: | ||
| Complete system prompt with custom prompts prepended to the base prompt. | ||
| """ |
There was a problem hiding this comment.
build_system_prompt() now has non-trivial logic (loading config, reading multiple files, ordering/formatting, and fallback behavior on missing/invalid configs). Add unit tests for: no config, config with multiple prompts, missing prompt files (skipped), invalid JSON config (does not crash / no stdout), and preservation of base prompt.
| # Import here to avoid circular dependencies | ||
| try: | ||
| from codegraphcontext.cli.project_config import get_prompt_file_contents | ||
|
|
||
| # Get custom prompt contents | ||
| custom_prompts = get_prompt_file_contents(project_root) | ||
|
|
There was a problem hiding this comment.
build_system_prompt() pulls get_prompt_file_contents from codegraphcontext.cli.project_config, which couples prompt construction (used by the MCP server) to the CLI layer and its Rich console side effects. Consider moving the project config reader into a non-CLI module (or make it side-effect-free) so the server can import it without bringing in CLI concerns or risking stdout writes.
|
I dont understand why is this so complex? We just needed to add prompts.py file in the config and retrieve it from there. Perhaps there's a confusion? |
|
Thank you for the feedback! I want to make sure I understand the requirement correctly. What I implemented:
Your feedback suggests a simpler approach. Could you clarify:
I'm happy to simplify the implementation once I understand the expected approach |
Summary
Implements #583 - Adds a new
cgc promptCLI command that allows users to register custom LLM prompt files that are injected into the system prompt at runtime.Features
cgc promptwith subcommands:cgc prompt add <path>- Register a custom prompt filecgc prompt list- List all registered prompts with statuscgc prompt remove <path>- Unregister a prompt file.cgc/config.jsonImplementation Details
project_config.pymodule for managing.cgc/config.jsonbuild_system_prompt()function to dynamically load and combine promptsTesting
Screenshots: