3.3 KiB
3.3 KiB
Pierre Bot
Pierre Bot is an intelligent, AI-powered code review assistant designed for Bitbucket Server/Data Center. It automates the initial pass of code review by analyzing Pull Request diffs and identifying potential bugs, logic errors, and style issues using modern LLMs (Google Gemini 2.0 Flash or Ollama).
Project Overview
- Type: Go CLI Application
- Core Function: Fetches PR diffs from Bitbucket -> Sends to LLM -> Prints structured review comments.
- Key Technologies:
- Language: Go (1.25+)
- AI SDKs:
google/generative-ai-go,ollama/ollama - CLI Framework:
alecthomas/kong
Architecture
The project follows a standard Go project layout:
cmd/pierre/: Contains themain.goentry point. It handles configuration parsing (flags, env vars, file), initializes adapters, and orchestrates the application flow.internal/pierre/: Contains the core business logic.judge.go: Defines theJudgePRfunction which prepares the system prompt and context for the LLM.
internal/chatter/: Abstraction layer for LLM providers.gemini.go: Implements theChatAdapterinterface for Google Gemini. notably includes dynamic JSON schema generation via reflection (schemaFromType) to enforce structured output from the model.ollama.go: Implements theChatAdapterfor Ollama (local models).
internal/gitadapters/: Abstraction for Version Control Systems.bitbucket.go: Client for fetching PR diffs from Bitbucket Server.
Building and Running
Prerequisites
- Go 1.25 or later
- Access to a Bitbucket Server instance
- API Key for Google Gemini (or a running Ollama instance)
Build
go build -o pierre ./cmd/pierre/main.go
Configuration
Configuration is handled via kong and supports a hierarchy: Flags > Env Vars > Config File.
1. Environment Variables:
BITBUCKET_URL: Base URL of the Bitbucket instance.BITBUCKET_TOKEN: Personal Access Token (HTTP) for Bitbucket.LLM_PROVIDER:geminiorollama.LLM_API_KEY: API Key for Gemini.LLM_MODEL: Model name (e.g.,gemini-2.0-flash).
2. Configuration File (config.yaml):
See config.example.yaml for a template. Place it in the current directory or ~/.pierre.yaml.
Usage
Run the bot against a specific Pull Request:
# Syntax: ./pierre [flags] <PROJECT_KEY> <REPO_SLUG> <PR_ID>
./pierre --llm-provider=gemini --llm-model=gemini-2.0-flash MYPROJ my-repo 123
Development Conventions
- Structured Output: The bot relies on the LLM returning valid JSON matching the
Commentstruct. This is enforced ininternal/chatter/gemini.goby converting the Go struct definition into agenai.Schema. - Dependency Injection: Adapters (
gitadapters,chatter) are initialized inmainand passed to the core logic, making testing easier. - Error Handling: strict error checks are preferred; the bot will exit if it cannot fetch the diff or initialize the AI.
Key Files
cmd/pierre/main.go: Application entry point and config wiring.internal/pierre/judge.go: The "brain" that constructs the prompt for the AI.internal/chatter/gemini.go: Gemini integration logic, including the reflection-based schema generator.config.example.yaml: Reference configuration file.