The condition for posting comments is inverted; it adds comments when disableComments is true. Change if s.disableComments { to if !s.disableComments {. (Reason: The code adds VCS comments only when s.disableComments is true, which contradicts the flag's purpose; flipping the condition fixes the logic.) (Generated by: OpenAI (openai/gpt-oss-120b))
Update the comment above the block to reflect that comments are posted only when not in dry‑run mode. (Reason: The comment correctly identifies that the existing comment is misleading—the code posts comments only when disableComments (dry‑run) is true, so the comment should be updated to reflect posting occurs when not in dry‑run mode.) (Generated by: OpenAI (openai/gpt-oss-120b))
cfg.Review.MaxChunkChars is an int that may be zero if the user omits the flag. The service correctly falls back to the default (60000) inside judgePR, but you could also enforce the default earlier (e.g. in the config struct default tag or after flag parsing) to avoid passing a zero value downstream. (Generated by: OpenAI (openai/gpt-oss-120b))
The comment that builds baseSystem concatenates the dynamically created guidelinesText directly inside a raw string literal. This makes the final prompt contain a leading tab on every line, which the LLM will treat as part of the instruction. Consider using a plain string without the leading indentation or run strings.TrimSpace on the final prompt. (Generated by: OpenAI (openai/gpt-oss-120b))
New now takes the extra parameters maxChunkSize and guidelines. Ensure all call‑sites (currently only cmd/pierre/main.go) are updated accordingly. Consider adding a comment that documents the meaning of maxChunkSize (bytes) and that a non‑positive value triggers the built‑in default. (Generated by: OpenAI (openai/gpt-oss-120b))
splitDiffIntoChunks may split a diff in the middle of a line (e.g. when a single line exceeds maxSize). Breaking a diff line arbitrarily can corrupt the unified‑diff syntax and confuse the model. Prefer to split only on line boundaries – e.g. split the input with strings.SplitAfter(diff, "\n") and then build chunks while staying under the size limit. (Generated by: OpenAI (openai/gpt-oss-120b))
The error message now correctly uses response.StatusCode. No further changes needed. (Generated by: OpenAI (openai/gpt-oss-120b))
The test TestSplitDiffIntoChunks_LargeSingleFile relies on an exact string equality after re‑joining the chunks. Because of the newline‑loss bug described above the test will currently fail. Once the split function preserves delimiters, the test will pass; otherwise consider comparing the recombined diff with the original using cmp.Diff only (as you already do) rather than a strict equality check. (Generated by: OpenAI (openai/gpt-oss-120b))
You added github.com/google/go-cmp v0.7.0 as a direct dependency – good for the new tests. Remember to run go mod tidy so that the go.sum is updated accordingly. (Generated by: OpenAI (openai/gpt-oss-120b))
The splitDiffIntoChunks implementation drops the newline that separates two files when it splits on "\ndiff --git ". After the split the code re‑adds the prefix with "diff --git " + part, which loses the leading line‑break. When the chunks are concatenated the original diff is no longer identical, causing the TestSplitDiffIntoChunks_LargeSingleFile test to fail. Fix by adding the missing newline, e.g. seg = "\n" + "diff --git " + part (or keep the delimiter when splitting with strings.SplitAfter). (Generated by: OpenAI (openai/gpt-oss-120b))