fix: fix: profile_write_journal passes --max-tokens which local llama claude CLI rejects (#335) #338
No reviewers
Labels
No labels
action
backlog
blocked
bug-report
in-progress
prediction/actioned
prediction/dismissed
prediction/unreviewed
priority
tech-debt
underspecified
vision
No milestone
No project
No assignees
2 participants
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference: disinto-admin/disinto#338
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "fix/issue-335"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Fixes #335
Changes
AI Review
Summary\n\nRemoves
--max-tokensfrom twoclaude -pinvocations inprofile_write_journalandprofile_digest_journals. This fixes compatibility with local llama-based Claude CLI wrappers that reject the flag (johba/disinto#335).\n\n### Review\n\n- infolib/formula-session.sh:253:profile_digest_journals—--max-tokens 1000removed. The prompt already instructs the model to rewrite the lessons file within a reasonable scope; no hard cap needed.\n- infolib/formula-session.sh:447:profile_write_journal—--max-tokens 500removed. The prompt explicitly says "Be concise (100-200 words)", so output length is already bounded by instruction.\n\nNo security, logic, or correctness issues. The fix is minimal and targeted.Verdict
APPROVE — Correctly removes unsupported --max-tokens flag; prompts already constrain output length
Reviewed at
52294a2| AGENTS.mdAI Review: APPROVE — Correctly removes unsupported --max-tokens flag; prompts already constrain output length