fix(cli): write tool output to stdout via fd 1 instead of stderr (#324)#346
Conversation
…shing (#321) - Wrap impact() in try-catch to return structured error JSON instead of process crash (SIGSEGV/exit 139) - Extract core logic to _impactImpl() for clean error boundary - Break out of depth traversal loop on query failure, return partial results collected so far (previously silently swallowed errors) - Add 'partial' flag to response when traversal was interrupted - Add try-catch in CLI impactCommand with structured error output - Improve formatImpactResult to show suggestion text and partial warning - Add 3 new unit tests for error/suggestion/partial scenarios Fixes #321
LadybugDB's native module captures Node.js process.stdout during init, but the underlying OS file descriptor 1 remains intact. By using fs.writeSync(1, ...) we bypass the Node.js stream layer and write directly to the real stdout fd. This restores standard Unix pipeline behavior: gitnexus cypher 'MATCH (n) RETURN n LIMIT 3' | jq . Falls back to stderr if fd write fails (e.g., broken pipe on Windows). Fixes #324
|
@ShunsukeHayashi is attempting to deploy a commit to the NexusCore Team on Vercel. A member of the Team first needs to authorize it. |
|
@claude ultra-think while looking at this PR and find potential bugs and performance issues. |
|
Claude finished @magyargergo's task in 2m 38s —— View job PR Review: fix(cli): write tool output to stdout via fd 1Tasks
This PR is a 2-commit bundle:
Here's my full analysis: 🔴 Bug: Inconsistent stdout fix in
|
| Severity | Issue | File |
|---|---|---|
| 🔴 High | eval-server uses process.stdout.write() for ready signal — same LadybugDB bug as #324 |
eval-server.ts:411 |
| 🔴 High | Stale comment says output goes to stderr | tool.ts:13 |
| 🟡 Med | EPIPE writes full JSON to stderr instead of exiting cleanly | tool.ts:47 |
| 🟡 Med | parseInt without NaN guard on user inputs |
tool.ts:70,115 / eval-server.ts:307 |
| 🟡 Med | No per-request timeout — hung queries block sockets forever | eval-server.ts:333 |
| 🟡 Med | Race condition: new request during idle shutdown hits disconnected backend | eval-server.ts:326 |
| 🟢 Low | text + '\n' double-allocates large string |
tool.ts:46 |
| 🟢 Low | Always pretty-prints JSON even when piped | tool.ts:44 |
| 🟢 Low | formatCypherResult drops extra columns for heterogeneous rows |
eval-server.ts:197 |
The core fix (writeSync(1, ...)) is sound — the approach of bypassing the Node.js stream layer to write directly to fd 1 is correct. The most impactful oversight is not applying the same fix to the GITNEXUS_EVAL_SERVER_READY signal in eval-server.ts, which would cause silent failures in the Docker eval environment this server is designed for.
|
Could you please tackle with this? 🙏 |
…PE handling Per @claude's review: - [RED] Fix stale file header comment in tool.ts (now references fd 1, not stderr) - [RED] Apply same writeSync(1, ...) fix to eval-server.ts READY signal (LadybugDB captures process.stdout, breaking Docker eval harness) - [YELLOW] Handle EPIPE in output() — exit cleanly per Unix convention instead of dumping full JSON to stderr on broken pipe
|
Great review @claude. Addressed the high-severity items: Fixed:
Acknowledged but not in this PR (scope):
Happy to submit separate PRs for the pre-existing issues if you'd like. |
|
Awesome work! Thank you for your contribution! |
CI Report✅ All checks passed Pipeline Status
Test Results
✅ All 2711 tests passed 20 test(s) skipped — expand for detailsIntegration:
Unit:
Code CoverageCombined (Unit + Integration)
Coverage breakdown by test suiteUnit Tests
Integration Tests
📋 View full run · Generated by CI |
|
@claude can you recommend a testing strategy for this PR? I want to see an integration test basically. |
|
Claude finished @magyargergo's task in 1m 33s —— View job Tasks
Integration Test Strategy for PR #346The core behaviors introduced by this PR that need integration coverage are:
The right home for these is Test 1 — Tool output on stdout, not stderrThis is the core regression test. After // Add AFTER the analyze test, so .gitnexus/ already exists
describe('tool output goes to stdout (fd 1)', () => {
// Wrapper so each command runs against the indexed mini-repo
function runToolCmd(args: string[], timeoutMs = 15000) {
return spawnSync(
process.execPath,
['--import', 'tsx', cliEntry, ...args],
{
cwd: MINI_REPO,
encoding: 'utf8',
timeout: timeoutMs,
stdio: ['pipe', 'pipe', 'pipe'],
env: {
...process.env,
NODE_OPTIONS: `${process.env.NODE_OPTIONS || ''} --max-old-space-size=8192`.trim(),
},
},
);
}
it('cypher: JSON appears on stdout, not stderr', () => {
const result = runToolCmd(['cypher', 'MATCH (n) RETURN n.name LIMIT 3']);
if (result.status === null) return; // CI timeout tolerance
expect(result.status).toBe(0);
// stdout must contain valid JSON (array or object)
expect(() => JSON.parse(result.stdout.trim())).not.toThrow();
// stderr must NOT contain JSON — only human-readable diagnostics are allowed there
expect(() => JSON.parse(result.stderr.trim())).toThrow();
});
it('query: JSON appears on stdout, not stderr', () => {
const result = runToolCmd(['query', 'handler']);
if (result.status === null) return;
expect(result.status).toBe(0);
expect(() => JSON.parse(result.stdout.trim())).not.toThrow();
expect(() => JSON.parse(result.stderr.trim())).toThrow();
});
it('impact: JSON appears on stdout, not stderr', () => {
const result = runToolCmd(['impact', 'handleRequest', '--direction', 'upstream']);
if (result.status === null) return;
expect(result.status).toBe(0);
expect(() => JSON.parse(result.stdout.trim())).not.toThrow();
expect(() => JSON.parse(result.stderr.trim())).toThrow();
});
it('stdout is pipeable: cypher output parses as valid JSON', () => {
const result = runToolCmd(['cypher', 'MATCH (n:Function) RETURN n.name LIMIT 5']);
if (result.status === null) return;
expect(result.status).toBe(0);
// Simulate what jq does: parse stdout as JSON
const parsed = JSON.parse(result.stdout.trim());
expect(Array.isArray(parsed) || typeof parsed === 'object').toBe(true);
});
});Why this matters: Before this PR, Test 2 — EPIPE causes clean exit (code 0)This requires import { spawn } from 'child_process';
it('cypher: EPIPE exits with code 0, not stderr dump', (done) => {
// Produce enough output to fill the pipe buffer so EPIPE triggers
const child = spawn(
process.execPath,
['--import', 'tsx', cliEntry, 'cypher', 'MATCH (n) RETURN n LIMIT 500'],
{
cwd: MINI_REPO,
stdio: ['ignore', 'pipe', 'pipe'],
env: {
...process.env,
NODE_OPTIONS: `${process.env.NODE_OPTIONS || ''} --max-old-space-size=8192`.trim(),
},
},
);
let stderrOutput = '';
child.stderr.on('data', (chunk) => { stderrOutput += chunk.toString(); });
// Destroy stdout immediately — this simulates `| head -0` (consumer closes early)
child.stdout.once('data', () => {
child.stdout.destroy(); // triggers EPIPE on the next write
});
child.on('close', (code) => {
// Clean EPIPE exit: code 0, not a stderr dump of the full JSON
expect(code).toBe(0);
// No JSON payload should appear on stderr
expect(() => JSON.parse(stderrOutput.trim())).toThrow();
done();
});
}, 20000);Why this matters: Before the EPIPE fix, the full JSON payload would be dumped to stderr when the pipe consumer closed early (e.g., Test 3 — eval-server READY signal on stdoutimport { spawn } from 'child_process';
it('eval-server: READY signal appears on stdout', (done) => {
const child = spawn(
process.execPath,
['--import', 'tsx', cliEntry, 'eval-server', '--port', '0', '--idle-timeout', '2'],
{
cwd: MINI_REPO,
stdio: ['ignore', 'pipe', 'pipe'],
env: {
...process.env,
NODE_OPTIONS: `${process.env.NODE_OPTIONS || ''} --max-old-space-size=8192`.trim(),
},
},
);
let stdoutBuffer = '';
child.stdout.on('data', (chunk) => {
stdoutBuffer += chunk.toString();
if (stdoutBuffer.includes('GITNEXUS_EVAL_SERVER_READY:')) {
// Signal arrived on stdout ✅
child.kill('SIGTERM');
done();
}
});
child.stderr.on('data', (chunk) => {
const text = chunk.toString();
// READY signal must NOT appear on stderr
if (text.includes('GITNEXUS_EVAL_SERVER_READY:')) {
child.kill('SIGTERM');
done(new Error('READY signal appeared on stderr instead of stdout'));
}
});
child.on('close', () => {
// If child exited without emitting the signal, fail
if (!stdoutBuffer.includes('GITNEXUS_EVAL_SERVER_READY:')) {
done(new Error('eval-server exited without emitting READY signal on stdout'));
}
});
}, 30000);Note: This test depends on the mini-repo being indexed (requires the analyze test to run first, or a pre-indexed Suggested placementAll three tests belong in If you'd like me to open a follow-up commit implementing these tests directly, just say the word. |
|
Could you please add these extra tests? 🙏 |
Per @magyargergo's request via @claude's testing strategy: 1. Tool output on stdout (4 tests): - cypher: JSON on stdout, not stderr - query: JSON on stdout - impact: JSON on stdout - pipeable: parsed as valid JSON 2. EPIPE clean exit (1 test): - Destroys stdout pipe mid-write, verifies exit code 0 - No JSON dump on stderr 3. eval-server READY signal (1 test): - GITNEXUS_EVAL_SERVER_READY appears on stdout - Does NOT appear on stderr All tests gracefully handle CI timeouts (accept null exit code).
|
Added all 3 integration test categories per @claude's testing strategy: 1. Tool output on stdout (4 tests)
2. EPIPE clean exit (1 test)
3. eval-server READY signal (1 test)
All tests gracefully handle CI timeouts ( |
Problem
The
gitnexus cypher(and all other CLI tool commands) output JSON to stderr instead of stdout (#324). This breaks standard Unix pipeline conventions:This affects everyone building scripts or automation around GitNexus CLI.
Root Cause
The
output()function intool.tsusesprocess.stderr.write()because LadybugDB's native module captures Node.jsprocess.stdoutat the OS level during initialization. The comment in the code explicitly notes this:Fix
Use
fs.writeSync(1, ...)to write directly to file descriptor 1 (stdout) at the OS level, bypassing Node.js'sprocess.stdoutstream which LadybugDB has captured.Key insight: LadybugDB captures the Node.js stream layer (
process.stdout), but the underlying OS file descriptor 1 remains intact. By writing to fd 1 directly, we restore standard pipeline behavior.After this fix:
Backward compatibility
writeSync(1, ...)fails (e.g., broken pipe, Windows edge cases), falls back to stderr (existing behavior)console.error()Testing
Verified locally on macOS (Apple Silicon) with GitNexus 1.4.5 across 25 indexed repositories.
Context
We've been running GitNexus in production and built gitnexus-stable-ops as an operational toolkit. Our
gniwrapper script has been using2>&1as a workaround for this issue — this PR fixes it at the source.Related: #344