Conversation
|
The implementation is correct. The Summary of the changes:
Analysis:
This is a clean, well-scoped bug fix. No actionable issues. LGTM |
…o-decode The node:http ClientRequest uses fetch() internally, which auto-decompresses gzip/brotli responses by default. However, it preserves the Content-Encoding header, causing callers that follow the Node.js convention of manually handling Content-Encoding to attempt double decompression. Pass encodeResponseBody: 'manual' to fetch() so raw compressed bytes are delivered to IncomingMessage, matching Node.js http behavior where callers handle decompression themselves.
7a7ffd7 to
d775c19
Compare
Fixes cloudflare/workers-sdk#11169
Summary
encodeResponseBody: "manual"to the internalfetch()call in the Node.jshttpClientRequestso that response bodies are not auto-decompressed by workerd's fetch implementationhttp.ClientRequestdelivers raw compressed bytes with theContent-Encodingheader intact, leaving decompression to the callerProblem
The
node:httpmodule usesfetch()under the hood. When a response hasContent-Encoding: gzip(orbr), workerd'sfetch()automatically decompresses the body but preserves theContent-Encodingheader. This causes a mismatch: callers following the standard Node.js pattern of piping throughzlib.createGunzip()based on the header would attempt to decompress already-decompressed data, resulting in errors or corruption.Fix
workerd's
fetch()supports anencodeResponseBodyoption:"automatic"(default) — decompresses gzip/brotli bodies"manual"— passes through raw compressed bytesSetting
encodeResponseBody: "manual"in theClientRequest's fetch call disables auto-decompression, matching Node.js semantics.Test
Added a test with a sidecar server that returns a gzip-compressed response. The test verifies that:
Content-Encoding: gzipheader is present on the responsezlib.gunzipSync())Note
Tests could not be run locally due to a Bazel registry certificate issue in the dev environment. CI should validate.