Skip to content

fix: prevent double decompression in node:http by disabling fetch auto-decode#6289

Merged
anonrig merged 2 commits intomainfrom
pbacondarwin/fix-http-node-compat-decompression
Mar 10, 2026
Merged

fix: prevent double decompression in node:http by disabling fetch auto-decode#6289
anonrig merged 2 commits intomainfrom
pbacondarwin/fix-http-node-compat-decompression

Conversation

@petebacondarwin
Copy link
Contributor

@petebacondarwin petebacondarwin commented Mar 10, 2026

Fixes cloudflare/workers-sdk#11169

Summary

  • Pass encodeResponseBody: "manual" to the internal fetch() call in the Node.js http ClientRequest so that response bodies are not auto-decompressed by workerd's fetch implementation
  • This matches Node.js behavior where http.ClientRequest delivers raw compressed bytes with the Content-Encoding header intact, leaving decompression to the caller

Problem

The node:http module uses fetch() under the hood. When a response has Content-Encoding: gzip (or br), workerd's fetch() automatically decompresses the body but preserves the Content-Encoding header. This causes a mismatch: callers following the standard Node.js pattern of piping through zlib.createGunzip() based on the header would attempt to decompress already-decompressed data, resulting in errors or corruption.

Fix

workerd's fetch() supports an encodeResponseBody option:

  • "automatic" (default) — decompresses gzip/brotli bodies
  • "manual" — passes through raw compressed bytes

Setting encodeResponseBody: "manual" in the ClientRequest's fetch call disables auto-decompression, matching Node.js semantics.

Test

Added a test with a sidecar server that returns a gzip-compressed response. The test verifies that:

  1. The Content-Encoding: gzip header is present on the response
  2. The body is still compressed (can be manually decompressed with zlib.gunzipSync())

Note

Tests could not be run locally due to a Bazel registry certificate issue in the dev environment. CI should validate.

@ask-bonk
Copy link
Contributor

ask-bonk bot commented Mar 10, 2026

The implementation is correct. The encodeResponseBody: "manual" option is well-supported in workerd's fetch. Let me review the complete picture.

Summary of the changes:

  1. Core fix (internal_http_client.ts): Adds encodeResponseBody: 'manual' to the fetch call, preventing auto-decompression. This correctly matches Node.js http.ClientRequest behavior. The type cast as unknown as RequestInit is a necessary workaround since the @types/node RequestInit doesn't include workerd-specific properties.

  2. Test sidecar (http-client-nodejs-server.js): Adds a gzip server that returns pre-compressed data with proper Content-Encoding: gzip header.

  3. Test (http-client-nodejs-test.js): Verifies the Content-Encoding header is preserved and the body is still compressed (can be manually gunzipped).

  4. Build/config (BUILD.bazel, .wd-test): Properly adds the GZIP_SERVER_PORT binding.

Analysis:

  • Backward compatibility: The node:http module is gated behind enable_nodejs_http_modules (implied by nodeJsCompat after 2025-08-15). This is a bug fix aligning with Node.js semantics, so no additional compat flag is needed. Users relying on the old (incorrect) behavior would be doing something wrong anyway.
  • Security: No concerns -- this just changes encoding behavior on the client side.
  • API correctness: Matches Node.js behavior where http.ClientRequest delivers raw bytes.
  • Test quality: Good test that verifies both header preservation and body compression. Could be slightly improved with an error handler on the request, but this matches existing test patterns in the file.

This is a clean, well-scoped bug fix. No actionable issues.

LGTM

github run

@petebacondarwin petebacondarwin marked this pull request as ready for review March 10, 2026 18:02
@petebacondarwin petebacondarwin requested review from a team as code owners March 10, 2026 18:02
…o-decode

The node:http ClientRequest uses fetch() internally, which auto-decompresses
gzip/brotli responses by default. However, it preserves the Content-Encoding
header, causing callers that follow the Node.js convention of manually handling
Content-Encoding to attempt double decompression.

Pass encodeResponseBody: 'manual' to fetch() so raw compressed bytes are
delivered to IncomingMessage, matching Node.js http behavior where callers
handle decompression themselves.
@petebacondarwin petebacondarwin force-pushed the pbacondarwin/fix-http-node-compat-decompression branch from 7a7ffd7 to d775c19 Compare March 10, 2026 18:09
@anonrig anonrig enabled auto-merge (squash) March 10, 2026 19:22
@anonrig anonrig merged commit a629fd2 into main Mar 10, 2026
39 of 40 checks passed
@anonrig anonrig deleted the pbacondarwin/fix-http-node-compat-decompression branch March 10, 2026 19:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

axios doesn't work with enable_nodejs_http_modules if axios is in another chunk

3 participants