Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions doc/api/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,13 +35,13 @@
* [HTTPS](https.md)
* [Inspector](inspector.md)
* [Internationalization](intl.md)
* [Iterable Streams API](stream_iter.md)
* [Modules: CommonJS modules](modules.md)
* [Modules: ECMAScript modules](esm.md)
* [Modules: `node:module` API](module.md)
* [Modules: Packages](packages.md)
* [Modules: TypeScript](typescript.md)
* [Net](net.md)
* [Iterable Streams API](stream_iter.md)
* [OS](os.md)
* [Path](path.md)
* [Performance hooks](perf_hooks.md)
Expand Down Expand Up @@ -71,7 +71,6 @@
* [Web Streams API](webstreams.md)
* [Worker threads](worker_threads.md)
* [Zlib](zlib.md)
* [Zlib Iterable Compression](zlib_iter.md)

<hr class="line"/>

Expand Down
21 changes: 10 additions & 11 deletions doc/api/stream_iter.md
Original file line number Diff line number Diff line change
Expand Up @@ -416,7 +416,7 @@ if (writer.endSync() < 0) await writer.end();
writer.fail(err); // Always synchronous, no fallback needed
```

### `writer.desiredSize`
#### `writer.desiredSize`

* {number|null}

Expand All @@ -425,7 +425,7 @@ Returns `null` if the writer is closed or the consumer has disconnected.

The value is always non-negative.

### `writer.end([options])`
#### `writer.end([options])`

* `options` {Object}
* `signal` {AbortSignal} Cancel just this operation. The signal cancels only
Expand All @@ -434,7 +434,7 @@ The value is always non-negative.

Signal that no more data will be written.

### `writer.endSync()`
#### `writer.endSync()`

* Returns: {number} Total bytes written, or `-1` if the writer is not open.

Expand All @@ -448,7 +448,7 @@ if (result < 0) {
}
```

### `writer.fail(reason)`
#### `writer.fail(reason)`

* `reason` {any}

Expand All @@ -457,7 +457,7 @@ or errored, this is a no-op. Unlike `write()` and `end()`, `fail()` is
unconditionally synchronous because failing a writer is a pure state
transition with no async work to perform.

### `writer.write(chunk[, options])`
#### `writer.write(chunk[, options])`

* `chunk` {Uint8Array|string}
* `options` {Object}
Expand All @@ -467,15 +467,15 @@ transition with no async work to perform.

Write a chunk. The promise resolves when buffer space is available.

### `writer.writeSync(chunk)`
#### `writer.writeSync(chunk)`

* `chunk` {Uint8Array|string}
* Returns: {boolean} `true` if the write was accepted, `false` if the
buffer is full.

Synchronous write. Does not block; returns `false` if backpressure is active.

### `writer.writev(chunks[, options])`
#### `writer.writev(chunks[, options])`

* `chunks` {Uint8Array\[]|string\[]}
* `options` {Object}
Expand All @@ -485,7 +485,7 @@ Synchronous write. Does not block; returns `false` if backpressure is active.

Write multiple chunks as a single batch.

### `writer.writevSync(chunks)`
#### `writer.writevSync(chunks)`

* `chunks` {Uint8Array\[]|string\[]}
* Returns: {boolean} `true` if the write was accepted, `false` if the
Expand Down Expand Up @@ -1422,7 +1422,7 @@ added: v25.9.0

Compression and decompression transforms for use with `pull()`, `pullSync()`,
`pipeTo()`, and `pipeToSync()` are available via the [`node:zlib/iter`][]
module. See the [`node:zlib/iter` documentation][] for details.
module. See the [`node:zlib/iter` documentation][`node:zlib/iter`] for details.

## Classic stream interop

Expand Down Expand Up @@ -2069,8 +2069,7 @@ console.log(textSync(stream)); // 'hello world'
[`bytes()`]: #bytessource-options
[`from()`]: #frominput
[`fromSync()`]: #fromsyncinput
[`node:zlib/iter`]: zlib_iter.md
[`node:zlib/iter` documentation]: zlib_iter.md
[`node:zlib/iter`]: zlib.md#iterable-compression
[`pipeTo()`]: #pipetosource-transforms-writer-options
[`pull()`]: #pullsource-transforms-options
[`pullSync()`]: #pullsyncsource-transforms-options
Expand Down
255 changes: 254 additions & 1 deletion doc/api/zlib.md
Original file line number Diff line number Diff line change
Expand Up @@ -1746,11 +1746,261 @@ added:

Decompress a chunk of data with [`ZstdDecompress`][].

## Iterable Compression

<!-- YAML
added: v25.9.0
-->

> Stability: 1 - Experimental

The `node:zlib/iter` module provides compression and decompression transforms
for use with the [`node:stream/iter`][] iterable streams API.

This module is available only when the `--experimental-stream-iter` CLI flag
is enabled.

Each algorithm has both an async variant (stateful async generator, for use
with [`pull()`][] and [`pipeTo()`][]) and a sync variant (stateful sync
generator, for use with `pullSync()` and `pipeToSync()`).

The async transforms run compression on the libuv threadpool, overlapping
I/O with JavaScript execution. The sync transforms run compression directly
on the main thread.

> Note: The defaults for these transforms are tuned for streaming throughput,
> and differ from the defaults in `node:zlib`. In particular, gzip/deflate
> default to level 4 (not 6) and memLevel 9 (not 8), and Brotli defaults to
> quality 6 (not 11). These choices match common HTTP server configurations
> and provide significantly faster compression with only a small reduction in
> compression ratio. All defaults can be overridden via options.

```mjs
import { from, pull, bytes, text } from 'node:stream/iter';
import { compressGzip, decompressGzip } from 'node:zlib/iter';

// Async round-trip
const compressed = await bytes(pull(from('hello'), compressGzip()));
const original = await text(pull(from(compressed), decompressGzip()));
console.log(original); // 'hello'
```

```cjs
const { from, pull, bytes, text } = require('node:stream/iter');
const { compressGzip, decompressGzip } = require('node:zlib/iter');

async function run() {
const compressed = await bytes(pull(from('hello'), compressGzip()));
const original = await text(pull(from(compressed), decompressGzip()));
console.log(original); // 'hello'
}

run().catch(console.error);
```

```mjs
import { fromSync, pullSync, textSync } from 'node:stream/iter';
import { compressGzipSync, decompressGzipSync } from 'node:zlib/iter';

// Sync round-trip
const compressed = pullSync(fromSync('hello'), compressGzipSync());
const original = textSync(pullSync(compressed, decompressGzipSync()));
console.log(original); // 'hello'
```

```cjs
const { fromSync, pullSync, textSync } = require('node:stream/iter');
const { compressGzipSync, decompressGzipSync } = require('node:zlib/iter');

const compressed = pullSync(fromSync('hello'), compressGzipSync());
const original = textSync(pullSync(compressed, decompressGzipSync()));
console.log(original); // 'hello'
```

### `compressBrotli([options])`

### `compressBrotliSync([options])`

<!-- YAML
added: v25.9.0
-->

* `options` {Object}
* `chunkSize` {number} Output buffer size. **Default:** `65536` (64 KB).
* `params` {Object} Key-value object where keys and values are
`zlib.constants` entries. The most important compressor parameters are:
* `BROTLI_PARAM_MODE` -- `BROTLI_MODE_GENERIC` (default),
`BROTLI_MODE_TEXT`, or `BROTLI_MODE_FONT`.
* `BROTLI_PARAM_QUALITY` -- ranges from `BROTLI_MIN_QUALITY` to
`BROTLI_MAX_QUALITY`. **Default:** `6` (not `BROTLI_DEFAULT_QUALITY`
which is 11). Quality 6 is appropriate for streaming; quality 11 is
intended for offline/build-time compression.
* `BROTLI_PARAM_SIZE_HINT` -- expected input size. **Default:** `0`
(unknown).
* `BROTLI_PARAM_LGWIN` -- window size (log2). **Default:** `20` (1 MB).
The Brotli library default is 22 (4 MB); the reduced default saves
memory without significant compression impact for streaming workloads.
* `BROTLI_PARAM_LGBLOCK` -- input block size (log2).
See the [Brotli compressor options][] in the zlib documentation for the
full list.
* `dictionary` {Buffer|TypedArray|DataView}
* Returns: {Object} A stateful transform.

Create a Brotli compression transform. Output is compatible with
`zlib.brotliDecompress()` and `decompressBrotli()`/`decompressBrotliSync()`.

### `compressDeflate([options])`

### `compressDeflateSync([options])`

<!-- YAML
added: v25.9.0
-->

* `options` {Object}
* `chunkSize` {number} Output buffer size. **Default:** `65536` (64 KB).
* `level` {number} Compression level (`0`-`9`). **Default:** `4`.
* `windowBits` {number} **Default:** `Z_DEFAULT_WINDOWBITS` (15).
* `memLevel` {number} **Default:** `9`.
* `strategy` {number} **Default:** `Z_DEFAULT_STRATEGY`.
* `dictionary` {Buffer|TypedArray|DataView}
* Returns: {Object} A stateful transform.

Create a deflate compression transform. Output is compatible with
`zlib.inflate()` and `decompressDeflate()`/`decompressDeflateSync()`.

### `compressGzip([options])`

### `compressGzipSync([options])`

<!-- YAML
added: v25.9.0
-->

* `options` {Object}
* `chunkSize` {number} Output buffer size. **Default:** `65536` (64 KB).
* `level` {number} Compression level (`0`-`9`). **Default:** `4`.
* `windowBits` {number} **Default:** `Z_DEFAULT_WINDOWBITS` (15).
* `memLevel` {number} **Default:** `9`.
* `strategy` {number} **Default:** `Z_DEFAULT_STRATEGY`.
* `dictionary` {Buffer|TypedArray|DataView}
* Returns: {Object} A stateful transform.

Create a gzip compression transform. Output is compatible with `zlib.gunzip()`
and `decompressGzip()`/`decompressGzipSync()`.

### `compressZstd([options])`

### `compressZstdSync([options])`

<!-- YAML
added: v25.9.0
-->

* `options` {Object}
* `chunkSize` {number} Output buffer size. **Default:** `65536` (64 KB).
* `params` {Object} Key-value object where keys and values are
`zlib.constants` entries. The most important compressor parameters are:
* `ZSTD_c_compressionLevel` -- **Default:** `ZSTD_CLEVEL_DEFAULT` (3).
* `ZSTD_c_checksumFlag` -- generate a checksum. **Default:** `0`.
* `ZSTD_c_strategy` -- compression strategy. Values include
`ZSTD_fast`, `ZSTD_dfast`, `ZSTD_greedy`, `ZSTD_lazy`,
`ZSTD_lazy2`, `ZSTD_btlazy2`, `ZSTD_btopt`, `ZSTD_btultra`,
`ZSTD_btultra2`.
See the [Zstd compressor options][] in the zlib documentation for the
full list.
* `pledgedSrcSize` {number} Expected uncompressed size (optional hint).
* `dictionary` {Buffer|TypedArray|DataView}
* Returns: {Object} A stateful transform.

Create a Zstandard compression transform. Output is compatible with
`zlib.zstdDecompress()` and `decompressZstd()`/`decompressZstdSync()`.

### `decompressBrotli([options])`

### `decompressBrotliSync([options])`

<!-- YAML
added: v25.9.0
-->

* `options` {Object}
* `chunkSize` {number} Output buffer size. **Default:** `65536` (64 KB).
* `params` {Object} Key-value object where keys and values are
`zlib.constants` entries. Available decompressor parameters:
* `BROTLI_DECODER_PARAM_DISABLE_RING_BUFFER_REALLOCATION` -- boolean
flag affecting internal memory allocation.
* `BROTLI_DECODER_PARAM_LARGE_WINDOW` -- boolean flag enabling "Large
Window Brotli" mode (not compatible with [RFC 7932][]).
See the [Brotli decompressor options][] in the zlib documentation for
details.
* `dictionary` {Buffer|TypedArray|DataView}
* Returns: {Object} A stateful transform.

Create a Brotli decompression transform.

### `decompressDeflate([options])`

### `decompressDeflateSync([options])`

<!-- YAML
added: v25.9.0
-->

* `options` {Object}
* `chunkSize` {number} Output buffer size. **Default:** `65536` (64 KB).
* `windowBits` {number} **Default:** `Z_DEFAULT_WINDOWBITS` (15).
* `dictionary` {Buffer|TypedArray|DataView}
* Returns: {Object} A stateful transform.

Create a deflate decompression transform.

### `decompressGzip([options])`

### `decompressGzipSync([options])`

<!-- YAML
added: v25.9.0
-->

* `options` {Object}
* `chunkSize` {number} Output buffer size. **Default:** `65536` (64 KB).
* `windowBits` {number} **Default:** `Z_DEFAULT_WINDOWBITS` (15).
* `dictionary` {Buffer|TypedArray|DataView}
* Returns: {Object} A stateful transform.

Create a gzip decompression transform.

### `decompressZstd([options])`

### `decompressZstdSync([options])`

<!-- YAML
added: v25.9.0
-->

* `options` {Object}
* `chunkSize` {number} Output buffer size. **Default:** `65536` (64 KB).
* `params` {Object} Key-value object where keys and values are
`zlib.constants` entries. Available decompressor parameters:
* `ZSTD_d_windowLogMax` -- maximum window size (log2) the decompressor
will allocate. Limits memory usage against malicious input.
See the [Zstd decompressor options][] in the zlib documentation for
details.
* `dictionary` {Buffer|TypedArray|DataView}
* Returns: {Object} A stateful transform.

Create a Zstandard decompression transform.

[Brotli compressor options]: #compressor-options
[Brotli decompressor options]: #decompressor-options
[Brotli parameters]: #brotli-constants
[Cyclic redundancy check]: https://en.wikipedia.org/wiki/Cyclic_redundancy_check
[Memory usage tuning]: #memory-usage-tuning
[RFC 7932]: https://www.rfc-editor.org/rfc/rfc7932.txt
[RFC 7932]: https://www.rfc-editor.org/rfc/rfc7932.html
[Streams API]: stream.md
[Zstd compressor options]: #compressor-options-1
[Zstd decompressor options]: #decompressor-options-1
[Zstd parameters]: #zstd-constants
[`.flush()`]: #zlibflushkind-callback
[`Accept-Encoding`]: https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.3
Expand All @@ -1769,6 +2019,9 @@ Decompress a chunk of data with [`ZstdDecompress`][].
[`ZstdDecompress`]: #class-zlibzstddecompress
[`buffer.kMaxLength`]: buffer.md#bufferkmaxlength
[`deflateInit2` and `inflateInit2`]: https://zlib.net/manual.html#Advanced
[`node:stream/iter`]: stream_iter.md
[`pipeTo()`]: stream_iter.md#pipetosource-transforms-writer-options
[`pull()`]: stream_iter.md#pullsource-transforms-options
[`stream.Transform`]: stream.md#class-streamtransform
[convenience methods]: #convenience-methods
[zlib documentation]: https://zlib.net/manual.html#Constants
Expand Down
Loading
Loading