Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions lib/buffer.js
Original file line number Diff line number Diff line change
Expand Up @@ -593,6 +593,15 @@ Buffer.concat = function concat(list, length) {
validateOffset(length, 'length');
}

// If there is only one FastBuffer, rotate without copying
if (
list.length === 1 &&
list[0] instanceof FastBuffer &&
list[0].length === length
) {
return Buffer.from(list[0]);
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If list[0] is a TypedArray (like another Buffer) then this still copies.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I’m wondering if it makes sense to return list[0] directly. I understand this avoids copying, so I guess I should update the test accordingly?
https://github.com/nodejs/node/blob/main/test/parallel/test-buffer-concat.js#L43

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Buffer.concat should always return a new buffer object:

node/doc/api/buffer.md

Lines 1033 to 1034 in 91d2400

Returns a new `Buffer` which is the result of concatenating all the `Buffer`
instances in the `list` together.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As I understand it, at the end of the day we should always create a buffer.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will try an optimization in the inner regions, I am now closing this pr, thank you very much for your comments

}

const buffer = Buffer.allocUnsafe(length);
let pos = 0;
for (let i = 0; i < list.length; i++) {
Expand Down
12 changes: 12 additions & 0 deletions test/parallel/test-buffer-concat.js
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,18 @@ assert.notStrictEqual(flatOne, one[0]);
assert.strictEqual(flatLong.toString(), check);
assert.strictEqual(flatLongLen.toString(), check);

{
const original = Buffer.from('fast');
const result = Buffer.concat([original], original.length);

const msgRef =
'Buffer.concat should return a new Buffer instance even with single FastBuffer';
const msgEq = 'Buffer.concat output should equal original content';

assert.notStrictEqual(result, original, msgRef);
assert.deepStrictEqual(result, original, msgEq);
}

[undefined, null, Buffer.from('hello')].forEach((value) => {
assert.throws(() => {
Buffer.concat(value);
Expand Down
Loading