Replies: 6 comments 34 replies
-
|
I have missed this message. Anyways, I have added experimental support for this now, you can try latest binaries with latest commit. |
Beta Was this translation helpful? Give feedback.
-
|
I benchmarked on windows. Used body size: 5MB (11 chunk) Concat on every call (/base) Concat on final call (/concat) Pre allocated buffer (/allocated) onFullData function (/fullData) |
Beta Was this translation helpful? Give feedback.
-
|
The thing is I could easily make an entirely different JSON parser that works entirely different and basically takes ArrayBuffer, Query and an output buffer. Then you give it basically a "query language" like "[root.a, root.a.b]" then you access the extracted properties as offsets in the output buffer. This would make it essentially free of cost and be faster than any other JSON parser for JS. It would have zero memory allocaations and just give you pointer offsets in the input data (zero copy). It would be basically infinitely faster than JSON.parse that returns a dynamic garbage collected object of string keys copied (holy shit, that sentence makes me puke). Then it would get hype, people would say, nice module it is so fast... but it has a complex interface can we make it return a garbage collected object instead? And then someone wraps it like that, making it worse than JSON.parse that also returns a garbage collected object and now we have negated the entire endeavour and we're back to square 1 where brainlets and hype always erode peak to lowest common denominator. That's basically the summary of uWS.js itself, just on a smaller (imaginary) project. The idea immediately dies because I can already see the end result become as shitty as everything else already is. |
Beta Was this translation helpful? Give feedback.
-
|
By the way, why does "onFullData" rewrite "res->onAborted" call? I guess it is not a pub/sub system, so only one callback is valid. So it rewrites res.onAborted from js |
Beta Was this translation helpful? Give feedback.
-
|
I updated the results again. |
Beta Was this translation helpful? Give feedback.
-
|
There is one more possible improvement, but that needs new (or extended old) functionality. This idea is currently in process of being proposed to TC39, but demonstration tests already show benefits. |
Beta Was this translation helpful? Give feedback.

Uh oh!
There was an error while loading. Please reload this page.
-
To parse the request body, we currently have to use the "onData" function.
While it provides full control over the receiving stream and works well for uploading or client-side streaming, it is overly complicated and less efficient for simpler requests such as form post or json api.
So i propose adding an "onFullData" function that automatically concatenates the received chunks and returns the complete body as an ArrayBuffer to Node.js. The performance benefits:
Function signature could be:
onFullData(maxSize: number, timeout: number, handler: (fullBody: ArrayBuffer) => void) : HttpResponse;On timed out or max size exceeded, handler returns with the partial body.
Beta Was this translation helpful? Give feedback.
All reactions