Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parsing socket #255

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open

Conversation

sirenkovladd
Copy link
Contributor

migrate to class syntax (Chain, request, response)
add a mock socket class (parse headers, body, trailers)
change emitting and subscription

this is the sequel to #252

bun.js support
update usage of deprecated attribute

some test was changed due to unexpected behavior
the main tests were tested with

http.createServer(inj).listen(3000, () => {
  const r = http.request({ port: 3000, method: 'POST' }, (res) => cb(null, res)).on('error', cb)
  payload && payload.pipe(r)
  r.end()
});

Checklist

@sirenkovladd
Copy link
Contributor Author

It takes some time to optimize
but generally open to discussion

@sirenkovladd
Copy link
Contributor Author

sirenkovladd commented Sep 3, 2023

light-my-request on  parsing-socket [!+⇕] is 📦 v5.10.0 via  v20.5.0 on ☁️  (us-west-2) took 5s 
❯ npm run benchmark

> [email protected] benchmark
> node benchmark/benchmark.js

Request x 160,977 ops/sec ±3.86% (80 runs sampled)
Custom Request x 14,371 ops/sec ±4.12% (81 runs sampled)
Request With Cookies x 136,650 ops/sec ±1.94% (87 runs sampled)
Request With Cookies n payload x 134,341 ops/sec ±2.71% (81 runs sampled)
ParseUrl x 1,207,543 ops/sec ±1.39% (87 runs sampled)
ParseUrl and query x 114,550 ops/sec ±2.00% (90 runs sampled)
light-my-request on  master [!⇕] is 📦 v5.10.0 via  v20.5.0 on ☁️  (us-west-2) took 11s 
❯ npm run benchmark

> [email protected] benchmark
> node benchmark/benchmark.js

Request x 267,087 ops/sec ±14.67% (72 runs sampled)
Custom Request x 21,130 ops/sec ±3.09% (85 runs sampled)
Request With Cookies x 203,067 ops/sec ±7.45% (74 runs sampled)
Request With Cookies n payload x 205,931 ops/sec ±2.17% (83 runs sampled)
ParseUrl x 1,168,402 ops/sec ±5.19% (80 runs sampled)
ParseUrl and query x 141,309 ops/sec ±2.21% (94 runs sampled)

Copy link
Member

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There seems to be quite a significant performance regression with this PR.

I think it's due to the addition of the internal Socket, so I think this might be a no-go.

I wouldn't also claim bun support without the tests running on Bun too.

Copy link
Member

@Eomm Eomm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is hard to find the old code and the new code feature.

I looked at the commits to follow the step-by-step process, but 2 commits did not help.

I really suggest splitting this PR into smaller pieces

@@ -14,7 +14,7 @@ declare namespace inject {

export type DispatchFunc = http.RequestListener

export type CallbackFunc = (err: Error, response: Response) => void
export type CallbackFunc = (err: Error | undefined, response: Response | undefined) => void
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could be a dedicated PR

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment on lines +753 to +754
t.equal(err instanceof inject.errors.ContentLength, true)
t.error(res)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test says can override stream payload content-length header - are we blocking this feature now?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes

  1. it is not supported with test example http.Server/http.request
  2. I can't call the Dispatch function unless I read everything from Readable

@@ -806,14 +831,14 @@ test('simulates split', (t) => {
})
})

test('simulates error', (t) => {
t.skip('simulates error', (t) => {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
t.skip('simulates error', (t) => {
test('simulates error', (t) => {

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was done on purpose as it could not be reproduced with a real server and request
maybe you have ideas how to do it?


const errorMessage = 'The dispatch function has already been invoked'

class Chain {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This refactor could be another PR - smaller PRs helps to get the code merged faster

@mcollina
Copy link
Member

mcollina commented Sep 3, 2023

I don't think we agreed to #252, this PR seems a bit premature too.

@sirenkovladd
Copy link
Contributor Author

I don't think we agreed to #252, this PR seems a bit premature too.

Yes, so I made a another PR with planning for the long future

@sirenkovladd
Copy link
Contributor Author

This PR will simply allow it to be more native, with support for most events and most attribute, methods
that are currently available in native request Node but not in light-my-request

that's why I used extends Socket and IncomingMessage

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants