-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
perf: Reduce object generation and optimize performance. #95
Conversation
with 5d8da3b |
Wow, ultra fast! I'll review this later. @tangye1234, what are your thoughts on this? Are there any problems? |
src/listener.ts
Outdated
get body() { | ||
return this.getRequestCache().body | ||
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This would make a huge difference for devs migrating from Express to Hono in the next few weeks. 🚀
Would it be possible to also support a built-in middleware such as body-parser (or a how-to guide) via config? cc @yusukebe
Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @rafaell-lycan!
Thanks for your comment.
Do you think this will cause problems? If so, please let us know what specific issues you see.
This would make a huge difference for devs migrating from Express to Hono in the next few weeks. 🚀
Sorry, I added this after the review started. I would also like to include the following commits. In hono-base.ts, there is an https://github.com/honojs/hono/blob/864fae668c7505e523cfe9d57e4e0778ddaa2fe5/src/hono-base.ts#L302 When a This produced the following results in my environment |
In modern JavaScript, Object.setPrototypeOf is said to perform poorly. * https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/create * https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/setPrototypeOf
Incidentally, if we proceed with this implementation, I would like to merge the following two additional PRs after this |
After extensive testing, I found that the The story is, in the past we deliberated over whether to use @remix-run/fetch or nodejs native fetch. Nodejs undici seemed to be heavy, slower than @remix-run/fetch, and it creates more internal objects which kills the performance. However the performance differences were marginal. We deemed that a more conventional implementation would be neat and compatible with more cases. Hence we decided to stick with the native fetch. Now, we see a more BRALLIANT implementation is taking place here. Personally I am not sure whether we should workaround it with this PR, By this, I mean, should we make extensive efforts to avoid using the native Request / Response constructor, aiming at incredibly high speeds for simple use cases like a hello world benchmark? This could disrupt the simplicity of the project, as sometimes magic is just that - magic. The original purpose behind this PR was to shed off the conversion between nodejs and web standards, aiming to employ the simplest methods possible. Maybe yes, it’s appropriate to leverage this concept as @yusukebe, the decision lies with you, I think it has reached a stage where there's not much coding work left, it’s more about the direction you want to take. |
Super thanks for your great work. I'll consider this deeply in this weekend. |
Hi, @tangye1234! Thanks for your thoughtful and detailed review. I fully agree with your comments. (I also learned a lot from the comments about the Node.js Response object submitted before that.) While there is no doubt that this PR will achieve a very significant speedup on small benchmarks, it will compromise the simplicity of the implementation. Whether we should focus our efforts there or not is a matter of judgment. For my part, even if this PR were to be merged, it is the effort of the node-server implementation up to this point that makes the result viable. If the bodyInit is not a string or ReadableStream, this PR’s Response will fall back to the Node.js Response object, which requires the existing implementation. As an aside, this is similar to the relationship between Hono's RegExpRouter and TrieRouter. Where RegExpRouter, which is complex and huge in implementation but very fast, cannot handle some routing, TrieRouter, a straightforward implementation, can take it as a fallback destination. Anyway, I have no objection to leaving it to the decision of @yusukebe. Best regards! |
P.S. If someone asks me if this PR is a "cheat against benchmark scripts?" I would answer "No" to that question. This is because most requests expected to be accelerated in a website production environment will likely be GET requests that return either a string or a ReadableStream. Since this PR is an optimization for those (which is annoying considering GraphQL), it will also contribute to performance gains for the production environment (although the effect will be more minor as the response size increases). RegExpRouter performs amazingly well in the benchmark scripts. Still, it is also faster in the production environment for "lots of handler registrations" and "lots of middleware registrations" than other routers. These characteristics are similar to this case. In short, the content of this PR should produce results consistent with Hono's policy of "speedy in benchmark scripts, but also fast in production environments. |
Let's make Hono one of the fastest frameworks for Node.js!For Hono, being fast and small are most important things. So, let's work on making Hono the fastest framework for Node.js! I've deeply considered the PR by @usualoma and comments by @tangye1234, and after running the app with the Node.js Adapter with this PR, I've decided to proceed with it. The reason is simple: it makes Hono faster. My investigation revealed that the implementation isn't overly complex or "magical". The logic is simple: it makes operations that aren't immediately necessary lazy. This approach is already applied in some parts of Hono's core. How much faster will Hono be with this PR?I'm comparing it with other frameworks using the scripts:
The scripts you can use: https://gist.github.com/yusukebe/895f2ff55014c10f3e249a87e592ca49 One test is accessing According to these tests, Hono is either the fastest or the second behind Fastify. We need to document these results, so I will do that. Next ActionsCould you run the benchmarks and look for more tuning points? I'll review and comment later. Your contribution so far is appreciated. I hope you will continue to do so. Thank you. |
src/listener.ts
Outdated
outgoing: ServerResponse | Http2ServerResponse | ||
) => { | ||
try { | ||
res = await res |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we make it does not await
if it's not necessary?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@usualoma Ah, you already did it!
https://github.com/usualoma/node-server/pull/2/files
__body: body, | ||
__init: init, | ||
__cache: [body, (init?.headers || {}) as Record<string, string>], | ||
}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hono does not return a Content-Type
header when returning a response with a metho like c.text('hi')
.
We have to set Content-Type
as text/plain;charset=UTF-8
if it's not set, at here for somewhere.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
PS. As we have discussed, being fast in benchmarks is not the same as being fast in real world. But being fast in benchmarks is good metrics, and as @usualoma said, in many cases this improvement will improve performance in the real world. |
Thank you. I will now merge the following and do some refactoring I will ask for a re-review here when I am done. |
perf: Response via cache also for a readable steam or a promise.
refactor: Rewrite to a more preferred style of JavaScript
…st.ts and response.ts.
If you're interested, you might want to check out this repository: https://github.com/the-benchmarker/web-frameworks This is another popular benchmark. Hono is tested on Bun, but the results show it as slow, because it might be running with a single thread for Bun. All frameworks for Bun don’t seem to perform well in this benchmark. I think Hono has the potential to be the fastest among JavaScript frameworks. |
The/json performance is so good with fastify because the serializer is prepared in advance. This is indeed a good idea. The npm that builds the serializer can be found here, which will also speed up Hono's /json. (It's already fast, though, and it isn't easy to stabilize the benchmark) I have seen improved performance with the following code import fastJson from 'fast-json-stringify'
import { serve } from '../node-server/dist/index.mjs'
import { Hono } from './dist/index.js'
const stringify = fastJson({
type: 'object',
properties: {
hello: {
type: 'string'
}
}
})
const app = new Hono()
app.get('/', (c) => c.text('Hi'))
app.get('/json', (c) =>
new Response(stringify({ hello: 'world' }), {
headers: {
'content-type': 'application/json; charset=UTF-8'
}
})
)
const port = 3001
console.log(port)
serve({
fetch: app.fetch,
port
}) Creating a custom serializer is not that difficult, and if it is a straightforward object, it can be implemented as follows, making it faster. However, supporting all kinds of schemas would be a big job. import { serve } from '../node-server/dist/index.mjs'
import { Hono } from './dist/index.js'
const jsonEscapeRe = /["\\\n\r]/g
const isJsonEscapeRequired = (str) => jsonEscapeRe.test(str)
const buildSerializer = (schema) => {
const list = []
const keys = []
for (const [k, v] of Object.entries(schema.properties)) {
if (v.type === 'string') {
list.push(`"${k}":"`)
keys.push(k)
}
}
list[0] = '{' + list[0]
list.push('"}')
return (data) => {
let res = list[0]
for (let i = 0; i < keys.length; i++) {
const value = data[keys[i]]
res += (isJsonEscapeRequired(value) ? JSON.stringify(value).replace(/^"|"$/g, '') : value) + list[i + 1]
}
return res
}
}
const serializer = buildSerializer(
{
type: 'object',
properties: {
hello: {
type: 'string'
}
}
}
)
const app = new Hono()
app.get('/', (c) => c.text('Hi'))
app.get('/json', (c) =>
new Response(serializer({ hello: 'world' }), {
headers: {
'content-type': 'application/json; charset=UTF-8'
}
})
)
const port = 3001
console.log(port)
serve({
fetch: app.fetch,
port
}) Considering the possibility of introducing this to Hono, it would be nice to be associated with jsonT or replaced somehow. However, it sounds like a challenging task. |
My work is done. Would you please review this? Although only some comments were added, 661a123 made the Request and Response definitions into independent files for better visibility. As for the tests, I have added some for the Request and Response fallback pattern. The existing tests will cover the others. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Awesome work!
__body: body, | ||
__init: init, | ||
__cache: [body, (init?.headers || {}) as Record<string, string>], | ||
}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
const app = new Hono() | ||
|
||
app.get('/json-blob', async () => { | ||
return new Response(new Blob([JSON.stringify({ foo: 'blob' })]), { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice tests!
Sorry to ping you again, but could you review this? If I don't hear back from you soon, I'll go ahead and merge it. |
I am testing an implementation, which can fully share the pros of the PR, but more cleaner, while also can satisfy these cases: // response name not changed
Response.name === 'Response'
// response prototype chain not changed
new Response() instanceof (await fetch('https://www.google.com')).constructor
// can only use new operator
expect(Response).toThows()
// support Response static method
typeof Response.error === 'function'
typeof Response.json === 'function'
typeof Response.redirect === 'function'
// support other class to extends from Response
class NextResponse extends Response {}
// support ReadableStream / URLSearchParams / string / ArrayBuffer quick proxy(without constructing real Response) But I don't know when it could be finished. |
Since `Response.json` is not defined in 4.8.3.
9133851
to
2fae062
Compare
I also did the same tests, completely depends on your branch. https://github.com/tangye1234/honojs-node-server/tree/perf/proxy-object I did not PR to here, cause I am respecting @usualoma's performant and beautiful job. In my code snippets, there are 3 important things which should be clear:
Additionally, I did some work to support the stream / arraybuffer body and also support getting the native NodeJS Response state for a more directly proxy. And I have resurved static methods for Response (json / redirect / error). The job may not be completely over, I test this implementation which is slightly slower than @usualoma 's, I guess it is due to the complex code work. Speed tests: |
Thanks, @tangye1234, In this case, I prefer @usualoma's implementation (I think it's because I am familiar with reading @usualoma's code). Therefore, I'd like to proceed with this PR. |
@tangye1234 I have confirmed your branch.
Yes, indeed, my branch was very messy in its implementation in this regard. It was fixed in e0c57d2. It would also be reasonable to override global.Request via globals.ts. It was fixed in 5e28a91. However, this is a jest issue, since I think getResponseInternalBody is an exciting implementation, it would help speed things up, reduce conditional branching in the current part of the following, and I think it is very attractive. However, since it depends on the internal implementation of Node.js, it is debatable whether this should be introduced to node-server, so it is better to make it another PR. Another PR is better because it will be clear that it is the result of @tangye1234 As for me, I would like to consider performance first here, so I think the content of this PR is the fastest at the moment, as indicated by #95 (comment), so I would like to merge current HEAD. After that, improving with additional PRs as needed would be good. |
There is an important case you may missed. const app = new Hono()
app.get('/', c => {
const isRealRequest = c.req.raw instanceof Request
// should return true
return c.json(isRealRequest)
}) In the listener, the incoming should be converted to a Request, which should be an instance of real nodejs Request. |
Thank you again. It was fixed in a9c3cd2 |
Sorry for the delay! Is this ready to be merged? Okay?? I'll release the v1.3.0 includes this PR after merging. |
@yusukebe |
Okay! Let's gooooooooooo!! |
This is an optimization in a way that does not use the web standard Request and Response, so it may be out of line with the implementation policy in this repository, but this change made it 1.6 times faster.
Currently, only the simplest response where a string is returned is optimized, but if we work hard with additional PRs, we can also optimize the pattern where streaming is returned.
app
main branch
perf/proxy-object branch