-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: grammar helper #85
Comments
I wanted to know if this approach was faster or slower - it is slower, and I did attempt some optimizations, but I'm not an expert, and the benchmark is really unreliable on my system for some reason... I'm seeing anywhere from 1% to 10% overhead compared with On this branch, you can also see what the JSON parser would look like in this format. mindplay-dk/sigma@add-core-folder...mindplay-dk:sigma:grammar-helper I quite like the encapsulation, the type-safety, and removing the potential for uninitialized parsers - but I'm not sure what the actual overhead is, and so I can't really say if I think it's worth while, or whether it could be optimized. There might be some inherent overhead due to the use of |
Here's a simplified version: export function grammar<T extends GrammarType>(init: GrammarInit<T>): Grammar<T> {
const grammar = {} as { [key: string]: Parser<unknown> }
for (const key in init) {
grammar[key] = {
parse() {
throw new Error()
}
}
}
for (const key in init) {
grammar[key].parse = init[key].apply(grammar).parse
}
return grammar as Grammar<T>
} Just putting in "dummy" parsers for everything in the first loop, so that we get objects with a Logically, this should be faster, because we're no longer doing anything dynamic - but it's quite possible that replacing a function like this breaks some sort of optimization in V8, I really can't say without a benchmark. What's really suspicious, is if I swap the order of the benchmarks, so that |
For the record, opened an issue here to find out if this benchmark is still valid. |
So yeah, I have to conclude the Not sure how to address this. Either way, having run the benchmarks in isolation, I'm now certain the So I think it makes sense to add this feature? If you agree, I'll try to get it wrapped up with tests and docs soon 🙂 |
Yeah, benchmarking in Node/JS is super flaky, and As for the feature itself, it does looks interesting and the implementation is kinda clever! Let's give it a go. |
Yeah, I was tempted already and start mucking about with some maths here: https://github.com/mindplay-dk/sigma/blob/crazy-math/benchmarks/src/json/index.ts There isn't really anything maths can do to fix this problem though - after all the testing I did over the last couple of days, I'm fairly confident there is some sort of V8/JS engine optimization that applies only to the first module that loads. Not sure where I'm going with this - if I decide to do something about this, it would need to have process isolation somehow, which probably means it'll have limitations on what you can test (e.g. Workers don't expose the DOM) or it'll work on Node only. (forking or launching a dedicated process for each test.) I might do another round at some point and scout for an existing benchmark library that works...
I will try to get this wrapped up soon. 🙂👍 |
... aha! 🤔 |
Nope, couldn't get that to work. We'll see if the author responds - it's a very new project, but it does sound really promising. 🙂 |
Updated https://github.com/Llorx/iso-bench with an API easier to implement 🙂 Just commented on the issue there with your code @mindplay-dk working with the old API and also it working with the new API. |
Sure, okay. |
What problem does this feature solve?
This is mainly for convenience - but it does solve the problem with the existing
defer
function, which relies initialization separate from creation. My proposedgrammar
helper would be statically type-checked, and wouldn't need an error-handler.Describe the solution
If we look at the example for
defer
:sigma/docs/docs/content/parsers/defer.md
Lines 33 to 64 in 0d96737
Here is that example implemented with the
grammar
helper:Here is a test demonstrating how to use the resulting grammar:
And here is my preliminary implementation:
Here is a screenshot demonstrating IDE support:
As you can see, this works with the circular references, which is possible with the magical
ThisType
in TS.It's doing basically the same thing as
defer
for each member, so of course this works with circular references at run-time as well.I didn't benchmark it against
defer
, and it might need some optimization, and the types could probably use a little work.But what do you think, would you welcome a PR for this feature? 🙂
The text was updated successfully, but these errors were encountered: