-
Notifications
You must be signed in to change notification settings - Fork 95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The Syntax Bikeshedding Dojo, round 10: Symbolic strings #1039
Comments
Re-stating what I've said when we've discussed this in meetings: I'm very pro- finding some way to allow end-users to think about "a Nix string" as a distinct thing from "a Terraform string" or "a Nickel string" when writing configurations. I think the main components for doing that are:
The two error-handling cases there are likely a lot easier said than done. I think trying to just desugar into function application, while nice & simple from an implementation perspective, probably puts too much burden on the user in terms of making sure the correct "type" of string is in scope. The "tagged" approach, plus a stdlib contract (e.g.
I'm just sketching this out quickly, so maybe there are reasons this wouldn't work that I'm not seeing, but it seems like this would allow us to raise pretty detailed errors in all of the situations mentioned above, thereby making the feature a lot simpler to use, at the relatively minimal cost of a pair of new seal/unseal primitives.
An alternative solution is to take the approach that Rust took with keywords & reserve a bunch of string prefixes in advance that we think we could want to use in future. We could reserve all single-character prefixes, as well as |
I don't have full understanding of what you're talking about, but I just want to make sure: Will we be able to mix nix-like interpolations and normal interpolations? Like we can do in Nix: let simple_var = "something"; in
"${pkgs.some-drv}/bin/${simple_var}" Here |
@bew Yes, you'll be able to mix the two. As of right now, when you write:
we essentially desugar the whole expression to:
In the current WIP iteration of nix interop, we then check that each element of that array is either a derivation or a string, and handle them accordingly. |
Thanks 👍
(hot reaction, might be helpful for the bikeshedding) |
Two comments:
|
My pet-peeve with the function semantics is that it puts the burden of making the corresponding function available in scope to the end-user. Because we don't have any namespacing or a global overloading resolution mechanism, this is very sensitive to variable names and shadowing. For example:
Now, this means that a
Now, if some reason we need to add a I personally find the opposite approach easier in Nickel: use a contract which is responsible for applying the parsing function. Then you can design your library in such a way that the user writing the config doesn't have anything to put in scope:
The glue code will import This is in fact how it is done today in the nickel-nix prototype. In the end, I think the function semantics can be nice, but I'm afraid it won't work very well given the very simple scoping model of Nickel. |
One possibility is to have two versions of special strings: a single line one, and a multi line one. The multiline, beside the indentation and all, is also useful because the escaping mechanism is different. So I tend to think we should have it anyway. But for short strings, we could use the cpp syntax
|
This is a good point. Yet it's not necessarily completely true: when using, say, the nix-nickel thing, I assume that the nickel file(s?) will be evaluated in a context that contains more than the standard library. Or maybe not, but then there would be a standard import at the top of the file. Either way, the extra context may very well include the nix-string-evaluation function.
A solution to this problem could be to give a very intentional name to string evaluators. Like
This is clever. However
I'm not saying that it's not the way to go (it's clearly more economical in terms of language design), but I'm not sold. |
Yes, that's a general philosophical question about what is the extent of processing that contracts should be ideally doing. I tend to think that "normalization" is fine (e.g. you have a union contract but your output value is normalized to always have one canonical form corresponding to one of the branch, think
The clash part doesn't, because there isn't any constraint on the name of an identifier holding a contract. If your The contract also doesn't have to be in scope for the end user, as the library code consuming the data may apply it. If you split your package in 10 Nickel files, each with symbolic strings, the function approach mandates that each one must import the relevant parsing function. With a contract, they might be pure Nickel without any import, if desired. Additionally, we'll have to have a contract on the field anyway, because the user may fill I'm not sold either, but I feel the contract approach only leverages known and usual concepts of Nickel (beside the funny string syntax for symbolic strings), while the implicit function requires to be familiar with an additional idea (making sure the corresponding parsing function is in scope). |
It makes sense indeed. It's in the same spirit as validation functions producing structured data in Haskell as well.
I must be missing something here, because I don't see how the two options differ with respect to this point.
I see where you're coming from now. It is indeed less material, which is valuable. |
Typically, the current nickel-nix prototype generates glue code to evaluate the main Nickel file, something like:
In particular, the It's not entirely true in the case of Still, the point is that some "root" contract (whatever it means) eventually applied is sufficient to propagate the required In a non Nix-related example, you could have something like:
Now, with the parsing function approach, you have to make sure the parsing function is in scope at each usage site, that is in the
It would work nicely in a language with a mechanism to resolve implicitly the definition of |
I understand your point now, thanks. I imagine, though, that if you use contracts to enforce the nixiness of strings, you would not call the string literal
We could imagine a mechanism to evaluate a nickel file in a context. We already do this somehow: the standard library is injected around every file. We could generalise this mechanism. Maybe with command-line arguments in the invocation of That being said, it's not clear to me how we can pass this context to the LSP. |
One of the proposal was to do this nonetheless, but just pass it as data in the resulting value, something like:
The parsing function from Nixel is then responsible for checking the prefix and act accordingly (probably blame with a cool error message if the prefix isn't supported). I think what @matthew-healy dislikes about this solution is that the parsing function can just ignore the prefix, while the implicit function approach somehow forces you to choose one or a finite number of prefixes.
Yes, in this case this would work pretty well. But as you mention, the mechanism must be standard and understood by the LSP (which probably requires a notion of "workspace", "module", "project" or whatnot). |
Yes, that's my main problem with just passing a record containing the prefix into the library function, but I think it might be possible to work around that using |
I think there is a path which is forward-compatible (at least for the consumers of the library, the library authors of e.g. Nixel may have to adapt, but that's less of a problem):
This shouldn't much difference on the user side, while imposing increasing constraints on the libraries. |
The latter approach has been approved. As this issue is mostly a bikeshedding issue, and not a tracking one, I'm closing. |
#994 introduced symbolic strings, as described in #948. The current syntax is temporary, using an
s
prefix, which is not very specific (could mean special, string, symbolic, etc.). Let's try to find a better surface syntax.The team has proposed to support a different prefix depending on the use-case. For example,
nix%"{inputs.coreutils}/bin/bash"%
ortf%"{resource-foo.id}"%
. Two questions follow, mostly orthogonal:Prefix syntax
m
for multiline string: all the other would be interpreted as a symbolic string prefix.r%".."%
, and language snippet (like the language option for a markdown code block: the semantics would be the same as a normal string, but editors could highlight the snippet differently, probably writtencpp%"..."%
orcpp-lang%"..."%
). Such addition could conflict with custom prefixes. The solution is to have a more restricted scheme for symbolic strings, such as requiring them to end with-s
:nix-s%".."%
,tf-s%".."%
or any scheme which tells them apart from other special strings.Prefix semantics
The other question is what to do with this prefix.
nix-s%"foo"%
would be parsed e.g. as{prefix = `nix, chunks = ["foo"]}
instead of["foo"]
currently. This solution is cheap and lets the library authors take care of checking that the prefix matches, or failing otherwise. We could provide combinator to check the prefix and otherwise raise blame with a standardized error. The library implementer could also decide to totally ignore the prefix, which is not great.nix-s%"foo"%
would be desugared tonix s%"foo"%
(or whatever expression we want involving Nix). This is how Scala does custom string interpolation (see Advanced usage section). The user then has to make sure that the relevant function is in scope, and this approach might require specialized error reporting to reach a reasonable user experience. To be contrasted with the previous approach coupled with a contract, where the said contract is responsible for selecting the parsing function and performing the checks while the user writing symbolic strings doesn't have to know or care.The text was updated successfully, but these errors were encountered: