-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature request: allow additional predicates in join_by #6970
Comments
+1, I think, as I've a use case which seems of interest here. I've grouped data and need to compare the individuals within different instances of groups of the same type. This results in a many to many join where groups can be matched exactly but
So it'd be nice to do more of the filtering in the join and less of it post hoc. I'm not seeing another way to request dplyr 1.1.3 repeat rows of |
Hit the same repeatedly join all rows in |
join_by
refuses to handle any predicates it doesn't recognize:This is understandable in a narrow sense but interferes with the ability of dplyr to work with external databases like duckdb that are far more expressive. This is also inconsistent with most of the other dplyr functions, that are quite happy passing along such functions. Consider manipulating the following
lazy_tbl
objects using the explicitly spatial functions that duckdb recognizes:note that
dplyr::filter
(and most other such functions) have no problem passing thesest_*
functions on to duckdb, allowing us to do spatial operations. Now a very natural operation is to do spatial joins, e.g. in SQL / DBI we can do:All I'd like is for dplyr to construct this syntax with it's usual wonderful magic, e.g. I want to write:
Wouldn't that be nice? I think it could do so if only
join_by()
was not so fastidious in asserting that it doesn't understand anything but a hard-coded set of functions. Unfortunately this obviously errors with the message:Would it be possible to just relax this assertion, e.g. when the dataset in question is a remote src?
The text was updated successfully, but these errors were encountered: