-
Notifications
You must be signed in to change notification settings - Fork 921
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SPARKC-577: Removal of Driver Duplicate Classes #1245
Closed
Closed
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…ere as well. Also added a few utility methods to help with table/view interactions.
absurdfarce
commented
May 6, 2020
val maxIndex = maxCol.componentIndex.get | ||
val requiredColumns = tableDef.clusteringColumns.takeWhile(_.componentIndex.get <= maxIndex) | ||
val maxIndex = tableDef.clusteringColumns.indexOf(maxCol) | ||
val requiredColumns = tableDef.clusteringColumns.take(maxIndex + 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wanted to highlight this for review. I'm pretty sure the logic I have in there now mirrors what was being done but wanted to make sure this was looked at more closely.
* ColumnSelector modified to work with both TableDef and TableDescriptor ** Need an IT for the TableDef case as it isn't really covered anymore * DatasetFunctions.createCassandraTable() modified to take table options + per-clustering column ordering val
Will likely wind up closing this in favor of #1250 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
How did the Spark Cassandra Connector Work or Not Work Before this Patch
Connector was using internal serializable reps for keyspace/table metadata because corresponding Java driver classes weren't serializable. This changed in v4.6.0.
General Design of the patch
Replaced internal refs with Java driver types directly wherever possible. Internal classes within the connector were used for several different functions:
In some cases (1) above could be handled with direct replacement, but we were also storing multiple layers of metadata within a single object (i.e. some table level information was stored in ColumnDef). Unfortunately the 4.x driver doesn't allow traversing the metadata tree in this way. In order to make this information available without too much breakage to the existing API some of the old classes are preserved with a new role: containers for all metadata types on a "branch" of this tree. Thus TableDef stores keyspace + table metadata, ColumnDef stores keyspace + table + column metadata, etc.
Fixes: SPARKC-577
How Has This Been Tested?
Still a WIP, hasn't been tested meaningfully yet
Checklist: