-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
aws-glue-alpha: extra_jars parameter in PySparkEtlJob #33225
Comments
Yes this should be simply expose the L1 to the L2 surface. We welcome the PRs. |
This is not what L2s are for. They are opinionated constructs and exposing every L1 parameter defeats the purpose of creating interfaces as contracts and enforcing best practices. |
Comments on closed issues and PRs are hard for our team to see. |
1 similar comment
Comments on closed issues and PRs are hard for our team to see. |
…3238) ### Issue # (if applicable) Closes aws#33225. ### Reason for this change PySpark jobs with extra JAR dependencies cannot be defined with the new L2 constructs introduced in [v2.177.0](https://github.com/aws/aws-cdk/releases/tag/v2.177.0). ### Description of changes Add the `extraJars` parameter in the PySpark job L2 constructs. ### Checklist - [x] My code adheres to the [CONTRIBUTING GUIDE](https://github.com/aws/aws-cdk/blob/main/CONTRIBUTING.md) and [DESIGN GUIDELINES](https://github.com/aws/aws-cdk/blob/main/docs/DESIGN_GUIDELINES.md) ---- *By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
Describe the feature
Include the
extra_jars
parameter in the newPySparkEtlJob
L2 construct introduced in v2.177.0.Use Case
We need this parameter in order to use the spark-xml package JAR in our PySpark Glue jobs.
Proposed Solution
No response
Other Information
No response
Acknowledgements
CDK version used
v2.177.0
Environment details (OS name and version, etc.)
MacOS
The text was updated successfully, but these errors were encountered: