Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ability to pass extra parameters to Hive Client Wrapper connection #45071

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

tomwit-nx
Copy link

@tomwit-nx tomwit-nx commented Dec 19, 2024

Closes: #45049

Add extra fields to Hive Client Wrapper connection so it is possible to pass extra parameters which are used to construct JDBC connection string. The extra fields are SSL Trust Store, SSL Trust Store password and Transport mode.


^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in newsfragments.

Copy link

boring-cyborg bot commented Dec 19, 2024

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contributors' Guide (https://github.com/apache/airflow/blob/main/contributing-docs/README.rst)
Here are some useful points:

  • Pay attention to the quality of your code (ruff, mypy and type annotations). Our pre-commits will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it's a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
  • Always keep your Pull Requests rebased, otherwise your build might fail due to changes not related to your commits.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: [email protected]
    Slack: https://s.apache.org/airflow-slack

@tomwit-nx tomwit-nx force-pushed the apache-hive-extra-beeline-params branch 2 times, most recently from 201e71a to d01edef Compare December 21, 2024 16:38
@tomwit-nx tomwit-nx force-pushed the apache-hive-extra-beeline-params branch from 7a85968 to 1e4bf2d Compare December 22, 2024 10:14
Copy link
Member

@potiuk potiuk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually - this is not securre enough.

The problem is that you can pass many, many bad things via JDBC url - when the values are not properly escaped. It's enough to pass ; as value of any of the paremeters you pass from extra. And when it happens, the user who has permission to configure the connection via UI might even perform RCE in a number of cases.

The recommendation here is:

  • only allow fixed values to be passed via extra (bool for example) to control how the JDBC URL is constructed
  • only allow "free-form" parameters to be passed by init parameters of the operator/hook - that can only be done by the DAG author, and DAG author by definitiion and security model of Airflow https://airflow.apache.org/docs/apache-airflow/stable/security/security_model.html can do more than connection editing user
  • sanitize the input (but this one is potentially very difficult)

While we already warn against the potentially Conection editing user being able to have more capabilities - so this is not strictly security issue, this is part of "security hardeninig" - we should not add "easy" ways for those users to be able to perform all kinds of attacks.

@tomwit-nx
Copy link
Author

Actually - this is not securre enough.

The problem is that you can pass many, many bad things via JDBC url - when the values are not properly escaped. It's enough to pass ; as value of any of the paremeters you pass from extra. And when it happens, the user who has permission to configure the connection via UI might even perform RCE in a number of cases.

The recommendation here is:

  • only allow fixed values to be passed via extra (bool for example) to control how the JDBC URL is constructed
  • only allow "free-form" parameters to be passed by init parameters of the operator/hook - that can only be done by the DAG author, and DAG author by definitiion and security model of Airflow https://airflow.apache.org/docs/apache-airflow/stable/security/security_model.html can do more than connection editing user
  • sanitize the input (but this one is potentially very difficult)

While we already warn against the potentially Conection editing user being able to have more capabilities - so this is not strictly security issue, this is part of "security hardeninig" - we should not add "easy" ways for those users to be able to perform all kinds of attacks.

I guess this also applies to the already existing Principal and Proxy User fields? I see that they only validate if a ; character was passed

@potiuk
Copy link
Member

potiuk commented Dec 27, 2024

I guess this also applies to the already existing Principal and Proxy User fields? I see that they only validate if a ; character was passed

I think there are more validations needed. Passing arbitrary parameter as path to jdbc is dangerous (what happens if for example jdbc driver displays content of the file when it is wrong and you pass "/etc/passwd"` ?. This is just example, it could be even more diastrous - printing more secret keys and secret variables stored somewhere on remote system. I am not sure if you can make it "secure" when this parameter is passed via UI and free-form.

There are only few values allowed for transportMode I guess, so it is safer to enumerate them rather than pass directly. When it comes to password, there is a question how ; is going to be passed (i..e what form of escaping should be there)?

@tomwit-nx
Copy link
Author

I guess this also applies to the already existing Principal and Proxy User fields? I see that they only validate if a ; character was passed

I think there are more validations needed. Passing arbitrary parameter as path to jdbc is dangerous (what happens if for example jdbc driver displays content of the file when it is wrong and you pass "/etc/passwd"` ?. This is just example, it could be even more diastrous - printing more secret keys and secret variables stored somewhere on remote system. I am not sure if you can make it "secure" when this parameter is passed via UI and free-form.

There are only few values allowed for transportMode I guess, so it is safer to enumerate them rather than pass directly. When it comes to password, there is a question how ; is going to be passed (i..e what form of escaping should be there)?

Thanks for the insight. I will see what I can do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add support for extra JDBC parameters for Hive Client Wrapper in apache-hive provider
2 participants