You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was wondering if you could help me with using spark-sftp with databricks. Firstly, I am struggling to import the library in databricks - I can only see (very few) examples in the documentation on loading in a dataframe, but nothing on how we import the library into the notebook itself. Secondly, is there a python API for spark-sftp, or is the functionality only available in Scala? (I develop using pyspark by can get past this by loading in the dataframe using scala and creating a temp view to access the dataframe with python). Thanks!
The text was updated successfully, but these errors were encountered:
All you have to do is make sure the jar of this project is added to your spark environment. I don't use Databricks but I think you could start there: https://docs.databricks.com/libraries.html
As for the pyspark API, it is exactly the same. You can write
I was wondering if you could help me with using spark-sftp with databricks. Firstly, I am struggling to import the library in databricks - I can only see (very few) examples in the documentation on loading in a dataframe, but nothing on how we import the library into the notebook itself. Secondly, is there a python API for spark-sftp, or is the functionality only available in Scala? (I develop using pyspark by can get past this by loading in the dataframe using scala and creating a temp view to access the dataframe with python). Thanks!
The text was updated successfully, but these errors were encountered: