Save *.dfsu file with Databricks on Microsoft Azure Storage account #695
Unanswered
RenewAndPlay
asked this question in
Q&A
Replies: 2 comments
-
No, that is not something that I have encountered. And it is only a problem for writing dfs files, and not with any other type of file? Curios to hear if anyone else experienced anything similar. |
Beta Was this translation helpful? Give feedback.
0 replies
-
We have successfully written other file formats - mainly *.txt files though. We have not yet have any success writing a *.dfsu file through the mount to the Azure Storage account. One of the reasons doing this is to handle the very large file sizes and other post processing of the dfsu files. Just fyi. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
We are using Databricks to handle a large amount of DFSU files. The files can easily be read from Azure Storage. However after some processesing a new dfsu files should be saved to Azure Storage.
We are using the to_dfs() command to save, where a mount has been setup on top of the Azure Storage.
The command runs succesfully and new saved file can now be seen through Databricks with an os.listdir() command. However, when entering the Azure Storage account the files are not visible.
Have you stumbled upon this issue or is there another kind of dfsu format we could save the files to?
Beta Was this translation helpful? Give feedback.
All reactions