You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I already have a spark-server running on my machine with 1 master and 1 worker node. However, every time I run anything in scala, it creates it's own spark cluster.
How can I make it use the existing spark servers that are running. I can do this with pyspark, but not with spylon-kernel.
Failed with Spylon-kernel -
The text was updated successfully, but these errors were encountered:
I already have a spark-server running on my machine with 1 master and 1 worker node. However, every time I run anything in scala, it creates it's own spark cluster.
How can I make it use the existing spark servers that are running. I can do this with pyspark, but not with spylon-kernel.
Failed with Spylon-kernel -
The text was updated successfully, but these errors were encountered: