-
Notifications
You must be signed in to change notification settings - Fork 32
Issues: databricks/sbt-spark-package
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
java.lang.NoSuchMethodError: sbt.UpdateConfiguration.copy$default$1()Lscala/Option
#51
opened May 6, 2021 by
LantaoJin
This JAR is hosted on Bintray, which is going down permanently in a few days...
#49
opened Apr 26, 2021 by
DCameronMauch
performing
sbt run
with spIgnoreProvided := true still issues classpath errors
#43
opened Dec 8, 2017 by
gregnwosu
Doesn't seem to be working
spDependencies += "databricks/spark-avro:3.2.0"
#36
opened Oct 27, 2017 by
nemo83
Accessing the ml Spark component (without importing all the mllib stuff)
#32
opened Mar 12, 2017 by
MrPowers
there is no sbt-spark-package:0.2.4/0.2.5, how can i configure the right configuration ?
#30
opened Dec 25, 2016 by
littleJava
Package from spPublishLocal not usable due to scala version appearing in the ivy module name
bug
#17
opened Jan 15, 2016 by
frensjan
ProTip!
Mix and match filters to narrow down what you’re looking for.