Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doesn't seem to be working spDependencies += "databricks/spark-avro:3.2.0" #36

Open
nemo83 opened this issue Oct 27, 2017 · 1 comment

Comments

@nemo83
Copy link

nemo83 commented Oct 27, 2017

As per documentation I should be able to add spark dependencies via:

spDependencies += "databricks/spark-avro:3.2.0"

But I have the following error:

[info] Loading global plugins from /Users/XXX/.sbt/0.13/plugins
[info] Loading project definition from /Users/XXX/Development/workspace/job-cerebro-spark/project
[info] Set current project to job-cerebro-spark (in build file:/Users/XXX/Development/workspace/job-cerebro-spark/)
[info] Executing in batch mode. For better performance use sbt's shell
[info] Updating {file:/Users/XXX/Development/workspace/job-cerebro-spark/}job-cerebro-spark...
[info] Resolving com.databricks#spark-avro;3.2.0 ...
[warn] 	module not found: com.databricks#spark-avro;3.2.0
[warn] ==== local: tried
[warn]   /Users/XXX/.ivy2/local/com.databricks/spark-avro/3.2.0/ivys/ivy.xml
[warn] ==== public: tried
[warn]   https://repo1.maven.org/maven2/com/databricks/spark-avro/3.2.0/spark-avro-3.2.0.pom
[warn] ==== local-preloaded-ivy: tried
[warn]   /Users/XXX/.sbt/preloaded/com.databricks/spark-avro/3.2.0/ivys/ivy.xml
[warn] ==== local-preloaded: tried
[warn]   file:////Users/XXX/.sbt/preloaded/com/databricks/spark-avro/3.2.0/spark-avro-3.2.0.pom
[warn] ==== Spark Packages Repo: tried
[warn]   https://dl.bintray.com/spark-packages/maven/com/databricks/spark-avro/3.2.0/spark-avro-3.2.0.pom
[info] Resolving com.github.scopt#scopt_2.11;3.3.0 ...
[warn] 	::::::::::::::::::::::::::::::::::::::::::::::
[warn] 	::          UNRESOLVED DEPENDENCIES         ::
[warn] 	::::::::::::::::::::::::::::::::::::::::::::::
[warn] 	:: com.databricks#spark-avro;3.2.0: not found
[warn] 	::::::::::::::::::::::::::::::::::::::::::::::
[warn] 
[warn] 	Note: Unresolved dependencies path:
[warn] 		com.databricks:spark-avro:3.2.0 ((sbtsparkpackage.SparkPackagePlugin) SparkPackagePlugin.scala#L309)
[warn] 		  +- com.my_company:job-cerebro-spark_2.11:0.1.0-SNAPSHOT
sbt.ResolveException: unresolved dependency: com.databricks#spark-avro;3.2.0: not found
	at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:313)
	at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:191)
	at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:168)
	at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:156)
	at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:156)
	at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:133)
	at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:57)
	at sbt.IvySbt$$anon$4.call(Ivy.scala:65)
	at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
	at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
	at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
	at xsbt.boot.Using$.withResource(Using.scala:10)
	at xsbt.boot.Using$.apply(Using.scala:9)
	at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
	at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
	at xsbt.boot.Locks$.apply0(Locks.scala:31)
	at xsbt.boot.Locks$.apply(Locks.scala:28)
	at sbt.IvySbt.withDefaultLogger(Ivy.scala:65)
	at sbt.IvySbt.withIvy(Ivy.scala:128)
	at sbt.IvySbt.withIvy(Ivy.scala:125)
	at sbt.IvySbt$Module.withModule(Ivy.scala:156)
	at sbt.IvyActions$.updateEither(IvyActions.scala:168)
	at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1541)
	at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1537)
	at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$121.apply(Defaults.scala:1572)
	at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$121.apply(Defaults.scala:1570)
	at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:37)
	at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1575)
	at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1569)
	at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:60)
	at sbt.Classpaths$.cachedUpdate(Defaults.scala:1592)
	at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1519)
	at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1471)
	at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
	at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
	at sbt.std.Transform$$anon$4.work(System.scala:63)
	at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
	at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
	at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
	at sbt.Execute.work(Execute.scala:237)
	at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
	at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
	at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
	at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
[error] (*:update) sbt.ResolveException: unresolved dependency: com.databricks#spark-avro;3.2.0: not found
[error] Total time: 8 s, completed 27-Oct-2017 10:49:26

Any suggestions?

@brkyvz
Copy link
Contributor

brkyvz commented Dec 6, 2017

@nemo83 I believe you're missing the Scala version.
Have you tried:

spDependencies += "databricks/spark-avro:3.2.0-s_2.11"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants