-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Don't shade dependencies #3
Comments
Not dead, if you make any features I'd happily merge them. What should we do instead of shading? I did that because it wouldn't work when I tried to build a jar otherwise. |
I'm not sure, I just know libraries should not shade their dependencies, because you will never run the jar itself, it's a library. Maven projects referencing this one should detect this project's dependencies and get them as needed. The idea is that right now, I can't remove the shaded dependencies of spark-debug-tools from my own jar, because they're "all in one". From my projects point of view, these are not dependencies of spark-debug-tools, they're part of spark-debug-tools. If you don't shade the dependencies, the warning will still be shown if versions does not match, but I will be able to choose between the possible versions, and not be forced into the version spark-debug-tools choosed. Until this gets fixed, maven will choose a pseudo-random version between those available in the dependency tree for each dependency... Which isn't great! |
As it stands now, one can't use this helpful tool with Spark (which surprisingly just works) on Java 9. I'm hoping the Maven folks figure something out. |
It appears that this shading is causing a weird issue when running spark app as java application. I've discovered this issue while using gradle docker plugin which packs app in jar and launches it with Steps to reproduce:
With this setup if you run
Application will start with stacktraces like
and won't serve any requests. Application start working if |
With more dependencies in project it's even more weird. E.g. for these dependencies:
If I run
|
I just submitted a PR for this issue: Low tech solution... removed the maven shade plugin, and documented in the readme the dependencies. |
Running into the same issue. I'm no expert but if you want to keep the dependencies shaded, I think it's possible by making sure to relocate them at the same time (with a This way, they could still be packaged with the library but would not override other imports, is that correct? |
IMO a library should not bundle it's dependencies. Additionally, the spark, and other library versions are continually being updated, so bundling them could also cause incompatibility issues with newer versions. |
Hmm that it why its size is this much big. It's >5M , spark core is only 155K. This debug page would be useful but I'm trying to reduce installation overall. Now as an alternative I fancied using JSON transformer for exceptions . |
I don't think this project should shade all of its dependencies. As expected, it produces tons of warnings when used with SparkJava itself:
This is my second issue on this nice project, I hope it's not dead!
The text was updated successfully, but these errors were encountered: