Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[8pt] Open JDK 17 re-appeared #1278

Closed
RobHanna-NOAA opened this issue Sep 4, 2024 · 8 comments · Fixed by #1291
Closed

[8pt] Open JDK 17 re-appeared #1278

RobHanna-NOAA opened this issue Sep 4, 2024 · 8 comments · Fixed by #1291
Assignees
Labels
dependencies Pull requests that update a dependency file FIM4 High Priority

Comments

@RobHanna-NOAA
Copy link
Contributor

RobHanna-NOAA commented Sep 4, 2024

On Aug 8, our security team detected that the Open JDK 17 issue has re-appear.

The security tool picked up the phrase in this folder on our servers.
Path              : /var/lib/docker/overlay2/c1e9b64d8e3f562048f29614d8adbdcd42966e1cda566edc74fc0fd9aa0bafef/diff/usr/lib/jvm/java-17-openjdk-amd64/

We no longer need open jdk17 at all.

After some experimenting, when we do a docker build, this folder re-appears, each with a random id after the overlay2 folder.

Soem experimenting was done on how this get there.

Let's look to see if there is something our docker file is building two images and maybe one of those is leaving java17 behind.
Maybe somethign is being left behind, and it might be just a matter of uninstalling java 17 before installing java 21. It is possible and common to have multiple jdk engines on one image or servers.

Lets check out one of our AWS EC2 to see if some docker builds can re-create this folder above and if so, we can debug and fix. Without it, we might have to get creative.

@RobHanna-NOAA RobHanna-NOAA added High Priority dependencies Pull requests that update a dependency file FIM4 labels Sep 4, 2024
@mluck
Copy link
Contributor

mluck commented Sep 5, 2024

I did a find on the current Docker image and found two additional locations that still had java-17-openjdk-amd64.

root@c1bb74b278b0:/# find / -type d -name java-17-openjdk-amd64
/usr/lib/debug/usr/lib/jvm/java-17-openjdk-amd64
/usr/share/gdb/auto-load/usr/lib/jvm/java-17-openjdk-amd64

@RobHanna-NOAA
Copy link
Contributor Author

oh, very cool. Do we have any for 18 by chance?

In our dockerfile, we have code that removes some java folders, but looks like it needs more.

Great catch.

@mluck
Copy link
Contributor

mluck commented Sep 5, 2024

None for 18. dev-rm-java-17 adds code to the Dockerfile to remove the two that were found.

@RobHanna-NOAA
Copy link
Contributor Author

Oops.. Update. There is no problem (I think) on what is inside the docker container, but what the host is doing. hummmm. I think we have proven the problem is at docker build and not when the docker run command is used, that is putting those files on the host machine. Might be worth checking.

@RobHanna-NOAA
Copy link
Contributor Author

It occurred to me that a post script would not work as we would have permission issues in OWP servers. But.. two thoughts.

  1. What if this is simply a caching thing by Docker during the build? Maybe there is a command we can run after docker build.... ; {other command} that just cleans up the docker build cache? System prune might be a bit heavy, but docker image prune might work or maybe there is specifically something to clear docker host cache after builds? Even a server reboot after docker build might give us a clue to this. REason: This problem was solved for a while and reappeared. And maybe that is because OWP stopped rebooting the servers weekly.

  2. If that does't work, the real requirement is not to have open jdk 17 removed, but it is at a lower version. We don't need it but maybe we can solve it by just upgrading it in our docker build script.

@mluck
Copy link
Contributor

mluck commented Sep 6, 2024

The java-17-openjdk-amd64 folders are being created by Docker during docker build in order to track diffs (e.g., for docker container diff) regardless of what's in the final image.

@RobHanna-NOAA
Copy link
Contributor Author

ya. It feels like their must be reason, the problem disappeared for a while, then re-appeared, which is why I am wondering now it if is some sort of docker host cache. Weird. If we can't find something in the docker build to upgrade it, or remove it from the host (maybe just in part of the two base images in our dockerfile?), or some sort of appendage to our docker command, I guess we can always jsut see if a reboot after a build. The reboot idea, if it even works, woudl be one of my last options as I don't want to have to keep bugging IT after each docker build, but it is an option (just not a good one, if it even works)

@mluck
Copy link
Contributor

mluck commented Sep 11, 2024

I sent a message to the gdal-dev mailing list and this is the response that I got:

Several potential solutions:

  1. Regenerate the Docker image from sources:
git clone https://github.com/OSGeo/gdal

cd gdal

./docker/ubuntu-full/build.sh
  1. Same as 1), but before edit ./docker/ubuntu-full/Dockerfile to remove all traces of java/jdk from it

  2. Use the existing image, remove the openjdk package, and "flatten" the Docker layers with docker export / docker import (cf https://forums.docker.com/t/how-to-flatten-an-image-with-127-parents/1600/2), so that the layer where it was installed disappears

  3. Wait a couple hours while I'm regenerating it to be updated to 17.0.12+7-1ubuntu2~24.04

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file FIM4 High Priority
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants