Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Keep only the two most recent builds rather than three #506

Closed
petefoth opened this issue Nov 27, 2023 · 28 comments
Closed

Keep only the two most recent builds rather than three #506

petefoth opened this issue Nov 27, 2023 · 28 comments
Assignees

Comments

@petefoth
Copy link
Contributor

petefoth commented Nov 27, 2023

We are currently using 99% of the available disk space on the download server (554GB used, 6.2GB free, out of 590GB). This caused a couple of problems with the latest build run, where copying of some files from build server to download server failed due to lack of disk space (see #505 and had to be completed manually after some unneeded files were removed.

The lack of space is party due to the fact we now publish loads of .img files in addition to the main ROM zip files and recovery images - see #405 & #467.

Zipping up these images (see #483) caused the 'Update recovery` function of the Updater to fail (see #487), so was reverted - See PR #490.

Adding more disk space is problematic for a number of reasons. So I plan to reduce the number of builds we make avaiable from three for each device to two (by setting DELETE_OLD_ZIPS=2 in the docker run command). Unless there are any strong objections, I will do this for the next build run which starts early Friday morning. If anyone has any concerns about this, please let me know before then.

@petefoth
Copy link
Contributor Author

@bananer I'd be interested to know your thoughts on this if you have time

@petefoth
Copy link
Contributor Author

petefoth commented Dec 2, 2023

Sat 02 Dec 23

  • After 24 hours and 10 completed builds, now 543G used, 17G free (98%) on /mnt/archive

@st3rox
Copy link

st3rox commented Dec 7, 2023

Looks like this now only leaves the current build and the last previous major version build
So there's no way to roll back a month

@petefoth
Copy link
Contributor Author

petefoth commented Dec 7, 2023

Looks like this now only leaves the current build and the last previous major version build So there's no way to roll back a month

It should leave the two most recently built versions, which would normally be the same version but previous month. I have vague memories of wrongly triggering a build of an earlier major version for one device, but I cant remember which.

@st3rox
Copy link

st3rox commented Dec 7, 2023

Looking at bramble, the only builds available are this month's build and the 19.1 build from 12/16/2022

Same for enchilada

@petefoth
Copy link
Contributor Author

petefoth commented Dec 7, 2023

OK

It looks like there's a bug in the cleanup code which happens here that doesn't do the right thing when a device is moved to the next Android version. Not sure whether the fix is in that sell code or in the clean-up.py Python code.

It's not new: in previous build runs we will have had 2 x 20.0 builds and 1 x 19.1 build. It will happen a lot in this build run, as a lot of devices have moved from 19.1 to 20.0 :(

I don't have the time (or the skills) to start debugging either. Short term I will do the following:

  1. Warn users in the XDA forum about the possible issue
  2. Work through all the 20.0 devices and delete any 17.1, 18.1 or 19.1 builds for those devices

@petefoth
Copy link
Contributor Author

petefoth commented Dec 7, 2023

On the build server, for those devices which

  • already have a 20.0 build and
  • have not been built yet this month and
  • have 17.1, 18.1 or 19.1 builds still present

I have deleted the 1?.1 files, so that after this month's build we will have the two most recent 20.0 builds. On completion of the build for each device, the device zips directory on the build server will be rsynced to the download server

This applies to the following devices:

h990 haydn heart hotdog hotdogb instantnoodle instantnoodlep joan kane kebab kiev lake lancelot lemonade lemonadep lemonades lisa lmi ls997 lynx m5 m5_tab m52xq marlin mars martini mata merlinx messi Mi439 Mi8917 Mi8937 miatoll nairo nash nio nx563j nx606j nx609j nx611j nx619j nx651j nx659j ocean odroidc4 odroidc4_tab onclite oriole panther payton pdx214 pdx215 PL2 polaris pro1 pro1x pstar pyxis racer radxa0 radxa0_tab raven redfin renoir rhode river rosemary rs988 sagit sailfish sake sargo Spacewar starlte star2lte sunfish surya sweet taimen tangorpro tissot TP1803 troika ursa us996 us996d us997 vela violet vs995 wade walleye X00TD xmsirius xz2c Z01R z2_plus zippo

Next for the devices which were built for 19.1 last month, which should all be built for 20.0 this month (discovery,gauguin,gemini,kirin,liber,mermaid,natrium,parker,pioneer,vayu,voyager). When this month's build completes, we will have the new 20.0 build and last month's 19.1 build which is correct.

That leaves the following problems to solve:

  1. 20.0 devices already built this month which had 20.0 builds last month and have 17.1, 18.1 or 19.1 builds still present (akari alioth apollon bardockpro beckham berlin berlna beryllium beyond0lte beyond1lte beyond2lte beyondx blueline bonito bramble channel cheeseburger cheryl chiron coral crosshatch crownlte d1 d2s d2x davinci devon dipper dre DRG dubai dumpling enchilada equuleus evert f62 fajita flame FP3 grus gta4xl gta4xlwifi gts4lv gts4lvwifi guacamole guacamoleb h830 h850 h870 h872 h910 h918 h932 )
  2. How to deal with version increases in future build runs

I will think about 1, and maybe ask other s for ideas on 2.

@st3rox Thanks for spotting this. It's not a new problem, but it's more urgent with only keeping two builds instead of three

@petefoth
Copy link
Contributor Author

petefoth commented Dec 7, 2023

More investigation:

  • According to the python code here and

  • the shell code here

    we should be keeping DELETE_OLD_ZIPS (currently 2, previously 3) builds of the current lineage version plus 1 version of previous versions. It seems that, if old versions are present ,we are keeping DELETE_OLD_ZIPS -1 (currently 1, previously 2) builds of the latest.

  • I need to debug - by code inspection :( - the python code and see why this is happening.

  • In the meantime, my actions in the previous post mean that we no longer have any builds of older lineage versions. I don't know how much of a problem that really is, and whether there is any need to keep old versions lying around (certainly LineageOS don't do it)

Time to stop now before I do any more damage :)

@petefoth
Copy link
Contributor Author

petefoth commented Dec 7, 2023

Damn! Removing the old builds didn't fix the problem for h990: no we only have today's build, lineage-20.0-20231207-microG-h990. Last month's lineage-20.0-20231207-microG-h990 is gone!

@st3rox
Copy link

st3rox commented Dec 7, 2023

  • In the meantime, my actions in the previous post mean that we no longer have any builds of older lineage versions. I don't know how much of a problem that really is, and whether there is any need to keep old versions lying around (certainly LineageOS don't do it)

I didn't realize lineageOS doesn't keep old versions

I see no problem with getting rid of previous versions to keep storage usage down

For future version increases my suggestion would be keeping the last previous version build for some time until the current version is daily driver quality

@ctag
Copy link

ctag commented Dec 11, 2023

I'd like to echo the other offers to help buy more space. Or offer space on my home storage server.

I just got bit by an update that breaks calling, and no clear way to access the next previous update file.

@st3rox
Copy link

st3rox commented Dec 11, 2023

I'd like to echo the other offers to help buy more space. Or offer space on my home storage server.

I just got bit by an update that breaks calling, and no clear way to access the next previous update file.

If your device has a/b partitions you can switch to the other partition to get back to your previous install

@ctag
Copy link

ctag commented Dec 12, 2023

Thank you for the troubleshooting advice, st3rox. When I change the active slot the system fails to boot with a warning about corruption. If there's a user step to enable a/b booting I must have not set it up.

For the topic of having older builds available, would a solution like the link below be permissible to this project's hosting provider? I only downloaded one phone's listing as an example, but I could keep the last handful of builds available for all the phones.

If the main repo kept the old sha files too, then users could confirm what they download from me is correct. Those sha files are small, but I understand that keeping more of them is still the opposite of freeing up space..

https://mirror.berocs.com/download.lineage.microg.org/guacamole/

@petefoth
Copy link
Contributor Author

I'd like to echo the other offers to help buy more space. Or offer space on my home storage server.

For the topic of having older builds available, would a solution like the link below be permissible to this project's hosting provider?

Thanks for these offers, but I don't think that's the way to go. We definitely have enough storage to keep the two most recent builds, and I think that will be enough. The fact that, for some devices we now only have one build is down to a bug either in the code that keeps a specified number of builds - previously three, changed to two for this build run - or in my understanding of that code. I hope to resolve that before the next build run. From then on, two builds should be available for each device, and I don;t think we need any more than that

@petefoth
Copy link
Contributor Author

The right thing is happening with the 18.1 builds: before this build run, two builds - the October & November builds - were available for the 18.1 devices; after this month's build we still have two builds -the November & December builds

@vanMacG
Copy link

vanMacG commented Dec 15, 2023

The right thing is happening with the 18.1 builds: before this build run, two builds - the October & November builds - were available for the 18.1 devices; after this month's build we still have two builds -the November & December builds

Not for every device. klte i.e. has only december builds now...

@petefoth
Copy link
Contributor Author

Not for every device. klte i.e. has only december builds now...

Good spot!

Looking at the logs, I think we had a 17.1 build for that device. So it looks like the 'bug' is in the handling of devices where a build of a previous Android / LOS version was present. Probably not helped by my removing all the 17.1 builds I could find - see my earlier comments

@vanMacG
Copy link

vanMacG commented Dec 17, 2023

I don't have proof for this (scrrenshots), but as far as I remember, there was no 17.1 build before. There was the current monthly 18.1 build including recovery and boot image and corresponding checksums. Additionally there was the previous monthly 18.1. image and corresponding checksum (no recovery and boot).
For me, this is fine. I always keep the previous image simply on the smartphone... So no blame ❤️

@petefoth
Copy link
Contributor Author

petefoth commented Dec 20, 2023

At some point before the January build run, I will check how many / which devices now have two builds present and which have only one.

I don't see the point of keeping builds for old Android versions once a device has been successfully 'promoted` to a higher Android version (except for devices which were only promoted in the most recent build run.

Ideally we should have two builds for each device available. These will usually be the latest Android version. For recently promoted devices, we will have the most recent build which is the new version, and the previous build of the previous version. So users of those devices have a month to check that the new Android version is working OK before the old Android version disappears.

@petefoth petefoth mentioned this issue Jan 2, 2024
3 tasks
@petefoth
Copy link
Contributor Author

petefoth commented Jan 20, 2024

See Tidy up old 17.1, 18.1 & 19.1 builds on both servers #525

After the January build run we have 263 zip files for 230 separate devices, so only 30 or our build targets have two builds as they should

  • free space on download server
df -h
Filesystem      Size  Used Avail Use% Mounted on
...
/dev/vdb1       590G  249G  311G  45% /mnt/archive
18.1 Builds

 for name in a5y17lte a7y17lte addison ahannah albus bacon castor castor_windy cedric debx ether flox foster foster_tab FP2 ginkgo griffin hannah hlte hltechn hltekor hltetmo jactivelte jflteatt jfltespr jfltevzw jfltexx jfvelte klte klteactivexx klteaio kltechn kltechnduo klteduos kltedv kltekdi kltekor montana oneplus3 porg porg_tab rhannah s3ve3gds s3ve3gjv s3ve3gxx shamu sirius victara x2 z3 z3c zl1 ; do ls $name/*.zip ; done
  a5y17lte/lineage-18.1-20231214-microG-a5y17lte.zip
  a5y17lte/lineage-18.1-20240117-microG-a5y17lte.zip
  a7y17lte/lineage-18.1-20231214-microG-a7y17lte.zip
  a7y17lte/lineage-18.1-20240117-microG-a7y17lte.zip
  addison/lineage-18.1-20231214-microG-addison.zip  addison/lineage-18.1-20240117-microG-addison.zip
  ahannah/lineage-18.1-20231214-microG-ahannah.zip  ahannah/lineage-18.1-20240117-microG-ahannah.zip
  albus/lineage-18.1-20231214-microG-albus.zip  albus/lineage-18.1-20240117-microG-albus.zip
  bacon/lineage-18.1-20231214-microG-bacon.zip  bacon/lineage-18.1-20240117-microG-bacon.zip
  castor/lineage-18.1-20231214-microG-castor.zip	castor/lineage-18.1-20240117-microG-castor.zip
  castor_windy/lineage-18.1-20231214-microG-castor_windy.zip
  castor_windy/lineage-18.1-20240117-microG-castor_windy.zip
  cedric/lineage-18.1-20231214-microG-cedric.zip	cedric/lineage-18.1-20240117-microG-cedric.zip
  debx/lineage-18.1-20231214-microG-debx.zip  debx/lineage-18.1-20240117-microG-debx.zip
  ether/lineage-18.1-20240118-microG-ether.zip
  flox/lineage-18.1-20240118-microG-flox.zip
  foster/lineage-18.1-20240118-microG-foster.zip
  foster_tab/lineage-18.1-20240118-microG-foster_tab.zip
  FP2/lineage-18.1-20240118-microG-FP2.zip
  ginkgo/lineage-18.1-20240118-microG-ginkgo.zip
  griffin/lineage-18.1-20240120-microG-griffin.zip
  hannah/lineage-18.1-20240118-microG-hannah.zip
  hlte/lineage-18.1-20240118-microG-hlte.zip
  hltechn/lineage-18.1-20240118-microG-hltechn.zip
  hltekor/lineage-18.1-20240118-microG-hltekor.zip
  hltetmo/lineage-18.1-20240118-microG-hltetmo.zip
  jactivelte/lineage-18.1-20240118-microG-jactivelte.zip
  jflteatt/lineage-18.1-20240118-microG-jflteatt.zip
  jfltespr/lineage-18.1-20240118-microG-jfltespr.zip
  jfltevzw/lineage-18.1-20240118-microG-jfltevzw.zip
  jfltexx/lineage-18.1-20240120-microG-jfltexx.zip
  jfvelte/lineage-18.1-20240118-microG-jfvelte.zip
  klte/lineage-18.1-20240118-microG-klte.zip
  klteactivexx/lineage-18.1-20240118-microG-klteactivexx.zip
  klteaio/lineage-18.1-20240118-microG-klteaio.zip
  kltechn/lineage-18.1-20240118-microG-kltechn.zip
  kltechnduo/lineage-18.1-20240119-microG-kltechnduo.zip
  klteduos/lineage-18.1-20240119-microG-klteduos.zip
  kltedv/lineage-18.1-20240119-microG-kltedv.zip
  kltekdi/lineage-18.1-20240119-microG-kltekdi.zip
  kltekor/lineage-18.1-20240119-microG-kltekor.zip
  montana/lineage-18.1-20240119-microG-montana.zip
  oneplus3/lineage-18.1-20240119-microG-oneplus3.zip
  porg/lineage-18.1-20240119-microG-porg.zip
  porg_tab/lineage-18.1-20240119-microG-porg_tab.zip
  rhannah/lineage-18.1-20240119-microG-rhannah.zip
  s3ve3gds/lineage-18.1-20240119-microG-s3ve3gds.zip
  s3ve3gjv/lineage-18.1-20240120-microG-s3ve3gjv.zip
  s3ve3gxx/lineage-18.1-20240119-microG-s3ve3gxx.zip
  shamu/lineage-18.1-20240119-microG-shamu.zip
  sirius/lineage-18.1-20240119-microG-sirius.zip
  victara/lineage-18.1-20240119-microG-victara.zip
  x2/lineage-18.1-20240119-microG-x2.zip
  z3/lineage-18.1-20240119-microG-z3.zip
  z3c/lineage-18.1-20240119-microG-z3c.zip
  zl1/lineage-18.1-20240119-microG-zl1.zip

devices a5y17lte - debx have two builds, ether - zl1 have only one. Look at the build log - nothing obvious (and I've tidied up the 18.1 device logs 😄 )

May be a consequence of the cleanup though I don't think so.

I'm going to do another 18.1 build run with DELETE_OLD_ZIPS (and LOGS) set to 3 instead of 2, and see what happens.

If there's no change then ????

@vanMacG
Copy link

vanMacG commented Jan 22, 2024

I'm going to do another 18.1 build run with DELETE_OLD_ZIPS (and LOGS) set to 3 instead of 2, and see what happens.

With the re-run it is again like I wrote in a comment above:

I don't have proof for this (scrrenshots), but as far as I remember, there was no 17.1 build before. There was the current monthly 18.1 build including recovery and boot image and corresponding checksums. Additionally there was the previous monthly 18.1. image and corresponding checksum (no recovery and boot).

@petefoth
Copy link
Contributor Author

petefoth commented Jan 22, 2024

I'm going to do another 18.1 build run with DELETE_OLD_ZIPS (and LOGS) set to 3 instead of 2, and see what happens.

With the re-run it is again like I wrote in a comment above:

I don't have proof for this (screenshots), but as far as I remember, there was no 17.1 build before. There was the current monthly 18.1 build including recovery and boot image and corresponding checksums. Additionally there was the previous monthly 18.1. image and corresponding checksum (no recovery and boot).

Thanks. This is caused by our post-build.sh which rsyncs the zip directories on the build and download servers: as currently coded, that is removing from both servers any files that are not present on the other. As I deleted the .img and .sha256sum files on the one of the servers in my initial tidy-up, they are missing from both :)

I'll go and learn a bit more about how rsync works :smile

@petefoth
Copy link
Contributor Author

petefoth commented Jan 22, 2024

OK I've made some changes to the post-build.sh (removing the --delete and --delete-excluded args from the call to rsync) which seem to have the desired effect. Test build onklteactivexx resulted in two zips, and the full set of .img and .sha2456sum files. (In the test build DELETE_OLD_ZIPS is set to 3, so we have two complete builds (20240121 & 20240122), and the partial 20240118 build (no .img and their sha256sum files).

For the February build run will keep this change, and keep DELETE_OLD_ZIPS at 2. This should give leave us with two complete builds per device. We can see how much space that takes up on the download server, and make a decision whether to go back to 3 builds in the March build run.

@petefoth
Copy link
Contributor Author

petefoth commented Feb 2, 2024

All the 20.0 devices built so far in the February 24 build run now have 2 builds present :)

@petefoth
Copy link
Contributor Author

All the 20.0 devices built so far in the February 24 build run now have 2 builds present :)

Except for devices which were successfully built for the first time (radxa0, odroidc4, m5)

To ensure we keep two builds available for these devices (and any devices which get built for the first time in future build runs, we should change out rsync logic so the --delete and --delete-excluded args are only set if there is only a single build present in the zips directory.

@ArchangeGabriel
Copy link

There is another exception, pdx203 and pdx206 that had been disabled since July were enabled again during last month, and rebuild for the first time in February. They now have three images: https://download.lineage.microg.org/pdx203/ and https://download.lineage.microg.org/pdx206/.

@petefoth
Copy link
Contributor Author

OK the idea of keeping only two builds, and deleting builds for devices which are no longer supported is too simplistic. Please see new issue 'Build retention strategy #573' for a strategy which is more flexible, and more friendly for devices which are no longer officially supported by LOS

I'd be very interested in your comments there. (I'll close this issue now)

@petefoth
Copy link
Contributor Author

petefoth commented Feb 20, 2024

The free disk space numbers after the February build run, with two builds stored per device

Filesystem      Size  Used Avail Use% Mounted on
/dev/vdb1       590G  491G   70G  88% /mnt/archive

The figures include the new archive folder containing the final builds for devices which are no longer supported by LOS (see Build retention strategy #573 ), which takes up a total of 24G. Even if we removed the archive, we would still not have space to keep three builds per device rather than two. This is mainly due to the fact that we now keep several img files for each device build

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants