Skip to content
This repository has been archived by the owner on Sep 30, 2022. It is now read-only.

[HACKTOBERFEST2021] [PYTHON] Memory Leak #32

Open
1 of 3 tasks
kboshold opened this issue Oct 1, 2021 · 7 comments
Open
1 of 3 tasks

[HACKTOBERFEST2021] [PYTHON] Memory Leak #32

kboshold opened this issue Oct 1, 2021 · 7 comments
Labels

Comments

@kboshold
Copy link
Contributor

kboshold commented Oct 1, 2021

Type of report:

  • BUG
  • FEATURE
  • OTHER

Actual behavior:
If you use the service for a longer time, the RAM consumption increases significantly. This should not be the case.

I assume that somewhere files/streams are not closed properly and therefore they remain infinitely long in RAM

image

@Jimmy-653
Copy link
Contributor

After first analysis the results concluded 3 test methods that were not cleaning up properly:
test_post_print_foreign: 34 False <= 0
test_post_print_png: 87 False <= 0
test_post_print_png_css_asset: False 41 <= 0

A few more tests would have increased RAM each run, a more detailed log will be added after a long test run.

Interesting is how two tests constantly reduced their used RAM:
test_post_print_mode_as_argument
test_post_print_html_without_css_assets

@kboshold
Copy link
Contributor Author

kboshold commented Oct 2, 2021

@J-Jimmy After this analysis I assume that the problem is a problem in the "Temates". These may inherit from another template and may not clean up correctly after printing.

@Jimmy-653
Copy link
Contributor

Out of 2068 tests, these are the results:
ram.1.log

  • test_may_update_result: 1967/2068 (95.11%)
  • test_get_health_status: 1347/2068 (65.13%)
  • test_get_health_timestamp: 2050/2068 (99.12%)
  • test_post_print_png: 1/2068 (0.04%)
  • test_post_print_png_css_asset: 1/2068 (0.04%)
  • test_post_print_pdf: 0/2068 (0%)
  • test_post_print_no_mode: 0/2068 (0%)
  • test_post_print_mode_as_argument: 2066/2068 (99.90%)
  • test_post_print_foreign: 0/2068 (0%)
  • test_post_print_foreign_url_deny: 0/2068 (0%)
  • test_post_print_foreign_url_allow: 1/2068 (0.04%)
  • test_post_print_access_deny: 2066/2068 (99.90%)
  • test_post_print_html_missing_params: 2066/2068 (99.90%)
  • test_post_print_html_without_css_assets: 2068/2068 (100%)

Where the % is the amount of test runs with no increased RAM.
Out of those tests the relevant ones are:

  • test_post_print_png: 1/2068 (0.04%)
  • test_post_print_png_css_asset: 1/2068 (0.04%)
  • test_post_print_pdf: 0/2068 (0%)
  • test_post_print_no_mode: 0/2068 (0%)
  • test_post_print_foreign: 0/2068 (0%)
  • test_post_print_foreign_url_deny: 0/2068 (0%)
  • test_post_print_foreign_url_allow: 1/2068 (0.04%)

To verify my results, I will run each test 100 and 1000 times and measure the RAM before and after again, this test won't be checked in, it's just to verify that a single run of those tests can work to determine leaks

Further tests narrowed it down to the png convert:

  • test_post_print_png
  • test_post_print_png_css_asset

However if the service is started normaly and pdf is requests the RAM still increases constantly.
I'm more guessing right now :D

@Jimmy-653
Copy link
Contributor

Actually i got it workaround/fixed.
Please add permission for me to create branches, too lazy to fork :P

We will have to discuss this solution tomorrow, but I think it's practicable ^^

@kboshold
Copy link
Contributor Author

kboshold commented Oct 6, 2021

@J-Jimmy You should now have write access

@Jimmy-653
Copy link
Contributor

I did the boring fix too....

@Jimmy-653 Jimmy-653 reopened this Oct 8, 2021
@Jimmy-653
Copy link
Contributor

So seems after running the container 2 days the RAM still increased by a 1.6GiB in memory.
Guess I didn't fix it yet. Maybe someone else can take a look at this :)

I will change my PR to the docker-container RAM limit which will force python to free memory and keep it low.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

2 participants