Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reduce memory leaked by using model() instead of model.predict() #125

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

colindean
Copy link
Collaborator

In a long-running process invoking nsfw_detector.predict.classify_nd() often, we observed memory leakage consistent with what keras-team/keras#13118 suggests can be avoided through predicting via model() instead of model.predict() when combined with some memory management following invocation:

import gc
import tensorflow.keras as keras
from nsfw_detector import predict
# …
predict.classify_nd(model, image)
gc.collect()
keras.backend.clear_session()

An alternative suggestion was to use model.predict_on_batch(x) instead of model() but we didn't try that because using model() solved our memory leak problem adequately.

If this fix limits a use case, an alternative implementation for this fix may be to allow the caller to specify what of the three methods to use or to pass a callable that will receive the nd_images parameter.

colindean and others added 2 commits February 23, 2023 10:48
In a long-running process invoking `nsfw_detector.predict.classify_nd()` often, we observed memory leakage consistent with what [keras-team/keras#13118][13118] suggests can be avoided through predicting via `model()` instead of `model.predict()` when combined with some memory management following invocation:

```python
import gc
import tensorflow.keras as keras
from nsfw_detector import predict
# …
predict.classify_nd(model, image)
gc.collect()
keras.backend.clear_session()
```

An alternative suggestion was to use `model.predict_on_batch(x)` instead of `model()` but we didn't try that because using `model()` solved our memory leak problem adequately.

If this fix limits a use case, an alternative implementation for this fix may be to allow the caller to specify what of the three methods to use or to pass a callable that will receive the `nd_images` parameter.

[13118]: keras-team/keras#13118
@colindean
Copy link
Collaborator Author

I'm going to hold off on merging this for a bit. After putting this into production ourselves yesterday afternoon, we've seen elevated disk space usage. I want to be sure this change is not the cause; there are a few other changes that went into production that could be affecting it, though.

@colindean
Copy link
Collaborator Author

colindean commented Mar 26, 2023

We've not yet solved the disk usage problem but we did improve upon this function by enabling more flexibility in the model() or model.predict() call. I'll add to this PR when I'm able.

Edit: Whoops, I already put up #127 which has some of that!

@frankjoshua
Copy link

We also had a memory leak. This fixed it completely. I haven't seen any issues with disk usage so far but I'll keep an eye out.

@Artucuno
Copy link

Artucuno commented Aug 2, 2023

Hey, is this pull ready for production use?

@colindean
Copy link
Collaborator Author

Hey, is this pull ready for production use?

In theory, yes. I've not merged it because I wanted some more feedback before doing so since my case might have been isolated. My team is no longer running the service using it as of a few weeks ago but we'd not encountered any additional problems in the meantime.

If it passes tests, I'll consider merging it. @GantMan, thoughts?

@frankjoshua
Copy link

I have been using this pull in production. It's run over 10 million times with no issues. Before this we were rebooting the servers every couple of days due to Out of Memory errors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants