You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This approach uses the dataset as an iterator and then aborts after the necessary count of records is found.
However, the connector process itself is not shut down when we stop iterating from it.
Improvement Proposal
Ideally, we'd add a callback to close the connection on the lazy dataset - and/or we'd operate like a context manager and auto-clean up the process when the context manager exits.
In practice, this has not caused any problems for our use cases - but it would be good to improve handling here.
The text was updated successfully, but these errors were encountered:
aaronsteers
changed the title
💡 Feature Request: The lazy Source.get_records() generator method doesn't yet have a clean way to shut down the connector process
💡 Feature Proposal: Add a way to clean up after a lazy Source.get_records() generator method is abandoned
Nov 6, 2024
aaronsteers
changed the title
💡 Feature Proposal: Add a way to clean up after a lazy Source.get_records() generator method is abandoned
💡 Feature Proposal: Add a way to clean up after a lazy Source.get_records() generator is abandoned
Nov 6, 2024
Today, the
Source.get_records()
method returns aLazyDataset
that can be iterated upon to get records.Given a source declared like this:
You can iterate over records lazily like this:
This approach uses the dataset as an iterator and then aborts after the necessary count of records is found.
However, the connector process itself is not shut down when we stop iterating from it.
Improvement Proposal
Ideally, we'd add a callback to close the connection on the lazy dataset - and/or we'd operate like a context manager and auto-clean up the process when the context manager exits.
In practice, this has not caused any problems for our use cases - but it would be good to improve handling here.
The text was updated successfully, but these errors were encountered: