-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dynamically modifying the .electron_object.executor
from an imported electron does not always work as expected
#1889
Comments
.electron_object.executor
from an imported electron does not always work with sublattices.electron_object.executor
from an imported electron does not always work as expected
…g job attributes (e.g. executor) (#1359) Continuation of #1322. Challenges: - Mainly Dask, which does not play nicely when `functools.partial()` is applied to a `Delayed` object. See dask/dask#10707. There is now a workaround. - Also there is a minor issue with dynamic switching of Covalent executors, but there is a workaround. See AgnostiqHQ/covalent#1889 Remaining issue: Need to be able to parallelize the following code on Dask. ```python from dask.distributed import Client from ase.build import bulk from quacc.recipes.emt.slabs import bulk_to_slabs_flow from functools import partial client = Client() atoms = bulk("Cu") delayed = bulk_to_slabs_flow(atoms) result = client.gather(client.compute(delayed)) ``` Hint: To monitor, set `"logfile": ""`. --------- Co-authored-by: deepsource-autofix[bot] <62050782+deepsource-autofix[bot]@users.noreply.github.com>
Thanks for this @Andrew-S-Rosen ideally it should be so that all kwargs of electrons are properties as well that can be set post creation. I will update the issue as a bug/feature-enhancement as well. Cc: @kessler-frost / @jimmylism |
CC: @cjao , any quick idea on why this would be the case for packages? |
@Andrew-S-Rosen I think this may be due to how the objects are imported: https://stackoverflow.com/a/3536638 |
@Andrew-S-Rosen This is a fun question and boils down to how cloudpickle (Covalent's serialization mechanism) works. The key difference between your examples (1) non-working import covalent as ct
from mypackage import increment, subflow
@ct.lattice
def workflow(x, op):
return subflow(x, op)
increment.electron_object.executor = "local"
ct.dispatch(workflow)(1, increment) (2) working import covalent as ct
from mypackage import increment
@ct.lattice
def workflow(x, op):
return op(x)
increment.electron_object.executor = "local"
ct.dispatch(workflow)(1, increment) is that (1) pickles the wrapped Example: mypackage.py class Electron:
def __init__(self):
self.executor = "dask"
def my_func():
pass
my_func.electron_object = Electron() main.py import cloudpickle
import mypackage
mypackage.my_func.electron_object.executor = "local"
with open("my_func.pkl", "wb") as f:
cloudpickle.dump(mypackage.my_func, f) exec.py import pickle
with open("my_func.pkl", "rb") as f:
my_func = pickle.load(f)
print(my_func.electron_object.executor)
But now if we instruct cloudpickle to serialize the main.py import cloudpickle
import mypackage
mypackage.my_func.electron_object.executor = "local"
cloudpickle.register_pickle_by_value(mypackage)
with open("my_func.pkl", "wb") as f:
cloudpickle.dump(mypackage.my_func, f) we get
|
@wjcunningham7, @cjao: Thank you both for the very insightful replies!
That's the main takeaway for sure. I didn't realize that --- subtle! In any case, this isn't a major issue, but I did want to flag it as a potential "gotcha." With this response, I feel pretty comfortable knowing what's going on now. Feel free to close if you wish! |
Environment
What is happening?
Modifying the
.electron_object.executor
field of an imported electron does not always carry over in select circumstances.How can we reproduce the issue?
Alright, bear with me for a moment! This is a pretty niche issue report.
Take the following example code:
This works perfectly. The
increment
job runs with the local executor rather than the Dask executor.However, if you move the electron and sublattice into a Python package and import them, the executor is not updated appropriately:
Here, the
increment
job will run via the Dask executor. This was as minimal of an example as I could make to reproduce the issue (see below).What should happen?
The modified
.executor
field should be retained.Any suggestions?
The following alternative approach works fine:
Note that simpler examples do work. For instance, the following example works fine:
Maybe this has something to do with the mutability of
increment
... could very well be me.The text was updated successfully, but these errors were encountered: