Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems with mixing diverse with optimization #2

Open
daemontus opened this issue Nov 11, 2023 · 5 comments
Open

Problems with mixing diverse with optimization #2

daemontus opened this issue Nov 11, 2023 · 5 comments

Comments

@daemontus
Copy link

Hi!

I'm not sure if my approach is correct, but I'm trying to do a diverse sampling of optimized networks using the following:

bo.maximize_nodes()
bo.maximize_strong_constants()
for bn in bo.diverse_boolean_networks():
    print(bn)

However, this fails with:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Cell In[7], line 31
     29 diverse = bonesis.DiverseBooleanNetworksView(bo)
     30 print(diverse.standalone())
---> 31 for bn in diverse:
     32     print(bn)

File ~/miniconda/envs/celloracle_env/lib/python3.8/site-packages/bonesis/views.py:173, in BonesisView.__next__(self)
    171             self._progressbar.close()
    172 elif self.mode == "optN":
--> 173     while not self.cur_model.optimality_proven:
    174         self.cur_model = next(self._iterator)
    175         self._progress_tick()

AttributeError: 'MPBooleanNetwork' object has no attribute 'optimality_proven'

Is this is a bug, or is the combination of diverse sampling and optimization currently unsupported?

I am also trying to run the ASP program returned by bonesis.DiverseBooleanNetworksView(bo).standalone(), and it is indeed returning something, but I haven't yet validated the results thoroughly.

@daemontus
Copy link
Author

Also, I might have missed this somewhere, but is there a way to "decode" the output of the .standalone() ASP program assuming I know the configuration of bonesis that generated the ASP?

@pauleve
Copy link
Member

pauleve commented Nov 15, 2023

Hi,
It is not supported for now. The way to workaround it (and actually the recommended way to do it) is to first enumerate sets of nodes resulting from the optimization (with NonConstantNodesView) ; and for each solution, prune the influence graph accordingly, and perform a diverse enumeration of boolean networks without having to perform any optimization.
This is recommended, because in the first stage you can relax the canonicity constraint (canonic=False of InflunceGraph domain), and then you can enforce it in the second stage, with a much reduced network (so much better performances).
You can find some related notebooks here:
https://github.com/StephanieChevalier/notebooks_for_bonesis

Regarding decoding the output, you have several ..._of_facts methods in the bonesis0.asp_encoding that may be helpful.

@daemontus
Copy link
Author

Awesome, thank you! I looked at Stephanie's notebooks but I couldn't find much about the diverse sampling, hence the question.

Maybe one more thing, just to be sure... The difference between exact and canonic is that exact means "all regulations of the influence graph should be essential (i.e. used by the BN)" while canonic means "all resulting networks are not only syntactically but also semantically unique", right?

@pauleve
Copy link
Member

pauleve commented Nov 15, 2023

yes. exact means that each edge of the influence graph must be used with the same sign. By default, this forbids having influence graphs with unsigned edges (because only locally-monotone BNs are supported for the learning). In the git version (to be released soon), exact="unsigned" ignores the sign, so it can work with influence graphs with undetermined signs.
In applications, we do not use much exact, this is essentially for small/synthetic case studies.
canonic ensures a canonical representation of the BN, so two solutions have different global functions. It requires a quadratic number of variables with the number of regulators and clauses, so we try to avoid it when doing the initial pruning of the GRN.

@daemontus
Copy link
Author

Cool, I understand then. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants