-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
For nodes derive from HasChannel, but node.channel raises and error #603
Comments
Ok, so this is not a case of a bug in execution or performance, but just a very uninformative error message. There is nothing wrong with the "call-like" syntax here, but rather the "node-as-channel" syntax. In this case, it's because import pyiron_workflow as pwf
@pwf.as_function_node
def Square(x: int) -> int:
y = x**2
return y
sf_channels = Square.for_node(iter_on='x', output_as_dataframe=False)
print(sf_channels.outputs.labels)
>>> ['x', 'y'] On the other hand, if you left them packaged as a data channel, you'd be able to access its single-output without trouble: sf_df = Square.for_node(iter_on='x', output_as_dataframe=True)
print(sf_df.channel.full_label)
>>> /ForSquareIterXDataOut.df You can then leverage the entire node as though it were output, and you can even convert the dataframe column to a list entirely within the @pwf.as_function_node
def Sum(xs: list[int]) -> int:
s = sum(xs)
return s
sf_df.inputs.x = [1, 2, 3]
xcol = pwf.standard_nodes.GetItem(sf_df, "y")
tolist = pwf.standard_nodes.GetAttr(xcol, "tolist")
df_col_as_list = pwf.standard_nodes.PureCall(tolist)
sum_from_dataframe = Sum(df_col_as_list)
print(sum_from_dataframe())
>>> 14 This is why I introduced sf_channels.inputs.x = [1, 2, 3]
sum_from_channels = Sum(sf_channels.outputs.y)
print(sum_from_channels()) Note that So, ultimately, I think the only problem here is that the error message is "I can't find 'channel'" instead of "You tried to get 'channel' when there's more than one output". Let me poke around and see if I can get it to be more explicit in this case. |
Aha, so I understand the issue and then agree that throwing some kind of error when doing what I did. I kind of wish though that it didn't return the input again. In the dataframe case I see the utility, but when outputing just values it seems counterproductive. In the case I need the input and output together, I can easily wire the input to the for node to whatever consumer also needs its output, but case I do not need the input, the workflow just leaked a bunch of memory. In a way I'd prefer a On a more generalized direction: This opened up the design question to me whether its better to stitch together a number of for nodes (which is where this example originally came from) or to stitch the individual nodes and wrap the collection into a single for node. The first was more natural to me, especially in terms of check pointing, but maybe the latter is more ergonomic with
I figured that later in the afternoon after reading the code a bit, though in the beginning it felt somewhat natural for me to instantiate it first. It may have just being me playing with the singular node first. Tangentially, @Workflow.wrap.as_function_node
def MyNode(...):
pass
MyNodeLoop = MyNode.for_node(iter_on=...) because the second one is an instance, unless I misunderstood something somewhere. I could easily write the loop node by hand using the unwrapped function, but I'd be neater if I wouldn't need to.
I'm not sure. |
In principle, I agree. I'm even ok moving there in the long run if we find the memory becomes a problem. In the meantime, it is not necessarily completely trivial to match inputs to outputs, so I think its convenient to give them to users all the time. Consider: import pyiron_workflow as pwf
n = pwf.standard_nodes.Add.for_node(
iter_on="obj",
zip_on="other",
obj=["a", "b", "c"],
other=["A", "B", "C"],
output_as_dataframe=False,
)
n()
We might be able to get a decorator working like that, but already you can just subclass from typing import ClassVar
import pyiron_workflow as pwf
from pyiron_workflow.nodes.for_loop import For
from pyiron_workflow.nodes.static_io import StaticNode
class AddFor(For):
_body_node_class: ClassVar[type[StaticNode]] = pwf.standard_nodes.Add
_iter_on: ClassVar[tuple[str, ...]] = ("obj",)
_zip_on: ClassVar[tuple[str, ...]] = ("other",)
_output_as_dataframe: ClassVar[bool] = False
n = AddFor(
obj=["a", "b", "c"],
other=["A", "B", "C"],
)
n()
That's a valid complaint. Your understanding is correct. To go into more detail, it's a little bit of an edge case, because
I did, and it's a pain. I couldn't manage to get the ambiguous output exception into the stack trace for composites with multiple outputs without causing other trouble. Not a good sign 👎 I don't think I have time to work on it more this week. |
@pmrv, I'd like to keep the issue open while the error message sucks, but IMO it's a documentation issue rather than a bug -- behaviour is basically what we want, we're just not doing a good job describing to the user what happened. I switched the issue label accordingly, but lmk if you object. |
A-ok, I agree that the functionality works. I'd just like to be better informed by the error. |
I had seen this class, but I honestly was a bit scared to derive from something with a lot of underscores. Your snippet worked perfectly fine for me though, so it's ok for now. |
Right, when taking the product of inputs it becomes tricky. Ok for now, but maybe offering a dedicated map in the future would be nice UX |
I was trying to connect a for node with another node using the 'call-like' syntax sugar, but that triggers an error as below, though explicitly connecting inputs/outputs works fine.
MWE:
Error:
The text was updated successfully, but these errors were encountered: