Replies: 1 comment
-
Hi @bcingram , Thanks for detailed explanation, we will take a look at this. For the $nodes we are planning to add a global context to the graph - meaning that every node that is about to evaluate will also have access to all previous nodes. This will allow graphs to not have a need for a complex connections but only minimum required. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey team, fantastic work with gorules, I love what you guys have done and where you are headed with this!
I'm working on some complex graphs where it makes more sense to use the new Switch node rather than translating some of the decisioning logic into a Decision table. When I have graphs that have numerous Switch's the performance seems to slow to a crawl after a certain critical mass of Switches are included in the graph.
After investigating, it appears that switches are passing all node data as part of their input to the next node ($nodes). Then, the next switch does the same thing again, seemingly doubling the size of the input payload and so on. After a while, this starts to bog down the parser to the point where a graph that ran in a handful of milliseconds can take 10, 20, or more seconds to run.
It looks like there's a simple fix, but I'm not sure what the impact if any might be. In traversal.rs lines 125 there's a hard-coded boolean value being passed through to fn "incoming_node_data" as true (with_bool). When I compile this myself with that bool set to false the graphs execute efficiently once more. This seems to turn off the feature of Switches where they're passing all the traversed $nodes and duplicating the data to the point it's affecting processing time.
The easy fix was to recompile and use my modified libzen.so but I'm curious if there's a better way to resolve this problem? Am I missing an easier way to work around this? Is all of that redundant node data necessary? In my testing it seems to make zero-difference to the output of my decision models.
If this isn't needed, is this something you can turn off in the official cloud version of your service? It would make working on these larger switch based graphs so much nicer.
Once again, appreciate what you've done here!
Beta Was this translation helpful? Give feedback.
All reactions