You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In order to reduce the amount of work that goes into identifying changes made to a graph, we can track when changes are made as we make them rather than sorting, serializing, and writing to disk ALL contexts as we do now.
Three approaches occur to me:
When triples are added to a context graph mark it as 'dirty'. Then we only need ever to serialize dirty graphs
Like 1, but compute a sort of hash function, h, of adds and removes where h([add(t1), add(t2), remove(t1)]) == h([add(t2)]). Typically, we don't do many removals, but if we did, this would detect. Technically, would need to handle collisions as well.
Record the actual adds and removes to each context literally in a journal. Updates would then just be writing to this journal and commits would read from it.
In order to reduce the amount of work that goes into identifying changes made to a graph, we can track when changes are made as we make them rather than sorting, serializing, and writing to disk ALL contexts as we do now.
Three approaches occur to me:
h([add(t1), add(t2), remove(t1)]) == h([add(t2)])
. Typically, we don't do many removals, but if we did, this would detect. Technically, would need to handle collisions as well.relates to openworm/owmeta#350
The text was updated successfully, but these errors were encountered: