-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Synapse trace #61
base: main
Are you sure you want to change the base?
Synapse trace #61
Conversation
for key_behavior in synapse.src.behavior | ||
if isinstance(synapse.src.behavior[key_behavior], SpikeTrace) | ||
synapse.behavior[key_behavior] | ||
for key_behavior in synapse.behavior |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could you explain the changes?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This specific learning method needs variable tau_s to perform its operation; Since the trace now resides on synapse accessing the src neuron group is wrong.
).expand(synapse.dst.size) | ||
if synapse.src is not None: | ||
synapse.src_shape = (1, 1, synapse.src.size) | ||
if hasattr(synapse.src, "depth"): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Although it is safe by design, imho a comment is needed for the existence guarantee of .height, .weight
too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree. There are going to be changes in the Pymonntorch. which would require refactoring these in a more general method.
spike_scale (float): the increase effect of spikes on the trace. | ||
""" | ||
|
||
def __init__(self, tau_s, *args, spike_scale=1.0, **kwargs): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pre-trace is a new behavior, thinking the following will be more descriptive
def __init__(self, tau_s, /, *, spike_scale=1.0, **kwargs):
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As far as I know, using the symbol "/" function signature prevents specifying "tau_s" by keyword which is not desired.
conex/helpers/transforms/masks.py
Outdated
@@ -15,12 +15,14 @@ class GridEraseMask: | |||
m (int): The grid row size. | |||
n (int): The grid column size. | |||
random (bool): If true, shuffles the order of the masks. | |||
gap (tuple(int)): left, bottom, up, right gaps for the cell. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to change the order to a more universal version,
or torch the standard version for padding
(padding_left,padding_right, padding_top, padding_bottom)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done, too bad vim keybinding are not considered universal
conex/helpers/transforms/masks.py
Outdated
@@ -33,7 +35,20 @@ def __call__(self, img): | |||
) | |||
for index, ij in enumerate(product(range(self.m), range(self.n))): | |||
i, j = ij | |||
result.append(TF.erase(img, i * h_grid, j * w_grid, h_grid, w_grid, v=0)) | |||
h_cor = i * h_grid + self.gap[0] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
deconstruct the gap, and use local variables
gap_left, gap_bottom, gap_up, gap_right = self.gap
or define constants
class Direction:
LEFT = 0
BOTTOM = 1
UP = 2
RIGHT = 3
h_cor = i * h_grid + self.gap[Direction.LEFT]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
defining a class is overkill, deconstructing is pretty reasonable that a new commit should've elected this style.
This PR have many changes, the main one will move trace behaviour into synapses instead of neurons with allows for more flexibility.