Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Synapse trace #61

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
Open

Synapse trace #61

wants to merge 9 commits into from

Conversation

saeedark
Copy link
Collaborator

This PR have many changes, the main one will move trace behaviour into synapses instead of neurons with allows for more flexibility.

for key_behavior in synapse.src.behavior
if isinstance(synapse.src.behavior[key_behavior], SpikeTrace)
synapse.behavior[key_behavior]
for key_behavior in synapse.behavior
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you explain the changes?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This specific learning method needs variable tau_s to perform its operation; Since the trace now resides on synapse accessing the src neuron group is wrong.

conex/behaviors/synapses/learning.py Show resolved Hide resolved
).expand(synapse.dst.size)
if synapse.src is not None:
synapse.src_shape = (1, 1, synapse.src.size)
if hasattr(synapse.src, "depth"):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Although it is safe by design, imho a comment is needed for the existence guarantee of .height, .weight too.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree. There are going to be changes in the Pymonntorch. which would require refactoring these in a more general method.

spike_scale (float): the increase effect of spikes on the trace.
"""

def __init__(self, tau_s, *args, spike_scale=1.0, **kwargs):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pre-trace is a new behavior, thinking the following will be more descriptive

def __init__(self, tau_s, /, *, spike_scale=1.0, **kwargs):

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As far as I know, using the symbol "/" function signature prevents specifying "tau_s" by keyword which is not desired.

@@ -15,12 +15,14 @@ class GridEraseMask:
m (int): The grid row size.
n (int): The grid column size.
random (bool): If true, shuffles the order of the masks.
gap (tuple(int)): left, bottom, up, right gaps for the cell.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to change the order to a more universal version,
or torch the standard version for padding
(padding_left,padding_right, padding_top, padding_bottom)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done, too bad vim keybinding are not considered universal

@@ -33,7 +35,20 @@ def __call__(self, img):
)
for index, ij in enumerate(product(range(self.m), range(self.n))):
i, j = ij
result.append(TF.erase(img, i * h_grid, j * w_grid, h_grid, w_grid, v=0))
h_cor = i * h_grid + self.gap[0]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

deconstruct the gap, and use local variables

gap_left, gap_bottom, gap_up, gap_right = self.gap

or define constants

class Direction:
   LEFT = 0
   BOTTOM = 1
   UP = 2
   RIGHT = 3

h_cor = i * h_grid + self.gap[Direction.LEFT]

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

defining a class is overkill, deconstructing is pretty reasonable that a new commit should've elected this style.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants