-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Defined classifier inputs and outputs #7
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a good start! I've left some suggestions and comments here. Also wondering if @chris-soelistyo has any comments?
input = K.Input(shape=(2,)) | ||
encoder = layers.Encoder2D()(input) | ||
dense = K.layers.Dense(512, activation="relu")(encoder) | ||
output = K.layers.Dense(4, activation=tf.nn.softmax)(dense) | ||
|
||
classifier_model = K.Model(inputs=input, outputs=output) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should make this a function for now (and transition to a full model later), e.g.:
def build_classifier(shape: tuple, outputs: int, dense_filters: int = 512):
"""add docstrings :)"""
input = K.Input(shape=shape)
...
return classifier_model
Co-authored-by: Alan R Lowe <[email protected]>
|
||
input = K.Input(shape=(2,)) | ||
encoder = layers.Encoder2D()(input) | ||
dense = K.layers.Dense(512, activation="relu")(encoder) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, I think you'll need a Flatten
here:
flat = K.layers.Flatten()(encoder)
dense = ...(flat)
from cellx import layers | ||
|
||
input = K.Input(shape=(2,)) | ||
encoder = layers.Encoder2D()(input) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might perhaps be worth specifying the encoder layer when the build_classifier function
is called as well, e.g.
encoder = layers.Encoder2D()
def build_classifier(shape: tuple, outputs: int, dense_filters: int = 512, encoder=encoder):
"""add docstrings :)"""
input = K.Input(shape=shape)
...
return classifier_model
seeing as it can be helpful to experiment with various instances of Encoder2D
(with different numbers of layers, units per layer etc) and that would be served I think by being able to call Encoder2D
outside of build_classifier
and then incorporate it in after.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great point @chris-soelistyo ! Why don't we save this for a follow-up PR? Perhaps you can do that once we merge this.
No description provided.