Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why is the backward output reversed across first axes in the bidirectional RNN? #32

Open
ReallyCoolName opened this issue Jun 14, 2016 · 3 comments

Comments

@ReallyCoolName
Copy link

Apologies if I'm missing something obvious,
but when you get the output for the bidirectional RNN:

    Xf = self.forward.get_output(train)
    Xb = self.backward.get_output(train)
    Xb = Xb[::-1]

Seems you're reversing the first (batch) dimension, aren't you?
(I'm assuming I'm wrong, since looking back at the file's history, this part hasn't changed, and no one else seems to be asking about this, but from what I understand, keras does every operation on batches)

@EderSantana
Copy link
Owner

I think you are right. Did you do some unit test to confirm?

Also, would you have time to PR the change Xb = X[:, ::-1] along with some tests to prove correctness?

@ReallyCoolName
Copy link
Author

Didn't do any tests, Just looked over the code when I was trying to figure out why it's complaining about build getting 2 arguments and noticed that. I gave up and went back to try to figure out how to use the graph model in Keras.
(Not sure if the whole build thing is a problem with your implementation. Keras has been complaining alot)
I'll probably have more time tomorrow to try to debug this, right now I'm trying to pull off a miracle :(

@EderSantana
Copy link
Owner

btw, notice that this code base uses an older version of keras. there are other ways to do biRNN with keras_1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants