Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setting Activations on layers #260

Open
ZadravecM opened this issue Aug 27, 2018 · 3 comments
Open

Setting Activations on layers #260

ZadravecM opened this issue Aug 27, 2018 · 3 comments

Comments

@ZadravecM
Copy link

Hi,

I am working with seq2seq library, and I have some problems,....

I have vector that have values inside from 1 to 11399,...
so my (training) vector is looking like : [1, 200, 1235, 11300,...]

But I always get back ('predicted') values from 0 to 1... like
[0, 0.2, 0.3, 1,..]

I guess this is due to a activation function (softmax?). Is there a way to defined a activation function for seq2seq model layers?

Marko

@mladl
Copy link

mladl commented Sep 27, 2018

I met the same problem. Did you solve it?

@yyb1995
Copy link

yyb1995 commented Nov 4, 2018

I think it's because there's a W2 dense layer after the normal lstm layer. 45th line of cell.py is y = Activation(self.activation)(W2(h)). actually h is the output of lstmcell. And y is the output of W2. So you can try to change the activation function of this line.

@mladl
Copy link

mladl commented Nov 4, 2018

I think it's because there's a W2 dense layer after the normal lstm layer. 45th line of cell.py is y = Activation(self.activation)(W2(h)). actually h is the output of lstmcell. And y is the output of W2. So you can try to change the activation function of this line.

I think you are right. The default activation function is tanh. Also normalization the data is another choice to solve the problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants