Skip to content

Feature/relu #18

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 48 commits into
base: main
Choose a base branch
from
Open

Conversation

Nereuxofficial
Copy link
Collaborator

I finally had time to implement this. This time i implemented it as a feature, which should be nicer to use, since it allows users this: neat-gru-rust = { version="1.1.1", features="relu"} but without it the tanh function will be used.

@Nereuxofficial Nereuxofficial marked this pull request as ready for review November 22, 2022 11:09
@Nereuxofficial Nereuxofficial requested a review from sakex November 22, 2022 14:51
Copy link
Owner

@sakex sakex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR, I really like the idea, however I'm wondering if we shouldn't go further and allow the user to provide their own activation functions by creating a trait that the user will implement and chose their own methods?

Also I can't merge in its current state because tanh and sigmoid should be separated in the compute method. Otherwise it will break for users.

@@ -95,20 +95,20 @@ where
#[replace_numeric_literals(T::from(literal).unwrap())]
#[inline]
pub fn get_value(&mut self) -> T {
let update_gate = fast_sigmoid(self.update);
let reset_gate = fast_sigmoid(self.reset);
let current_memory = fast_tanh(self.input + self.memory * reset_gate);
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should have options for tanh,sigmoid and relu for all possible activations as it wil bias the output

@Nereuxofficial
Copy link
Collaborator Author

Nereuxofficial commented Dec 5, 2022

Thanks for the PR, I really like the idea, however I'm wondering if we shouldn't go further and allow the user to provide their own activation functions by creating a trait that the user will implement and chose their own methods?

Also I can't merge in its current state because tanh and sigmoid should be separated in the compute method. Otherwise it will break for users.

Yeah that sound like a better idea. Make a NN generic over its activation function. I will try that later

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants