Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About IPA alphabet #1

Open
loretoparisi opened this issue Apr 13, 2018 · 2 comments
Open

About IPA alphabet #1

loretoparisi opened this issue Apr 13, 2018 · 2 comments

Comments

@loretoparisi
Copy link

Thanks a lot for this amazing and inspiring work. I'm currently working on a Tensor2Tensor like LSTM encoder/decoder G2P, but using the CMU 2 IPA dictionary / alphabet. In your model you are using the standard CMU/Arpabet, but what about using IPA instead? - see https://github.com/loretoparisi/docker/tree/master/g2p-seq2seq

@viveksck
Copy link
Owner

Thanks for letting me know! When we were writing the paper, we used a pre-trained model (on the standard CMU data-set) mainly due to timing constraints. Indeed, we note in our paper that using IPA might improve the models (since stress is accounted for). We will be looking at the IPA based models you pointed out as well! Thanks!

@loretoparisi
Copy link
Author

loretoparisi commented Apr 24, 2018

@viveksck you are welcome! I will let you know the results of the model training. We are now working on the Tensor2Tensor Seq2Seq architecture by CMU and the CMU 2 IPA dict.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants