Experiments using neural network RNN(char and word level) and GAN(char-level) in Natural Language Generation and their performance
The Big Bang Theory S1-S10: https://github.com/skashyap7/TBBTCorpus/tree/master/preprocessing
Crawl from here: https://bigbangtrans.wordpress.com/
Training data used for the experiment. Extracted all Sheldon transcripts from raw_corpus
All related report and presentation
Codes for crawling data and preprocessing
Codes for char and word level generation using RNN
Results of sample output and training logs after training GAN in char level with sequence length of 80, codes are in the reference
- RNN model:
- GAN:
- Crawler: