It includes all lab/assignment required in CS224 Winter 2019 Stanford NLP Course In addition, I have a personal project based on the guidance provided.
- Pratical tips for Personal - Some pratical tips for future NLP realated side projects
- CS231n notes on backprop
- CS224 HOME- Homepage for CS224N
- CS124: From Language to Information - A great course by Dan Jurafsky. It is mainly about extracting meaning, information, and structure from human language text, speech, web pages, genome sequences, social networks, or any less structured information. Methods include: string algorithms, edit distance, language modeling, naive Bayes, inverted indices, vector semantics. Applications such as information retrieval, question answering, text classification, social network models, chatbots, genomic sequence alignment, word meaning extraction, recommender systems.
- stanfordNLP
- NLTK
- Apache OpenNL
- Kaldi HTKHTS
- Hadoop Mahout word2vec
- word segmentation
- POS tagging
- named entity recognition
- knowledge graph
- text classification
- information extraction
- keyword tagging
- automatic summarization
- LSTM
- CNN
- RNN
- seq2seq
- BERT
- Based on a prefix dictionary structure to achieve efficient word graph scanning. Build a directed acyclic graph (DAG) for all possible word combinations.
- Use dynamic programming to find the most probable combination based on the word frequency.
- For unknown words, a HMM-based model is used with the Viterbi algorithm.