Feb. 2020 – Aug. 2020
1. Derive the followings by hand: MLE, MAP, XGBoost, SVM, dual SVM,
Expectation–Maximization (EM) Algorithm, K-means, Gaussian Mixture Model
(GMM), Viterbi Algorithm, Hidden Markov Model (HMM), Conditional Random
Field (CRF), Log-Linear Model, Probabilistic Graphical Model (PGM),
Skip-Gram, Latent Dirichlet allocation (LDA), Gibbs Sampling.
2. NLP fundamentals including including but not limit to Noisy Channel
Model, N-Gram, CBoW, Skip-Gram, RNN, LSTM, Seq2seq, Glove, ELMo,
POS tagging, dependency parsing, Named Entity Recognation.
3. Knowledge Graph, including bootstrap, entity disambiguation, entity resolution,
co-reference resolution, etc..
4. Transformer, BERT, RoBERTa, XLNet, GPT, ALBert.
5. Graph Neural Network. Including Recurrent graph neural networks (RecGNNs),
Gated Graph Neural Networks (GGNNs), Spatial convolutional neural network
(including Message Passing Neural Network, GraphSage, PATCHY-SAN), Spectral
Convolutional Neural Network (including Graph Fourier Transform, Chebyshev
Spectral CNN), Graph Attention Networks (GATs).