This schedule is subject to change. A slightly more detailed list of topics (as PDF) can be found here.
Day | Date | Topic/slides | Lab | Reading |
1 | 6 July | Introduction, probability estimation, entropy | ||
2 | 9 July | n-gram language models | distributions, estimation, likelihood | JM 4.1-4.4, 4.10-4.11 |
3 | 13 July | Smoothing, part-of-speech tagging | JM 4.5-4.7, 5.1-5.3 | |
4 | 16 July | hidden Markov Models | POS tagging | JM 5.5, 6.1-6.5 |
5 | 20 July | Parsing algorithms, human parsing | JM 12.5, 13.1-13.4 | |
6 | 23 July | Treebanks and statistical parsing | parsing | JM 12.4, 14.1-14.6, 14.10 |
7 | 27 July | Word senses, distributional semantics, classification | JM 19.1-19.3, 20.1-20.3, 20.7 | |
8 | 30 July | Learning (Sharon's research) | Sentiment on Twitter, data |