Overview | Grades | Textbook | Schedule | Readings |
Grading Scheme: Evaluation will be based on a set of assignments, a midterm, and an exam. Important: you must pass the final in order to pass the course. The instructor reserves the right to adjust this grading scheme during the term, if necessary.
If your grade improves substantially from the midterm to the final, defined as a final exam grade that is at least 20% higher than the midterm grade, then the following grade breakdown will be used instead.
Assignments -- 15%
Readings: Questions and Summaries -- 10%
Midterm -- 15%
Final -- 60%
The assignment grade will be computed by adding up the number of points you get across all assignments, dividing this number by the number of possible points, and multiplying by 20. Assignments will not be graded out of the same number of points; this means that they will not be weighted equally.
Your overall assignment grade will be computed by adding up the number of points you get across all assignments, dividing this number by the number of possible points, and multiplying by 15. Assignments will not be graded out of the same number of points; this means that they will not be weighted equally.
Working with a partner on assignments is highly recommended. In order to promote this kind of collaboration, you will receive a 5% bonus to any assignment where you work with a partner. for example, if an assignment is out of 100 points, you will receive 5 bonus points on it if you work with a partner (for a maximum of 105/100, which can "spill over" onto other assignments but cannot be used to bring your overall assignment grade over 100%). Note: to optimize your learning, you should actively collaborate with your partner, rather than simply having each partner work on part of the assignment. Your partnership should only submit one copy of the assignment; if both members submit, then you will not receive the partnership bonus.Assignments are to be typed (not handwritten) and submitted electronically on Canvas before the start of lecture on the due date. For each assignment, your submission must be formatted as a single PDF file. That means:
No Word files (all modern word processors have the option to save as PDF)
No zip/tar/rar/etc. files
No submissions with multiple files
Submissions failing to meet these formatting requirements will not be graded.
Late Assignments: Assignments are to be handed in BEFORE the start of lecture on the due date. However, every student is allotted four "late days", which allow assignments to be handed in late without penalty on three days or parts of days during the term. The purpose of late days is to allow students the flexibility to manage unexpected obstacles to coursework that arise during the course of the term, such as travel, moderate illness, conflicts with other courses, extracurricular obligations, job interviews, etc. Thus, additional late days will NOT be granted except under truly exceptional circumstances. If an assignment is submitted late and a student has used up all of her/his late days, 20% will be deducted for every day the assignment is late. (E.g., an assignment 2 days late and graded out of 100 points will be awarded a maximum of 60 points.)
How late does something have to be to use up a late day? A day is defined as a 24-hour block of time beginning at 1PM on the day an assignment is due. To use a late day, write the number of late days claimed on the first page of your assignment
Examples:
Assignments can be handed in electronically using ....Canvas.....; this is the only way to hand in late assignments over a weekend. Written work can also be put in Giuseppe's mailbox in the main CS office (room 201); ask the secretary to time-stamp it.
Missing Deadlines or Exams: In truly exceptional circumstances, when accompanied by a note from Student Health Services or a Department Advisor, the following arrangements will be made.
Submitting the work of another person as your own (i.e. plagiarism) constitutes academic misconduct, as does communication with others (either as donor or recipient) in ways other than those permitted for homework and exams. Such actions will not be tolerated.
You may ask questions about assignment questions on discussion boards. However, you may not publicly post your work or solutions (whether complete or partial).
You may fully collaborate with your partner. You may also discuss the assignments with other students; however:
You may not show your work to other students or look at other students' work (the same applies to sharing answers or checking whether you got the same answers as other students)
You may not take away any written record of your discussions with other students
After discussions with other students, you must wait at least half an hour before working on the assignment, to help ensure that you are working from your own understanding of the material.
On the first page of each assignment submission, you must acknowledge any students outside your partnership with whom you discussed the assignment.
All work for this course is required to be new work and cannot be submitted as part of an assignment in another course without the approval of all instructors involved.
Violations of the rules constitute very serious academic misconduct, and they are subject to penalties ranging from a grade of zero on the current and *all* the previous assignments to indefinite suspension from the University. More information on procedures and penalties can be found in the Department's Policy on Plagiarism and collaboration. If you are in any doubt about the interpretation of any of these rules, consult the instructor or a TA!
For information on department policies related to student well-being,
please visit
https://www.cs.ubc.ca/students/undergrad/resources/equity-inclusion-wellness.
Readings: what to do (Please check if email mentioned in the link is still valid)
Here is where you can find the course schedule and the PPT and PDF files from lectures. These dates will change throughout the term, but this schedule will be kept up to date. Assignment due dates are provided to give you a rough sense; however, they are also subject to change. I will try to always post the slides in advance (by 3PM). After class, I will post the same slides inked with the notes I have added in class.
Date | Lecture | Notes |
1 Mon, Jan 11 | Course Overview [pdf] | |
2 Wed, Jan 13 | Value of Info and Control - start Markov Decision Processes (MDPs) [pdf] | "322" Slides on Decision Networks and on Markov Chains s1, s2, s3, s4 |
3 Fri, Jan 15 | MDP example start Value Iteration [pdf] | Practice Ex 9.C (removed - not consistent with latest 2nd edition) |
4 Mon, Jan 18 | MDPs Value Iteration [pdf] |
FYI (not a required reading!) Planning with Markov Decision Processes: An AI Perspective, Synthesis Lectures on Artificial Intelligence and Machine Learning June 2012, 210 pages |
5 Wed, Jan 20 | Finish MDs and start Partially Observable MDPs (POMDPs) [pdf] | |
6 Fri, Jan 22 | POMDPs (cont') [pdf] |
- Assignment 1 out -
Blackjack.xml
See http://www.cs.uwaterloo. see also DESPOT JAIR 2017 |
7 Mon, Jan 25
|
Reinforcement Learning (RL) [pdf]
|
Practice Ex 11.A |
8 Wed, Jan 27 | Reinforcement Learning (RL)(cont') [pdf] | |
9 Fri, Jan 29 | Paper Discussion MDP for scheduling (Medicine) [ppt] [pdf] YOUR QUESTIONS |
A Markov decision process approach to
multi-category patient scheduling in a diagnostic facility,
Artificial Intelligence in Medicine Journal, 2011
[pdf]
MDPs vs. Heuristic Methods
|
10 Mon, Feb 1 | Finish RL - SARSA [pdf] | Ex 11.B |
11 Wed, Feb 3 | Recap BNets - Start Approximate Inference in BNets [pdf] | Practice Ex 6.E, BN Company NorSys assignment1- due / assignment-2 out |
12 Fri, Feb 5 | Approx. Inference - Likelihood Weighting, MCMC (Gibbs Sampling) [pdf] | BNet tool (with approx. inference algorithms) GeNIe |
13 Mon, Feb 8 |
Paper Discussion
(ITS) -
Paper on
application of a relatively large BNet (where approx. inference is needed) slides [pdf] YOUR QUESTIONS |
Using Bayesian Networks to Manage Uncertainty in Student Modeling. Journal of User Modeling and User-Adapted Interaction 2002 [pdf] Dynamic BN (required only up to page 400) |
14 Wed, Feb 10 | Temporal Inference - HMM (Filtering, Prediction) [pdf] | |
15 Fri, Feb 12 | HMM (Smoothing, just start Viterbi) [pdf] | |
Mon, Feb 15 | Family Day - University closed | |
Winter Session Term 2 mid-term break February 15 to 19 inclusive | ||
16 Mon, Feb 22 | Finish Viterbi - Approx. Inference in Temporal Models (Particle Filtering) [pdf] | |
17 Wed, Feb 24 | Intro Graphical Models -Undirected Graphical Models - Markov Networks [pdf] | |
18 Fri, Feb 26
|
Inference in Markov Networks Conditional Random Fields (CRFs)
- Naive Markov [pdf]
|
FYI (not a required reading!) An Introduction to Conditional Random Fields. Charles Sutton, AndrewMcCallum. Foundations and Trends in Machine Learning 4 (4). 2012. |
19 Mon, Mar 1 | Linear Chain CRFs - NLP applications [pdf] | MALLET assignment2- due |
20 Wed, Mar 3 | Full Propositional Logics, Language and Inference [pdf] | |
21 Fri, Mar 5 | Finish Resolution, Satisfiability, WalkSAT [pdf] |
|
Mon, Mar 8 | Midterm exam (Zoom) Will start at 4 sharp | |
22 Wed, Mar 10 | SAT encoding example - First Order Logics (FOL) [pdf] | |
23 Fri, Mar 12 | Ontologies/Description Logics: Wordnet, UMLS, Yago, Probase..... [pdf] | assignmet3 out |
24 Mon, Mar 15 |
Similarity Measures: Concepts in Ontologies and Distributional for Words [pdf] |
- Wordnet
and YAGO
(Wikipedia + Wordnet + GeoNames).
See also MS Research Probase, Google Knowledge Graph and Freebase and MS Concept Graph - (Domain
specific thesaurus)
Medical Subject
Headings (MeSH) |
25 Wed, Mar 17 | NLP: Context-Free Grammars and Parsing [pdf] |
(for next three lectures) Russell and Norvig's Artificial Intelligence: A Modern Approach (third or fourth edition) [webpage] Part VI Communicating, Perceiving, and
Acting |
26 Fri, Mar 19 | Probabilistic Context Free Grammar (1) [pdf] | |
27 Mon, Mar 22 | CANCELLED | |
28 Wed, Mar 24 | Probabilistic Context Free Grammar (2) [pdf] | - Berkeley Parser with demo |
29 Fri, Mar 26 | Markov Logics (1) Representation [pdf] |
Markov Logic: An Interface Layer for Artificial Intelligence
P. Domingos University of Washington
D. Lowd
University of Oregon
2009 |
30 Mon, Mar 29 | Markov Logics (2) Inference [pdf] |
Alchemy is a
software package providing a series of algorithms for statistical
relational learning and probabilistic logic inference, based on the
Markov logic representation. assignmet3 due - assignmet4 out |
31 Wed, Mar 31 | Finish Markov Logics Review + Applications [pdf] | |
Fri, Apr 2 | Good Friday - University closed | |
Mon, Apr 5 | Easter Monday - University closed | |
32 Wed, Apr 7 | Probabilistic Relational Models (1) Representation [pdf] | Sample application to recommender systems |
33 Fri, Apr 9 | Probabilistic Relational Models (2) Parameters and Inference [pdf] | (only if we cover plate notation) Practice Ex. 14.A |
34 Mon, Apr 12 |
Discuss EMNLP 2020 paper
MEGA RST Discourse Treebanks with Structure and Nuclearity from
Scalable Distant Sentiment Supervision (guest speaker: first author of the paper PhD student Patrick Huber ! Material to review before reading: CKY, Exploration/Exploitation trade-off in RL, Beam Search (from 322), Recurrent Neural Networks (if you have seen them in 340 or other courses) PATRICK's SLIDES |
MEGA-DT
Paper pdf FYI Paper on our discourse parser that can be trained on MEGA-DT (COLING 2020) |
Wed, Apr 14 | Beyond 3/422, AI research, Watson etc. [pdf] |
assignmet4 due - Some relevant papers form IUI-15 Modeling users' interests |
Fri Apr 23 12:00 PM 2:30 |
Final Exam X PM 2.5 hours Zoom? | |
MAYBE Paper Discussion (NLP) on PCFG and CRFs [ppt] [pdf] student-questions [pdf] |
portions of CL paper
CODRA: A Novel Discriminative Framework for Rhetorical Analysis. Computational Linguistics (CL (2015)) Vol. 41, No. 3: 385–435, MIT press only sections 1, 3 and 4 are mandatory CODE - DEMO |