diochnos/teaching

Fall 2017: CS 4710 - Artificial Intelligence

Table of Contents

Course Description

Introduces artificial intelligence. Covers fundamental concepts and techniques and surveys selected application areas. Core material includes state space search, logic, and resolution theorem proving. Application areas may include expert systems, natural language understanding, planning, bayesian networks, Markov models, multi-agent systems, machine learning, or machine perception. Provides exposure to AI implementation methods.

[Course Description] [Table of Contents] [Top]

Basic Information

Syllabus

The syllabus is available here.

Time and Location

Mondays and Wednesdays, 2:00pm – 3:15pm, 341 Mechanical Engineering Building.

Contact Information

Please see here.

Office Hours

I hold my office hours in 414 Rice Hall.

Mondays
12pm-1pm
Thursdays
11am-1pm
Exceptions to the Regular Schedule of Office Hours

If you want to meet me outside of my office hours, the most reliable method is to send an email and arrange an appointment.

For possible exceptions to the regular office hours that the TAs hold, please see exceptions for TAs below.

[Basic Information] [Table of Contents] [Top]

Teaching Assistants

The following undergraduate students, listed alphabetically by last name,

Office Hours

Teaching assistants hold their office hours in room 436 Rice Hall. A tentative schedule is shown below starting from the week commencing on Monday, September 4, 2017.

MondaysTuesdaysWednesdaysThursdaysFridays
11am-12pm11am-1pm11am-1pm11am-1pm
(Brittany)(Alyson)(Dimitris @ 414RH)(Akshay)
12pm-1pm
(Dimitris @ 414RH)
1pm-2pm
(Derrick)
4pm-5pm
(Tushar)
5pm-6pm
(Derrick)
6pm-7pm
(Shijia)
Exceptions to the Regular Schedule of Office Hours for Teaching Assistants

Friday, September 8, 11am-1pm: Moving earlier to Tuesday, September 5, 12.30pm-2.00pm, 436 Rice Hall.

Friday, September 15, 11am-1pm: Moving earlier to Tuesday, September 12, 12.30pm-2.00pm, 436 Rice Hall.

Wednesday, September 20, 11am-1pm: Only one hour will be held between 11am-12pm (same place: 436 Rice Hall).

Monday, September 25, 11am-12pm: Moving later to Thursday, September 28, 2-3pm, Thornton stacks.

Monday, October 9, 11am-12pm: Moving earlier by one hour; that is, October 9, 10am-11am, 436 Rice Hall.

[Teaching Assistants] [Table of Contents] [Top]

Assigned Readings

[Assigned Readings] [Table of Contents] [Top]

Optional Readings

[Optional Readings] [Table of Contents] [Top]

Events

Below we will have announcements on events that take place on campus and are somehow related to artificial intelligence.

[Events] [Table of Contents] [Top]

Homework Assignments

Assignments will be announced through Collab as we make progress. There will be 4-5 assignments worth 50% of your grade.

Assignment 1: Announced Monday, September 4. Due Friday, September 22.

Assignment 2: Announced Monday, September 25. Due Monday, October 9.

Assignment 3: Announced Monday, October 23. Due Tuesday, November 7.

Assignment 4: Announced Wednesday, November 8. Due Wednesday, November 29.

[Homework Assignments] [Table of Contents] [Top]

Class Log

Class 1 (Aug 23, 2017)

Introduction to the course. Syllabus and policies.

Assigned reading: Computer Science as Empirical Inquiry: Symbols and Search, by Allen Newell and Herbert A. Simon

Class 2 (Aug 28, 2017)

Introduction to artificial intelligence (AI). Typical problems and approaches. Different sub-fields of AI and philosophical foundations and concerns.

Relevant material from the book: Sections 1.1 and 1.2 as well as 26.1 and 26.2. Also, Section 1.4 is a one-page describing state of the art achievements in AI.

Class 3 (Aug 30, 2017)

Introduction to knowledge representation. different ways in which we can represent knowledge. Two main approaches: logic-based and associations using semantic networks (primarily expressed using frames).

Optional reading: Commonsense knowledge bases and network analysis.

Class 4 (Sep 4, 2017)

Announcing the first homework assignment through Collab; due: Friday, September 22, 2017.

Mention the Piazza forum.

Brief review on discrete mathematics: predicate logic, connectives, conditionals, functional propositions, quantifiers, inference using modus ponens. Example with a rule-based system.

Deriving new knowledge using forward chaining. Discussed conflict resolution. Reason maintenance.

Querying using backward chaining and obtaining explanations.

CLIPS and examples on CLIPS.

Overview of expert systems' architecture. Automatic knowledge acquisition. MYCIN as an example of an expert system.

Relevant material from the book: Sections 2.1 and 2.2. From Chapter 7 up to Section 7.5.1 where modus ponens is discussed. However note that it might be more convenient to refer to your personal resource on discrete mathematics for such a review, so this is only indicative.

Class 5 (Sep 6, 2017)

Classical search. Breadth-first search (BFS) and depth-first search (DFS). Depth-limited search, DFS with Iterative Deepening.

Relevant material from the book: Sections 3.3 - 3.4.5.

Optional reading: The Science of Brute Force, by Marin J.H. Heule and Oliver Kullmann.

Class 6 (Sep 11, 2017)

Heuristic search. Best-first search, greedy search, and A* (A-star) search.

Discussion on admissible heuristics, consistent heuristics, and optimality of A*.

Relevant material from the book: Sections 3.4.4 - 3.4.5 and 3.4.7. Further, Sections 3.5 - 3.5.2. Section 3.6 is optional as it has a discussion on heuristics.

Class 7 (Sep 13, 2017)

Take-home midterm exam.

Class 8 (Sep 18, 2017)

Local search methods.

Random walk and hill climbing. One vs several starting points.

Simulated annealing: using temperature to shift between random walk and hill climbing.

May discuss introductory notions on genetic algorithms or other models based on evolution.

Relevant material from the book: Sections 4 - 4.1.3. Note that Section 4.1.3 is about local beam search and it has an optional interesting discussion for that method.

Class 9 (Sep 20, 2017)

A Glimpse on Evolutionary Algorithms.

Class 10 (Sep 25, 2017)

Second homework assignment announced through Collab; due: Monday, October 9, 2017.

Genetic algorithms: basic characteristics.

Mutation and recombination operators.

Applications to the N-queens problem.

Relevant material from the book: Sections 4.1.4.

Introduction to search problems that are related to games.

Minimax and alpha-beta pruning.

Relevant material from the book: Sections 5.1 and 5.2.

Optional reading: Artificial Intelligence Poised to Ride a New Wave, by Gary Anthes.

Class 11 (Sep 27, 2017)

Continuation on our discussion on games. Using heuristics to deal with large state spaces and the (finite) horizon problem.

Review on Probability Theory and introduction to Bayesian networks.

Relevant material from the book: Sections 5.2, 5.3 and 5.4. We also went briefly through Section 5.5. Finally, Section 5.7 has an interesting discussion on state-of-the-art game programs. Note that AlphaGo (relevant article from Science) is missing from the list of Section 5.7 as this was a feat accomplished last year.

If you need to refresh your probability theory, Chapter 13 has a review.

Sections 14.1, 14.2, 14.3 (discrete variables)

Finally, note that a brief summary of Bayesian Networks is given by the additional book chapter Bayesian Networks by Irad Ben-Gal (available through Collab).

Class 12 (Oct 2, 2017)

No class; reading days.

Class 13 (Oct 4, 2017)

Inference with Bayesian networks.

Relevant material from the book: 14.4.1 (exact inference by enumeration), 14.4.3, 14.5.1 (approximate inference by direct sampling, rejection sampling, and likelihood weighting).

Introduction to Markov Chains.

Class 14 (Oct 9, 2017)

Markov chains, Markov Decision Processes (MDPs), Value Iteration.

Relevant material from the book: Sections 17 – 17.2.2.

Class 15 (Oct 11, 2017)

Review for midterm.

Class 16 (Oct 16, 2017)

In-class midterm.

Class 17 (Oct 18, 2017)

Discussion on a probably (exactly) correct solution of the robot finding its way out of the maze from the 2nd homework assignment. Tools discussed and used: Union bound, Markov's and Chebyshev's inequality. The analysis can serve as an example of the kind of results that we will be looking at in the CS6501 Learning Theory course.

Relevant material available through Collab under Resources / Readings (IntroLearning.pdf).

I recently realized (about 2 months ago) that there are some very closely related papers from the past that essentially use our techniques in order to solve very similar problems, and in one case, an identical problem with a robot trying to find its way out of a maze when the sensors are faulty. The notes have been updated to include these references.

Optional reading: solving the problem as in the homework: Inferring Finite Automata with Stochastic Output Functions and an Application to Map Learning, by Thomas L. Dean, Dana Angluin, Kenneth Basye, Sean P. Engelson, Leslie Pack Kaelbling, Evangelos Kokkevis, Oded Maron.

Optional reading: with a noisy framework on membership and equivalence queries: On learning from queries and counterexamples in the presence of noise, by Yasubumi Sakakibara.

Optional reading: with an overview of some noisy frameworks within the PAC model of learning: Four types of noise in data for PAC learning, by Robert H. Sloan.

Class 18 (Oct 23, 2017)

Finish with the discussion on the solution of the second homework in the case with uncertainty.

Go through the solutions of some of the exercises from the second midterm.

Class 19 (Oct 25, 2017)

Finish discussion on the solutions for the second midterm.

Partially Observable Markov Decision Processes (POMDPs) and Value Iteration.

Tutorial online: POMDPs for Dummies (up to General Form of a POMDP solution)

Class 20 (Oct 30, 2017)

Introduction to machine learning.

No free-lunch theorems.

Version spaces.

Optional reading: A Few Useful Things to Know about Machine Learning, by Pedro Domingos.

Class 21 (Nov 1, 2017)

Candidate-elimination algorithm on version spaces (using conjunctions).

k-nearest neighbors.

Relevant material: Chapter 8 up until Section 8.2.3 from Tom Mitchell's book Machine Learning.

Class 22 (Nov 6, 2017)

Decision trees.

Class 23 (Nov 8, 2017)

Overfitting and validation.

Naive Bayes

Class 24 (Nov 13, 2017)

Finished our discussion on Naive Bayes.

Introduction to neural networks. Discussed perceptrons (thresholded units) and unthresholded units.

Class 25 (Nov 15, 2017)

Conclude the discussion on neural networks.

Brief discussion on the contents of the third midterm.

Class 26 (Nov 20, 2017)

Third midterm (take-home).

Class 27 (Nov 22, 2017)

Thanksgiving.

Class 28-29 (Nov 27 and Nov 29, 2017)

Other topics in AI. Some ideas are the following.

Regression.

Clustering.

More on the theory of machine learning.

Multi-agent systems.

Class 30 (Dec 4, 2017)

Last class. May use some of the time - say the first 20 minutes - to finish with the material.

Overview of the course and format of the final (similar to the midterms).

Ask me anything.

Class 31 (Tue, Dec 12, 2017)

Final exam, in-class, 2pm-5pm.

[Class Log] [Table of Contents] [Top]