Artificial Intelligence

CSCI 440, fall semester, 2016

 

To the Bottom of the Page

 

Instructor:                 Dr. Shieu-Hong Lin                           Email:   Description: Description: Description: Description: Description: Description: Description: LinEmail

Class:                        MW 1:30-2:45 pm at LIB 141

 

Office Hours:             M~Th 8:30am~10:30am, MW 11:30~1:00pm Math & CS department at Grove 8

Contact Dr. Lin to set up an appointment in advance

 

TAs: Alvin Suh, William Tan            

 

Course Syllabus

 

Submission of all your work: go to Biola Canvas

Your grades: see them under Biola Canvas

 

*************************************************************************************************

 

Week 1. An Introduction to Hidden Markov Models (HMMs) and Bayesian Networks as Probabilistic Models for Reasoning

 

Reading 1: Report due Wednesday, August. 31

 

Lab # 1: Report due Wednesday, August. 31

How intelligently could the computer interact with people given the collection of historical data of past interactions? (i) Play rock-paper-scissor here with the computer in the novice mode online for at least 50 matches. Record the actions of both players in each match. What is percentage of time you are able to win? (ii) Play it again with the computer set in the veteran mode for at least another 50 matches. Record the actions of both players in each match. What is percentage of time you are able to win? (iii) Submission: Record your findings in the two steps above in a WORD document together with an approach about how you may develop a program that can play the game intelligently as that online program. Upload this file under canvas.

 

*************************************************************************************************

 

Week 2. Applications of HMMs: From Spelling Recognition to Speech Recognition  |  Overview of Machine Learning

 

Reading 2: Report due Wednesday, Sept. 7.

 

Homework#1 first attempt on spelling recognition: due Wednesday, Sept. 7.

 

Programming #1A: Wednesday, Sept. 7.

 

*************************************************************************************************

 

Week 3. Formulation of Hidden Markov Models (HMMs)   |  Ingredients of Machine Learning I

 

Reading 3: Wednesday, Sept. 14.

 

Programming #1B: Wednesday, Sept. 14.

 

*************************************************************************************************

 

Week 4. Formulation of Hidden Markov Models (HMMs)  |  Ingredients of Machine Learning II

 

Reading 4: Wednesday, Sept. 21.

 

Programming #2A: due Wednesday, Sept. 21.

 

 

*************************************************************************************************

 

Week 5. Simulation Using Hidden Markov Models (HMMs)  |  Binary Classification in Machine Learning

 

Reading 5: Wednesday, Sept. 28.

 

 

Programming #2B: due Wednesday, Sept. 28.

 

 

Lab #2 (WEKA for machine learning / data mining): Wednesday, Sept. 28.

 

*************************************************************************************************

 

Week 6. Probabilistic Reasoning Using Hidden Markov Models (HMMs) I |  More on Classification in Machine Learning

 

Reading 6: Wednesday, Oct. 5.

 

 

Programming #2C: due Wednesday, Oct. 5.

 

Homework #2A: due Wednesday, Oct. 5.

On the brute-force enumeration algorithm for probabilistic reasoning

 

 

*************************************************************************************************

 

Week 7. Probabilistic Reasoning Using Hidden Markov Models (HMMs) II  |  Tree Models

 

Reading 7: Wednesday, Oct. 12.

 

Homework #2B (updated 3:30pm, Monday Oct. 10): Wednesday, Oct.12.

On the forward algorithm for probabilistic reasoning

 

 

*************************************************************************************************

 

Week 8. Test #1 | Torrey Conference

 

Test #1 open-book test on HMMs: Monday, Oct. 17.

 

 

 

*************************************************************************************************

 

Week 9. Probabilistic Reasoning Using Hidden Markov Models (HMMs) III  |  Rule Models

 

Reading 9: Wednesday, Oct. 26.

 

Programming #3A: due Wednesday, Oct. 26.

 

*************************************************************************************************

 

Week 10. Probabilistic Reasoning Using Hidden Markov Models (HMMs) IV  |  Distance-Based Models

 

Reading 10: Wednesday, Nov. 2.

 

Homework #2C Wednesday, Nov. 2.

More practice on the forward algorithm for probabilistic reasoning:

 

 

*************************************************************************************************

 

Week 11. Probabilistic Reasoning Using Hidden Markov Models (HMMs) V  |  Probabilistic Models

 

Reading 11: Wednesday, Nov. 9.

 

 

Programming #3B: due Wednesday, Nov. 9.

 

About the real authors of documents A to H in Homework#1 and Experiment 3B for Programming #3B.

 

Lab #3 (download the new dataset on Nov. 7): Due: due Wednesday, Nov. 9.

1)    Knowing more about WEKA explorer: Read the manual of WEKA 3.7.10 to see (i) how you can open up datasets in CSV (comma separated values) files and save them in arff format for classification tasks and (ii) how you can use classifiers to datasets to learn predictive models and run cross validation experiments. Read the description of cross validation here.

2)    Getting the retention datasets and sign the agreement: Log into the Canvas to download 2016_Project.zip under File (a new version uploaded on Nov. 7). Unzip the zip file and explore the contents inside. First of all, carefully read the enclosed confidential agreement and sign it before you use the retention datasets for this lab assignment. Copy and paste the agreement and the signature into your report.

3)    Learning tree models using J48: Use WEKA and apply the J48 method under decision tree to the training datasets in the Data folder separately to learn to learn decision trees as predictive models. Put down the resulting decision trees in your report.

4)    Cross validation experiments using J48: Do (3) above again and conduct 10-fold cross validation experiments together accordingly. Based on the results of the cross validation experiments, put down the expected precision and recall of the prediction model in (3) in terms in your report.  

5)    Cross validation experiments using IBk: Instead of the J48 classifier under decision tree, use the IBk classifier under lazy and do (4) above again and conduct 10-fold cross validation experiments together accordingly. Based on the results of the cross validation experiments, put down the expected precision and recall of the prediction model in (5) in terms in your report.  

6)    Submission: Upload your report for Lab #3 with the results from Step 2 to Step 5 under Canvas.

 

 

Homework #3A (updated Nov. 2) Wednesday, Nov. 9

On the backward algorithm for probabilistic reasoning

 

 

*************************************************************************************************

 

Week 12. Probabilistic Reasoning Using Hidden Markov Models (HMMs) VI  |  Features and Transformations of Features

 

Reading 12: Wednesday, Nov. 16.

 

Homework #3B (updated Nov. 14) Wednesday, Nov. 16

On the forward-backward algorithm for probabilistic reasoning using \ values, ] values and ^ values

 

 

*************************************************************************************************

 

Week 13. Probabilistic Reasoning Using Hidden Markov Models (HMMs) VII  |  Linear Models

 

Reading 13: Wednesday, Nov. 23.

 

 

Programming #4A: Wednesday, Nov. 23.

 

Note: About the real authors of documents A to H in Homework#1 and Experiment 3B for Programming #3B.

 

Homework #3C: Wednesday, Nov. 23.

On the Viterbi algorithm for probabilistic reasoning using _ values

 

Homework#4 (Naïve Bayes classification): Wednesday, Nov. 23.

 

 

*************************************************************************************************

 

Week 14. Probabilistic Reasoning Using Hidden Markov Models (HMMs) VIII  |  Machine Learning Experiments

 

No class on Monday Dec. 5: Dr. Lin out of town for a conference

 

Reading 14: Wednesday, Dec. 7.

 

Homework #5: (Linear Regression and Linear Models):  Wednesday, Dec. 7.

 

Homework #6: (Decision tree induction based on entropy and information gain): Wednesday, Dec. 14.

 

Programming 4B: Spelling-Recognition with Training Data. Due Wednesday, Dec. 14 (submission open till 19)

P       Updated demo executable: Please download and carefully play with the new demo executable (updated Dec. 14, 2016) for automatic recovery of a message X described below. The previous version did not correctly rank the 4 most likely candidate words (for each corrupted word) in the descending order of their probabilities. This new version fixes the bug and correctly ranks the 4 most likely candidate words (for each corrupted word) in the descending order of their probabilities. It also correctly calculates the rates of accuracy of the recovered message according to the top 1 list, the top 2 list, the top 3 list, and the top 4 list respectively.

P       Demo executable and the programming task: Please download and carefully play with options L R, T, and U provided in the new demo executable (updated Dec. 14, 2016) for automatic recovery of a message X described below. In your Programming #4A, you have already implemented Option L, and now you need to enhance the new options R, T, and U such that you can go through the steps to recognize from corruptedMessage1.txt and corruptedMessage2.txt as the results of Mr. X trying to type an unknown document recorded in messageX.txt twice given that all the original words are in vocabulary.txt.

 

Final Test (Test #2): Take-home open-book test Due Monday, Dec. 19.

 

*************************************************************************************************

TA hours:  T  Th 1:00~4:00pm (Alvin Suh, William Tan), MATH/CS Alcove lab

 

**************************************************************************************************************

 

Links to online resources

 

 

To the Top of the Page                                 

 

 

.