Artificial Intelligence

CSCI 440, fall semester, 2018

 

To the Bottom of the Page

 

Instructor:                 Dr. Shieu-Hong Lin (Description: Description: Description: Description: Description: Description: Description: LinEmail)   Course Syllabus

 

Class:                        MW 12:00-13:15 pm at Busn 210

 

Office Hours:             Dr Lin (Lim 137): MW TR 3:00-5:00pm   email  Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: LinEmail to confirm an appointment in advance

 

Submission of all your work: go to Biola Canvas               Your grades: see them under Biola Canvas

 

*************************************************************************************************

 

Week 1. Overview of the Landscape of Machine Learning

 

Reading 1: Report due Wednesday, Due: Wednesday, Sept. 12

 

 

Showcase: Application of Hidden Markov Models (HMMs) as Bayesian Networks

 

Lab # 1 Rock-Paper-Scissor: Report due Wednesday, Due: Wednesday, Sept. 12

1.      Collecting data: Download, unzip, and run rock-paper-scissor Agent#1 (or this alternative x64 executable) for a couple of times. Each time the program would require you to play with the agent for 100 matches and yield a transcript file RPS_transcript.txt about the outcomes of these 100 matches in the same folder. You can rename these text transcripts and then put them together into a single combined transcript file of all matches. What is percentage of matches in which you won? What is percentage of matches in which you lost?

2.      Learning from data: Try to learn from the results in the transcript of matches to improve your chance of winning the game. Then play with Agent #1 again based on what you have learned from the data in Step #1. Put down (i) what you have learned from the data and (ii) whether it did help you to improve the chance of winning the game into a WORD or text document.

3.      Submission of your work: Upload the combined transcript in Step #1 and the file of your thoughts and exploration in Step #2 file under canvas.

 

 

*************************************************************************************************

 

Week 2. Probabilistic Models for Reasoning: An Introduction to Hidden Markov Models (HMMs) and Bayesian Networks

 

Reading 2: Report due Wednesday, Due: Wednesday, Sept. 19

 

Programming #1A: Wednesday, Sept. 19.

 

 

*************************************************************************************************

 

Week 3. Hidden Markov Models (HMMs) for Spelling Recognition: Implementation of the keyboard model and the spelling model

 

Reading 3: Report due Wednesday, Due: Wednesday, Sept. 26

 

Programming #1B: Wednesday, Sept. 26.

 

*************************************************************************************************

 

Week 4. Simulation of Typing using Hidden Markov Models (HMMs) + Basics of Probabilistic Reasoning Using HMMs

 

Reading 4: Wednesday, Oct 3.

 

Programming #2A: due Wednesday, Oct 3.

 

Homework #1: due Wednesday, Oct. 3.

 

 

*************************************************************************************************

 

Weeks 5-6. Hidden Markov Models (HMMs): Simulation of Typing + More on Probabilistic Reasoning

 

Reading 5-6: Wednesday, Oct 17.

 

Programming #2B: due Wednesday, Oct. 17.

 

Lab #2 (Supervised leaning for classification using WEKA): Wednesday, Oct. 17.

 

 

*************************************************************************************************

 

Week 7. Supervised Learning: Naïve Bayes Classification|  Forward Algorithm for Probabilistic Reasoning on HMMs

 

Reading 7: Report due Wednesday, Oct. 24.

 

Homework #2: Forward algorithm for solving the first HMM Problem: Wednesday, Oct. 24.

On the forward algorithm for probabilistic reasoning

 

Programming #2C: due Wednesday, Oct. 24.

 

 

*************************************************************************************************

 

Week 8. Supervised Learning: Decision Trees |  Implementation of the Forward Algorithm

 

Reading 8: Wednesday, Oct. 31.

 

Homework#3 (Naïve Bayes classification): Wednesday, Oct. 31.

 

Programming #3A: due Wednesday, Nov. 7.

 

 

*************************************************************************************************

 

Week 9. Supervised Learning: Basics of Linear Models |  Identity Recognition Based on Typing/Spelling Behaviors

 

Reading 9: Wednesday, Nov. 7.

 

Homework #4: (Decision tree induction based on entropy and information gain): Wednesday, Nov. 7

 

Programming #3B: due Wednesday, Nov. 14.

 

 

*************************************************************************************************

 

Week 10. Supervised Learning: Support Vector Machines and More on Linear Models |  Identity Recognition Based on Typing/Spelling Behaviors

 

Reading 10: Wednesday, Nov. 14.

 

 

*************************************************************************************************

 

Weeks 11-12.  Neural Networks and Deep Learning  |  Learning HMM Models I

 

Faith and Learning Integration Assignment on Creation and Computer Science due: Monday, Nov. 19

íP         Dr. Lin will be out of town for a conference on Nov. 19. Please use the class time for reflection needed to do this assignment.

íP         You should put down what you have in the reflection process according to the requirement in the assignment.

íP         Submit your reflection report accordingly through Canvas.

 

Reading 11: Wednesday, Nov. 21. (submission open till Nov. 26 without penalty)

 

Homework #5: (Linear Regression and Linear Models):  Wednesday, Nov. 21. (submission open till Nov. 26 without penalty)

 

Reading 12: Wednesday, Nov. 28.

 

Programming #4A: Wednesday, Nov. 28.

 

 

*************************************************************************************************

Links to online resources

 

*************************************************************************************************

 

 

 

To the Top of the Page                               

 

 

.