Q520: Mathematics and Logic for Cognitive Science
Spring 2008, TuTh 11:15A-12:30P in BH 340
Instructor: Larry Moss
Assisted by: Gabi Teodoru
Office: Rawles Hall 323
Office Hours: after the class, M 2:30 - 3:30, and W 10:00 - 11:00 AM
Phone: 855-8281
E-mail: lsm@cs.indiana.edu

The syllabus.


Links to sections on various course topics:

Hopfield Nets

Probability theory and Bayesian Nets

Markov Chains and Hidden Markov Models

Markov Decision Processes and Q-Learning

Entropy

Linear Algebra, the Singular Value Decomposition, and Latent Semantic Analysis

Logic


I'll keep class resources in this site. Please note that I frequently make corrections, additions, and changes to the course slides. What I post here is therefore often an improved version of what I presented in class.


My lecture slides on Hopfield nets as a pdf file.

The first homework: is here.

Some sources on this topic include the web pages from the Program for Research into Intelligent Systems in Singapore.

Also, a useful book chapter available on the web is Chapter 13 in Raul Rojas' book Neural Networks - A Systematic Introduction. Parts of some of the other chapters will be covered in our course as well.



My first lecture slides on the basics of probability are here.

The first homework set on probability, due on January 24.

My lectures on Bayesian Nets: are here. The first homework on this topic is here, due Thursday, January 31. The next homework on this topic is here, due Thursday, February 7.

The homework due February 14 is here. The third lecture set on topics in probability, covering variance and related topics. I also will be adding sections on entropy, in later lectures.

The probability calculator used in class.

A very nice on-line book on probability, by the late Richard Jeffrey.

A web site called the Layman's Guide to Probability Theory. It has some slow discussion of probability, and also examples.

For Bayesian nets, the best source on the web for the material is probably the course notes of Kathryn Blackmond Laskey of George Mason University, especially unit 2.

Here is a tutorial web site from www.norsys.com that has the example "Chest Clinic" slides that I used in class. This site doesn't explain the math, but it does give an idea of how Bayesian nets might be used in practice. It also has lots of other examples of Bayesian nets.



My Lectures on Hidden Markov Models: are here.

An old version of my lectures on Markov chains: pdf, to be updated. (The unit on Markov chains will actually come later.

The main introductory survey article on HMMs, by L. R. Rabiner and B. H. Juang. My lectures are based on their article, and also the survey article on the EM algorithm by Detlef Prescher. What I am trying to do in my lectures is to take that second paper's clear and expansive presentation and then adapt it back to the case of HMMs. I highly recommend Prescher's paper to Linguistics students.

My example write-up of the Baum-Welch algorithm for a small HMM and corpus is here.

A nice tutorial web site on HMM's is this one by R.D. Boyle. It has a nice treatment of the Viterbi algorithm, for example.

There are lots of other HMM resources on the web. If anyone wants to read a textbook presentation, one source is Rabiner and Juang's book on speech recognition.

The homework on HMM's is here, and it is due on February 21.



My notes on Markov Decision Processes are here.

The value iteration handout, based on the Little Prince on a toroidal planet.

Andrew Moore's MDP slides, used in class. Here are his slides on Reinforcement Learning.

Notes from a course on Reinforcement Learning given by Professor Yishay Mansour of Tel Aviv University are here. I found these to be very helpful, because the theory is worked out in a clear way.

The book Reinforcement Learning by Richard S. Sutton and Andrew G. Barto.

The online matrix calculator used in class.

Blackjack and reinforcement learning.

The homework on this topic, due March 6.



My Lectures on Entropy: pdf, to be updated.

A preliminary version of my lectures on linear algebra and the Singular Value Decomposition are here. These were posted on Thursday morning, March 20. I need to fix up the example pertaining to weighted sums and Hopfield-like updates.

The homework on entropy and linear algebra, due March 27.

A discussion of the Google PageRank algorithm that I hope to go over is posted here.

A paper which connects PageRank with human memory is posted here. It is called "Goolge and the Mind: Predicting Fluency with PageRank", and the authors are Thomas L. Griffiths, Mark Steyvers, and Alana Firl.

The web site by Todd Will on the Singular Value Decomposition is here.

The main website on Latent Semantic Analysis. You might click on "What is LSA?".

One paper by Landauer and Dumais on Latent Semantic Analysis is here. And another introductory paper by Landauer, Foltz, and Laham is posted here as a .ps file.

A paper by Sheldon Axler called Down with Determinants! has a proof of the Spectral Theorem and lots more. If you have seen some linear algebra but not the Spectral Theorem, this would be a good source. If you have not seen any linear algebra, there are lots of books on introductory topics. One that I like is Murray Spiegel's book Linear Algebra in the Schaum's Outline Series. In fact, I review from the book each time I teach linear algebra.

Another resource that covers the linear algebra involved in the SVD is posted here. It's by Jody S. Hourigan and Lynn V. McIndoo.

The second homework covering topics from linear algebra, due April 3.

The final homework covering topics from linear algebra, due April 10. Here's the paper "The $25,000,000,000 Eigenvector: The Linear Algebra Behind Google". I passed out the first six pages with the homework, but in case you need a copy, here it is.



The first part of my lectures on logic: pdf, the part on natural logic first and the part on logic of knowledge second.

The first homework on logic is here.

The final lectures on logic: are here. These cover default reasoning, systems P and Z, and the preferential and probabilisitic semantics. You can find the lecture here as a ps file . The arrows in the trees go upwards.

A slower presentation of the systems of natural logic may be found here.

Incidentally, the logical systems that we started with are fairly close to the very first works in logic, due to Aristotle. What we studied is essentiallly the syllogistic logic developed from a more modern point of view. For a good overview of Aristotle's work on logic, see this article by Robin Smith in the Stanford Enyclopedia of Philosophy.

The material on epsitemic logic is mostly taken from a paper by Alexandru Baltag and me. You can find a version of it here.

A shameless plug: if you enjoy this subject, you might consider taking Philosophy 550 next springr. The course will be about modal logic, the system related to the examples that we are playing with in class. Feel free to ask me if you are considering that class but have questions.

The final exam is here.